Spatial representation in the avian hippocampal formation

The representation of space in the hippocampal formation has been studied extensively over the past four decades, culminating in the 2014 Nobel Prize in Physiology or Medicine. However, an important question remained unresolved: Is the role of this brain structure in spatial processing restricted to mammals, or can we find its origins in other classes of vertebrates?

In this project we study two species of birds: Japanese quail and barn owl. While the quail is a ground-dwelling bird and an efficient forager, the barn owl is a nocturnal predator, tending to stand on high branches and scan the surrounding from afar, searching after distal visual and auditory cues.

Techniques for recording from freely behaving animals are used to measure nerve cell activity of the birds foraging on the ground in 2D, or searching from afar, or flying in 3D space. A variety of avian behavioral tasks are devised in order to search for place cells, spatial-view cells and other types of space-coding cells.

PROJECT LEADERS: Elhanan Ben Yishay, Ksenia Krivoruchko, Arpit AgarwalShaked Ron

Visual Search: Behavioral Experiments

Visual Search: Behavioral Experiments

Animals actively scan the environment to collect useful bits of information. This prSetup system for behavioral experimentsocess, known as visual search, is highly studied in humans and animals. What attracts the visual gaze and how barn owls search for interesting stimuli are the main questions asked in this project. We combine kinematic measurements of head movements with tracking of gaze point. Experiments are performed in spontaneously behaving owls as well as with trained owls performing a controlled visual task on a computer screen. To measure the kinematics we attach infra-red reflectors to the owl’s head and track their position using a Vicon system. To follow in real-time the point of gaze of the owl a wireless miniature video camera is mounted on the head. Using these techniques, on-going experiments address the following topics: combining  visual  and auditory information for saliency mapping,  detection of camouflaged objects, pop-out perception and the role of active head motions in visual perception. Results from the behavioral experiments are combined with results from physiological experiments conducted in our lab to gain an understanding of visual search mechanisms in barn owls.

PROJECT LEADER: Tidhar Lev-AriHadar Beeri

Related article:

Project Poster from the ISFN conference 2016 (click to enlarge):

Behavioral evidence and neural correlates of motion perceptual grouping in the barn owl

 

Visual Search: Physiological Experiments

Visual Search: Physiological Experiments

The saliency of visual objects is based on the center to background contrast. Particularly objects differing in one feature from the background may be perceived as more salient. It is not clear to what extent this so called ‘‘pop-out’’ effect observed in humans and primates governs saliency perception in non-primates as well. In this study we searched for neural correlates of pop-out perception in neurons located in the optic tectum of the barn owl.
We measured the responses of tectal neurons to stimuli appearing within the visual receptive field, embedded in a large array of additional stimuli (the background). Responses were compared between contrasting and uniform conditions. In a contrasting condition the center was different from the background while in the uniform condition it was identical to the background.
We further randomly shuffled the position of the target in the array across trials, so that it occasionally appeared in the RF, as in freely viewing conditions where the owls actively scan their surroundings, to check whether such condition would lead to orientation contrast sensitivity.
We were also interested in the effect of background homogeneity on response to the target. Different homogeneity levels were tested and responses were compared, searching after Gestalt-like perception of the stimulus, with the target popping-out over the whole background elements.

PROJECT LEADER: Arkadib Dutta, Tidhar Lev-Ari, Yael ZaharHadar Beeri