Below there is a list of projects and open positions.
For more details and application, please contact us.
Below there is a list of projects and open positions.
For more details and application, please contact us.
Research of the physiology of senses and perception requires knowing when the stimulus causes a perceptual sensation in the experimental animal. In order to achieve this requirement, the animal should be trained to report its sensations through various experimental tasks. But animal training can constitute some problems – it takes rather long time and it does not always yield accurate reports. Thus, it is preferable to develop a method for studying the animal’s behavior that does not require any training.
Thus, I chose to study a reflexive behavior in the barn owl. This behavior includes reflexive eye movements and pupil dilation as a response to unexpected visual or auditory stimulus. The barn owl is a nocturnal predator. It survives thanks to its ability to detect small prey in dark and noisy environments. It has frontal eyes and stereo vision that use a strategy for information processing of the physical space resembling in many aspects strategies used by mammals and even humans (van der Willigen et al., 2002). However, in the barn owl’s natural habitat the light levels often fall below the threshold level at which it can see its prey. Then the barn owl uses its remarkable sense of hearing to dynamically locate the prey (Payne, 1971). Thus, the specialization of the barn owl in spatial information processing by both senses, vision and audition, makes it an attractive animal model for studying visual-auditory integration and its influence on reflexive behavior.
Previous work that was done in this field showed that pupil dilation can be obtained in an awake, untrained and head-restrained barn owl in response to a sound stimulus. Moreover, it habituates when the sound is repeated and recovers when the frequency or location of the stimulus is changed (Bala and Takahashi, 2000). My research will focus on the effect of visual pop out stimuli, as well as the effect of visual-auditory stimuli, on the reflexive pupil dilation and eye movements.
The experimental animals that I use in my research are barn owls (tyto alba). All birds are hatched and raised in captivity and kept in large flying cages that contain perches and nesting boxes. In all of the conducted experiments the subjects were head-restrained and slightly anesthetized by using isoflurane (2%) and nitrous oxide in oxygen (4:5). When it is anesthetized, the bird is put inside a restrainer and its head is fixed to the frame of the restrainer. The restrainer is located at the center of a sound-attenuating booth plated with acoustic foam that is used to suppress echoes. While the bird is kept inside the booth, the isoflurane is removed and the bird is maintained on a fixed mixture of nitrous oxide and oxygen(4:5) (Netser et al., 2010).
For pupil dilation and eye movement measurements I opened one of the bird’s eyes by attaching miniature clips to the small feathers on the eyelids. The clips are gently pulled by strings in order to open the eyelid. This procedure does not prevent the blinking of the nictitating membrane that enables spontaneous moisturizing of the eye.
For video recording of the change in pupil size I used a Point Grey analog video camera that uses IEEE-1394 interface, which is also called FireWire. Around the lens I placed an infra-red LED ring. Because it is rather difficult to distinguish between the barn owl’s pupil and iris, I project the infra-red light into the bird’s eye and get its reflection from the retina. Since the barn owl’s photoreceptors, like those of humans, are not sensitive to infra-red light, the projection of infra-red light on the owl’s eye is not noticeable by it and thus does not influence the pupil dilation or the eye movements.
The acquired videos during the trials are stored on the computer and are processed offline through MATLAB software.
Pupil dilation data analysis
An automatic tracking software was developed by me and is used to track the pupil during the whole acquired movie (see preliminary results). The algorithm finds automatically the region of interest (ROI) that contains the pupil in the first frame. This is done according to intensity differences between the pupil and its surrounding. My next step is to convert the ROI from a cartesian representation into polar representation. In this polar image I draw a curve that represents the borders of the pupil. I identify these borders using an intensity threshold. Then I apply filters to smooth the curve. In case there are some artifacts, such as feathers that enter the ROI, that distort the curve that was drawn previously, I apply a threshold to this curve and this way I get rid of the artifacts. Afterwards I apply a polynomial fitting to the smoothed curve and then I convert this polynomial from polar representation back into cartesian representation. Next I fit an ellipse to the curve that I got in the cartesian image. It is necessary to fit a geometrical structure to the curve because this way we can calculate all its parameters and learn about the change in pupil size.
Eye movement data analysis
The method that I will be using includes a tiny gold plated neodymium magnet that is implanted in the bird’s eye, under the conjunctiva and a Honeywell HMC1512 magnetic displacement sensor that is mounted on the bird’s skull. This is a single surgical procedure. When the eye moves, the magnet moves with it and the sensor tracks its position. A single sensor chip can reliably track a single dimension of eye movement. Thus, I will start with one chip to track the horizontal dimension. Another optional method for eye movement tracking is to use three reflectors that are glued to the cornea and reflect infra-red light. A video camera will record their movement over time during the trials and this movement will be analyzed by tracking software, whose basic principle is similar to the one described in the previous section.
Plan for first year
Plan for second year
Performing the experiments.
Bimodal oddball paradigm. Oddball paradigms use a long sequence of stimuli. This sequence contains common stimuli, which are the distractors, and rare stimuli, which are the targets and are hidden within the common stimuli. The bimodal oddball paradigm is commonly used for estimation of human perception in healthy and diseased subjects. It was found that humans find the stimulus that is rarely presented as the salient one and are more attentive to it (Brown et al., 2006). The bimodal oddball paradigm I plan to use in my experiments will contain presentation of two different modalities – visual modality and auditory modality. I will use it in order to investigate the integration of the information, which is received from these two modalities, in the brain, through its influence on attention. My experiment will include a long sequence of common stimuli, which will be individual visual or auditory stimuli, and a rare stimulus that is a combination of both stimuli mentioned above, imbedded inside the common stimuli. Similarly to the work presented by Netser (Netser et al., 2011), where he showed that in auditory adaptation experiments a reflexive eye reaction was observed when a new, unpredicted sound was presented to the owl and “surprised” it, I expect to see in my experiments similar reflexive reaction to the rarely presented stimulus, i.e. to the combined visual-auditory stimulus. Also, I plan to examine whether the bimodal stimulus is more salient than a unimodal one, e.g. visual or auditory stimulus alone, whose location rarely varies. A unimodal experiment is illustrated in Fig. 2a and the bimodal experiment I will be conducting is illustrated in Fig. 2b.
Bala A.D. S. and Takahashi T. T., Pupillary dilation response as an indicator of auditory discrimination in the barn owl. Journal of Comparative Physiology A, Volume 186, Issue 5, Pages 425-434, May 2000.
Brown C. R., Clarke A. R., Barry R. J., Inter-modal attention: ERPs to auditory targets in an inter-modal oddball task. International Journal of Physiology, Volume 62, Issue 1, Pages 77-86, October 2006.
Netser S., Ohayon S. and Gutfreund Y., Multiple manifestations of microstimulation in the Optic Tectum: eye movements, pupil dilations, and sensory priming. Journal of Neurophysiology, Volume 104, No. 1, Pages 108-118, 1st July 2010.
van der Willigen R. F., Frost B. J. and Wagner H., Depth generalization from stereo to motion parallax in the owl. Journal of Comparative Physiology A, Volume 187, Issue 12, Pages 997-1007, January 2002.
We are seeking a highly motivated graduate student or post-doc to lead a project on active vision and audition in barn owls. The project is a collaboration study with the biorobotics lab in the Technion.
Applicants must have a solid background in computation and programming. Background in biology or psychology will be a plus.
We are seeking a highly motivated graduate student or post-doc to lead a project on attention and short term memory using multielectrode recordings in behaving animals. Applicants must have a background in neuroscience. Programming and mathematical skills will be a plus.
Habituation is the most basic form of memory, yet very little is known about its underlying mechanisms. Given the assumed role of the gaze control system in stimulus selection (see section 2) and the direct link between habituation and stimulus selection, we assume that the gaze control system is involved in habituation. We therefore study the gaze control system of the barn owl with the aim to explore mechanisms of habituation. To this end we have developed a video based system to measure the pupil dilation responses and eye movements in barn owls. As previously shown, the pupil of the barn owl dilates in response to surprising auditory stimuli and readily habituates to repeating stimuli. Our initial analysis demonstrates a similar habituation to visual stimuli and shows that eye movements also habituate to repeating stimuli. These initial results provided us with an ability to use ocular parameters as a behavioral metric for habituation. To further characterize the habituation of the pupil response, we are currently studying effects of interactions between visual and auditory stimuli on the habituation process.
Project Leader: Shai Netser
Project Poster from the ISFN conference 2009 (click to enlarge)
|The neural mechanisms that generate internal representations of the auditory space have been extensively studied, indicating the importance of precise integration, in time and space, of information from the two ears and from different frequency bands. However, the important question of how the specific neural connections are formed remained unsolved. Are these connections preprogrammed into the brain or are they formed selectively by acoustic experience? We have undertaken to explore this question in barn owls. For this purpose we raise young barn owls from the age of 10 days to the age of 60 days, in continuous broadband acoustic noise. In such an artificial acoustic environment the auditory signals which are typical of a natural environment are masked and, thus, the experience of localizable sounds is substantially reduced. We then carefully mapp the internal representation of auditory space in these and compared it with owls that were raised under normal conditions.
Project Leader: Adi Efrati
Visual and auditory information is integrated in the brain to facilitate the perception of events that can be both seen and heard. The barn owl, with its excellent visual and auditory capabilities, provides an interesting case study for the mechanisms of visual auditory integration. Moreover, it was shown in barn owls that interactions between visual and auditory signals play an important role in auditory map plasticity. We focus on responses of multisensory neurons in the optic tectum, a mid-brain structure which contains aligned auditory and visual maps of space. Our findings corroborated results obtained in mammals, that responses to correlated visual and auditory stimuli are enhanced. But, we have also found that this enhancement was context dependent, being stronger in stimuli which appeared after a long period of silence, compared with stimuli that were embedded in a sequence of similar stimuli. In addition, we have shown that responses to correlated visual and auditory stimuli can be enhanced not only in the number of spikes but also in their ability to follow the temporal modulation of the stimulus (phase locking). In a second project, we studied the tectofugal pathway. Using a series of electrophysiological and pharmacological experiments we were able to show that this pathway carries auditory information from the optic tectum to the forebrain, in addition to visual information. Currently we are investigating visual auditory integration of motion information in the OT. For this we are employing virtual acoustic space techniques to simulate acoustic motion at various speeds and directions.
Project Leader: Yael Zahar, Amit Reches, Yael Nae, Yonatan Kra
|Stimulus specific adaptation (SSA) is a phenomenon at the neural level, proposed as the neural correlate of novelty detection. In SSA a rare stimulus elicits a stronger response than a frequent stimulus. Novelty is an important component of stimulus saliency and has been linked with the ability of animals to abruptly attend to events that differ from their background. Because of its assumed importance in target selection the phenomenon of SSA has recently attracted much scientific attention. We have measured and analyzed responses of neurons, in three different brain regions, to probabilistic stimuli which were specifically designed to probe SSA. Our findings robust SSA in high brain areas related to decision and gaze control (the optic tectum and the frontal eye fields) but not in low sensory areas (the inferior colliculus). Remarkably, we have demonstrated similar SSA to four independent acoustic features (sound frequency, the two binaural localization cues and sound intensity) and one visual feature (spatial position). This is in contrast to most previous studies, which have focused on one sensory feature at a time. The manifestation of the SSA in such a wide variety of features supports the notion that the SSA in the gaze control centers is involved in sensory memory for novelty detection. Currently we are analyzing SSA to bimodal visual-auditory stimuli.
Project Leader: Amit Reches