Tuesday @ Noon
Learning to Attend Induces an Increased Response to Unattended Stimuli
Anna Byers - UCSD
Traditional perceptual learning tasks employ just a single stimulus feature, making it difficult to parse out the differential effects of low-level sensory plasticity (i.e. Seitz et al., 2009) and top-down attentional gain modulations (i.e. Fahle, 2009) on observed changes in behavior and neural activity. Here, we evaluated the relationship between learning and top-down attentional gain using feature-selective fMRI techniques and a task that required discriminating one of ten possible orientations (instead of only a single orientation, as is typically employed in perceptual learning studies). Given that perceptual learning has been documented to occur without attention (Seitz et al., 2009), we expected to see an increase in the amplitude of orientation-selective response profiles in V1 after training on both an orientation-attended and an orientation-unattended task. Five subjects participated in an initial fMRI scan session, 10 behavioral training sessions, and a final scan session. For all sessions, subjects performed four blocks of an orientation discrimination task and four blocks of a rapid serial visual presentation (RSVP) letter task. Before training, the orientation-selective response profile in V1 had higher amplitude during the orientation-attended task compared to the orientation-unattended (RSVP) task. However, after training, the amplitude of the orientation-selective response profile increased, particularly when orientation was ignored (i.e. during the RSVP task). These results indicate that practice improves feature-selective representations of stimuli in early visual cortex, even when the stimulus is not being actively attended. Moreover, since our experiment involved multiple orientations, our subjects must have been learning to modulate sensory gain in a general sense, as opposed to optimizing gain to process a single, highly trained stimulus feature.
The effects of task engagement on neural responses in the songbird auditory forebrain.
Dan Knudsen - UCSD
Sensory systems in the brain evolved to provide a representation of the external world for use in behavior. Though it has traditionally been thought that these representations are static in adult vertebrates, it is becoming increasingly clear that neural representations in sensory areas can be modified by exposure to or training with behaviorally meaningful stimuli, and by changes in motivational and attentional states. The songbird auditory system provides an powerful model for studying this type of behaviorally modifiable sensory representation. A major feature of this system is the ability to modify representations at all levels of the ascending auditory pathway in response to behavioral demands placed on a bird. While these findings have been exposed at the relatively long timescale of song recognition learning, it is unknown if or how the auditory system modifies stimulus representations at shorter timescales, such as changes in behavioral state. In order to determine the relationship between neural response and behavioral state, we have developed methods to record from single units in the auditory forebrain of freely moving starlings as they perform auditory recognition tasks on segments of conspecific song. By comparing neural responses to these songs when a bird is engaged in the task to responses when the bird is not performing the task, we can see the effects of task-engagement on stimulus representation. I will present current findings, including changes in trial-to-trial firing rate variability and stimulus selectivity, and discuss how these types of changes might allow a sensory system to more reliably represent those features in the environment that are relevant to an organism.