Symposium schedule:

Arrival and regisration: 9:30am - 10:00am.

Talks will begin at 10:00am and finish at 6:00pm.

Please visit the following page for information on how to get to Brunel University. The meeting will be held in the Hamilton Centre (see the following campus plan) - the entrance can be found to the side of the HSBC Bank. Parking is also available on campus on the day. Please proceed to the Wilfred Brown building where someone will provide you with a parking permit.


Programme Outline:

Andrew T. Smith
Professor of Visual Neuroscience and Director of MRI, Royal Holloway, University of London

The standard fMRI group analysis is based on statistical detection of task-related brain activity that has a consistent location across brains. The use of this technique has revealed a great deal about the organization of the human brain. Arguably, it has revealed more than has neuropsychology and has done so in a much shorter time. Such studies continue to proliferate but, I shall argue, they have inherent limitations that severely reduce the life expectancy of the approach. One practical limitation is simply that averaging across brains (even though spatially normalised) discards much of the spatial precision that will form the bedrock of fMRI in the future. A more fundamental limitation is that, as has often been pointed out by critics, knowing where something occurs is not the same as knowing how it works. I shall review some promising avenues down which fMRI research may be able to move in order to overcome the limitations of the standard group analysis. I shall then illustrate one of them (the repetition suppression paradigm) with my own work on the processing of optic flow in the occipital cortex.

Masud Husain
Institute of Cognitive Neuroscience, University College London, UK
Command and Control Mechanisms in the Brain

Riitta Salmelin
Brain Research Unit, Low Temperature Laboratory, Helsinki University of Technology, Finland

Language in the brain: timing, location and connectivity
This presentation starts with an overview of cortical dynamics of speech perception and reading, as revealed by MEG activation studies. By the stage of semantic processing auditory and visual language perception show marked convergence. The neural correlates of semantic (and phonological) processing as proposed by neurophysiological vs. hemodynamic neuroimaging approaches will then be considered. This brings us to estimation of long-range neural connectivity in language processing and relationship between connectivity and activation maps. Finally, based on combined information from activation and connectivity maps in reading, extracted from MEG data, we will consider the specific role of the left inferior occipitotemporal cortex in fluent and impaired reading.

Larry Parsons
Dept. of Psychology, University of Sheffield, Sheffield S10 2TP, UK
New Studies Comparing the Brain Basis of Music, Deduction, and Language
Language, like deductive thought and music, is characterized as uniquely, or most highly, developed in humans. Various researchers have speculated on how these three capacities may or may not share specific functional or computational features. Recent experiments by my colleagues and I, and others, were designed to explore these issues. One series of functional magnetic resonance imaging studies compared complex deductions to simpler ones with identical linguistic complexity (across different content). The acquired data indicate consistently that deduction relies on a language independent network. A related recent study indicates that inferences about semantic equivalence in paraphrased sentence pairs does not rely on brain areas isolated for deductive inference. In another line of investigation, corresponding language and music generation tasks were examined with positron emission tomography. The results suggest that corresponding music and language performances can elicit shared as well as distinct localizations of brain activity. The implications of these findings for current models and future research will be considered.

Olaf Blanke
Laboratory of Cognitive Neuroscience, Brain Mind Institute, Swiss Federal Institute of Technology, Lausanne, SWITZERLAND

Neural mechanisms of the embodied self.
Although most humans have never had any trouble localizing themselves within their own bodily borders, this sense of self location or embodiment is a fundamental aspect of self consciousness and requires specific brain mechanisms. Recent clinical and neuroimaging evidence suggests that multisensory integration of bodily and two posterior brain regions, the temporo-parietal junction (TPJ) and cortex at/near the extrastriate body area (EBA) are crucial in coding embodiment.
In this seminar I will review three lines of research investigating brain correlates of embodiment. (1) Pathological states of embodiment (such as out-of-body experience, autoscopy, and feeling of a presence) due to focal brain damage to temporo-parietal cortex and extrastriate cortex in neurological patients. (2) Recent findings on activations of the temporo-pariatal cortex and extrastriate cortex in embodiment-related tasks using mental imagery in healthy subjects. (3) The experimental induction of disembodiment in healthy subjects using multisensory conflict and virtual reality.
I argue that these experimental and clinical findings on embodiment might turn out to be of relevance in defining some of the functions and brain structures mediating self consciousness and subjectivity.

Beatrice de Gelder
Tilburg University, The Netherlands
Considering the emotional body
Many valuable insights into human emotion and its neurobiological bases have been obtained from the study of facial expressions. In comparison the neurobiological bases of emotional body language are relatively unexplored. Observing emotional behavior often prompts a similar reaction in others. Characteristic fear behavior like putting the hands in front of the face and running for cover protects one from imminent danger. However, this automatic transmission of whole body emotion may be appropriate in response to some emotions, such as joy or fear, but not for others. For example, the most adaptive response to anger might not be to reciprocate the observed anger. The talk will present and discuss neuropsychological and brain-imaging studies investigating perception of facial expressions and of whole body expressions of emotions.


ELATED PageKits © 2002 ELATED.com/PageKits.com