Seminar Series at UC Merced, “Music, Movement and Cognition”
Talk by Scott Makeig, Institute for Neural Computation, UCSD
April 21, 2014, 3pm, KL 232
Title: Brain Dynamics of Music, Movement, and Emotion
As conscious and social beings, we are faced with the problem of knowing (a.k.a. ‘seeing’ or ‘hearing’ or ‘feeling’) the intentions of others from relatively sparse sensory cues. Intentions, and ensuing behavioral decisions, are now clearly understood to crucially involve a major and long experimentally under-explored aspect of cognition — affective or emotional feeling and awareness. To ‘see,’ ‘read.’ or ‘intuit’ the feelings of others (the primary source and determiner of their intentions), we need to interpret subtle aspects of their behavior — their facial expressions, their body position and gestural ‘body language,’ their vocal prosody and choice of words. Modern neurophysiology has demonstrated the existence of brain support for such perception, first in the form of brain neural networks involving neurons that are activated either by motivated self movements or by motivated movements of others. These discoveries suggest that we actually understand the actions of others by experiencing them ‘as if’ they were our own.
We clearly must also understand their feelings in the same way, whether expressed via sparse prosodic expression, movement, and vocal cues. A consequence of our long-evolved and crucial ability to read the feelings of others from their postural expressions, movements and voice qualities comes our ability to read human feeling expression from (or into) more abstract forms, movements and sound streams, allowing artistic feeling expression to become a very important human activity. However, mere ‘mirror neurons’ alone cannot perform and sustain such ‘reading’ and interpretation unless their dynamics are understood as being tiny isolated portions of large and complex brain networks.
Neurophysiology was long (~1930-1990) hindered by its fixation on single-cell neurophysiology (a.k.a. ‘single neuronology’), and for human studies, by its invasive nature. Non-invasive electrical brain imaging (EEG), on the other hand has long been hampered by a failure to use an electrical forward head model to model and visualize the recorded data within the original brain space, its actual origin.
At SCCN we have been working to build and disseminate methods for treating high-density EEG data as a true functional brain/cortical imaging modality in which some scales of activity within distributed cortical networks can be seen, measured, and modeled. We are also pioneering the development of a new brain/body imaging concept we call ‘mobile brain/body imaging’ or MoBI. This new paradigm combines high-density EEG and body motion capture with recording of eye gaze and task behavior in subjects who are performing actions — and/or interacting — in natural 3-D space.
We have begun to exploit EEG source imaging and MoBI recording methods to study affective imagination and perception in paradigms involving either subject stasis (eyes closed imagination) or activity (making emotionally communicative ‘conducting’ gestures in time to heard music. We have also composed and presented a musical composition for trio and brain that used imaginative recall of human feelings the subject (or ‘brainist’) had spontaneously associated with various musical chords in pilot testing to play the corresponding musical chord during the piece.
Such experiments only touch the surface of a rich body of experimental evidence and investigation that needs to occur before we truly understand the process of affective perception, communication, and empathy, or its brain support.