English (United Kingdom)French (Fr)Russian (CIS)Espa
Home Forum Neurohacking The Bay Lipreading a viable exercise for some networks?

Connexion

Pour acc

empathia
useravatar
User Info

Lipreading a viable exercise for some networks?

Just came to mind.


Administrator has disabled public posting
Scalino
useravatar
User Info

Re: Lipreading a viable exercise for some networks?

I rather lipkiss lately, sorry... won't be of any help here... smile

Scalino


Administrator has disabled public posting
Alex
useravatar
User Info

Re: Lipreading a viable exercise for some networks?

Hi dudes,
Lipreading obviously has benefits, but until recently there was considerable controversy and confusion about which areas of the brain it uses or improves. It was thought wise to observe MRI explorations and then let the dust settle for a bit before including it in exercises  :  )

Here's a handful of previous research (loadsa reading, get coffee):


Science 25 April 1997:
Vol. 276 no. 5312 pp. 593-596
Activation of Auditory Cortex During Silent Lipreading
G. A. Calvert, S. D. Iversen, E. T. Bullmore, M. J. Brammer, S. C. R. Williams, P. K. McGuire, P. W. R. Woodruff, R. Campbell, A. S. David.

Abstract
Watching a speaker’s lips during face-to-face conversation (lipreading) markedly improves speech perception, particularly in noisy conditions. With functional magnetic resonance imaging it was found that these linguistic visual cues are sufficient to activate auditory cortex in normal hearing individuals in the absence of auditory speech sounds. Two further experiments suggest that these auditory cortical areas are not engaged when an individual is viewing nonlinguistic facial movements but appear to be activated by silent meaningless speechlike movements (pseudospeech). This supports psycholinguistic evidence that seen speech influences the perception of heard speech at a prelexical stage.

********************
November 30, 1998
Brains of Deaf People May Rewire Themselves to "Hear" When Lip-reading
Dean K. Shibata, M.D, in collaboration with Drs. Takasci Yoshiura, Edmund Kwok, Jianhui Zhong, David A. Shrier and Yuji Numaguchi.
Researchers at the University of Rochester have provided new evidence that the brains of deaf people may "rewire" themselves in a manner that may help them communicate with hearing people. The researchers found that a part of the brain normally responsible for hearing and speech - once thought to be relatively inactive in people who are deaf - is actually bustling with activity as deaf people read lips.
The findings shed new light on the brain’s remarkable ability to assign new functions to areas of the brain once thought to be dedicated to other purposes.
"When a hearing person imagines a sound, some activity in the part of the brain that processes sound can be seen with Functional Magnetic Resonance Imaging (FMRI)," says Shibata. "We’re finding similar areas of activation in deaf volunteers when they lip-read. It’s almost as if the brains of these volunteers were trying to create sounds to match the lips they were reading."
Eight deaf volunteers, students at the nearby National Technical Institute of the Deaf, were compared to eight hearing volunteers performing a series of visual tasks while lying within a magnetic resonance scanner. The volunteers watched a silent video of someone speaking as the scanner recorded blood flow throughout regions of each volunteer’s brain. The hearing subjects showed a relatively symmetric distribution of activity between the left and right sides of the brain, but deaf volunteers showed a surprising focus of activity on the right side, in a part of the brain called the Superior Temporal Gyrus – an area normally used to process sound and speech. The activity varied depending on the type of task performed. Both hearing and deaf volunteers who tried to identify shapes that were displayed on a screen produced the same brain activity, but lip-reading produced substantial differences.
"When a person is born deaf, the brain is able to wire itself to make sure that the parts usually used for hearing are reassigned to more useful roles," says Shibata. "This study shows that this reconfiguration is very specific. A visual task involving shape identification does not activate these "auditory" areas, but tasks involving movement and language, such as lip-reading, do." This raises the question of why lip-reading in particular is allocated to the lobe responsible for hearing. It could be that the auditory cortex has the kind of processing power needed to understand language – no matter what its form.
"The findings are preliminary, but a better understanding of deaf physiology might eventually help guide strategies for deaf education," Shibata explains. "I’m currently studying how memory works. We know that deaf children use more visual approaches to memory tasks, and this may be reflected in the way their brain is organized."
****************************
2005
http://jn.physiology.org/content/90/3/2005.full
ABSTRACT
Regional cerebral blood flow (rCBF) PET scans were used to study the physiological bases of lipreading, a natural skill of extracting language from mouth movements, which contributes to speech perception in everyday life. Viewing connected mouth movements that could not be lexically identified and that evoke perception of isolated speech sounds (nonlexical lipreading) was associated with bilateral activation of the auditory association cortex around Wernicke's area, of left dorsal premotor cortex, and left opercular-premotor division of the left inferior frontal gyrus (Broca's area). The supplementary motor area was active as well. These areas have all been implicated in phonological processing, speech and mouth motor planning, and execution. In addition, nonlexical lipreading also differentially activated visual motion areas. Lexical access through lipreading was associated with a similar pattern of activation and with additional foci in ventral- and dorsolateral prefrontal cortex bilaterally and in left inferior parietal cortex. Linear regression analysis of cerebral blood flow and proficiency for lexical lipreading further clarified the role of these areas in gaining access to language through lipreading. The results suggest cortical activation circuits for lipreading from action representations that may differentiate lexical access from nonlexical processes.
********************************
Jan 2009
Lip reading involves two cortical mechansims
K OKADA, G HICKOK (2009). Two cortical mechanisms support the integration of visual and auditory speech: A hypothesis and preliminary data Neuroscience Letters DOI: 10.1016/j.neulet.2009.01.060
It is well known that visual speech (lip reading) affects auditory perception of speech. But how? There seem to be two ideas. One idea, dominant among sensory neuroscientists, is that visual speech accesses auditory speech systems via cross sensory integration. The Superior Temporal Sulcus (STS) is a favorite location in this respect. The other, dominant among speech scientists, particularly those with a motor theory bent, is that visual speech accesses motor representations of the perceived gestures which then influences perception.

A paper in Neuroscience Letters by Kai Okada and yours truly proposes that both ideas are correct. Specifically, that there are two routes by which visual speech can influence auditory speech, a "direct" and dominant cross sensory route involving the STS, and an "indirect" and less dominant sensory-motor route involving sensory-motor circuits. The goal of our paper was to outline existing evidence in favor of a two mechanism model, and to test one prediction of the model, namely that perceiving visual speech should activate speech related sensory-motor networks, including the Sylvian parietal temporal (Spt).
Short version of our findings: as predicted, viewing speech gestures (baseline = non-speech gestures) activates speech-related sensory-motor areas including Spt as defined by a typical sensory-motor task (listen and reproduce speech). We interpret this as evidence for a sensory-motor route through which visual speech can influence heard speech, possibly via some sort of motor-to-sensory prediction mechanism. Viewing speech also activated a much broader set of regions along the STS, which may reflect the more direct cross sensory route.

*****************************

March 25, 2011
"Mirrors in Our Brain" -Do They Do for Psychology What DNA Did for Biology? (A 'Galaxy' Most Popular)
A recent paradigm-shattering discovery in neuroscience shows how our minds share actions, emotions, and experience what we commonly call "the monkey see, monkey do" experience. When we see someone laugh, cry, show disgust, or experience pain, in some sense, we share that emotion. When we see someone in distress, we share that distress. When we see a great actor, musician or sportsperson perform at the peak of their abilities, it can feel like we are experiencing just something of what they are experiencing.
Only recently, however, with the discovery of mirror neurons, has it become clear just how this powerful sharing of experience is realized within the human brain. In the early 1990's Giacomo Rizzolatti and his colleagues at the University of Parma discovered that some neurons had an amazing property: they responded not only when a subject performed a given action, but also when the subject observed someone else performing that same action.
These results had a deep impact on cognitive neuroscience, leading the the world's leading experts to predict that 'mirror neurons would do for psychology what DNA did for biology'.

Vilayanur Ramachandran is a neurologist at the University of California-San Diego and co-author of Phantoms in the Brain: Probing the Mysteries of the Human Mind writes that "Giacomo Rizzolatti at the University of Parma has elegantly explored the properties of neurons - the so-called "mirror" neurons, or "monkey see, monkey do" neurons. His research indicates that any given cell in this region will fire when a test monkey performs a single, highly specific action with its hand: pulling, pushing, tugging, picking up, grasping, etc. In addition, it appears that different neurons fire in response to different actions."

The astonishing fact is that any given mirror neuron will also fire when the monkey in question observes another monkey (or even the experimenter) performing the same action. "With knowledge of these neurons, you have the basis for understanding a host of very enigmatic aspects of the human mind: imitation learning, intentionality, "mind reading," empathy - even the evolution of language." Ramachandran writes.

"Anytime you watch someone else doing something (or even starting to do something), the corresponding mirror neuron might fire in your brain, thereby allowing you to "read" and understand another's intent, and thus to develop a sophisticated "theory of other minds."

Mirror neurons may also help explain the emergence of language, a problem that has puzzled scholars since the time of Charles Darwin, he adds.

"Is language ability based on a specially purposed language organ that emerged suddenly 'out of the blue,' as suggested by Noam Chomsky and his disciples? Or did language evolve from an earlier, gesture-based protolanguage? No one knows for sure, but a key piece of the puzzle is Rizzolatti's observation that the ventral premotor area may be a homologue of "Broca's area" - a brain center associated with the expressive and syntactic aspects of language. Rizzolatti and Michael Arbib of the University of Southern California suggest that mirror neurons may also be involved in miming lip and tongue movements, an ability that may present the crucial missing link between vision and language."

To test his idea, Ramachandran tested four Broca's aphasia patients - individuals with lesions in their Broca's areas. He presented them with the sound of the syllable "da" spliced to a videotape of a person whose lips were actually producing the sound "ba." Normally, people hear the "da" as "ba" - the so-called "McGurk effect" - because vision dominates over hearing. To his surprise, he writes, "we found that the Broca's patients did not experience this illusion; they heard the syllable correctly as 'da.' Even though their lesions were located in the left frontal region of their brains, they had a visual problem - they ignored the lip movements. Our patients also had great difficulty with simple lip reading. This experiment provides a link between Rizzolatti's mirror neurons and the evolution of human language, and thus it calls into question the strictly modular view of language, which is currently popular."

Based on his research, Ramachandran predicted that mirror neurons will do for psychology what DNA did for biology: "they will provide a unifying framework and possibly even explain a host of mental abilities that have hitherto remained mysterious and inaccessible to experiments."

...Betcha wish you'd never asked  :  )
AR


Administrator has disabled public posting
empathia
useravatar
User Info

Re: Lipreading a viable exercise for some networks?

LOL, how did you know? smile


Administrator has disabled public posting

Board Info

User Info:   Newest User :  sailing 1   Members Online: 0   Guests Online: 491
Topic
Nouveau/nouvelle
Locked
Topic
Nouveau/nouvelle
Locked
Sticky
Active
New/Active
Sticky
Active
New/Active
New/Closed
New Sticky
Closed/Active
New/Locked
New Sticky
Locked/Active
Active/Sticky
Sticky/Locked
Sticky Active Locked
Active/Sticky
Sticky/Locked
Sticky/Active/Locked