A team of researchers at the University of California, San Francisco, said identifying those brain areas has potential implications for developing computer-brain interfaces for artificial speech communication and for the treatment of speech disorders.
Speech is an ability unique to humans among living creatures but is poorly understood, they said.
"Speaking is so fundamental to who we are as humans -- nearly all of us learn to speak," Edward Chang of the university's Center for Integrative Neuroscience said. "But it's probably the most complex motor activity we do."
Scientists have not previously understood how the movements of distinct "articulators" -- the lips, tongue, jaw and larynx -- are precisely coordinated in the brain.
Chang and colleagues recorded electrical activity directly from the brains of three people undergoing brain surgery to determine the spatial organization of the "speech sensorimotor cortex," creating a map of which parts of the brain control which parts of the vocal track.
This cortical area has a hierarchical and cyclical structure that exerts a split-second, symphony-like control over the tongue, jaw, larynx and lips, they found.
Speaking demands well-timed action of several various brain regions within the speech sensorimotor cortex, like musicians in a symphony orchestra having to time their playing to each other, they said.