Photo credit: www.sciencedaily.com
A recent investigation has uncovered fascinating insights into how parakeets are able to mimic human speech, revealing similarities in brain function typically associated with human language abilities.
For the first time, researchers at NYU Grossman School of Medicine captured the brain activity of parakeets while they produced sounds, leading to discoveries that connect their vocalizations to patterns previously observed only in humans. This study was published online on March 19 in the journal Nature.
The investigation focused on the brain’s central nucleus of the anterior arcopallium (AAC), a region pivotal in controlling the muscles used for vocalizations. The researchers identified distinct clusters of cells in the AAC that correspond to sounds resembling vowels and consonants, illustrating a sophisticated neural architecture for sound production.
During singing, particular cells fired in response to different pitches, drawing parallels to the manner in which humans articulate speech. The researchers postulate that both humans and parakeets exhibit a rare connection between complex neurological processes and the generation of sound, distinguishing them from other studied animals.
Exploring Human and Avian Linguistics
The researchers noted that the capacity for nuanced spoken language in humans arises from intricate brain patterns. To investigate the uniqueness of these patterns, they conducted recordings from the AAC in budgerigars, a species renowned for their ability to imitate a vast array of human words.
The study highlighted the differences in vocal learning between budgerigars and zebra finches, another songbird known for its complex vocalizations. While zebra finches require an extensive number of practice sessions—upwards of 100,000 trials—to learn specific songs through a rigid method, budgerigars have demonstrated a remarkable ability to adapt their vocalizations rapidly. This adaptability resembles human speech acquisition, where sounds can be flexibly combined and recombined using an internal cognitive “vocal keyboard.”
The research team plans to delve deeper into the higher-level brain functions that determine vocalization choices by examining the influences on the AAC from external signals. Understanding these mechanisms may offer insights into higher cognitive processes in humans and enhance the design of artificial intelligence models, such as those used in chatbots.
“Our findings verify that AAC neurons systematically engage in the representation of vocal pitch while exerting detailed control over sound production, showcasing unprecedented similarities with human brain activity,” remarked lead researcher Zetian Yang, a postdoctoral scholar in Dr. Michael Long’s lab. “This positions the parakeet as a crucial model for future studies on speech motor control.”
This groundbreaking research received support from the Simons Collaboration on the Global Brain, highlighting its significance in understanding communication processes.
Source
www.sciencedaily.com