New parents have long used “baby talk” to coax grins and giggles out of their infants.
But that earliest form of communication, particularly coming from mothers with a singsong quality and clearer enunciation of sounds, has also been linked to speeded development of language and speech skills in children with normal hearing.
How hearing-impaired infants develop those skills, and what can be done to help them and their parents maximize the likelihood they’ll be able to use language and speech effectively, are the overarching goals of a nearly $2 million project involving two BGSU faculty members.
Drs. Laura Dilley, psychology and communication disorders, and J. Devin McAuley, psychology, are working with Dr. Tonya Bergeson, the project leader from the Indiana University School of Medicine. The National Institutes of Health’s National Institute on Deafness and Other Communication Disorders is funding the research for five years through mid-2012, with BGSU’s share of the total just under $400,000.
The importance of singsong maternal speech to infants learning language is driving the project, said McAuley, explaining that babies must figure out where words are in speech and “baby talk” aids that recognition. Further affirming its value is babies’ boredom with a monotone delivery. “Baby talk is a good thing,” he said.
Infants with normal hearing are more attentive to the melodious speech and more able to learn language when tuned in to it, added Dilley, calling it “a hallmark of development” across cultures.
Teaching the hearing-impaired baby
Little is known, though, about how hearing-impaired infants, especially those with hearing aids or cochlear implants, develop attention to maternal speech and language ability, she said.
The IU School of Medicine performs cochlear implantation, which has been an option for the profoundly deaf since the 1980s. The implants can directly stimulate the auditory nerve, sending signals to the brain and providing some degree of hearing, Dilley pointed out. However, the sound signals are distorted, which, while not preventing adults who know language from hearing speech, presents “quite a large hurdle to overcome” for children with limited or no experience with language, she said. Those with cochlear implants, she added, need about a year’s experience with the implants to start reacting to normal speech.
Knowing how hearing-impaired infants respond to speech gives an idea of how they’re learning language, Dilley continued. But how does a mother respond to the challenge of speaking to a hearing-impaired child? Mothers talk to other adults differently than they do to children, she said, and when a child is hearing impaired, research findings have shown further differences—more repetition and simple utterances, and less responsiveness.
They may not realize what they’re doing, however, and that it may affect how the child learns, McAuley said. As Dilley put it, they may be unwittingly undercutting the child’s chances of learning language because of the subpar input they’re providing.
That’s where clinicians come in, she said. Assuming that hearing aids or cochlear implants appear to be helping, the professionals can talk more to the mother and child to maximize the chances of language acquisition.
Assessing the data
In the ongoing project, Bergeson is studying behavior of hearing-impaired infants and how their mothers talk to them. That’s at IU, where, after children with hearing loss undergo fittings and surgeries, they return with their mothers for checkups and stay for a day as project participants. Mothers of infants with normal hearing have been recruited to join the study there.
The collected data is being sent to BGSU for analysis, including acoustic analysis by undergraduate students who have been hired to assist and are learning to use computer software to do the work, Dilley said. Among other things, they measure frequency, timing and amplitude information from the mothers’ recorded speech, McAuley said.
Evidence indicates that adult listeners are good at using timing and temporal aspects of speech, Dilley said, but the question remains if children can acquire similar ability. “Temporal cues” include when a sound begins and ends, as well as rhythm. Just as Morse code has a rhythmic pattern, so, too, does speech have rhythmic components, McAuley noted.
The project is long to allow tracking of the participating children’s language development, and the grant is large because the work is labor intensive, Dilley said. It also has possibilities for substantial impact, she said, pointing out that the biggest concern of parents with a hearing-impaired child is if their child will be able to understand and speak language.
“It would be very empowering for those parents” to be able to help their child simply by changing their speech, she said. “This project has the potential to identify which course of action they could be taking to help the child articulate language and understand spoken language.”
As the parents of a 10-month-old daughter, the husband-and-wife researchers have personal, as well as professional, interest in the study. “Having a child definitely brings this home,” McAuley said.