In the nineteenth century, Paul Broca established the cerebral location for articulate speech as residing in the left cerebral hemisphere. Since Broca's discovery, subsequent studies by investigators in a multitude of scientific disciplines have localized additional components of human language in areas of left hemisphere as well as the right. In this connection, psychologists and brain physiologists have developed an important literature on brain lateralization that localizes behavioral and cognitive functions to specific areas of the brain and because specific behavioral and perceptual attributes have been localized in the brain, they have thus been related to proximate cognitive functions about which there is more extended knowledge. However, there is still controversy over strict locationist models pertaining to language and speech, as human communication is not restricted to the verbal message alone but an array of nonverbal vocal communication forms as well. These forms have not, as yet, been designated as left or right cerebral functions.
Human vocal communication is a multiplex signal comprised of verbal and paraverbal components. The paraverbal component of speech transmits a frequency signal that is independent of the more conventionally known verbal signal, and specifically below 0.5 Khz in the speech spectrum. This has been referred to in the literature as the speaking fundamental frequency or “SFF” and has been shown in research to be the spectral carrier of a communication function that is manipulated by interacting speakers to produce social convergence and social status accommodation. Social status accommodation between interacting partners has been found to provide a means whereby persons can mutually adapt their lower voice frequencies to produce an elemental form of social convergence. This convergence is then used to complete social tasks by preparing the communication context for transmission of verbal information contained in the frequencies above 0.5 Khz. Research involving filtering of the SFF band in dyadic task related conversations has shown that the lower frequency is critically important in human communication and may play an independent role tantamount with its verbal counterpart.
Past research into tracing or mapping the cerebral location of behavioral functions has involved various invasive and direct, as well as passive and active techniques. One researcher, Kimura, used dichotic listening techniques in the early 1970's to monitor the symmetry of identification of words presented to a subject's right ear or left ear respectively. The dichotic listening technique involves the simultaneous input of stimuli to each ear but with a different stimulus to each ear. Rather surprisingly, Kimura found that the right ear appeared to have an advantage in the subjects' reporting right ear stimuli more accurately. Kimura reasoned that her finding could relate to earlier findings in animal studies by Rosenzweig that contralateral (opposite sided) transmissions from ear to brain (i.e. from one ear to the opposite brain hemisphere) are stronger than ipsilateral transmissions (i.e. from an ear to the same side brain hemisphere).