A study in 1925 found that children with speech and language difficulties were more likely than their peers to have cross-laterality or ambiguous laterality issues. Further research done in subsequent years supported this finding. When we refer to speech and language, we include the ability to listen and decode language for writing and comprehension too. We learn to read and write through using most of our senses; through listening, looking, speaking and holding a pencil, relying on good proprioceptive feedback and balance as our vestibular system interacts with the visual system.
The impact of cross or ambiguous laterality issues
Cross laterality means that when we test the hands, feet, eyes and ears for dominance there will be a mix of left and right answers. In some cases, a person will be able to perform equally well with either hand or foot and they would be classified as having ambiguous laterality. Either situation can mean that the brain has to work harder to handle incoming signals. Children with learning difficulties often complain of extreme tiredness. Processing sensory information is simply not happening in the most efficient manner.
We expect a certain amount of ambiguity in a child up to the age of 7-8 years, but after demyelination around this age, when the brain prunes away those neural connections that are no longer useful, we should see the same choice of side for each action.
The brain is very adaptable, it functions with neural plasticity and although there are locations in the brain with defined sensory processing centres, for instance the left hemisphere is home to the major speech and language processing areas, there are other parts of the brain which can also process speech and language.
The role of the leading ear
If your right ear is the leading ear, or dominant ear, it prefers to pick up the auditory signals and sends the information across the corpus callosum to be interpreted in the left hemisphere. If your left ear is dominant, the signals received will be processed in the right hemisphere first, where interpretation is adequate but once the input is more complex, the information will need to be referred to the left hemisphere afterwards for full understanding. In fact, signals will be dancing across the hemispheres all the time, as you will have input from the eyes and the proprioceptive feedback from your body placement as well.
There are many aspects to the jigsaw of learning or comprehension dysfunction and cross-laterality indications are just some of the pieces we look for, to try and understand what is going on for that person. Some children (and adults) experience confusion when they feel that their comprehension levels are poor. They can hear well, but they can neither retain details well, nor cope when a lot of information is fired at them.
How do we find out which is the leading ear?
During an assessment for the Johansen programme there are monaural and dichotic audiometric tests to perform which help determine which is the leading ear. On the audiogram for the threshold hearing test, you can often see the plot lines for the readings cross over each other on the trajectory from 250-8000 Hz. This is another indicator that the auditory processing is not operating smoothly.
All the results are input into the specialist music software so that individualised compositions can be made, facilitating new neural pathways to grow for a better processing of sound for the client.
Orton, S. T. (1925), ‘“Word-blindness” in school children’. Arch. Neur. Psychiat, 14, 5, 581–615.