How do we listen?

How we listen…

This is a story about sound.  How does it get from two hands clapping together in through our ears and to our brain so our brain can tell us, “yes. That is two hands clapping and I think it’s coming from over there!” ???? 

So, first and foremost.  What is sound? If you want a really great, easy to read article about sound try Everything you should know about Sound but i’m going to do my best.  

As you know everything is made up of particles, the air we breathe included.  Basically, sound is a vibration of the particles in the air (or water) cause by some event like two hands clapping or pressure from a car horn; probably, for humans the important are the pressure waves generated by another person’s vocal chords, lips and tongue which are moving in just the right way that you understand them as words and language.

Sound moves through the air (or water (or solids for that matter)) in longitudinal waves, meaning that particles compress together with pressure then spread out (to low pressure areas)….confused?  Check the link below for an idea at how sound waves move.  Probably the best description is an earthworm how it bunches up then spreads out to move forward.

Sound is measured based on what the wave looks like, well not exactly.  This can get a little tricky but I’ll mention it once just so you know. When we look at the graph of a sound waves we are looking at the pressure intensity graphed, not how the sound waves actually move (remember the worm)….ok I’ve said that now let’s forget it and move on.  

waveear

 

 

 

 

 

 

Look at the distance between the peaks. This is the wavelength.  We describe it as pitch or frequency of the sound, the shorter the distance between peaks the higher the frequency.  As humans we hear a sounds with frequencies from about 20 Hertz (which is long….around 17 metres between peaks) to 20 000 Hz (1.7 cm peak to peak).  So, as an example a drum beat will be a low frequency sound while a whistle will be high frequency.  

The other important measurement is loudness, or amplitude.  This is the height of the wave.  All we need to remember is that sound is measured in decibels and decibels double in loudness as they go up by 10.  So, 20dB is twice as loud as 10db and 30dB is four times louder than 10dB.  With that in mind, sound at 160dB (inside a speaker at rock concert) is probably going to cause permanent hearing loss and sound at 180dB will probably lead to death.  Pretty crazy stuff to think a wave of pressure travelling through your body could kill you but that the concept of shock waves from nuclear bombs.  But lets get back to hearing and what happens when sound gets to the ear.

The ear – our little transducer.

Now that we understand a little bit about what sound is, let’s look at how we hear it and turn those waves of pressure into something meaningful.  What better place to start than the ears. (you know if we had no ears we’d still be able to feel the vibrations of low frequency sounds through our bones and body and that different parts of our body will pick up different frequencies!?)

The outer ear or pinna acts like a funnel which amplifies sound and sends it down to the ear canal until bang! it hits the eardrum.  The eardrum then vibrates which moves the three tiny, super sensitive bones behind (malleus, incus & stapes) and their movement leads to vibrations in the fluid of the inner ear, the cochlea.  The cochlea pretty much looks like a spiralling seashell with three semi circle tubes at one end (or vestibular system, which helps us not fall over) with liquid inside.  Now, here is when things start getting a little hairy (haha cause there’s cilia which are like tiny hairs in the cochlea…plus it’s complicated).

Long story short the cochlea has three compartments in this spiraling tube. two of them have fluid in them.  When the fluid vibrates it moves this little flap called the Organ of Corti, which moves these little hairs (cilia).  The hair cells turn the energy from the vibrations into electrical signals which move along the auditory nerve and into the brain! To be honest it’s a little more complicated than that; there’s two types of hairs, potassium concentration plays a part but we don’t need to go into that.  When you look at the spiral of the cochlea, the high frequencies are transduced at the base of spiral (meaning turned into electrical energy) and low frequencies are transduced at the apex.  It’s like a frequency map in the ear.  We call this tonotopic organisation.  Picture a piano, that’s kind of what the cochlea would be like if it were rolled out flat.    

Image result for picture of a cochlea tonotopic

The Brain

We’re now in the brain. And we will find that same tonotopic organisation in our brains as well.  The signal travels from the cochlea, along the auditory nerve (VIII facial nerve) to the ventral cochlear nucleus.  Here information is split up a little and information travels both ipsilaterally and contralaterally (same side and opposite side of the brain) as it heads up the superior olivary complex (the Super Olive if you like).  Quite a bit happens between the super olive and the primary auditory cortex but I’m not going to go through all the brain areas involved.  Some of the important thing that happen by the superior olive and other interim areas include:- interaural differences (important to detect where sound is coming from), separating high and low frequencies, reacting to loud noises (which is important and will be discussed in auditory training) etc.  

So now we get to the primary auditory cortex.  Again, this is tonotopically mapped (like a piano) and its the cortical region responsible for basic sound perception like pitch and rhythm.  The primary auditory cortex is connected to other regions like the second auditory cortex, other areas in the superior temporal gyrus, superior temporal sulcus and the frontal lobe.  To make it easier we call the whole thing the frontotemporal system.  In the auditory cortex, adjacent neurons tend to respond to tones of similar frequency.  They can, however, specialize in different combinations of tones. Some respond to pure tones, which a rare in nature, and some to complex sounds like those made by a guitar string. Some respond to long sounds and some to short, and some to sounds that rise or fall in frequency. Other neurons might combine information from these specialist neurons to recognize a word or an instrument.

Sound is processed in different regions of the auditory cortex on both sides of the brain. However, for most people, the left side is specialized for perceiving and producing speech;  this includes most left handers.  Damage to the left auditory cortex, like from a stroke, can leave someone able to hear but unable to understand language.  

Now we know the boring stuff here are a few interesting facts about sound:  

  • Our first perception is sound.  It is the first sense to come online as early as 20 weeks.  Before we can open our eyes we can hear the world around them.  Which at that stage is primarily amniotic fluid and whatever our mums are up to.  Before we are born our inner ears are already at adult size.     
  • There has been research to suggest that a newborn processes their mother’s voice differently to the voice of strangers.  
  • As the brain gets information from the ears it is also sending information to the ears!!  When we hear a loud noise our brain sends a message to our ear muscles which tighten the ear drum.  This protects the hair cells in the cochlea which can be damaged by loud noise.  
  • The ears still function while we sleep but our brain blocks it out.

So now you know a little more about how sound gets from the lawn mower into our brain (and our awareness).  Please stay tuned for Part 2 of our auditory blogs where we will discuss Auditory Processing, which is how the brain sifts through and makes sense of all that noise.  We will also discuss what can happen when things go wrong with our processing and how sound affects our emotions.

Speak Your Mind

*