Archive for January, 2021

What Makes Sense

Friday, January 1st, 2021

As children, we’re taught that the human animal has five senses: vision, hearing, touch, taste, and smell. Since then, I haven’t thought about them very much. But when I recently started to write a post on audio cables and connectors, I was pondering the differences between audio and video, and here’s where I ended up.

Our senses are so instinctive for most of us that we hardly think of them, but they are pretty amazing in many respects. From an engineering perspective, they include redundancy and some functional overlap, but each sense also serves a unique purpose, with a unique set of characteristics.

These descriptions and comments are based mostly on my own observations and interpretations. Occasionally I looked something up, and have so noted, usually by way of a link.

In this post, I look at the senses from the following perspectives:

  1. How motion is involved in the operation of the senses
  2. How we augment the senses
  3. How the senses compare in determining distance and direction
  4. How we store and retrieve sensory information

I conclude by talking briefly about audio as a seque to that previously mentioned post about cables and connectors.

1. Motion

In a broad, external context, all of our senses can be summarized as “the ability for specialized nerve endings to detect movement.” (I’ll leave the details of how sensing information gets from your nerve endings to your brain for another time, and maybe another author.)

Let me explain my thinking.

  • Taste and smell work by detecting molecules which move into contact with the nerve endings in your nose (via air motion) and your mouth (via your eating utensils). We know from experience that these two senses are closely related. In fact, a friend of mine who was recovering from the Coronavirus recently mentioned that, as her sense of smell began to return, it worked better when she opened her mouth, “as if I am tasting the air.”
  • Touch is enabled by nerve endings in your skin, and I think it can reasonably be divided into two sub-senses, the ability to sense contact or force, and the ability to sense temperature. Sensing physical contact occurs when any object moves against your skin. Temperature sensing also occurs through the movement and contact of air with your skin. But other movements also affect the sensing of temperature. The evaporation of moisture produces a decrease in skin temperature, and long-wavelength electromagnetic waves of radiation bumping into the skin produce an increase in skin temperature. It’s pretty miraculous that light waves produced by the Sun can travel 93 million miles through the vacuum of space, through six or so miles of atmospheric gases, and even a few fluffy clouds, to warm your skin as you stand outside.
  • Hearing sound is also triggered by movement; in this case, when a sound occurs near you, the air molecules around your ear move, pressing in on your ear drum, which moves a couple of tiny bones against a small “canal” filled with liquid. The liquid moves tiny nerves inside the canal, which send signals to your brain, and you hear speech, music, or noise.
  • Sight occurs when electromagnetic waves in the visible spectrum enter your eye, passing through your cornea, into an opening in the front of your eye (the iris), through a lens which focuses the waves, through the liquid in your eye (humor me here, please) until they strike the rod- and cone-shaped detectors in your retina. The rods and cones send this information to your brain, and you can see!

There are other movement-related phenomena which don’t usually make the “5-senses” cut, but are also important. When you move your body parts using your muscles, nerves in the muscular structure tell your brain how far, how fast, and how hard you are moving your limbs. This important sense, called proprioception, is what allows you to touch your nose with your eyes closed (assuming you haven’t over-indulged in any sensory-altering activities.)

And the nerves in your ear canals also detect which way your head is tilted, and they respond to movements of your body within the Earth’s gravitational field. This sense provides balance, which keeps you from falling over (again, assuming you haven’t over-indulged), and helpfully lets you know when you have ridden something (roller coaster, mountain road, etc.) that causes you to be nauseated.

For the purposes of this commentary I shall omit the aptly-named “extra-sensory perception”, as well as the sense that somehow detects when someone is looking at you, or when you have said something your spouse doesn’t approve of, even when they are behind you. (If you are interested in ESP, let me recommend the book Extraordinary Knowing by the late Elizabeth Mayer.)

To conclude this section on motion, I would note that the very concept of motion is dependent on time, speed, and distance (a drone leaves Boston flying toward Chicago at a rate of 300 kilometers per hour….), and speed is dependent on the medium through which one travels. To keep this diatribe at a sane length, that’s all I’ll say about that.

2. Sensory augmentation

I think of taste and smell as fundamentally based on chemistry, while touch, sound and sight feel more physics-oriented. I know some very fine chemistry folk, so I would not dare malign the discipline, but even simple chemical reactions are complex when compared, for example, to the Newtonian physics of how a pool ball ricochets off of a table cushion (one of my favorite parts of the discipline).

Perhaps this difference between chemistry and physics partially explains why we don’t have commercially available products to improve our senses of smell or taste (OK, I see you there, MSG).

But over the years we have figured out how to augment touch, sight, and sound for our benefit.

Touch

For a start, we use clothing to cover our skin sensors. Clothes not only keep us warm, but can protect us from sunburn. Work gloves protect our skin from blisters and abrasion. Shoes and boots protect our feet. And as an extreme example, armor was developed to protect a fighter’s skin from harmful contact, similar to armor’s modern equivalent, the Kevlar vest.

Sight

Pictures of Ear trumpets from Wikipedia
Ear Trumpets (Wikipedia)

Glasses, contact lenses, and now laser surgery are commonly used to bend light rays before they enter our eyeballs to compensate for variations in our vision.

Hearing

Our ability to hear has been enhanced by a broad spectrum of inventions ranging from the ear trumpet to modern hearing aids, and even to medical devices implanted in the aforementioned ear canal (cochlear implants). In the other direction, ear plugs, ear muffs, and now noise-suppressing headphones protect our ears (and our focus) from destructive and/or annoying sounds.


3. Distance and Direction

Our different senses interact with our brain to provide useful indications of distance and direction.

3.1 Taste and touch

Taste is almost completely based on physical contact; you taste something when it touches your taste buds, so the distance you can taste is effectively zero, and the direction is mostly meaningless. Of course, the distribution of different types of taste buds may provide a small sense of “direction” but it’s relatively minor compared to the other senses.

Skin contact is similar — you are aware of something touching you when it actually touches you. (There are ways to trick the senses, but they are minor factors in this context.) And a single skin contact nerve is not directional; any touch is more or less perpendicular to the skin. However, when your brain assembles the signals from touch points all over the body, it draws some directional conclusions. You can determine which direction even a gentle breeze is blowing by using signals from the different touch sensors around your body. This ability to combine multiple sensors to produce additional information is notable, and is used with other senses, too.

Your skin’s ability to detect heat (or its absence) has a much broader range, as evidenced by the distant sun warming your body. And although that sense for a single nerve is also non-directional, your brain again assembles the sensors around your body to determine whether you are facing the campfire or have turned your back to it, even with your eyes closed.

Let’s focus a little more on the distance aspect. The fact that your skin can feel the heat from the sun is a combination of two things: (1) the ability of your “heat detectors” to sense electromagnetic radiation in a certain spectrum, and (2) the ability of electromagnetic radiation to traverse the vacuum of space as well as the gases which comprise our planet’s atmosphere. We’ll touch on this more as we move through the other senses.

3.2 Smell

Your nose is generally non-directional — a cloud of molecules in your vicinity will make their way into your nostrils as you breathe. You can move your head and body around while sniffing to get a better bead on the source, but it is imprecise by nature.

The distance your nose can smell is almost completely determined by external factors, primarily the strength of the original aroma, and the wind direction. As far as I can tell, it’s pretty much impossible to tell the difference between a faint aroma nearby and a strong smell at a distance.

Somewhat related, loss of smell is one of the curious symptoms of COVID-19. A recently published Harvard paper explains that the problem is not neural, but is a temporary malfunction of the nasal cells that normally detect smells. (The original article is here, probably the first time I have ever cited anything from that particular source.)

3.3 Hearing

Hearing is both distance- and direction-sensitive. The shape and positioning of our outer ear flap efficiently funnels moving air into our ear drum, but it also affects the relative amplitude of the frequencies that make up a sound. High frequencies tend to be directional, meaning that they don’t turn corners as well as low frequencies. Noises behind our ear sound different, and we can sense the difference as we rotate our head toward the sound.

We also get hints about the distance of a sound due to its relative volume combined with its frequency spectrum. Air and humidity tend to attenuate the high frequencies — listening to a neophyte bagpipe player practicing at the other end of a football field is not too painful.

Experienced sound system operators know that the low-frequency sub-woofers and bass drivers can be placed nearly anywhere, since their waves are effectively non-directional, but they have to elevate the high-frequency speakers on stands, stage trusses, or ceiling mounts for clear sound. The rule of thumb is that an audience member must be able to see the speaker to hear it (since your ears and eyes are so close together.)

We know that, in practical terms, a loud noise (thunder, explosion, rock band) can be heard several miles away through the air. The distance at which hearing works is limited by its dependency on a fluid to transmit sound waves. (Yes, I tricked you a little when I said “air” above; sound also passes through water, and presumably other fluids.) Air works the best for Earthling ears, but water apparently works for whales, otherwise they wouldn’t sing. And water also transmits sound for boats, ships, and submarines (also serious fishing persons), which (who) use sound waves (called sonar ) as a means of locating objects (and fish) under water.

Returning to land-based life-forms, having two ears provides us with significant benefits. The volume of a sound (see below) as registered by each ear provides an additional indication of whether the sound is coming from your left, your right, or straight ahead. This definitely feels like a survival skill — it helps you know which way to jump when you hear the snap of a twig, the sound of a snake rattle, or a low growl. In addition having two ears provides redundancy. If one of your ears stops working for any reason, you may lose some directional indications, but you will still be able to hear something!

The frequency range of human hearing starts at around 10 cycles per second. Air moving slower than that is felt by the skin (and your internal organs!) rather than heard with your ears. At the upper extreme, a set of young fresh ears can hear high frequency sounds up to around 20,000 cycles per second (a practically named metric that is now known as Hertz.) Sounds above that certainly exist, but we call them “ultra-sonic” sounds (see chart below).

Sound frequency chart from Wikipedia
Sound Frequency Chart
Noise chart from NIOSH
Noise chart

Returning to volume, the intensity of sound pressure is most commonly measured in decibels. Sustained sounds above a certain limit can cause temporary or permanent loss of the ability to hear, as shown in the adjacent chart from the National Institute of Occupational Safety and Health, part of the Centers for Disease Control (CDC).

Now let’s compare hearing to vision.

3.4 Sight

Sight is generally considered the most important sense. Pardoxically, it is also the most complex sense. Here are some characteristics of eyes that differ from ears.

  • your ears are fixed (other than the amusing ability to wiggle them that some people have), while your eyes can move left, right, up and down.
  • Your ears simply process the sound waves that strike them, while the lenses in your eyes can change focal length to better perceive near and distant objects.
  • The iris in your eye opens and closes to control the intensity of light striking your retina, which allows you to see in a variety of illumination levels, while your ears have no such dynamic adjustment capability.
  • Your eyes have two primary kinds of sensors: rods, which see grey-scale, and cones which see color in high resolution. The cones are in the middle of the visual field; they see with high resolution, but require relatively bright light. As your eyes adjust to the darkness, the rods take over. When trying to spot a faint object at night, it helps to look away from it slightly, because the rods are denser on the periphery of the visual field. (There is a third type of sensor, recently discovered, described in this article on photoreceptor cells. Who knew? Besides Wikipedia. of course: “These cells are thought not contribute to sight directly, but have a role in the entrainment of the circadian rhythm and pupillary reflex.”)
  • If a sufficiently bright light hits your eyes, or an object flies toward one, your eyelids close instinctively. for protection.
  • Your eyes also converge to allow better focus on objects less than 20 feet away.
  • Your eyes close when you sleep, partly for self-protection, and presumably to give the vision-processing part of your brain a break.

Even more than hearing, sight is distance-sensitive and direction-sensitive. Your youthful eye’s lenses can focus on an object as close as an inch away. Sometime around middle age your lenses lose their flexibility, which explains the popularity of reading glasses among people when they reach that age. The act of focusing on an object is one of several indicators your brain receives to provide you with an indication of the distance of the object. I worked on 3D television a few years ago, and its focus (heh) was on one of the main “binocular” aspects of depth perception, namely the fact that your eyes see slightly different images when you look at a three-dimensional scene. Note that this is another example of your brain combining information from multiple sensors to produce additional information is notable

Other factors that provide distance and depth perception (relative distance) include:

binocular effects (requiring the use of both eyes)

  • disparity – your left and right eye see slightly different images
  • parallax – when you focus on a near object, the background appears different to your two eyes
  • vergence – when you look at a near object, your eyes converge; when you focus on a more distant object, they diverge

monocular effects (noticeable using just one eye)

  • relative size – when you look at a row of parked cars, the more distant cars appear smaller
  • texture gradient – the texture of near objects is more detailed that that of far objects
  • occlusion – near objects block portions of farther objects
  • perspective – a straight road appears to converge to a point as it nears the horizon; (learning this is an important key to making realistic sketches)
  • contrast differences – similar to texture gradient; objects in the foreground have higher contrast, and
  • motion parallax – when you’re riding in a car looking out the side window, nearer objects seem to pass quickly, while more distant objects (like a far-away tree line) seem to be moving slower

The ability to sense distance is important to sports players, who need to perceive distance quickly and accurately in order to kick, hit, or catch a ball. But it also allows you to do such mundane things as pour liquid into a glass at arms length without spilling it, and efficiently navigate in your world without running into things.

Direction of vision is quite different from direction in hearing. In short, you can hear things around corners, but you can’t see things around a corner (without help, anyway, from a periscope, mirror or fiberscope). Not only can you not see around corners, you can only see things located in front of your eyes. You can’t see things that are too far above, below, or behind them. Fortunately your eyes can move in their sockets, your head can turn, and your peripheral vision is pretty amazing compared to a camera lens, for example.

3.5 Senses in space

As an amusing little side trip, let’s pause to do an “astronaut check” on our senses; what senses work in space?

  • Vision? Fortunately you can see pretty well, since electromagnetic waves (light) can pass through the vacuum of space (assuming you have a clear cover on your space helmet).
  • Hearing? not so much; your hearing requires a fluid, preferably similar to Earth air, which outer space it notoriously short of; that exploding starship you can see through the spaceship portal or camera is completely soundless to you, unless some ill-fated person on board the ship is broadcasting the sounds to you via radio waves.
  • Temperature? It turns out that managing the temperature extremes in space is a significant engineering challenge. The energy from a nearby sun is absorbed by your space suit and potentially transmitted through your suit’s pressurized air, but the suit is also an efficient radiator of heat into an airless void. There’s an article here from Quora via Forbes that provides a little more explanation.
  • Touch, and smell are hampered by your space suit; you might smell yourself while in your suit, and you can fortunately feel your astronaut tools through the gloves (at least a little bit).
  • And taste, already noted for having a distance of zero, works about the same in space.

5. Storage and retrieval of sensory-based information

Over the centuries, humans have invented methods for storing and retrieving some kinds of information. Consistent with my previous comments about physics and chemistry, we have yet to find a scalable way to store and retrieve smells, tastes, or even touches (that would definitely be creepy), but we do pretty well with sight and hearing. (Although there is promise; as I was writing this, I ran across a Gizmodo article that describes a lickable device that creates different taste sensations on your tongue.)

5.1 Analog storage

In this context, by analog storage I mean any method of capturing sensed events that does not require the use of modern computing equipment.

The recording of visual images probably started with cave dwellers marking their walls for decoration, or record-keeping. Audio “recording” probably started with the verbal transmission of stories and songs, building on a spoken language.

The development of written language led to the ability to record stories and other written information. Monks re-wrote stories as a means to reproduce them.

Crude photographic technology (vision) has been around since the early 1800s, slightly beating out Edison’s invention of the phonograph in 1877 (hearing). Commercially available products for recording and playing back synchronized audio and video became available during my lifetime. While I was a student at Georgia Tech, our department purchased one of the first Sony Video Tape Recorders which used 1/2″-wide magnetic tape.

Modern analog recording devices still capture high resolution video and audio, typically using tape recording based on magnetism. Despite this, the recording and playback process always changes the content in detectable ways. This was pretty obvious with the first photographs — while they served a valid purpose, they would never be confused with the original subjects. The same is true about audio recording. Early recorded music was like an elephant doing ballet: the accomplishment was not how well it did it, but that it did it at all.

Today, recorded audio and video, and amplified audio produce pretty high-quality versions of the original content, but you can always tell a difference. A light-based image or video camera can only capture what is it pointed at. Smartphone cameras have significantly improved the quality of snapshots and home videos, but you don’t confuse them with reality. Since the “is it live or is it Memorex?” days, audio systems have also improved. A top-of-the-line 5.1 amplifier/speaker system (left, center, right, left rear, right rear, and subwoofer) makes for an enjoyable and exciting theater-like experience in your home, but, you aren’t likely to confuse the sound with reality. Recordings provide a way for us to share and recall real-life experiences, but they don’t replace them.

5.2 Digital storage

In this context, by digital storage I mean the method of capturing sensed events that uses modern computing equipment using digital encoding techniques. As an example, this Wikipedia article describes digital audio encoding.

Claude Shannon of Bell Labs gets the credit for defining binary digits as a way to measure information. Appropriately, in our modern digital lives, our file sizes reflect the amount of information required to store and reproduce different types of content.

5.3 Information storage comparison

Image of a text file showing song lyrics and chords
Text file of lyrics and chords
  • The lyrics of a song in a text file might require 1,000 to 2,000 bytes of character information.
  • A music score which captures the melody, harmony, and timing as symbols might require 30,000 bytes of data.
  • An audio recording of a song, encoded using an efficient compression method such as as MPEG3, might require 2,000,000 to 3,000,000 bytes. This captures the audible vocal nuances, and the background instruments.
  • A video recording of the same song might require 100,000,000 to 200,000,000 bytes. This captures a camera’s view of the scene, likely including some close-ups, some wide shots, perhaps closed captioning of the lyrics, and even some Ken-Burns-type effects on promotional posters or other still images mixed in.

It’s intentionally obvious from my descriptions that each format requires more “information” as you move down the list. It’s also true, although perhaps less obvious, that the creation, editing, storage, and distribution processes get more complicated, too.

Focus on Audio

From my perspective, audio recording and reproduction strike a nice balance between quality and accessibility. I don’t mind editing and formatting video content, but the sheer scale is often daunting. As I was writing this sentence, I was also uploading a 24-minute video to the internet to share with my family. It took me about 6 hours to edit the video, and it took about ten minutes on a decent internet connection to upload it.

For some reason, there are more readily available options for editing, amplifying, and distributing audio content, and most of the older technologies still work pretty well.

Photo of Zoom H1 pocket-sized audio recorder
Zoom H1

That’s not to say the technology has been stagnant — my 5-year-old shirt-pocket-friendly Zoom H1 stereo recorder has been a faithful friend, capable of recording 12 hours of high-quality stereo audio on one AA battery. It apparently has a new sibling, the H8, which can record 4 channels fed by 4 XLR inputs, in addition to their standard dual microphone capsule. (I’ve been afraid to check the price!)

This focus on audio technology brings me back to the original intent of this discussion, which was to lead into an overview of audio cables and connectors. Faithful readers who have made it this far won’t be surprised that this post has gone way beyond that. But if you happen to be interested in knowing more about audio cables and connectors, (not to mention the names for the 4 different genders of XLR connectors), here’s the link: http://iideaco.com/cables.