Archive for May, 2011


Tuesday, May 24th, 2011

In math (does it show my age that I don’t say “maths”?) and science, a baseline is a reference, a metric against which other measurements may be compared, a sort of “normal” state.

In an initial writing I postulated a baseline, namely, that I have a Peculiar Perspective. Today I found a cartoon that adds another baseline view. Click for larger view. Consider yourself warned.

Age and Music

Tuesday, May 17th, 2011

The May 4 broadcast of Fresh Air featured an interview with James Levine, recently-retired orchestra conductor for the New York Metropolitan Opera. His approach to conducting in rehearsal versus conducting during actual performance seemed very logical, yet novel.

But the part of the interview that really caught my ear was when Terry Gross asked him about his conducting debut at The Met in 1971, at the age of 27. (His debut conducting the Cincinnati Symphony occurred 17 years prior, at age 10!) As part of his response, he noted that musicians tend to focus on musical skill and ability rather than age.

Revelation, circa mid-70s

I have made it clear in conversation and in writing that I am not a skilled musician, just a persistent one. I dearly love rehearsing, performing, and just “pickin'” with other people, but I am awful at “woodshedding,” the term applied to the intense solo practice required to improve one’s skills. As a result, it is more accurate to say that I know how ‘operate’ my instruments, instead of knowing how to ‘play’ them.

And yet, I have had some wonderful opportunities to perform with musicians of all ages during all stages of my life. I started out singing with my older parents and my younger sisters around the piano in the living room. When I was a teen, I sang in choirs with some old timers whose voices ranged from the gravelly to the mellow, and began to learn guitar picking from my older cousin (OK, he was just a year older, but that seems like a lot when you are 14.)

I’ve performed with a 100-year-old banjo player, and sat in once with a country band of high-schoolers who called themselves “Fenced In.” I played handbells for years in a choir that ranged from teenagers to retirement home denizens. I won’t include my own talented children in this list, since they sort of have to let me pick with them, except to say that one summer Bo did invite me to accompany him in several very pleasant outdoor performances hosted by a Decatur wine and cheese bar.

Through all of these experiences I can’t ever remember feeling like I was being judged by other musicians because of my age. Insufficient skill, weak chops, lack of practice, bad puns, and even lack of shoes on occasion, but not age. My personal musical experiences have confirmed James Levine’s assessment.

OK, maybe there’s one exception. Not too many years ago, my wife helped assemble a band of college-aged musicians for a series of Sunday night contemporary church services, and, almost accidentally, pulled together some of the coolest musicians I’ve ever played with, run sound for, and schlepped equipment with. These guys seemed larger than life, and they went on from that church event to play together as a band for a year or so, write some cool songs, and record a couple of CDs. Though they have each gone their separate ways in music and in life, they are still good friends, and I still cherish, and proudly answer to, the nickname they gave me during their nicknaming phase. During that phase, for example, they called my son who picked with them “Tall,” they called his younger sister “Tallette,” and they called my wife “Hot Mama.”

So what did they call me?


“Filter bubbles”

Wednesday, May 4th, 2011

There is a fascinating and disturbing 10-minute TED talk by Eli Pariser, in which he explains how your favorite internet sources (Facebook, Google, etc.) are controlling the content you see, without your knowledge or permission. This is not a conspiracy theory, but is based on changes Eli has recently observed in his own internet results. He explains how and why this is happening, and why is it not a good thing.

In the middle of the talk, I was reminded of Lawrence Lessig’s 1999 book, Code and Other Laws of Cyberspace, which describes ways in which software implementations can subtly (and not-so-subtly) promote a particular agenda or position.

Pariser’s TED talk is at

Security and cognitive biases

Tuesday, May 3rd, 2011

We are hosting a family of two cats, two dogs, and two humans this week. The cats actually arrived last Thursday, three days before the rest of the bunch. Their behavior surprised me, even though I know cats pretty well. For the first three days, they had a quiet house completely to themselves, and yet they were very shy and skittish, almost totally unapproachable.

But when the rest of the family, including the two very active dogs, showed up, the cats underwent an almost immediate transformation, from shy to confident. They no longer ran from our touch, but would readily let us pet them. They roamed into places they had previously avoided.

For some reason, the presence of their “mommy and daddy,” and their canine “sisters” made them feel more secure than they felt in a quiet home they had all to themselves.

Many of our reactions to modern life seem just as strange to me as the cats’ behavior. Here are a couple of anecdotes for your consideration:

  • I know someone who carefully blacks out her name and address before throwing out junk mail. When I pointed out that her name and address are readily available public information, she said, “But that doesn’t mean I have to make it easy for them, putting it right there in front of them. Besides, it makes me feel safer.” I’m not sure who “them” is, why they would have access to her trash, or what they could do with her name and address.
  • I also know someone who refuses to wear seat belts. She fears being trapped in a burning car more than she fears being violently projected through the front windshield, even though the odds of the latter are much greater than the former.
  • I know someone who refuses to transact any business via the internet, for fear someone will be able to get all her money somehow. As it turns out, several recents breeches of “secure” corporate databases have made that fear considerably more understandable, perhaps even justified.
  • I know several people with burglar alarms who have decided that the embarrassment of disturbing the neighbors with a false alarm outweighs the benefits of activating the alarm, so they just never turn it on. And it is not unreasonable to assume that the alarm company sign in the yard has as much actual security effect as the alarm system itself.

Bruce Schneier is a nationally recognized cryptographer, computer security specialist, writer and expert on the topic of security. He gave a TED talk last year that was recently posted by the TED folks. His talk explains much of what I observe about our reactions to threats, danger and security. He also has something in common with Chuck Norris.

Here are some points Bruce discussed in his talk.

The term “security” can really refer to two different things, the feeling, and the reality. It is possible to feel secure without actually being secure, and it is possible to be secure without feeling secure. Of course, what we all want is to both feel and actually be secure.

Security is almost always a tradeoff. You trade such things as money, convenience, capabilities, fundamental liberties, for an increase in security.

And the question then becomes not whether our security efforts make us safer (they almost all have some positive effect), but whether they are worth the tradeoff.

There are rarely any clearly defined right or wrong answers, either. “Should I get a burglar alarm?” Well, it depends. What is your house like? What is your neighborhood like? How valuable are your belongings? How much risk of theft are you willing to accept? Will you even remember to activate it?

In general people have a natural intuition for security decisions. The tradoffs around double-locking the door of your hotel room, or buckling your seat belt, are very reasonable to most people.

Security decisions can also be viewed from an evolutionary perspective. The rabbit in the field hears a noise, and must make a decision: do I keep eating, or should I flee? Make the wrong decision, and you either starve, or you get eaten.

From that we might conclude that humans are good at making correct security decisions. Unfortunately, we are not. And why not? The short answer is that we opt for the feeling of security rather than the reality.

Throughout most of human history, the perception and the reality of safety have been closely aligned. But one could argue that our reactions are still tuned for living in the East African highlands around 100,000 BC. Those instincts may not be so helpful in contemporary New York City.

We have some key biases that color our perceptions:

  1. We tend to exaggerate spectacular and rare risks, and downplay common risks. That’s why flying still seems more dangerous than driving, despite the statistics.
  2. The unknown seems riskier than the familiar. That’s why we tend to fear that our child will be kidnapped by a stranger, when it is very much more likely that a kidnapper will be a relative.
  3. Personified risks are perceived to be greater than anonymous risks. That’s why we used to fear Osama Bin Laden more than terrorists in general.
  4. People underestimate risks in situations they do control, overestimate risks in situations they don’t control. If you take up skydiving or smoking, you tend to downplay the risks, compared to a danger that seems outside your control, such as terrorism.

Other cognitive biases also affect our perception of security.

  • One is “availability” – we estimate the probability of something by how easy it is to bring something to mind. If we hear a lot about tiger attacks, but not much about lion attacks, we tend to rightfully fear the tiger. And this worked fine until the invention of the newspaper industry. Newspapers tend to publicize rare risks out of proportion to reality. Bruce tells people, “If its in the news, don’t worry about it.” By definition, things that appear in the news are rare! Common and likely occurrences, such as car crashes or domestic violence, just don’t make the news.
  • Our cultural identity as story tellers produces another bias. [This is one of my favorite hot buttons.] We tend to react to, and remember, stories much more strongly than we do to to data and statistics.
  • Another cognitive bias that affects our perception of security is basic innumeracy – we are pretty good at visualizing small numbers and small odds, but not at all good with very large (a trillion dollars) or very small (one chance in a million.)

All of these cognitive biases act as filters to our perception, and the result is that we often feel more or less secure than we actually are.

Schneier goes on in the talk to discuss “security theatre,” the difficulty of evaluating protection schemes for events that don’t happen very often, or for changes that span decades. He explains why and how mental models help us understand complex situations, and where those models tend to come from. He discusses the various agendas pursued by different security stakeholders, and how confirmation bias makes it very difficult for us to change our models.

Finally, he talks about our reliance on others for safety, for example, in pharmaceuticals, air safety, and building codes, and he describes his ultimate goal, which is to provide people with better security models so they can make better choices.

I think I’ll stop without trying to hammer conclusions into your head. I may revisit the topic later – you might want to find a hard hat.

Meanwhile, if you want to check out Bruce Schneier for yourself, here are some links: