Berra’s Law

That great sage and commentator on the human condition Yogi Berra is credited with creating this commandment-like law of technology:

In theory, there’s no difference between theory and practice, but in practice, there is.

Now, of course, Yogi didn’t realize or intend his statement to become one of the unwritten laws of the AV/IT universe but it has. You might be pondering precisely how this happened and why.

Well, ponder no more, the answer lies within.

A Theory Is Created by Deduction and Supposition

Everyone has a beloved theory, maybe a lot of them. Unfortunately, most will follow the legendary console designer Dave Harrison’s law: “There is always an easy answer to every problem — neat, plausible and wrong.” But there is always that outlier theory, the one that actually ends up being either true or at least probable, theoretically (no pun intended).

And that’s where the trouble starts. Because valid or apparently valid theories can also generate or be used to create theory-based models.

For those who don’t recognize the name, Manfred Eigen (born May 9, 1927) is a German biophysical chemist who won the 1967 Nobel Prize in chemistry for work on measuring fast chemical reactions and who has the distinction of a physical acoustical/vibration phenomenon named after him, i.e. Eigen modes. In physics, Eigen-modes are one of the normal vibrational modes of an oscillating system. Image via History in Science.

Want an everyday example? Weather forecasting! I’m sure we’ve all seen the hurricane prediction model tracks showing where a storm is ‘likely’ to go — but did you notice that there are five, six, seven or more such models/tracks? Each is based on a set of assumptions or if you will, theories based on solid, validated observations and documented facts.

The reason we have so many is because to get within range of something accurate for a 24-48 hour window (beyond that all bets are off), forecasters need to consider that all the models could be wrong, one could be right or maybe partially right, or maybe none of the models is even close to what will actually happen. After all, they are based on theory — solid, mathematically-verified, carefully analyzed and checked. So multiple variations exist, can be compared and possibly out of that complex stew of possibilities, one emerges with a higher probability of being correct. Maybe!

More importantly, although they would be hard-pressed to admit it, most forecasters are believers in the well known “law” attributed to various pundits that says, “Under the most carefully controlled conditions of pressure, temperature, humidity and other variables, the system will perform as it damn well pleases.” They instinctively recognize that any theory and thus any model based on that theory is likely to be flawed in ways we have not yet detected or do not as yet understand. So the models end up as best guesstimates — scientifically calculated, logically derived — but still guesstimates.

See also  Scope Creep

We Predict, We Model, But Do We Have a Clue?

So here we are, with estimates, predictions, models, theories and a whole lot of Scientific Wild Ass Guesswork (SWAG) driving our design process and producing our proposals. All the highly researched and data-intensive models and predictive software still produces results that are approximations, not facts.

We have become a data-focused business, relying more and more on the models and predictions to guide us to solutions and offering confidence that what we expect to happen WILL happen. That’s all well and good, except for the intrusion of that ugly reality about the system doing whatever it damn well pleases.

We have focused on ever more complex ways of getting to an answer, creating a proposal and modeling a design, but the real question, lurking just out of sight, is do we actually know with any higher degree of certainty that all that ‘stuff’ is producing better results?

Occam’s Razor

I would offer for your consideration the following proposition. The answer to the above question is optimistically, maybe. What is getting lost or least forgotten in all this rush to data-centric methodologies is the unalterable fact that the most obvious answer, the simplest one, is overlooked.

The principle is commonly referred to as Occam’s razor. The principle behind Occam’s razor was derived by scholastic philosopher William of Ockham (1285–1347/49. It states in its crystal simplicity that: “When presented with competing hypothetical answers to a problem, one should select the answer that makes the fewest assumptions.”

Somewhere along the path to complex simulations, and theoretical modeling, based on theoretical math, based on theoretical assumptions, we lost that concept.

I would respectfully suggest that its return is well past due.

To make this work requires only that you consider the following:

What we need to avoid at all costs is to end up here:

If your prediction ends up being disproved, the discovery will not be to your benefit, I suspect.

Focus on the Facts

In order to make sure we don’t get lost in our data, we need to stop and look at each step of a project with an eye toward William of Occam’s razor. If it’s too complicated to explain, it’s simply too complicated. I recommend the 30-second rule. If you cannot explain to your client or another non-technical person the purpose of some piece of software or equipment in 30 seconds of simple language, then it’s too complex for them to grasp and it’s more likely to meet with resistance because of that complexity.

If on the other hand, you can present your theory, application and/or solution in a straightforward this-does-that and solves-this-problem format, your chances of success go way, way up.

How do I know? I have been using this reduce-it-to-simple-sentences and cause/effect explanations method for quite awhile now and have found that the number of clients who resist or are reluctant to move forward has dropped off substantially. Simplicity works! What William of Occam said nearly a millennium ago is still true, and if you focus on that you can make every project flow without disturbance. Think about it.

Dr. Frederick Ampel

About Dr. Frederick Ampel

Dr. Frederick J. Ampel has been involved in the professional A/V industry for more than 40 years, working as a systems designer, consultant, sales and marketing professional and market researcher. He was the founding editor of Sound & Video Contractor and for SCN, as well as an editorial development consultant for Residential Systems Magazine, Live Sound International and ProSoundWeb. He holds a PhD in Acoustics and MS in Broadcasting and Electrical Engineering from Boston University and has been published by the Institute of Acoustics (UK), Acoustical Society of America, AES, NSCA, InfoComm and quoted in USA Today and The Wall Street Journal, and numerous industry publications. He volunteered with and taught for CEDIA, NSCA, InfoComm and AES. In 1991, Fred founded Technology Visions Analytics, a consultancy and market research firm, which he still runs today. Reach him via email at