Is It a Standard if Nobody Uses it?
The AV/IT industry has a problem, one of its own making and one that is getting worse by the micro-second. There are just too many conflicting and confusing “standards” and best practices. Between those created by trade groups, those developed by technical associations or technology consortia (worse yet as these tend to be proprietary) and the flavor variations created by various international overlays on existing standards, the pile of paper is beginning to reach the size of a major mountain range in height.
Don’t believe me? Do your own web search and see how many listings you get for the terms AV standards, audio standards, video standards and a few other similar word groups. Our results produced well over 90K hits and 20+ pages of non-duplicate listings under just the search term AV standards — and the numbers increase logarithmically when you add in the related topics or categories.
How many of these numerous sets of rules and procedures are you using in your business on a daily basis, or for that matter at all? I would wager – not a whole hell of a lot if your company or facility is typical of the industry norms.
Now I need to be clear on something here. I spent more than two calendar years worth of time over the last seven to eight years working on two major InfoComm (now AVIXA) performance standards development projects — Audio Coverage Uniformity (AVIXA A102.01:2017 Formerly ANSI/INFOCOMM A102.01:2017 Audio Coverage Uniformity in Listener Area) and Spectral Balance – (Sound System Spectral Balance Document Number:ANSI/AVIXA A103.01:201X). I also volunteered and worked on several AES standards efforts so I am speaking from a perspective that supports the concept of standards.
However, it has become increasingly obvious that this perspective is not as widely valued or accepted as those who are pushing the standards efforts would have us believe. Having spent literally hundreds of hours in the committee trenches of this world, it is often very hard to understand why what appears to be so logical and useful is not being put into practice in the real world.
Rules Are Complex and Often Expensive
So I took a look at the situation as dispassionately as possible to try and understand why the disconnect between what seems useful and what gets used occurs.
Taking a 20,000-foot viewpoint makes it somewhat easier to understand where the problem is. Conceptually any standard, especially one which is performance focused, requires that the user invest both time and training in its deployment.
Time is valuable and having to commit a non-trivial amount of it to adding a skill-set to a technician’s or engineer’s workload and project responsibilities, as well as incorporating the required hardware/operational instruction to any project is going to raise a warning flag at the cost/benefit budget line.
These are especially true if the standard develops while the project is already underway, and therefore its implementation was not planned for or budgeted initially.
It thus amounts to an often-significant change order style cost center issue, and that can create major problems up and down a project’s time line. Since time ALWAYS equals money, the change will have to be paid for somehow — the question then becomes — who is paying and where in the budget does this unplanned for cost end up?
And That Ladies and Gentlemen…
Is the crux of the matter? There is money on the line and that means, more than likely, somebody is going to have to get an approval and sign-off on the additional cost. With project budgets being as tight as they are in today’s mixed AV/IT world, that might well prove next to impossible.
The only way around this pothole on the project roadmap is to convince the client/end user that the additional expense and use of the standard will deliver a better or more efficient or some other upside to their end results. If that can be accomplished there is a chance of supporting the application of the standard because there is a measureable ROI from its use.
Realistically
This is a bottom-line problem, because it’s a cost problem. If using the standard was zero cost, none of this would matter but that is just simply NEVER going to be the case. Any standard that requires verification of performance requires a measurement of some kind, measurement(s) mean measurement systems and equipment, training of user on how to take those measurements, processing the data from those measurements and so on.
All of that equals dollar signs to the mind of the project managers and financial side folks. If the cost cannot produce a direct benefit as noted above there is very little chance of the expense being validated, and therefore the deployment of the standard is contingent on the ROI it can generate,
Showcasing ROI
Having sat through hundreds of conference call minutes and listened to endless often round-robin discussions of samurai-blade edge fine points of technical dispute, it soon dawns on you that what is NOT being built into the standard development process, often not after it is complete, is establishing where that ROI lives and how to value and validate it.
Instead of looking at that aspect as integral to the whole standards development from the outset, it is more often than not something of an afterthought, often created well after the actual technical work is finished. This is simply wrong in my view.
If the value of the standard is built into its development, then it will stand a far greater chance of being used and creating the desired results. It must be integral to the whole creation process or it simply doesn’t work. I would strongly urge any standards development group, committee, organization or creator to carefully consider this issue from the earliest possible point. Only then will the industry start of achieve the goals it wants from the standards efforts!
