By Greg Bronson, CTS-D
Whether you are a technology manager, or vendor, for the education AV market, much of what we do is incremental improvements to an organization’s broader AV communications infrastructure. This work is planned in the context of the latest and greatest technology and within a — perhaps, lately, not so great — budget. So at any given point in time, how does one really know how to prioritize available funds across a broad range of spaces needing technology improvements? Step away from the crimper and think assessment!
Assessment is “the classification of someone or something with respect to its worth.” This certainly is not a new idea for AV in education; I vividly remember my early career contributions to a detailed process for the institution I was employed with in the late 80’s. In fact, odds are your (and/or your client’s) organization has its own existing form of assessment, whether you call it that or not. It’s probably associated with the “cousin” processes of equipment inventory, campus standards, commissioning tests and/or space programming.
To be clear, what I’m referring to is a process that is applied equally across all existing systems, with an overall goal of determining their effectiveness in supporting audio visual communications. Approached from the Pro AV vantage point, the emphasis is on verifying performance of the sub systems: audio, video and control. Approached from the academic vantage point, the emphasis is more on the physical space, pedagogy and academic programming. The best assessment process has an awareness of all these elements.
The process needs to strike a balance between gathering enough data to establish a fair “snapshot” of each space/system, and yet not require too much time to actually complete each evaluation. It needs to be anchored to benchmarks, if not actual standards. Issues that fall below a minimum threshold need to be flagged for upgrades. Likewise ones that have elements that are “stars” in their category are flagged for replication elsewhere.
Above all make sure the basics are nailed for all spaces. Groovy looking rolling furniture is great – if everyone in the room can clearly hear what is being said! And, by the same token, creating and adopting new standards is great – if the majority of spaces/organizations have the resources to actually implement them. There is a danger of allowing the process and/or results to be skewed by a halo effect of hype (i.e., being billed as “innovative”) or burdened by detail beyond the resolution needed to facilitate organizational decision-making. These can’t be entertained, especially in these times.
Quite frankly, performing a large-scale assessment project is not very glamorous. It certainly doesn’t stand out like the latest technology innovation. By the same token, it needs to be performed by individuals with a comprehensive background in the elements to be assessed. It is work that might be done in-house or potentially outsourced. Either way, when the data is summarized, the organization will be in a much better position to plan future work. And then, what is old is new again.
The views expressed in this column do not necessarily reflect the views of the authors’ employer(s), past or present.
Greg Bronson, CTS-D, applies AV technologies in the development of innovative learning spaces for higher education. Greg spent the first 10 years of his career as AV technician and service manager, with the past 12 years as an AV system designer and project manager. Bronson currently works for Cornell University and has also worked for two SUNY (State University of New York) campuses as well as a regional secondary education service depot. Bronson is the originator of concept for Infocomm’s Dashboard for Controls and has had completed projects featured in industry publications. You can reach Greg at firstname.lastname@example.org