This month’s topic is a bit personal; but no worries, it’s not that personal. Actually, the full title should be: What’s your test media? Of course there is no right or wrong answer; it’s a matter of personal taste and often seems to default to what ever is handy. Sure, in the grand scheme of things, the media that you pick to test an AV system is not worth losing sleep over. But it’s come across my radar again recently, while preparing for a seminar titled Technology Manager’s Toolkit 2010 (that I’ve been asked to present at Infocomm), and felt it worthy of covering here.
Let’s be honest. How many times have you (or someone you observed first hand) worked on an AV system that’s worth more than the down payment on your house, only to pull out some crummy copy of an unwanted movie to test it with? Adding insult to injury, an arbitrary 20-second clip is played for a quick go – no go test. If something resembling a picture is made and sounds were heard, it was time to move on.
If your answer is some form of, “been there, done that…” then I want to be the first to congratulate you! Yes, that’s right, while the AV Club is about to take some issue with how this is done, it shouldn’t overshadow that making the effort to replicate what the real end user does day in and day out (playing real media) is to be commended. This kind of thing goes a long way toward keeping them happy and complaints at bay. Yet, as AV Pros, we expect more of the systems than just, “It works!” …Right!?
Practically speaking though, short of setting up thousands of dollars worth of test gear to conduct niche tests (which may well be warranted, as determined by the Pros) each time, how is a higher level of expected performance ensured? What more than a go – no go test can, or should, be accomplished? The answer lies in using the best test instruments an AV Pro has available — our eyes, ears and touch! And, contrary to popular belief, doing so doesn’t require “golden” ears or eyes. What is required is appropriate test media, as well as intimate experience with how good (or bad) that same media can look, sound and be manipulated.
Starting with selecting the media, let’s first remind ourselves we’re in agreement that the test media best be PG (or lower) rated. Also, we’re not suggesting that popcorn get popped for some extended viewing in the field that’s disguised as “a test.” OK, with that out of the way, let’s get to some specifics.
The format the media is available in is a practical consideration; ideally, it allows the use of the same, familiar content across the many formats being supported. Next, the selection should be heavily influenced by the degree to which it “exercises” the system. For example, a video clip should have some dark scenes, with small differences in contrast. Or, an audio clip should have a transition from a very quiet to loud passage, as a dynamic range example. Lastly, pick content you enjoy, since you’ll have lots of chances to play it.
For video, we have two primary content classifications — motion video and graphics. On the motion video side, I like Coral Reef Adventure. It’s available on DVD, Blu-ray and even online (in the form of downloadable digital trailers), not to mention it’s natively shot in the ultimate gold standard: IMAX. There are many dark underwater shots that challenge display contrast ratio as well as vibrant color that demonstrate color saturation. While not exactly an action thriller, there are a number of scenes sufficient to evaluate motion artifacts. For graphics, there is the good ole’ “poor man’s test pattern” — a simple Word page filled with the capital letter H over and over again. This works really well for gross evaluation of uniformity of focus and convergence.
For audio, we’re concerned with both speech intelligibility and program audio reproduction. On speech, this really needs to be a two-person test, with one to speak into the microphone (being the live “media,” and reading aloud what ever is handy) while the other listens. It’s best to then flip roles and compare impressions (for a summary critique). When another live voice (you can’t talk and listen at the same time) is not available, I like some CBS Radio Mystery Theater radio dramas. Picking one that has a range of characters (including some with challenging dialects) works really well to listen for intelligibility throughout a space. For program audio testing, one of my favorites is a Dire Straits album. They have good (wide) dynamic range and distinct (and wide) frequency elements in many songs. We’re listening for overall frequency response and minimal distortion here. But the testing gets a bit more interesting with kicking up the volume several notches to check for buzz/rattles.
A final note, relates to the “touch” sense, i.e., how it is manipulated. Consider the media’s use of both regular and special commands. For example, on a DVD, the basic stop, play, FF, etc., must work. But in many cases, the end user needs ability to access deep menu structures, so this test disk need incorporate a full range of menus and features. Don’t forget to also make sure it has closed captions encoded.
So, your test media should be something you know very well, which likely is not some “B” movie that should have been tossed years ago. Ideally, you’ll have opportunity to really appreciate it on the highest performing AV system(s) as well not so hot ones. Speaking of high performing AV, hopefully you’ll be making it to Infocomm. If you do, and also make my seminar, be sure to introduce yourself as a fellow AV Clubber!
The views expressed in this column do not necessarily reflect the views of the authors’ employer(s), past or present.
Greg Bronson, CTS-D, applies AV technologies in the development of innovative learning spaces for higher education. Greg spent the first 10 years of his career as AV technician and service manager, with the past 12+ years as an AV system designer and project manager. Bronson currently works for Cornell University and has also worked for two SUNY (State University of New York) campuses as well as a regional secondary education service depot. Bronson is the originator of concept for Infocomm’s Dashboard for Controls and has had completed projects featured in industry publications. You can reach Greg at email@example.com