Survey Says! Surveys and InfoComm’s APEx

survey-saysI wrote a blog a couple weeks ago about Why I like InfoComm’s New APEx Program.  The blog brought attention to the fact that the new program will include client surveys that must be positively completed each year for an integrator to remain in good standing in the program.  That is an idea I have promoted for some time, but also one that seems to be a bit controversial.  Why?

There are a many number of large ticket items that we buy as consumers that are associated strongly with customer satisfaction surveys.  We are surveyed on everything from our retail experiences, (via a phone number on the register receipt promising a gift card or credit for our time), to purchases of new cars and homes.  I bet you can remember a salesman of some product telling you that,

“A survey will be coming in the mail and if there is any reason that you can’t answer ‘completely satisfied’ to every question, please call me to resolve it before completing the survey.”

As a consumer I always welcome the chance to share my opinions in hopes that the overall experience of that product and service can be the best it can be.

However, it seems that when the roles are reversed, some in our industry are afraid of what the survey system may portray.  One commenter on the previous blog mentioned the “negative bias” that is sometimes apparent in surveys and rating systems.  I think it is a point worth addressing.

The Buttes

This is where I used to wait tables in college.

I used to work for what was the Wyndham Buttes Resort in Tempe, AZ when I was in college (it’s now a Marriott).  As you may imagine, hotels are very focused on the customer experience and as a result rely heavily on the survey system.  I remember that part of our training was to try and proactively “uncover” potential problems that guests may be experiencing.  Many people may have had a bad experience but did not make the effort to complain to the hotel or its staff.  These guests instead left without speaking a word with their issues unresolved and the hotel thought everything went just fine.  The problem was that these unhappy guests didn’t stay silent for long.  When they returned from their trip, everyone asked how their vacation went and then all the frustration was spilled out in the story of what happened.

In fact, the Wyndham brass liked to remind us that every guest who leaves unhappy tells an average of 10 people about their experience, while the guests that have a good experience share it with only one person.  We needed to be at 90% guest satisfaction just to break even on the word of mouth!

You see much of the same with online ratings of products and services.  A jilted customer will take time out to let everyone know he feels he was ripped off, while a satisfied customer rarely feels the same call to arms to promote how great the product was.

With all this said, I am not at all concerned that a ‘negative bias’ will affect the ratings integrators receive in APEx.

First, it seems that integrators will get to choose the clients that receive surveys and prepare them accordingly.  InfoComm is not requiring a survey from every client, nor can a client voluntarily submit a negative review unless they have been selected to receive a survey.  This alone will greatly reduce the chances of a ‘negative bias’ skewing the scores.

In fact, it seems a bigger concern may be a ‘positive bias’ being conveyed due to the potential of “cherry picking” clients for surveys.  This assumes however that the integrator has a true picture of how satisfied the customer is.  This may not always be reality however, as consultant Leonard Suskin pointed out in my comments on the previous blog:

“I have seen firms use jobs as part of their portfolio, only to contact someone there and find that the job went so poorly that the client would never even want to look at that particular contractor again. I don’t know if this is cluelessness or deliberate sleight of hand.”

Some cherries may not be as ripe for the picking as the integrator believes.

Secondly, surveys can be written several ways to elicit different response and even influence the scores.

I used to work with two home builders when I was in residential AV that used the survey system to great effect.

Shea Homes used surveys to give their customers an ‘evangelical’ score.  How likely were their members, (this is what they called their homeowners), to proactively promote the builder to their family and friends?  This goes beyond satisfaction.  There are plenty of times that we may find a product pleasing to us but would not put our reputations on the line that others would feel exactly the same.

David Weekley Homes used to tailor their surveys specifically to make high scores harder to achieve.  How did they do that?

They started by using a scale of 1-5 as opposed to 1-10.  Most people when completing a survey won’t just give the highest mark all the way down the column, even if they are happy with the product or service.  Given that, a survey with a scale of 1-10 will net a 90% with the next highest rating selected.  A survey with a scale of 1-5 however will only get a score of 80% if the survey taker follows that same behavior guideline.  To get a 90% or better (something David Weekley is proud to repeatedly achieve year after year) you have to get some “5”s, and those aren’t easy to get!

Next they changed the language in the surveys themselves.  Instead of the top of the scale being described as “Completely Satisfied” it was described as “Delighted”.  Think back on your last positive consumer experience and ask if you were completely satisfied and then, if you were, ask yourself if you would describe yourself as “delighted”.  I guarantee there are a large number that would say yes to the first but have a hard time saying yes to the second.

It was rumored that any member who gave a score of 3 on any section of the David Weekley Homes survey got a visit from the founder David Weekley himself to find out why.  Whether it is true or not I don’t know, but working with their team for the brief time I did, as well as with the team at Shea Homes, I was not surprised when they were the only two homebuilders to get in the JD Power Top 50 Customer Service Brands.

So there is a science to writing surveys to get the type of answers you want, as well as to either mninimize or engage emotions depending on your goals. InfoComm relates that they have worked with an outside team to create a system that is fair and reflects accurately on integrators and their businesses.  According to InfoComm:

“Because both the ANSI/INFOCOMM Standard Guide for Audiovisual Systems Design and Coordination Processes and the AV System Performance Verification Standard were developed in an open, consensus-based process, approved by an independent outside body, and are aimed at providing proper communication and documentation between the AV industry and the client, using both on several projects is a requirement.”

If that still doesn’t ease your mind, remember who supports InfoComm to the tune of over $40MM per year.  Here’s a hint, it’s not your client 🙂  InfoComm is invested in the integrator and as such I’m sure the survey program reflects that.  It is not designed to harm businesses but instead to help create a program that raises the bar for the industry as a whole and that should be a good thing for all of us!

What are your thoughts?  Chime in below in the comments.