LAVNCH WEEK ProAV Tech Talk: “Using AV to Trigger Biology” with Mark Coxon — The Blog Version

Mark Coxon keynote

In the greater context of AV systems and design, Mark Coxon asked in his Tech Talk presentation for LAVNCH WEEK today, “Why do we bother with all this stuff?” There’s no ONE answer to using audio and video to tell a good story. So how do we do it?

Turns out a lot of the answers are in understanding the way our bodies are wired.

Mark Coxon’s session for ProAV Day of LAVNCH WEEK was called “Triggered! Using AV to Trigger Biology,” and I have to say, this was one of my favorite sessions from the day. I’m a visual person, and a colors person, and an emotions-over-everything person. So Coxon’s talk, structured around the psychology of our brains and, thus, how we think and react to AV (as related to our biology) seemed so unique to me.

In case you missed it live, here’s the blog recap (or, perhaps, a highlight reel!) of it in all its colorful beauty.

How Do Humans Learn?

First, to understand how to use AV to trigger biology, we need to know a little bit about the human brain.

Coxon helps us understand this through the three most common ways people learn: visually (through their eyes), through audio (through their ears) and through touch (or kinetics). Yes most people are visual; that’s their primary learning style. But that doesn’t mean everyone interprets everything the same way. For instance, if you could see inside the brains of our audience, you’d find the word “mountain” would almost produce a Google search for the various ways people envision mountains.

Mountain 1  Mountain 2

“A picture’s worth 1,000 words, but a word is also worth 1,000 pictures,” Coxon voices to explain his point.

On the other hand, if you want to show someone something very focused — or you’re trying to get them to experience something very internal — be specific but mindful of the different ways people connect to images.

Memory Matters

Maya Angelou said, “At the end of the day people won’t remember what you said or did, they will remember how you made them feel.”

So, next, Coxon helps us understand the two types of main memory: explicit memory (This is knowing “what happened?”) and implicit memory (This is our interpretation of “how it felt.”).

Our memories are very closely connected to our memory states. So, triggering your content in a way that triggers our memory states can help with retention and connection, Coxon explains. Which is quite important when we’re building both AV systems and content.

Coxon adds, usually, the way we typically remember things are through, one, the peak (the emotional high/low) and, two, the way the experience ended — two moments that create the “lasting memories” we talk about and strive for.

Head, Heart, Gut, Groin

We then learn about the concept of being “anchored.” Coxon explains that we’re all anchored in certain ways to different things (songs, movies, what have you). The takeaway is: We can use the way we present things to create emotional anchors. If we can recreate these states later, almost as “emotional codecs,” Coxon calls them, we can get across our message — because we’ve now created an emotional relationship beyond the transaction.

In other words, our memories are affected by emotion. Coxon continues: We can approach our audience in four different places: Head, Heart, Gut and Groin.

In AV, we are notorious for going for point #1 (spitting out AV specs, showing off technical prowess) and we’ve had a bit of a reputation for #4 (oversexualizing digital-signage content, hiring what the industry sadly called “booth babes,” which is slowly but surely improving). It’s great that we can talk logic, but we also need to connect points #2 (“pulling at those heartstrings”) and #3 (the “gut punch”). Imagine how effective and compelling our content and messages would be if we could balance all four of those things.

Furthermore, if you’re in a state of happiness and well-being (as opposed to one of fear), it becomes easier to remember information more clearly. Yes, there’s still a delta — but the delta is reduced in the latter (the latter being happiness vs. fear), Coxon explains.

Using AV to Activate Brain States

What sticks out to the average viewer in Times Square? What makes us stop and watch something someone else wants to tell us when we have complete control over own own thinking and narratives? Likely nothing.

Use this as inspiration for how to create outcomes in our AV systems. Here are some of Coxon’s explorations on how to use AV to activate brain states:

  • Peripheral vision. Peripheral vision could actually engage more of the brain. (Thus, there’s a higher chance of us absorbing information.) An application for this in AV is being mindful of screen size and the content on that screen — don’t overwhelm the viewer with anything jarring, but activate the peripherals through something like a wide-scale video wall or augmented/virtual reality.
  • Spatial audio. Audio is fun to play with — and spatial audio engages the “fight or flight” in our brain. Consider that audio, in partnership with your video, can maximize impact. Coxon says, “The more activity centers we engage, the more people remember those experiences.” An application for this in AV is to know a little bit of technicalities about sound — bass creates a different experience than treble, for instance.
  • Biophilia. People love nature. Any time we can incorporate content that draws upon nature, the brain goes into a more activated state. In AV, if we can recreate this digitally, that’s the goal.
  • Color and mood. Color and mood are intrinsically linked. Yellow is bright and sunny; red is jarring and alarming. Use colors and moods that create energy and calm. The color choices you make directly impact people’s interpretations and feelings around your content.
  • Music and Score. Use music or scores to harvest emotion. But do be careful with music too, as you always have the chance to alienate someone with your musical choices. Ella Fitzgerald sang, “But how strange the change from major to minor.”
  • Contrast, color, clarity. Think about these three things, in that order. Our eyes are first drawn to contrast. Then, we know that color accuracy is critical; the closer we can get to recreating a real-life experience, the better. In AV, this could be your accuracy in direct-view LEDs or LED-backlit LCD displays. Lastly, clarity makes a huge difference, particularly for high-detailed imagery like 4K and 8K.
  • Eye contact. We’re social animals, and eye contact is built into us intrinsically in the backs of our brain. Eye contact is important to consider in face-to-face events, for virtual reality in groups, videoconferencing and more.

Peripheral Vision Spatial Audio Biophilia Color Music Score Contrast, Color, Clarity Eye Contact

I hope all you #AVtweeps found this tech talk as interesting as I did, even if you were only able to read the blog version and didn’t catch it live. To learn more about Mark, feel free to reach out to him on LinkedIn.

Join us for LAVNCH 2.0!

If you’ve enjoyed LAVNCH WEEK so far and are interested in attending LAVNCH 2.0 the week of June 22 (it’s free for attendees!), go ahead and join the list here; spots will fill up fast. Also, you can check out our LAVNCH WEEK microsite here to see all the articles (like this one) and public videos from the week and more.