THE #1 AV NEWS PUBLICATION. PERIOD.

The Perception Problem: 4 Reasons Digital Environments May Never Be Perceived as Real

My role as an Experience Design Professional often intersects with my passions for biology, neuroscience and human behavior, and for good reason. To truly create a transportive experience, you need to immerse someone completely in a new world, making them forget that they’re in an alternate reality.

I’ve often reflected on the challenges of a digital environment being imperceptible from a physical one, thinking about everything from visual inputs, spatial audio and even natural user interfaces with haptic feedback.

I have a list as long as my arm (and I’m 6’5″, so my arm is long) of challenges we would need to solve in order to truly create an alternate reality so compelling the brain doesn’t know the difference. For the purposes of this blog, however, I want to focus on 4 challenges pertaining to visual inputs that I believe to be major challenges in overcoming the digital-reality gap.

neurooptal

Source of Light

In the digital world, display technology uses three main methods of producing light. In front projection systems, the light we see is typically reflected light, bouncing off of a surface and returning to our eyes. In rear projection systems and in LCD display technologies, the light we see is transmissive light (passing through a substrate first), and in OLED and DVLED displays, the light we see is emissive light, meaning we are viewing the light source directly.

In the real world, however, most of the light we see is reflected light. When we are looking at objects, they are not emitting or transmitting light, they are reflecting either natural or artificial light sources. Of course, if we look at a lamp or a stoplight, the lighting element we see is emissive, but the surrounding body of the fixture is not. There are very few natural sources of emitted light beyond things like fire, lightning, and phosphorescent algae and fish, which to be honest, feel a little supernatural when you see them.

All of this begs the question if the brain, which has interpreted the world through visual stimuli based on reflected light, on some level, responds differently to emitted light with respect to the perception of the images as real or artificial. This could make projection perhaps the strongest case for acceptance, as it is reflected light as well, but obviously, the issues of contrast and ambient light interference can limit the saturation, contrast, and vividness of those images, decreasing their impact as well.

Pixel Structure

I was geeking out on a SMPTE chat years ago where an engineer quipped, “Somewhere, in the back of the human brain, there is a perception of pixels as different than the natural world.”

It may seem like a nuance of speech, but I truly believe that there is a difference between visual acuity and perception. We can reduce pixel structure beyond the limit of visual acuity (picking out individual pixels)  but the structure is still there and may be perceived subconsciously.

There has long been an argument that film looked more natural in part because of the organic nature of film grain and its variances in grain size and spacing.

Taking this cue, some OLED technologies are even trying to escape the “row and column” pixel structure by creating dots that are arranged in more triangular arrangements, with different size subpixels based on colors, allowing the pixels to “recruit” elements from other pixel groups, blurring the lines between discrete, grid like pixels. It’s no surprise that VR headsets were some of the first devices to leverage these new OLED technologies, given the “reality” portion of their moniker.

Long and short, humans are very good at pattern recognition, and the repeated symmetry of pixels may be an ongoing challenge in bridging the digital-reality gap.

Refresh Rates

Electrical light sources have refresh rates. They are an unfortunate reality based on the frequency of the electrical source itself. In high end LED lights, these refresh rates can be 3840 or 7680 HZ, which is extremely high and outside the ability of the human eye to identify specifically,

However, the sun has a refresh rate that is nearly infinite, so in the natural world, our brain isn’t subjected to cycle noise in the same way. Does that matter to the perception of digital images as real? I don’t think its something easily dismissed.

Parallax

In the natural world, objects are perceived in 3D space and in relation to each other, those spatial relationships change based on the position of the person viewing them, and in short, this is parallax.

Parallax is different than stereopsis, which is why traditional “3D” technologies have never hit the mark.

Imagine looking through an open window. As you approach the window, the objects in the foreground get larger, and you start to see an expanded view of the landscape in the background. If you walk left, the foreground objects change in spatial relationship to the background objects, creating a wholly new view.

3D technologies create depth through stereopsis, but they don’t create parallax. Moving to the left of the screen won’t let you see “around” the car in the foreground.

Head or motion tracking technologies can create parallax by manipulating the digital model to reflect position of the viewer, but even in high frame rate, (HFR) application of 240fps, are limited to 3 viewers. LIght-field displays and multi directional pixels seem to have promise in creating parallax in community environments without tracking each individuals, but have their own challenges as well, the least of which being data rates to support 100s of simultaneous views, as well as limitations in the field of view and issues with changes in Z-axis.

I believe parallax is one of the most important features of any transportive visual technology, and it is one of the most difficult challenges to navigate.

There is a lot of focus in the display industry on resolution and visual acuity, which are by no means unimportant in creating visually stunning experiences. I’ve long argued that contrast and color space are of equal if not more importance than resolution, and we are making great gains in those arenas as well, However, as analog beings interpreting our world, faced with a mixture of digital and analog inputs, will our perception of the digital, and our acceptance of it as real, ever be equal to the way we perceive the natural world we’ve evolved to navigate over millennia?

What does it mean for us to see something as “real?”

 

Top