Beware: 4K Capable and 4K Native Resolution Aren’t the Same Thing

Among the plethora of deceptive (intentional or not) 4K specs being thrown into the market, there is one that should be obvious to all of you out there in AV-land, but may not be — so here you go.

There are all sorts of 4K resolution displays appearing in the marketplace, finally! After a year or so of hype, they’re shipping. However, not all 4Ks are the same. For example, 4K signals carry way, way more complex signal (and image) characteristics than just resolution (most being 3840×2160). In addition to the resolution of the signal, there are important spec considerations like chroma sampling rate (this ultimately determine the color-quality of of the displayed image), color bit depth (this ultimately determines the color resolution and detail) and frame-rate (this determines how well a display handles motion images).

For this blog, I will focus only on the important factors of resolution you need to be concerned with — over the next few months, I’ll add blogs on chroma sampling, color bit depth and and frame rate.

And, now’s a perfect time to understand this as all the 4K gear and displays start shipping this fall.

Now for a resolution lesson:

There are two resolutions standards for 4K (and, before you get upset that there isn’t just one, remember the the old-timey (currently used) standard for HD actually has four solutions — so having two for 4K is 50 percent less than what we had with HD. So insert smiley-face here!

4kultrahdlogos1-0716

  • 3840×2160 is what 90 percent of the 4K display will be, natively. This is the resolution selected by the SMPTE.
  • 4096×2160 is what the DCI (Digital Cinema Initiative) decided 4K resolution would be in movie theaters. So, displays using this resolution likely use the actual DCI chips used in movie theater projectors.

No, the human eye can’t perceive a difference in the two resolutions unless it’s side-by-side. Basically the real difference here is in aspect ratio. DCI is 1.9:1 aspect ratio and the SMPTE 4K is 16:9 (the exact same thing as all four of our existing HD resolutions). Que light bulb going off in head, here!

4096px-Digital_video_resolutions_(VCD_to_4K).svg-0716

So, if this is a a clear-cut resolution issue — what’s the big deal?

Well, the native 4K resolution displays vs. the 4K “compatible” or “capable” (or even 4K “ready”) displays are very different and it can mean one of two things.

In the first way, a display might accept 4K content, but downscale it before displaying it. For example, Samsung has an entire line of TVs it sells at places like Best Buy that are branded 4K, but are actually converting all 4K resolution images down to HD. Yes, the image is still stunning, but it’s ultimately still 1080p. You have to look at the fine print to see that their native resolution isn’t 4K after all and that they are converting the signal down from 4K to 1080. One day, those buyers could be either pissed or disappointed.

See also  The Truth About the New TechLogix "Gaming Market" Fiber Optic Cables

In any case, in the ProAV market, there are already a plethora of projectors that are 2,560×1,600 resolution but are displaying 4K in a unique way (more on that in a moment). Let’s take a look at how those projectors display their resolution specs and you tell me if this is potentially confusing or even deceptive:

4k-screenshot-0716

Note the the manufacturer shows the 4K resolution first and then in an “oh by the way” way, says the actual native resolution is just 2560×1600.

Personally, I totally get it — but I’m an industry-insider. Will clients understand that they aren’t buying a native (actual) 4K resolution projector? And do integrators know this?

The second way displays can be 4K-but-not-really is using something commonly referred to as wobulation technology.

Wobulation — sometimes also referred to as pixel shifting — works by overlapping pixels. It does so by generating multiple sub-frames of data while an optical image shifting mechanism (e.g., the crystal in an LCD) then displaces the projected image of each sub-frame by a fraction of a pixel (e.g., one-half or one-third). The sub-frames are then projected in rapid succession and appear to the human eye as if they are being projected simultaneously and superimposed.

homee_wobulation485x339-0716

Image Courtesy of Popular Science Magazine

So, basically, you are actually seeing a 4K image displayed, but you’re seeing it by splitting pixels on LCD crystals or DLP mirrors on the imaging device.

Can the human-eye detect the difference? It depends…

There are a plethora of factors that would determine if anyone can tell — for example, those of us in the industry may perceive 4K wobulation as a little bit softer than native 4K resolution imaging. But, on the flip-side, wobulation-based imaging tends to hide the edge of the pixels — so you don’t see the squares — when sitting close to the projected image. So, the trade-off is between perceived sharpness vs resolution.

Wobulation-based imaging is also significantly less expensive than native 4K imaging. Currently, it’s in the range of 40-60 percent cheaper. So, when the same projection systems come out with the exact same specs, but using native 4K imagers, they will be more expensive.

If you read the specs carefully, you can tell that the display is using wobulation to display the 4K content (as opposed to downscaling) because it will use marketing phrases like “4K enhancement” or even brand the technology. JVC, one of the first manufacturers to use pixel shifting to sell “4K displays” at a lower price point, calls it “e-Shift.”

Thus, both will live in the market side-by-side and be sold according to applications and customer budget.

But the take-away here is that there is a big difference between native 4K and and “other” 4K. It doesn’t matter what you choose, as long as you are making an educated decision about what you use.