Another 4k “Fantasy”-Why Football Won’t Help UHD

fantasy footballI read this article today (Sunday) on one of the nation’s other AV sites, and it’s premise was so far off the mark that I couldn’t even wait til tomorrow to write this piece.

The main jest of the article is that Fantasy Football’s popularity has spurred a channel called NFL RedZone.  The channel shows multiple games tiled on the screen at once, with a larger tile used for games where one of the teams is about to score (they’re in the RedZone…get it?)  The article asserts that this type of content, with multiple tiles across the screen helps make a great new argument for 4k.  But does it?

First, I have to make the statement that what we are talking about is 2160p or UHD and not 4k.  I refer the author to my Definitive Guide to 4k.  Nuff said.

Second I have to say that the article pretty much destroys its own premise half way through the blog itself.  It states:

“Your television is a fixed resolution and size. If you want to be able to keep an eye on all the games simultaneously but no one is in the red zone, the size of the windows available prevent you from seeing much detail in what’s going on. In one experience with this channel in standard definition, which some people still use due to the increase in cost in the jump to HD, I’ve heard it compared to watching the 8-bit graphics of Tecmo Bowl.”

Let’s explore that for one minute.  Why does the SD signal look so bad on the existing 1080p display?  Because it is being up-scaled.  In this case the 480p signal (854 x 480) is being up-scaled to 1920 x 1080 and it as the author says, looks pretty bad.  Any of us that watched 480p content up-scaled on a 1080p display know this to be true.

In HD, if the satellite or cable box is set to 1080i, you avoid scaling the image, you just deinterlace the signal to show it on the 1080p set.  This is MUCH cleaner, and any of us that watch HD at home know that the HD channels look exponentially better than their SD counterparts.

So the question is, what does having a 2160p display do to the RedZone?  The answer, it makes it worse not better.

The 2160p display is still a fixed pixel device.  The incoming signal is still 1080i or 720p.  So in the case of a UHD display, the content actually know has to be up-scaled and de-interlaced to be displayed.  You have just recreated the same problem in quality we had with converting SD to HD.

(***Warning, Extreme Geekery Ahead*** The above doesn’t address the fact that the RedZone is downscaling content to smaller windows on the RedZone from their original 1080i single channel feeds.  There is almost definitely an initial loss in quality in that process.  Then choosing to display it on a 2160p display adds more scaling on the back end.  Take a paper sack, wad it up in a ball, and then unfold it.  What does it look like?).

The article tries to utilize the concept of “multi-window”, a scenario that has been deemed advantageous to UHD adoption by many, to create an argument that the RedZone content will look better.

The reason this is deceptive is that multi-window refers to taking for example four 1080p signals and displaying each of them pixel for pixel on a UHD display, either through a server with a 3840 x 2160p output, or by using 4 discreet 1080p inputs on the display itself.  This means you can essentially use an 84 ” UHD display to simultaneously show four 1080p windows in full resolution (It’s like having four 42″ 1080p displays on the wall).  Of course this would look amazing, as the display isn’t doing any scaling in that scenario.

Using the advantage of the term multi-window for the RedZone is a stretch, in fact more than a stretch than the 2160p display is doing to the RedZone content.  It’s dangerous in that like a Dan Brown novel, it takes something we know to be fact, UHD is maximized by multi-window content, and then uses that to make an argument that relies on none of the same logic.

If you are still up in the air on this, or if you are a display manufacturer reading this article, here is my challenge to you.

Set up an 84″ UHD display next to an 84″ 1080p display with the same backlight, brightness, etc.  Feed them both 1080i content out of a satellite box and let’s take a look.  My educated guess is that the picture on the UHD will not look as nice as the one on the 1080p set but the pixels will indeed be smaller of course.

In the meantime, as long as the RedZone is broadcast as a single 1080i channel, UHD will not give you any enhancements on the quality of each of those tiled games.

If you do buy a UHD set, buy one with the capability to show 4 different 1080p feeds tiled on the display and feed it 4 different games from 4 satellite or cable boxes.  That would look awesome and be a great pitch for needing an integrator as well as a control system for all those boxes.

But of course that’s not the argument the other article made, is it?