Down the Rabbit Hole – My Return to the CRT Television

I’ve been watching the DF Direct weekly podcast on YouTube for almost exactly two years.  DF is short for Digital Foundry, and they are a (mostly) European press outlet in gaming with a mission statement of covering “the latest and greatest in gaming technology–past, present, and future.”  I might be paraphrasing a bit.  Their weekly “Direct” show covers gaming news, but among the three hosts who are typically there, two keep harping on the same points and only one has something original to say each week.  One of those who says the same thing again and again is John Linneman, and he’s been arguing for two years that a CRT (tube) TV has better “motion fluidity” than a flat screen.

If you know how TVs have evolved over the years, skip this paragraph.  There is a lot to unpack in the differences between tube TVs and flat-screens.  First, you need to understand “hertz” (Hz) and “frames per second” (FPS).  North American and Japanese tube TVs run at 60Hz, because the alternating current that comes out of the wall in those regions is 60Hz.  Actually, in Japan, the power grid is half 60Hz and half 50Hz, but they had to stick with one standard … so in the 50Hz regions, the TVs had internal converter boxes that made the video 60 Hz.  Europe is entirely a 50Hz electricity region, so all of European tube TVs are 50 Hz.  What does this mean?  In North America and Japan, TVs can display a theoretical maximum 60 FPS, while in Europe, the TVs could run a maximum of 50 FPS.  It’s more complicated still, because in an effort to bring down the amount of antenna and cable bandwidth needed to broadcast shows, the TVs would run “interlaced” content, meaning if you divided the television’s pixels into rows, the odd rows (1,3,5, etc) would “update” during one of the frames, and the even rows would update during the next frame, and they would alternate like that.  This wasn’t exactly 30 FPS content (or 25), but it was less fluid than straight 60 FPS or 50 FPS.  Much later (at least relative to the invention of the television) we got something called “progressive scan” which differed from interlaced content because all rows of pixels would update on the TV simultaneously.  When someone says a flat-screen is 1080p or 1440p, the “p” stands for “progressive scan.”

Those little converter boxes in Japanese TVs that made televisions in 50Hz regions run at 60Hz were the way of the future. We are no longer limited by what the wall can output with modern “displays.”  We can still do 50 and 60Hz, but we can also do 120Hz, 144Hz, 165Hz, and beyond.  The latest 4K displays that have yet to hit the market can output 480Hz(!).  So isn’t it better to have a flat-screen TV that occupies less space in your house and weighs less than a tube TV if they can display the same content at the same or a better refresh rate (again, Hz)?  According to John Linneman, the answer is no.  He claims that, due to technological differences that I don’t fully understand (but that I can see), the animation on a flat-screen display is more smudgy and blurry than the animation on a tube TV, even if the flat screen is displaying content at double the framerate.  According to John Linneman, a flat screen would need to run content at 1000 FPS to achieve the clarity of motion a 60Hz television can do.  Good God almighty.  I got sick of hearing Mr. Linneman whine about CRTs being superior each week so I found one on Facebook Marketplace for a low price, and drove across the state border to pick it up.  Here it is: reason it’s hard to take a good picture of a CRT TV is because the content is interlaced.  The more you know™ 

As the picture shows, I also picked up a Nintendo Wii for cheap (not from the same person) so I could soft-mod it and play a bunch of retro games.  I also own a flat-screen monitor that can handle 165Hz maximum at 1440p resolution.  I have decided I like the look of a flat screen at 90 frames-per-second and above.  As the elder and wiser of the DF Direct hosts likes to say, anything above 90 FPS on a flat screen is fairly “academic.”  In other words, it becomes good enough, and in the 90-120 FPS range, 90 FPS looks about as good as 120 FPS, and that is true for all the FPS values in-between. It only becomes substantially better with monitors capable of very high refresh rates, such as 240 Hz or 360 Hz.

So what does that mean about the tube TV?  It’s very good, but for gamers in the new millennium, things have actually improved over the years that have nothing to do with display technology.  I noticed the input lag on the Wii was terrible.  In New Super Mario Bros. Wii, I had to press the jump button well in advance of when the jump actually happened.  The retro games fared better, with the least lag, but I am a PC gamer, and the PC has the least input lag overall.  Fun fact ~ the more frames-per-second you are getting, the less input lag there is.  That is because if you’re playing a game at 120 FPS, you see the cue to press the button twice as fast (we’re talking fractions of a second) as at 60 FPS.  Also, the time at which your button press changes what is happening on screen occurs twice as soon at 120 FPS as at 60 FPS.  Additionally, tools like Nvidia Reflex reduce the amount of time it takes the electrical signal to travel on the hardware level, from when you press the button, to when the computer registers it.  I guess you could run Nvidia Reflex on an old CRT monitor… like a crazy person.

And that’s part of the problem.  Tube TVs have never, to the best of my knowledge, gone above 60Hz, aside from very expensive TVs for skilled professionals, not consumers.  Flat-screen monitors are going into the stratosphere with refresh rates.  Also, the fact that a game can run between 90 and 120 FPS without screen “jutter” is another innovation that tube TVs never got called variable refresh rate, or VRR.  The idea is a graphics card communicates with the VRR capable display how many frames the graphics card can handle that second and the monitor will output that many frames that second.  Rinse and repeat for every second the game is running.  When the game has a lot of action on screen, the graphics card may be capable of only 90 frames-per-second.  When the load is light, the graphics card may be capable of 120 frames-per-second.

VRR prevents “frame time spikes.”  If a game is running at 60 FPS, and dips below the target on a non-VRR capable display, which includes every tube TV out there, the video will flicker between 30Hz and 60Hz, because the frame rate isn’t locked to 60.  This is a bit jarring.  Once you see it, you can’t unsee it.  If the display is capable of VRR, there is no jutter, and you barely notice the drop.

All that said, the tube TV is nice.  It didn’t change my world as much as I thought it would.  I apologize if I wrote this post at a third-grade reading level.  I used to go to a local writers’ group in my neighborhood, and everyone was a bit older than me.  I would read a game review out loud each week, and one of the members would say, “I just have no idea what you’re talking about.”  I’m trying to explain to the layperson what I’ve learned from watching two years of videos largely about modern screen technology.  A game review is coming.  I need to finish the game first.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s