I don’t often hear the words “National Resource Defense Council” and “Ultra HDTV” in the same sentence. But the NRDC just released a report stating that power consumption in Ultra HD (4K) TVs is about 30% higher than same-size 1080p TV sets.
The NRDC report goes on to say that one-third of all new TV purchases are for screens 50 inches and larger, which should come as no surprise given how dramatically TV prices have dropped in the past three years. And the NRDC calculates that, if Americans were to replace their older 1080p sets with Ultra HDTVs in screens sizes from 36 inches and up (who has a 36-inch TV??), the additional power consumption could amount to $1 billion dollars annually, equivalent to three times the annual residential power consumption of San Francisco. (I love it when press releases come up with offbeat statistics like that one.)
The substance of this argument should come as no surprise to anyone: An Ultra HDTV has four times as many pixels as a 1080p set (which is why it was originally called Quad HD). So it stands to reason that an Ultra HDTV would use more power, although a 30% increase seems on the low side.
The NRDC apparently tested a few models of TVs and found “…there were dramatic differences in the power consumption among UHD models of the same size, indicating the technology already exists to make energy-saving improvements to the most inefficient UHD televisions.”
The report went on to state that “Consumers can cut several hundred dollars off the lifetime energy costs of a new UHD TV by a) buying models with the ENERGY STAR® label, b) ensuring Automatic Brightness Control is enabled, and c) avoiding the quick start feature on Internet-connected televisions that results in significant amounts of wasted standby power.”
Most LCD TVs use amorphous silicon (a-Si) or low-temperature polysilicon (LTPS) thin-film transistors to switch the pixels on and off. These technologies have been around for some time, and they have their disadvantages – high leakage current is one. But the yields are good and predictable.
A solution to the power consumption issue us waiting in the wings. Oxide TFTs, or more accurately, indium gallium zinc oxide TFTs, look like the logical replacement for a-Si and LTPS. IGZO, in development for over 30 years and first commercialized by Sharp, promises low leakage current, a smaller size (more light passes through the pixel as a result), faster on/off switching times, and lower power consumption.
That last attribute alone makes IGZO attractive as we move into the worlds of 4K / UHDTV, 5K, 6K, and even 8K displays and TVs. My guess is that most of that 30% power consumption increase would be rolled back by moving to IGZO TFT arrays.
The catch is cost – IGZO is expensive to implement in a consumer television that might sell for all of $700 – $800. Right now, Sharp is implementing IGZO in their line of Ultra HD desktop monitors, small multipurpose displays, and possibly their large (104” and 120”) Ultra HD commercial monitors.
They’re not alone. I was told by LG Display at CES a couple years ago that their OLED TVs also used IGZO TFTs to switch pixels. Given the price of those sets, the added cost of IGZO isn’t as much of a problem: LG’s 55-inch Ultra HD OLED TV currently retails for about $3,000.
The NRDC report didn’t state which models they tested. Were these conventional edge-lit, or full array LED models? Were any quantum dot Ultra HDTVs tested? How about Samsung’s S-series UHDTVs? What picture mode was tested – Dynamic? Standard? Movie/Cinema?
As for the NRDC’s recommendations; all TVs come with Energy Star ratings, so it shouldn’t be difficult to figure out which models are the best penny-pinchers. However, turning on ambient light sensors to dim the screen depending on room light does funny things to picture quality, and I wouldn’t recommend it.
Instead, simply set the contrast to about 80, brightness to 40 – 50, and color temperature to “warm.” (This presumes you’ve already taken the TV out of “dynamic” mode). If you can adjust the backlight levels, crank them back to about 60 – 70 and see if they are bright enough for everyday viewing. (Turn OFF all of the other image/picture enhancements found in basic and advanced picture menus, too.)
Granted; turning off the quick start feature will reduce power consumption when you’re not watching. I don’t mind waiting a few seconds for my 46-inch 1080p set to turn on, but it only uses 160 watts to begin with.
Here’s one last recommendation: Don’t buy a new TV on Black Friday, or even before Christmas. Wait instead until the two – three week period before Super Bowl 50 (Sunday, February 7, 2016) to make your purchase, and you should score a great deal on a new Ultra HDTV. (And don’t forget to check the Energy Star tag!)
Posted by Pete Putman, November 17, 2015 3:41 PM
About Pete PutmanPeter Putman is the president of ROAM Consulting L.L.C. His company provides training, marketing communications, and product testing/development services to manufacturers, dealers, and end-users of displays, display interfaces, and related products.
Pete edits and publishes HDTVexpert.com, a Web blog focused on digital TV, HDTV, and display technologies. He is also a columnist for Pro AV magazine, the leading trade publication for commercial AV systems integrators.