Cathode-ray-tube TVs,with their familiar protruding backsides, may look bulky and primitive next to today’s sleek flat-screen models, but they’ve still got two big advantages: they’re cheap, and they’re relatively energy efficient. Plasma flat screens, for example, still cost $2,000 or more apiece, and they use up to five times as much electricity as CRTs. They’re such energy hogs, in fact, that global adoption of plasma TVs could increase electrical demand noticeably, increasing both the chances of blackouts and the volume of greenhouse gases emitted by power plants.
At the very least, plasma screens are sure to put additional stress on the electrical grid, which has lately approached its breaking point in states such as California. A typical plasma display consumes about 1,000 kilowatt-hours of power annually, compared to approximately 233 kilowatt-hours for an average CRT, according to a study by the United Kingdom’s Department of Environment, Food, and Rural Affairs. And in ten years, it is widely believed, half of all TVs will be flat-panel displays, although it’s unclear whether plasma, liquid-crystal displays, or newer technologies will predominate.
If plasma wins, the consequences could be startling. If half of California’s 12.7 million households replaced their CRTs with plasma displays, the state’s electrical usage would grow by 7.6 billion kilowatt-hours annually, an increase of about 1.3 percent. Supply-and-demand projections from California’s state energy commission show that during hot summers, the extra demand from these plasma screens alone could eat up much of the state’s reserve generating capacity, increasing the likelihood of rolling blackouts. A solution might be for the state to ask citizens to turn off their televisions in hot weather – but such measures are unlikely to be received well in the homeland of the world’s entertainment industry.
The AI revolution is here. Will you lead or follow?
Join us at EmTech Digital 2019.