As I remember
with plasma it depends on what you are watching
snow scenes - high energy
underground darkness - low energy.
The figures the manufacturer are obliged to publish are the maximum figures - only applicable if you only watch ski sunday.
with lcd the energy figure is constant, it is not able to vary.
yes, but the numbers I quoted are actual measurements not manufacturers specs.
So even a LCD that draws 87W no matter what is on the screen will always use less energy than a plasma that does 450+W on the test picture.
Plasma TVs use more power than LCD TVs. In our tests of TVs since the beginning of 2008 plasmas consume, on average, roughly two to three times more electricity to produce an image of the same brightness as LCD. In the last couple of years, plasma TV makers have made some progress--Panasonic claims improvements of 30 percent yearly, for example--but they still can't compete with LCD for energy efficiency. One problem is that in plasma TVs, each pixel is a discrete light source (think of it as a tiny light bulb), so when resolution increases, say from 720p to 1080p, power use goes up as well. The intensity of light from each pixel must be increased to brighten the picture as a whole.
CNET follows the standards outlined in IEC 62807, the same methodology employed by Energy Star, to test TV power usage. Per these methods, TV power draw is tested in the default picture setting and in standby mode. At CNET, we also test two additional picture-setting scenarios: post-calibration and power saver. Unlike Energy Star, we also disable room lighting sensors in default mode, if possible, before running the test. For the full details of Energy Star 3.0 testing requirements, click here (PDF). We have not made any changes to our testing with the advent of Energy Star 4.0.
To collect the data, we use a Chroma 66202 Digital Power Meter, which is designed to meet Energy Star/IEC 62301 measurement requirements. The meter automatically averages 1-second interval wattage measurements over the 10-minute test period, and is capable of accurate standby measurements. (We retired our old Watts Up meter, which was not capable of accurately measuring the fractional watts drawn by most new HDTVs' standby mode, in June 2009, so most measurements made before then showed standby consumption of zero watts.)
The basic test procedure goes as follows:
* Plug the television into the meter and the meter into a wall outlet.
* Connect a DVD player via an HDMI input.
* Insert the IEC 62087 test DVD, which contains a specific 10-minute clip of program material, into the player.
* Turn on the television.
* If the TV has a "home" mode available as a choice during initial setup, choose it and go through the rest of the initial setup process, making sure to not change any of the picture settings. This ensures the closest to "default" picture setting.
* If the TV's default settings incorporate an automatic brightness control with a room lighting sensor, turn it off. If the control cannot be turned off, make sure the light level striking the sensor is as close to 300 lux as possible.
* Run the IEC test DVD for an hour by setting the 10-minute clip to repeat six times. This warms up the TV and stabilizes its power consumption.
* Default test: After the warm up period is over, run the test DVD for the official 10-minute clip, making sure to increase the volume to a moderate level the tester can hear clearly.
* Power-saving mode test: If the TV has a power-saver mode, engage that mode in its most-efficient setting, the one that uses the least power, and repeat the 10-minute clip.
* Post-calibration test: Repeat the 10-minute clip again in the calibrated picture settings, with a light output of 40 footlamberts measured on a specific window test pattern (more information).
* Standby test: Turn the TV off and observe the meter's readout after it settles down into a steady standby power draw, typically after a minute or two.
* The three 10-minute clip tests, along with the reading from standby power, are reported in the Juice box attached to every TV review (see below).
As we mentioned, Energy Star does not report on the effects of power saver or post-calibration picture settings. We chose to include these tests because they give a better overall picture of TV power use, rather than what you get simply with the default picture settings. The post-calibration results in particular are useful for leveling the field of comparison between different TVs. Default picture modes can vary widely in light output, and thus power use, but our calibration specifies a set light output.