I build myself a jig that allows me to safely measure the Power used by any appliance. I basically use 2 multimeter : One plugged in parallel to measure voltage the other one plug in series to measure the current.
I got the following results for my PC:
Antec Sonata ( Included 380 W Power supply )
ASUS A7N8X-E Deluxe
Athlon XP 2500+ ( Barton 166 MHz DDR FSB )
2x Kingston HyperX 256MB KHX2700/256 ( DDR 333MHz (PC2700) )
MSI GeForce3 Ti200 64MB ( Home build cooling fan and bracket )
Creative Labs SoundBlaster Live 5.1
Adaptec SCSI Host Adapter AHA2940
Pioneer DVR-107D ( DVD±RW 8X )
Pioneer DVD-117
Western Digital WD800JB 80GB 8MB Cache
During the time I took all the reading, the voltage stayed pretty stable at 126 V.
Power ( Watts ) = Voltage ( Volts ) * Current ( Amperes )
While idling at my Linux or Windows desktop, I measured 0,9 A which give a power of 113.4W
I tried playing various games ( UT2004 Linux, Return to castle Wolfenstein Linux, Ghost Recon, NOLF2, Vice City )
While playing most games, the current would fluctuate lightly between 1,04 and 1,07 giving respectively 131.04W and 134.82.
I was really surprised when I tried Vice City! The current fluctuates a lot depending on what is happening in the game! Just staying still a corner or driving around "normally" I measured 0,96A ( 120.96W ) I then decided to see how much I could load the game, so I went on a rampage and fired at everything until I had the entire police force and army after me. The power then rose to 1,03A ( 129.78W ) but it didn't stay that high for to long. As soon as things coled down in the game, the current dropped back to 0.96A.
I then decided to try a little test that would only load the processor and not the graphic card, so I booted linux and used the following command : cat /proc/kcore | bzip2 -9c > /dev/null. For those who are not familiar with unix, it basicly reads the entire system memory, compresses it and discards the result. ( pretty good for wasting all your CPU cycles
While it was running, I measured a current of 1,07A which lead me to the following conclusion : The thing that influences the power draw the most is the CPU usage. That leaves only to possibility for the video adapter : I draws a lot no matter if it's working or idle, or doesn't draw much anyhow. Using my fingers as thermal probes, I was able to feel the temperature of the video card rising when it was under load.
Since the amount of heat dissipated by a computer component is pretty much in direct relation with the powers it draws, it suggests that the video adapter draw gets bigger under load. I have not collected sufficient data to be able to nail down the impact of the GeForce3 Ti200 on the entire system power draw.
When the HD went in sleep mode ( spin down ) the system power was reduced by ~10W.
I tested the load generated by using the network and optical drives, but don't remember the figures. However, I do remember that it was nowhere near what the CPU was able to draw.
I then thought about cpuburn... 3v!| I tried the different versions under Linux and the K7 version was the one that generated the greatest load. A whopping 1.2A! That's 151.2 Watts! I was never able to go over 1,07A with "regular" system use. It confirms my theory that the CPU is the greatest power sucker.
Further tests should include testing the system without the display adapter, testing while undercloking the CPU and testings with all the drives unplugged.
I measured the power used by my monitor Viewsonic G773 ( 17" ). The brighter the image the more power it used. It maxed out around 100W which is the maximum rated value.
I hope my results will help you guess your system power requirements, and as soon as will have some spare time, I will do further testing.
p.s. : Sory for all he mistakes, I don't write this much in English often anymore.