nVidia 6150 vs. AMD 690G
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
The most obvious scenario not mentioned is multi-tasking. It's all very well and good if your CPU is able to decode 1080p but this precludes you using it for much of anything else at the same time. Offload video decode onto specialised hardware and you'll still have plenty of CPU cycles available to record something using your HTPC PVR/download stuff/encode video/etc. whilst still watching your Bluray/HD-DVD movie.
I was under the impression that having hardware decoding would increase the idle power usage considerable. This should also apply to tuner cards (with hardware decoding).Mariner wrote:If, on the other hand, NVidia/ATI/Intel support full hardware decode on their graphics chips, this specialised support ought to take up just a relatively small area of the die space and therefore use less power.
By using software decoding you would tax the CPU more but still use less power overall.
Power wise, hardware decoding is the best way to go. Look at the MyHD HDTV card, it does all the HDTV decoding legwork and has only a tiny little heatsink. On the other hand, most HDTV cards don't use hardware decoding and get CPU usages in the order of 30%-100%. Using a CPU seems to cost 10x as much power as a dedicated part.Kaoru wrote:By using software decoding you would tax the CPU more but still use less power overall.
If HD decoding is the only significant demand on your CPU, if you unloaded as much of that as possible onto dedicated hardware, then you can save power by getting a less powerful CPU (lower speeds, lower voltages).
I think this begs the question however, especially when it comes to the new 8xxx series nVidia cards: which is better, using the CPU to decode video, or having to install a dedicated video card to do the decoding for you?
If you have just the CPU doing the legwork using onboard video, then you can get a very low idle power draw with load power consumption at between 60-100% of CPU use, which, if using say a Brisbane with a 65W TDP, you are probably jumping up between 30-50W.
Compare that to a dedicated video board (I'm not talking about the dedicated PCI decoders like the MyHD or the Hauppage varieties). XBit Labs says that the 8600 GTS as an example draws 21W idle and 47W at max load (unfortunately that's 3D load which may not be comparable to video decoding). So your decoding load power would probably be about the same whether you had a dedicated card like this or not, but the idle power use would be 21W higher due to the video card.
For me, I know my HTPC sits idle about 90% of the time... so those extra watts when the machine is sitting there would add up.
Of course, as pipp mentioned, getting a lower power/speed CPU would help this equation out, but the Brisbanes are pretty close to the bottom of the spectrum already, unless you want to spend the money getting a mobile or VIA/high efficiency CPU.
If you have just the CPU doing the legwork using onboard video, then you can get a very low idle power draw with load power consumption at between 60-100% of CPU use, which, if using say a Brisbane with a 65W TDP, you are probably jumping up between 30-50W.
Compare that to a dedicated video board (I'm not talking about the dedicated PCI decoders like the MyHD or the Hauppage varieties). XBit Labs says that the 8600 GTS as an example draws 21W idle and 47W at max load (unfortunately that's 3D load which may not be comparable to video decoding). So your decoding load power would probably be about the same whether you had a dedicated card like this or not, but the idle power use would be 21W higher due to the video card.
For me, I know my HTPC sits idle about 90% of the time... so those extra watts when the machine is sitting there would add up.
Of course, as pipp mentioned, getting a lower power/speed CPU would help this equation out, but the Brisbanes are pretty close to the bottom of the spectrum already, unless you want to spend the money getting a mobile or VIA/high efficiency CPU.
For those thinking of video performance, it looks like AMD have a bios update for you.
http://www.xtremesystems.org/forums/sho ... p?t=142804
1080p MPEG2 was 2.8ghz now 1.8ghz
1080p VC1 was 3.0ghz now 2.4ghz
http://www.xtremesystems.org/forums/sho ... p?t=142804
1080p MPEG2 was 2.8ghz now 1.8ghz
1080p VC1 was 3.0ghz now 2.4ghz
I just built a system for my family... So I can add my own data point to this discussion.
65nm X2-3600+ on an Asus M2NPV-VM (6150) in an NSK3300.
With 1GB of DDR2-800, a single HD, a single optical drive, and a wifi card, I got it down to the following power consumption (at the wall) undervolted:
S3 Sleep (took some effort to get that working) - 4W
Idle - 1.0 GHz/0.800V - 50W
Full CPU + Integrated Graphics Load - 1.9GHz/0.950V - 75-80W
System is acceptably quiet with the rear TriCool on Low and the Asus Q-Fan system keeping the stock cooling fan at low RPMs.
65nm X2-3600+ on an Asus M2NPV-VM (6150) in an NSK3300.
With 1GB of DDR2-800, a single HD, a single optical drive, and a wifi card, I got it down to the following power consumption (at the wall) undervolted:
S3 Sleep (took some effort to get that working) - 4W
Idle - 1.0 GHz/0.800V - 50W
Full CPU + Integrated Graphics Load - 1.9GHz/0.950V - 75-80W
System is acceptably quiet with the rear TriCool on Low and the Asus Q-Fan system keeping the stock cooling fan at low RPMs.
Incidentally, I just noticed the following report:
http://www.dailytech.com/article.aspx?newsid=7147
Indicates that AMD's UVD won't find it's way into an integrated chipset until Q1 2008.
I'd imagine this probably means that NVidia's equivalent won't be around until that time either and who knows about Intel?
Still, we're getting there, slowly but surely.
http://www.dailytech.com/article.aspx?newsid=7147
Indicates that AMD's UVD won't find it's way into an integrated chipset until Q1 2008.
I'd imagine this probably means that NVidia's equivalent won't be around until that time either and who knows about Intel?
Still, we're getting there, slowly but surely.