nVidia 6150 vs. AMD 690G

Got a shopping cart of parts that you want opinions on? Get advice from members on your planned or existing system (or upgrade).

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Mariner
Friend of SPCR
Posts: 260
Joined: Thu Jan 06, 2005 11:25 am

Post by Mariner » Mon Apr 30, 2007 2:55 am

The most obvious scenario not mentioned is multi-tasking. It's all very well and good if your CPU is able to decode 1080p but this precludes you using it for much of anything else at the same time. Offload video decode onto specialised hardware and you'll still have plenty of CPU cycles available to record something using your HTPC PVR/download stuff/encode video/etc. whilst still watching your Bluray/HD-DVD movie.

Kaoru
Posts: 11
Joined: Thu Apr 05, 2007 3:52 am

Post by Kaoru » Mon Apr 30, 2007 6:50 am

Mariner wrote:If, on the other hand, NVidia/ATI/Intel support full hardware decode on their graphics chips, this specialised support ought to take up just a relatively small area of the die space and therefore use less power.
I was under the impression that having hardware decoding would increase the idle power usage considerable. This should also apply to tuner cards (with hardware decoding).
By using software decoding you would tax the CPU more but still use less power overall.

pipperoni
Posts: 218
Joined: Sun Oct 24, 2004 9:10 pm
Location: Toronto
Contact:

Post by pipperoni » Mon Apr 30, 2007 7:44 am

Kaoru wrote:By using software decoding you would tax the CPU more but still use less power overall.
Power wise, hardware decoding is the best way to go. Look at the MyHD HDTV card, it does all the HDTV decoding legwork and has only a tiny little heatsink. On the other hand, most HDTV cards don't use hardware decoding and get CPU usages in the order of 30%-100%. Using a CPU seems to cost 10x as much power as a dedicated part.

If HD decoding is the only significant demand on your CPU, if you unloaded as much of that as possible onto dedicated hardware, then you can save power by getting a less powerful CPU (lower speeds, lower voltages).

Maelwys
Posts: 85
Joined: Wed Feb 28, 2007 6:27 pm
Location: Washington D.C.

Post by Maelwys » Mon Apr 30, 2007 9:38 am

I think this begs the question however, especially when it comes to the new 8xxx series nVidia cards: which is better, using the CPU to decode video, or having to install a dedicated video card to do the decoding for you?

If you have just the CPU doing the legwork using onboard video, then you can get a very low idle power draw with load power consumption at between 60-100% of CPU use, which, if using say a Brisbane with a 65W TDP, you are probably jumping up between 30-50W.

Compare that to a dedicated video board (I'm not talking about the dedicated PCI decoders like the MyHD or the Hauppage varieties). XBit Labs says that the 8600 GTS as an example draws 21W idle and 47W at max load (unfortunately that's 3D load which may not be comparable to video decoding). So your decoding load power would probably be about the same whether you had a dedicated card like this or not, but the idle power use would be 21W higher due to the video card.

For me, I know my HTPC sits idle about 90% of the time... so those extra watts when the machine is sitting there would add up.

Of course, as pipp mentioned, getting a lower power/speed CPU would help this equation out, but the Brisbanes are pretty close to the bottom of the spectrum already, unless you want to spend the money getting a mobile or VIA/high efficiency CPU.

dragmor
Posts: 301
Joined: Sun Jul 10, 2005 7:54 pm
Location: Oz

Post by dragmor » Wed May 02, 2007 8:24 pm

For those thinking of video performance, it looks like AMD have a bios update for you.

http://www.xtremesystems.org/forums/sho ... p?t=142804
1080p MPEG2 was 2.8ghz now 1.8ghz
1080p VC1 was 3.0ghz now 2.4ghz

Jumper
Posts: 72
Joined: Thu Jan 18, 2007 6:33 pm

Post by Jumper » Wed May 02, 2007 8:32 pm

I just built a system for my family... So I can add my own data point to this discussion.

65nm X2-3600+ on an Asus M2NPV-VM (6150) in an NSK3300.

With 1GB of DDR2-800, a single HD, a single optical drive, and a wifi card, I got it down to the following power consumption (at the wall) undervolted:

S3 Sleep (took some effort to get that working) - 4W
Idle - 1.0 GHz/0.800V - 50W
Full CPU + Integrated Graphics Load - 1.9GHz/0.950V - 75-80W

System is acceptably quiet with the rear TriCool on Low and the Asus Q-Fan system keeping the stock cooling fan at low RPMs.

Mariner
Friend of SPCR
Posts: 260
Joined: Thu Jan 06, 2005 11:25 am

Post by Mariner » Thu May 03, 2007 9:08 am

Incidentally, I just noticed the following report:

http://www.dailytech.com/article.aspx?newsid=7147

Indicates that AMD's UVD won't find it's way into an integrated chipset until Q1 2008. :(

I'd imagine this probably means that NVidia's equivalent won't be around until that time either and who knows about Intel?

Still, we're getting there, slowly but surely.

Post Reply