Dual graphics - to save energy! (Why not?)

Ecological issues around computing. This is an experimental forum.

Moderators: Ralf Hutter, Lawrence Lee

Post Reply
nicke2323
Posts: 38
Joined: Tue May 22, 2007 6:23 am

Dual graphics - to save energy! (Why not?)

Post by nicke2323 » Tue May 29, 2007 2:32 pm

Given the extreme power requirements of current high-performance graphics cards, and the (relatively speaking) negligible power needs of entry-level cards and integrated graphics, I wonder why nobody has developed a card which can be completely switched off? Then we could run on integrated graphics most of the time and reboot when graphics performance is needed for gaming or HD video decoding. The power savings would offset the extra cost of a basic card in no time at all. (The 8800GTX draws 70W at idle and 130W at load. Just the card. See http://www.xbitlabs.com/articles/video/ ... dup_6.html)

I can't imagine why this couldn't be easily implemented. In fact, it's already available in Sony VAIO SZ-series laptops. So why not in desktops?

Similarly, I wish Intel and AMD would make it possible to switch off entire cores of their multi-core processors. I read somewhere that asymmetric use hurts the processor, but then wouldn't it be possible to alternate the active core?

Really, why should a desktop computer burn 10X the power of a laptop when the workload is pretty much identical most of the time?

Sorry for ranting.

SebRad
Patron of SPCR
Posts: 1121
Joined: Sun Nov 09, 2003 7:18 am
Location: UK

Post by SebRad » Tue May 29, 2007 3:27 pm

Coming to a laptop near you soon, well vaguely soon-ish, from AMD/ATI branded PowerXPress.
Even more interesting is the platform's support for a feature called PowerXPress. The idea is that all Puma notebooks will have integrated graphics, but they can also have optional discrete graphics for better gaming performance. AMD is confident that its integrated graphics will be lower power than any discrete graphics solution, so when you're running on battery power the external graphics core would be disabled and your display would run off of the integrated GPU. On AC power, in the max performance mode, the internal graphics gets disabled and the external GPU is operational.

According to AMD, this switchover happens seamlessly; there's no reboot required and there should be hardly any interruption in use. We have yet to see it in action and are understandably skeptical of how smooth the transition between GPUs would be, but AMD claims that it works and very well at that. AMD also mentioned that this functionality could be overridden by a control panel if need be.
Not all GPUs are so power hungry, the now dated Radeon 9600 series used <20w under load for the lower models and will drive Vista's Aeroglass and older games just fine. I'm using an X1950pro that I believe is ~50w under load and not exactly a slow card, but the above technology does look promising.

Seb

Bigg
Posts: 154
Joined: Fri Nov 08, 2002 4:05 pm

Post by Bigg » Wed May 30, 2007 7:23 am

I don't think that there is much demand for this sort of thing. 95% of computer users are just fine with integrated graphics. Then, some of the 5% who are 1337 g4m3rz have separate systems with loud fans and power consuming components to game, and a more tame system for regular stuff. Plus, that keeps the games separate from everything else and visa versa.

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Wed May 30, 2007 1:38 pm

Bigg does have a point. I think a lot of us forget that the biggest supplier of GPU's isn't Nvidia or Ati. It's Intel. In fact Intel supplies a higher percentage of the graphics market than Nvidia and Ati/AMD combined. Stand-alone graphics cards account for something like ~8% of the market, and multi-GPU sales were 0.4%, according to sales figures for last year.

So in terms of "green" impact, the few thousand 300W monsters that get sold a year are a drop in the bucket compared to the millions of 15W integrated GPU's.

But, even with that, there's no reason that a standalone GPU couldn't idle at as low a wattage as an integrated one does. The process technology and architecture of the standalone's are generations ahead of the Intel iGPU's. They just need drivers that allow them to downclock to a fraction of their max speeds, and to shut down the superfluous shader units and pipelines that aren't needed for 2D work. Unfortunately, the media and marketing that drives standalone GPU development really doesn't seem to care about wattage at all, so there's no real push for improvements.

Bigg
Posts: 154
Joined: Fri Nov 08, 2002 4:05 pm

Post by Bigg » Wed May 30, 2007 1:48 pm

Good point about idling at a lower power. One problem, however, is that with Vista and the interface that makes people want to puke, you need 3D graphics all of the time. Of course, you can turn that off, and go back to the somewhat normal looking Windows 2000ish look. Using 3D graphics on the desktop is a HUGE blunder for Microsoft. Anything beyond the Windows 2000 look with no visual effects is just a waste. It is a waste of power, a waste of time, a waste of computing power, a waste of my eyeballs.

Linus
Posts: 184
Joined: Fri May 21, 2004 12:47 pm

Post by Linus » Mon Jun 04, 2007 12:30 pm

AnandTech reported in AMD's Next-Generation Mobile Architecture Revealed: Griffin that AMD intends to add a feature called PowerXPress to their upcoming mobile platform, Griffin. These systems will have both integrated and discrete graphics cores, and will supposedly switch seamlessly between them depending on whether the AC power brick is plugged in or not.

I doubt this will be making it to desktop machines anytime soon, but at least someone's working on the technology.

Post Reply