Page 1 of 1

The future of silent computing vs. dual core, sli, etc.

Posted: Thu Jun 23, 2005 3:36 pm
by kojak71
I don't know exactly were to post this, but I wanted to discuss what the future of silent computing what with the onslaught of more and more powerful CPUs & graphics card. Whenever a new and (typically) more powerful CPU comes out, production of slower and more quieter CPUs is stopped. Gradually the lowest (quietest) common demonitor is being raised. Will there be a day, I wonder, when the TDP of a P4 Prescott will be considered good for the SPCR community!

Sure things like SLI, and dual core aren't important now. But as the user base grows, more and more applications will be written to take advantage of such technologies, leaving those who cling onto older/quieter setups, getting frustated with the slowdown of their PCs. Parallelism is affecting the applications we use, e.g. MCE 2005 workflow is cripplingly slow on a single threaded CPU, when creating a DVD recording, whilst recording a TV progams, and watching another. High def. video editing has very large processor requirements.

The evidence is clear, we are going to need more power in the future. OS requirements keep getting larger, the average PSU rating has increased, and people are using their PCs for more and more things.

Posted: Thu Jun 23, 2005 4:57 pm
by Shadowknight
Actually, Intel's gone on record saying that they'll be moving to a more Pentium-M style, thermally efficient chip in 5 yrs.

Now graphics card WILL be a problem while ATI and Nvidia are in a race to outdo the other, instead of concentrating on more efficient designs. It's easier to keep making faster, hotter GPUs instead of taking the time to re-engineer them.

Posted: Thu Jun 23, 2005 5:17 pm
by teknerd
I agree with Shadowknight. I think the bigger problem is in the Graphics arena rather than processors. Especially since laptops are becoming more popular (last month was the first time ever that laptops outsold desktops) AMD and Intel have no choice but to produce more efficient chips. Graphics card companies however dont have the same incentive because the people who care about high performance (gamers mainly) dont care very much about the noise. Whereas business people might need powerful cpu's for things they do, most outside of the content creation and 3D modeling/development world dont need powerful graphics cards.

Posted: Thu Jun 23, 2005 5:51 pm
by Shining Arcanine
Shadowknight wrote:Actually, Intel's gone on record saying that they'll be moving to a more Pentium-M style, thermally efficient chip in 5 yrs.

Now graphics card WILL be a problem while ATI and Nvidia are in a race to outdo the other, instead of concentrating on more efficient designs. It's easier to keep making faster, hotter GPUs instead of taking the time to re-engineer them.
I don't think that is entirely true considering that Nvidia's new GeForce 7800 GTX uses less power than their GeForce 6800 Ultra while eliminating the use of shader replacement and performing up to 3 times higher in non-CPU bound applications. I think that is an indicator that they're making strides in the right direction.

Posted: Fri Jun 24, 2005 1:26 am
by kojak71
I agree that some inroads have been made into more efficient designs, but I don't think that "quiet" is their primary design brief. Intel's original roadmap for the P4 had clock speeds of up to 10 GHz, but limitations in material science has seen them abandon the GHz race. Once they find a process, they'll return to it.

As for talk of PentiumM architectures, their design brief is battery life, first and foremost. Although good for general computing, these chips aren't good for say video editing.

As for nVidia's lower wattage graphic cards, most of this is down to the shrink of the GPU's die size (reduced from 130nm to 110nm). Again it's a happy coincidence that the resultant chips run a little cooler. If they don't have a further die shrink, their next generation cards will run hotter. Will we eventually see external power bricks for 3d cards, thus showing a little mercy to our PSUs?

Posted: Sat Jun 25, 2005 9:37 pm
by AZBrandon
When I look back at the old PC's I've owned: my first 286, my 386, two different 486's, two Pentiums, and on up to my current PC's, they've each gotten quieter than the previous generation even though they've gotten more powerful. I imagine that's how the trend will continue - they'll find some way to make the systems quieter regardless of their processing power.

Posted: Sat Jun 25, 2005 11:49 pm
by rpsgc
Dual-Core 2.0GHz Dothan? :P

Posted: Sun Jun 26, 2005 2:17 am
by lm
I think there are limits to this. For example a 1kW machine would heat a room very effectively, and say 5kW would make the room a (lousy) sauna.

Posted: Sun Jun 26, 2005 4:53 am
by kojak71
AZBrandon wrote:When I look back at the old PC's I've owned: my first 286, my 386, two different 486's, two Pentiums, and on up to my current PC's, they've each gotten quieter than the previous generation even though they've gotten more powerful. I imagine that's how the trend will continue - they'll find some way to make the systems quieter regardless of their processing power.
In those days, most of the noise was generated by the hard-disk and the psu. At that time, there were no such things as 3d cards, and cpus didn't have fans on their heatsinks.

When I read stories that 800W psu's are being developed, even if we assume 80% efficiency, that's potentially 160W of heat having to be dissipated from the PSU alone. Of course thats assuming that a PSU is running at 100% load. Of course that sounds ridiculous, TODAY, but when you consider how much more your pc is doing, the baseline for minimum wattage has been on the increase. A few years ago, 150W was the average, now it's around the 250W mark. As the average load goes up so does the amount of heat generated, and we've only talked about the PSU, what about the heat emitted from the CPU, VGA card, mobo, hard-drives, (there's even going to be add-on card which will compute real-world physics, expect this to be the next big thing, just as 3d cards were).

Perhaps we should be looking to harness this energy to heat our homes :)

Posted: Sun Jun 26, 2005 5:37 am
by rpsgc
kojak71 wrote:Perhaps we should be looking to harness this energy to heat our homes :)
Shhh... or the men in black will silence you! :twisted:

Posted: Sun Jun 26, 2005 6:24 am
by swivelguy2
kojak71 wrote:Perhaps we should be looking to harness this energy to heat our homes :)
Easy: just have a folding farm that you only turn on in the winter. Your house furnace will automatically run that much less.

About the trend of heat vs. time, it's not all bad. The heat output of CPUs does go up gradually as clock speeds are increased and designs are pushed to the limit, but every once in a while, a large jump downward in power occurs.

For example, Athlons and Athlon 64's were steadily climbing up to the 100 watt mark, until the new winchester and venice cores dropped the power back down to the 40's or something. Similarly, with intel, the P4 was bad, prescotts got a bit worse, EE made it a bit worse for another marginal gain in performance, but switching to the Pentium M architecture drops power usage down from 110+ watts to ~50 watts, which gives room for squeezing some more performance out of the chips, until the heat gets to a level where another architecture change is needed, and the cycle repeats.

As technology advances, the ratio of processing power to themal power can only increase.

Posted: Sun Jun 26, 2005 6:33 am
by Shining Arcanine
kojak71 wrote:As for talk of PentiumM architectures, their design brief is battery life, first and foremost. Although good for general computing, these chips aren't good for say video editing.
The Althon 64 X2 processors changed the Althon's performance in video encoding when compared to the Pentium 4 due to their high efficiency. Given that the Pentium-M has even higher efficiency, two of them (Yonah) should be able to roughly match or outperform a high end Pentium 4 in video encoding.
kojak71 wrote:As for nVidia's lower wattage graphic cards, most of this is down to the shrink of the GPU's die size (reduced from 130nm to 110nm). Again it's a happy coincidence that the resultant chips run a little cooler. If they don't have a further die shrink, their next generation cards will run hotter. Will we eventually see external power bricks for 3d cards, thus showing a little mercy to our PSUs?
Most of it is actually the result of the transistors they used for the 110nm process. I don't know how TSMC and Nvidia collaborate with the designs but if there are mutiple transistors avaliable (as is true for Intel and IBM's processor design teams), then they probably selected the ones that provided the best performance while keeping the thermal envelope smaller than the GeForce 6800 Ultra's.
kojak71 wrote:When I read stories that 800W psu's are being developed, even if we assume 80% efficiency, that's potentially 160W of heat having to be dissipated from the PSU alone. Of course thats assuming that a PSU is running at 100% load. Of course that sounds ridiculous, TODAY, but when you consider how much more your pc is doing, the baseline for minimum wattage has been on the increase. A few years ago, 150W was the average, now it's around the 250W mark. As the average load goes up so does the amount of heat generated, and we've only talked about the PSU, what about the heat emitted from the CPU, VGA card, mobo, hard-drives, (there's even going to be add-on card which will compute real-world physics, expect this to be the next big thing, just as 3d cards were).
It is actually 200 Watts of power wasted if you take into account that the 800 Watt figure is the maximum DC power provided rather than AC power used.

Also, my tower uses less than 147 Watts of AC Power (I'm going by my UPS's measurement, which has other things in the figure) so 250 Watts of AC Power is far from average. Especially if we're talking about DC power rather than AC power.

Posted: Tue Jun 28, 2005 2:45 pm
by pony-tail
When I look back at the old PC's I've owned: my first 286, my 386, two different 486's, two Pentiums, and on up to my current PC's, they've each gotten quieter than the previous generation even though they've gotten more powerful. I imagine that's how the trend will continue - they'll find some way to make the systems quieter regardless of their processing power.
I started with a Mac Plus 1mb - totally silent , fanless but monochrome. for silence it is a hard act to follow . But I am hoping!

Posted: Tue Jun 28, 2005 3:43 pm
by StarfishChris
pony-tail wrote:I started with a Mac Plus 1mb - totally silent , fanless but monochrome. for silence it is a hard act to follow . But I am hoping!
Amiga 1200 (without a hard drive). Upgrade to an '060 and you're set.

Posted: Wed Jun 29, 2005 4:41 am
by mb2
there is a slight advantage to this 'dual' everything; companies will have to make their chips relatively cooler (so that 2 of them wont overheat), therefore those of us who want only one of them will be cooler

also the super-heat means more extreme cooling solutions (not that u can get much more extreme than what we have now, for air atleast), which means quieter for those of us not running infernos..

..in theory