Does the CPU currently matter with high end graphics cards?

Our "pub" where you can post about things completely Off Topic or about non-silent PC issues.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
CA_Steve
Moderator
Posts: 7651
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Does the CPU currently matter with high end graphics cards?

Post by CA_Steve » Mon Feb 19, 2007 9:25 pm

Interesting article where the author runs tests with the 8800GTS and GTX at a fixed 1600 x 1200 resolution/AA/AF and then swaps out every core 2 duo available to him.

~El~Jefe~
Friend of SPCR
Posts: 2887
Joined: Mon Feb 28, 2005 4:21 pm
Location: New York City zzzz
Contact:

Post by ~El~Jefe~ » Mon Feb 19, 2007 9:40 pm

er hm.

this isnt normal for most systems. what the heck is bottlenecking what here?

This is a strange sort of test. I dont think I have seen results like this ever :)

They are what they are though. Looks like 8900 and R600 ati are going to exceed what people need. Hm. intriguing.

I wouldnt concern myself with gaming marks on core duo. the 4300 newbie that just came out could be pushed past 3ghz and provide a couple of years worth of game pounding fun. seems like intel core 2 duo's just open up the gaming world regardless of which ones you get. the Empire Strikes Back

K10 from amd... where are you ... :(

dragmor
Posts: 301
Joined: Sun Jul 10, 2005 7:54 pm
Location: Oz

Post by dragmor » Mon Feb 19, 2007 9:50 pm

The same thing happended with people paring ATI 9700's and P4's/XP's < 2000. Every now and again one will overtake the other.

jessekopelman
Posts: 1406
Joined: Tue Feb 13, 2007 7:28 pm
Location: USA

Post by jessekopelman » Tue Feb 20, 2007 2:19 am

I don't remember CPU ever being more important than GPU for games and I'm going back to the 1980s on this one. If you read reviews on sites like Anandtech, it becomes clear pretty quickly that GPU is far more important than CPU for producing high FPS in games. Having a dual core CPU doesn't even help for most games. If you are building a game machine, buy an Athalon 64 3800+ put the extra $100+ that a C2D would have cost to a better graphics card. If you are a hardcore gamer, you'll probably still be feeling the need to upgrade the graphics card before swapping in an X2.

belkincp
Posts: 96
Joined: Mon Feb 05, 2007 4:22 am

ive beeen building my own gaming machines since 1990

Post by belkincp » Tue Feb 20, 2007 4:24 am

and I can tell you the age we r living in..graphics cards matter moe than ANYTHING for fps...
I have a 3700 athlon 64....if I put my 9600 radeon in instead of my x800gto quake 4 goes from 76 fps to 14 fps running at 1600x1200. Everything on Highest. If this doenst say somehtign aobut gpu's i dont know what does

CA_Steve
Moderator
Posts: 7651
Joined: Thu Oct 06, 2005 4:36 am
Location: St. Louis, MO

Post by CA_Steve » Tue Feb 20, 2007 7:08 am

Well, it's been known with DX9 that the GPU rules the roost. But, also that you need to have a bit of speed with the CPU due to the inefficient way DX9 works (lots of CPU calls). I was just suprised that the baseline cpu needed is still below the level of a standard clocked e6300.

With DX10, the cpu will matter even less.

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Tue Feb 20, 2007 10:49 am

Actually, in strategy games the CPU might be more important than the GPU, at least when there are many units in the game - not on screen.
I still can experience a few slowdowns when playing an old game, C&C Generals, when confronting 7 brutal armies.
That's with C2D E6600 and 7900 GTX, at 1600x1200. Of course, after nuking the bulk of the enemies, no slowdowns. :lol:

penny
Posts: 20
Joined: Fri Jan 05, 2007 8:09 pm

Post by penny » Wed Feb 21, 2007 2:33 am

agree, totally depends on the game.

supreme commander is barely playable without a dual core system :)

and the thing with graphics is that you can turn them down. if the system just flat out needs more cpu there's not much you can do (turning down max units, I suppose, but that's a more radical change than say less AA..)

s_xero
Posts: 154
Joined: Sun Sep 10, 2006 2:56 pm

Post by s_xero » Wed Feb 21, 2007 4:51 am

Depends definetly on the game..

The GeForce 8800 GTX is bottlenecked by the AMD X2-series (even FX), if compared to a Core 2 Duo-platform.

Especially bottlenecking occurs in the game Supreme Commander.
With thousands of units in one game, the CPU is heavily loaded by the AI.

So don't say that low CPU's are enough yet:P

Byakko
Posts: 11
Joined: Thu Nov 16, 2006 11:35 am

Post by Byakko » Wed Feb 21, 2007 10:37 am

Here's another review that says the opposite. From Tom's Hardware:

http://www.tomshardware.com/2006/11/29/ ... index.html

jessekopelman
Posts: 1406
Joined: Tue Feb 13, 2007 7:28 pm
Location: USA

Post by jessekopelman » Thu Feb 22, 2007 1:14 am

Byakko wrote:Here's another review that says the opposite. From Tom's Hardware:

http://www.tomshardware.com/2006/11/29/ ... index.html
OK, this article shows that CPU matters. It also shows that it matters very little compared to choice of GPU. Take a look at what happens in the results where you hold CPU constant and change graphics card. Now, hold the graphics card constant and change the CPU. Changing the GPU often had dramatic difference, while changing the CPU rarely did. The lesson I take from this is that if you are doing graphics intensive gaming, prioritize spending on GPU over CPU. If money is no object, or you've hit a performance wall with the best possible GPU, then upgrade the CPU.

the_beast
Posts: 37
Joined: Mon Sep 04, 2006 12:00 pm

Post by the_beast » Mon Feb 26, 2007 11:53 am

jessekopelman wrote:I don't remember CPU ever being more important than GPU for games and I'm going back to the 1980s on this one.
I'm intrigued as to what gpu was important in the 80s. The first recognised graphics card (from IBM) was only released in 1981, and for nearly 10 years graphics cards in pcs did little more than put text on screen. VGA graphics didn't make it along til about 1987. 3D graphics weren't around til 95 or so.

I reckon the first time a gpu became more important than a cpu was when voodoo cards were released, and even more so when the sli voodoo2 came out.

jessekopelman
Posts: 1406
Joined: Tue Feb 13, 2007 7:28 pm
Location: USA

Post by jessekopelman » Tue Feb 27, 2007 3:24 am

the_beast wrote:
jessekopelman wrote:I don't remember CPU ever being more important than GPU for games and I'm going back to the 1980s on this one.
I'm intrigued as to what gpu was important in the 80s. The first recognised graphics card (from IBM) was only released in 1981, and for nearly 10 years graphics cards in pcs did little more than put text on screen. VGA graphics didn't make it along til about 1987. 3D graphics weren't around til 95 or so.

I reckon the first time a gpu became more important than a cpu was when voodoo cards were released, and even more so when the sli voodoo2 came out.
So wait, your saying the ability to put something on the screen isn't the most important thing for computer games? What I remember from playing computer games in the 80s is that there were many games that required a specific screen resolution or number of colors and if your card didn't support those things it didn't matter what CPU you had. I don't remember a single game I tried in the 80s that wouldn't run on my 10 MHz 8088, but plenty didn't like my Hercules GPU. Even if 3D acceleration wasn't common before 1995, there were plenty of games with 3D graphics before then. Having more memory on the video card made a big difference for those games (and for things like merely running Windows).

the_beast
Posts: 37
Joined: Mon Sep 04, 2006 12:00 pm

Post by the_beast » Tue Feb 27, 2007 4:11 am

jessekopelman wrote:So wait, your saying the ability to put something on the screen isn't the most important thing for computer games?
No - I'm saying that graphics cards were not something you could choose to get better performance from a game.
jessekopelman wrote:What I remember from playing computer games in the 80s is that there were many games that required a specific screen resolution or number of colors and if your card didn't support those things it didn't matter what CPU you had.
Exactly. A game was written to run on specific graphics hardware, which meant that performance was not the reason to choose a different card. If your card was supported by the game it would run, if not it wouldn't. Games would look the same running on different cards and perform identically. In fact many games were timed using the speed of the cpu as a limit - if you ran them on a faster computer, they would run faster. This was again because a game was written to run on a single cpu type with some kind of graphics adapter hooked up to it. This is not the same situation we are in today, where a slower processor can run a game faster and with more detail than a faster processor with a poor graphics card.
jessekopelman wrote:I don't remember a single game I tried in the 80s that wouldn't run on my 10 MHz 8088, but plenty didn't like my Hercules GPU.
But is that because you only tried games suitable for your 8088? I know of many games available for the Amiga that wouldn't run on my PC or my Acorn, but it was nothing to do with the graphics hardware present. But as I knew they wouldn't run cross-platform I didn't bother. I'm guessing you didn't try this either either.

BrianE
Posts: 667
Joined: Tue Mar 29, 2005 7:39 pm
Location: Vancouver, BC, Canada

Post by BrianE » Tue Feb 27, 2007 10:53 am

All I remember was that my first and second video cards didn't really matter as far as 3D went. Sure I still cared what they were (a Trident for the 386 and a S3 Virge for the Pentium 166) because magazines did reviews about image quality and 2D framerate, but they were mostly chosen because they were: A) Cheap B) Popular and C) Adequate. As far as being a factor in gameplay performance I don't remember it being as crucial as the CPU and RAM. I remember the huge boost I got playing Doom when I dropped a 486DX33 into the 386 board and upgraded the RAM to a whopping 24MB. :P

Later on I got the Virge because "I guess I should have some 3D ability", but it was really an afterthought since 3D was an ugly, ugly thing back then. My first "real" performance video card was an Nvidia TNT2 some time later.

These days I notice I am paying far, far more attention to what video card I buy. I imagine most people are, especially since almost every game genre is incorporating 3D functionality these days. I also don't remember individual GPU chips coming in so many different "flavours" all at the same time.... Sure there were a few different ice cream shops, but they all sold only 2 or 3 flavours. The market didn't seem so crowded and bewildering back when Quake and Descent were benchmarks.


More on-topic ;) I have recently read that GPUs are in some ways more complex and perform more operations than CPUs do. Other than going along with the inevitable move to multi core CPUs, I would just recommend a reasonable balance between CPU and video card, putting more emphasis (and more $$) on the video card depending on how high the 3D gaming demands are. I don't necessarily mean that you should spend the same amount on both, but I mean a reasonable balance of their positions in their markets (ie, don't pair a low-end CPU with an 8800GTX).

jaganath
Posts: 5085
Joined: Tue Sep 20, 2005 6:55 am
Location: UK

Post by jaganath » Tue Feb 27, 2007 11:45 am

I have recently read that GPUs are in some ways more complex and perform more operations than CPUs do.
I don't know if transistor count is a good proxy for complexity, but if you look at the current high-end CPU (C2E QX6700) that has 582m transistors, versus the 8800GTX's 681 million. Also look at F@H and the massive boost they get from using X1900-series GPUs. Certainly GPUs have become massively more powerful in the last few years.

jessekopelman
Posts: 1406
Joined: Tue Feb 13, 2007 7:28 pm
Location: USA

Post by jessekopelman » Tue Feb 27, 2007 11:01 pm

the_beast wrote: Exactly. A game was written to run on specific graphics hardware, which meant that performance was not the reason to choose a different card.
Actually, if something requires specific graphic hardware that is pretty much the definition of graphics hardware being important. Yes, one EGA card wasn't going to make the game play better than another, but you still needed that EGA card no matter if you had a 8088 or a 386 CPU. The reason for needing a GPU has not changed over the years -- the ability to correctly render the game visuals. Now 3d acceleration has been added to the mix, but in the end it still comes down to a certain resolution and number of colors at a certain frame rate, just as it did in the 80s when you had to spring for EGA or even VGA to play certain games.

qviri
Posts: 2465
Joined: Tue May 24, 2005 8:22 pm
Location: Berlin
Contact:

Post by qviri » Tue Feb 27, 2007 11:12 pm

jaganath wrote:Also look at F@H and the massive boost they get from using X1900-series GPUs.
I believe that has less to do with intrinsic complexity and more to do with graphics cards being optimised for different kind of processing. (Floating point operations, if I remember correctly.)

I agree about the type of game BTW.

the_beast
Posts: 37
Joined: Mon Sep 04, 2006 12:00 pm

Post by the_beast » Wed Feb 28, 2007 12:02 am

jessekopelman wrote:
the_beast wrote: Exactly. A game was written to run on specific graphics hardware, which meant that performance was not the reason to choose a different card.
Actually, if something requires specific graphic hardware that is pretty much the definition of graphics hardware being important. Yes, one EGA card wasn't going to make the game play better than another, but you still needed that EGA card no matter if you had a 8088 or a 386 CPU.
But compatibility is no more a graphics card issue than it it a cpu issue. You can't argue that possible gpu incompatibility makes a gpu more important than a cpu. If your cpu was not compatible then the game would not run either.

The question is about whether changing gpu/cpu makes a difference as to how games run. Not whether changing the gpu/cpu changes if a game runs.
jessekopelman wrote: The reason for needing a GPU has not changed over the years -- the ability to correctly render the game visuals. Now 3d acceleration has been added to the mix, but in the end it still comes down to a certain resolution and number of colors at a certain frame rate, just as it did in the 80s when you had to spring for EGA or even VGA to play certain games.
But this is the point. The reason for needing a gpu hasn't changed, just like the reason for needing a cpu hasn't changed. But the factors used to make the choice has changed. You used to choose a gpu based on compatibility with whatever programs you needed to run. Now you choose a gpu based on how fast it will run your 3D games (in this case - the original question is about framerates & graphics performance. Obviously there are more factors involved than this).

To some extent the problems you mention regarding graphics cards still exist today. Try playing a DX9 game on DX7 hardware. Similar problems to your Hercules gpu? Unless the game can be rendered in software, it will not run at all. And if it will run, it is the choice of cpu that will determine how fast it will go. So again the choice of cpu is more important in this case.

Basically - if your system is compatible with your game, the 3D graphics acceleration provided by the gpu is more important than cpu choice if the game if graphically intensive and your cpu is of reasonable speed. However if the game is cpu intensive, then often the gpu is of little consequence. The gpu has only really been a factor since they started being able to speed things up (Voodoo in 95). Before that, framerates were purely cpu bound.

jessekopelman
Posts: 1406
Joined: Tue Feb 13, 2007 7:28 pm
Location: USA

Post by jessekopelman » Wed Feb 28, 2007 3:51 pm

the_beast wrote: Basically - if your system is compatible with your game, the 3D graphics acceleration provided by the gpu is more important than cpu choice if the game if graphically intensive and your cpu is of reasonable speed. However if the game is cpu intensive, then often the gpu is of little consequence. The gpu has only really been a factor since they started being able to speed things up (Voodoo in 95). Before that, framerates were purely cpu bound.
Whether you want to admit it or not, you are supporting my point. We agree about GPU and 3D acceleration for modern game frame rate. The issue I think you are intentionally disregarding is that framerate was not always important. What used to be important was number of colors at a given resolution and that was pure GPU. I tried replaying a lot of my 80s games in the early nineties on a much faster computer. The faster CPU made no difference at all, but the nicer monitor and ability to play them in EGA or VGA resolution did. You know full well that the original post in this thread was talking about graphically intensive games, so why bring up special cases (CPU intensive games have always been a small minority, anyway).

the_beast
Posts: 37
Joined: Mon Sep 04, 2006 12:00 pm

Post by the_beast » Thu Mar 01, 2007 1:32 am

But you are missing my point. The gpu was no more important in the 80s than the cpu was. So saying that the CPU has never been more important than the GPU since the 80s is not true. The issue back then was not one of 'better card = better game', which is what the thread is asking about. It was a compatibility issue, which was just as true for cpu compatibility as it was for gpu compatibility. You could add RAM into this mix - far more times I found out I couldn't run a game because I didn't have enough memory to do so rather than because any other part of my computer wasn't up to scratch.

In the early nineties this was still true. In fact I'm sure that Doom would play better on a C2D with a crappy old S3 or similar than it would on a Pentium with a 7800GT in it. Because the games were rendered by software then, so faster processors were all that were important.

I am not trying to argue just for the sake of it. However I don't think it's fair to say that CPUs are irrelevant as far as gaming is concerned, and that this has always been the case.

To further illustrate this, I draw you to this article:

http://tomshardware.co.uk/2006/11/29/ge ... index.html

This indicates that the 8800 is so fast in some games that it requires the fastest cpus to increase framerates. In fact they state:
tomshardware.co.uk wrote:you should conclude from the results the same thing we did: you need an Extreme CPU to run next generation graphics.
tomshardware.co.uk wrote: Author's Opinion
The long and short of this experiment is that you need a high speed platform to get the most out of the new DX10 hardware. If you were planning on getting a £400 graphics card to replace your 1-year-old graphics card, it would behoove you to rebuild your box. Of course, this means that the whole graphics upgrade will cost you a lot more than just the graphics card.

If you don't do the job properly, the net effect will be like hooking up a pair of garbage speakers to a Bose or Klipsch sound system. The effect would be the same... less than optimal performance, and an experience that is far from ideal given the money you spent.
So if you have a decent 7 series GPU but an average processor, an upgrade to an 8800 may be a waste of money. Which kind of further proves my point that GPU is not all important. As with any upgrade, it really depends on what you want your computer to do, or which game you want to play. When DX10 games appear this may shift further towards cpus being important, or make the gpu even more critical.

Another factor is the monitor. With fixed resolution tfts now the norm, it is now possible that this could limit your gaming experience. If you are limited to playing at 1024x768 a cheaper card may run at full quality at similar speeds to the more expensive cards.

My point is that there is no hard and fast rule. New gpu does not necessarily mean better. But sometimes new cpu doesn't mean better either. Sometimes you just need to buy a new computer, sometimes you don't. It almost always depends on what you want to achieve.

Miz
Posts: 8
Joined: Thu Feb 08, 2007 11:47 pm

Post by Miz » Fri Mar 02, 2007 12:31 am

Well, I know this is completely off topic. Have anyone actually calculated how much GPU memory is need when running specific games at specific resolution.

I am trying to figure out how much memory is enough to run on widescreen UXVGA 1920x1200 setting. With all those bigger than ever memory now in Nvidia 8800 (768MB!!! for 8800GTX), I am wondering, is such amount of memory just a waste of money? (thus they scaled down to 320MB for 8800GS from 640MB!!! That's HALF of the original!!!!!)

Maybe someone could explain how a GPU memory actually help to speed up the game and image quality and when the bottleneck occur? Or could someone give a detail calculation of how these memory works, for examply, how many bytes is need for certain resolution or something like that...

I am really interested to know technical soundness while everything is boosting about how big their memory is!

klankymen
Patron of SPCR
Posts: 1069
Joined: Thu Aug 04, 2005 3:31 pm
Location: Munich, Bavaria, Europe

Post by klankymen » Fri Mar 02, 2007 3:58 am

actually resolution is not the important factor in memory used, but rather just game settings and AA/AF.

I don't know if you can calculate the amount used, but there certainly are tests that have been done to be found online. try googling abit, maybe ill search for some myself later on.

scorp
Posts: 148
Joined: Fri Jun 03, 2005 3:15 am
Location: Romania

Post by scorp » Fri Mar 02, 2007 4:17 am

RivaTuner can show you the amount of memory that a game uses; the idea behind having more memory is that games begin to use more and more memory; 256MB is already at the limit (and while most of the current games run just fine with that much, some already begin to show improvements (in some cases huge improvements) from having 512MB or more)); nobody relly knows how DX10 games will be, but if game developers have more memory to work with, it's likely that they'll use it

Tzupy
*Lifetime Patron*
Posts: 1561
Joined: Wed Jan 12, 2005 10:47 am
Location: Bucharest, Romania

Post by Tzupy » Fri Mar 02, 2007 4:21 am

Resolution and AA settings matter on one hand, larger textures on the other hand. When using a large screen you'll also want larger textures.
The 8800 GTS 320 is very good from 1280x1024 upto 1680x1050, after that is loses too much performance due to the half memory.

jessekopelman
Posts: 1406
Joined: Tue Feb 13, 2007 7:28 pm
Location: USA

Post by jessekopelman » Mon Mar 05, 2007 4:21 pm

the_beast wrote: So if you have a decent 7 series GPU but an average processor, an upgrade to an 8800 may be a waste of money. Which kind of further proves my point that GPU is not all important. As with any upgrade, it really depends on what you want your computer to do, or which game you want to play. When DX10 games appear this may shift further towards cpus being important, or make the gpu even more critical.

Another factor is the monitor. With fixed resolution tfts now the norm, it is now possible that this could limit your gaming experience. If you are limited to playing at 1024x768 a cheaper card may run at full quality at similar speeds to the more expensive cards.

My point is that there is no hard and fast rule. New gpu does not necessarily mean better. But sometimes new cpu doesn't mean better either. Sometimes you just need to buy a new computer, sometimes you don't. It almost always depends on what you want to achieve.
I agree with you here 100%. No one was saying that GPU was all important or that CPU was irrelevant. The issue was which is more important. While I stand by my generality that GPU is more important than CPU, for graphically intensive games, I can't deny that you are correct in saying that there is no hard and fast rule. I would think it goes without saying that if you are limited by your monitor, discussions about the PC itself become rather pointless. As for whether GPU had any relevance prior to 3D acceleration, I guess we will have to agree to disagree.

borbs
Posts: 1
Joined: Mon Mar 19, 2007 3:50 pm

Post by borbs » Mon Mar 19, 2007 4:08 pm

I dont really think that the conclusion from that tomshardware article can be that conclusive...they are running dx10 hardware in XPs dx9 and dx9 software, so how can they conclude u'll need a faster cpu for dx10 hardware?? Maybe this applies now with no dx10 software out there but the little i know about dx10 is that the DX10 cards will do much of the cpu work and i guess we can only have a conclusion when we try dx10 hardware along with dx10 software!
By the logic of it cpu wont matter that much as long as we dont compare 2 very distant cpu architectures.

=assassin=
Posts: 243
Joined: Thu Aug 25, 2005 2:46 am
Location: Blackpool, England, UK
Contact:

Post by =assassin= » Wed Mar 21, 2007 7:15 am

jessekopelman wrote:I don't remember CPU ever being more important than GPU for games and I'm going back to the 1980s on this one. If you read reviews on sites like Anandtech, it becomes clear pretty quickly that GPU is far more important than CPU for producing high FPS in games. Having a dual core CPU doesn't even help for most games. If you are building a game machine, buy an Athalon 64 3800+ put the extra $100+ that a C2D would have cost to a better graphics card. If you are a hardcore gamer, you'll probably still be feeling the need to upgrade the graphics card before swapping in an X2.
I did exactly that last year, got an A64 3800+ single core, and it allowed me to get a 7900GS on my small budget.

Post Reply