GeForce FX = Dust Buster: first test

The forum for non-component-related silent pc discussions.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Herb W.
Posts: 113
Joined: Sun Sep 15, 2002 8:35 am
Location: Toronto

GeForce FX = Dust Buster: first test

Post by Herb W. » Mon Jan 27, 2003 1:45 am

Tom's has a review of the first Geforce FX reference card. While it shows a slight-to-moderate performance gain over a 9700Pro, the noise it makes receives a full page of commentary plus Tom's offers some attached sound files so you can hear the thing for yourself - to quote Tom's, "it sounds like a vacuum cleaner". Listening to the audio (there is a bootup computer beep in the recording so you can reference the volume level to your own machine), I don't see how anyone with even partial hearing could stand to live with it - it just blows away the comparison track from a noisy 9700Pro. Nvida claims they are fixing this but as Tom points out it, it's hard to see how they will do it given the card's cooling needs - it has an 80 or so watt draw. Water cooling anyone?

tyrian
Friend of SPCR
Posts: 8
Joined: Wed Nov 20, 2002 3:28 am
Location: santa barbara, ca
Contact:

potential downfall of nvidia?

Post by tyrian » Mon Jan 27, 2003 3:12 am

While the subject header may vastly overestiate the issue, as someone on slashdot was saying a few weeks ago on the same topic, this is the same mistake 3dfx made many years ago -- making a card that was `uber' relative to the market (voodoo5/6), but in the end had massive power and space requirements and really never got anywhere.

I doubt such a fate will truely fall Nvidia, but they definitely should learn from the mistakes of their past competitors (who went on to be bought out by Nvidia nonetheless). A 80 Watt heat dissipation is simply unacceptable for a videocard. While GPUs are being offloaded with more and more operations, and their complexity is outstripping that of other computing markets, if they surpass CPUs in heat and cost, (which, from the sounds of it they have) they'll need to make some inroads to keep us hooked on their latest and greatest.

My relatively paltry Geforce3 came with a stock 60mm fan, and that alone was enough to drive me nuts. People won't stand for massive decibel levels from non-essential components -- not even hardcore gamers like myself.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Mon Jan 27, 2003 8:12 am

That's absolutely absurd. Gamers aren't going to buy it. They're not stupid. See the link to the Intel's chips are down article on the front page news.

TheMuffinMan
Posts: 146
Joined: Sat Jan 04, 2003 3:45 am
Location: New York, NY
Contact:

Post by TheMuffinMan » Mon Jan 27, 2003 8:38 am

That's friggin insane! The Radeon 9700 Pro fan is clearly audible through a case, even a relatively loud one. If the FX fan is that loud...I can't possibly begin to imagine....

Also, those temps are just plain dangerous. This card would be completely unstable in even a low to possibly medium airflow cases. I said it once, and I'll say it again: the cooling in-out vents are completely ineffective. Why suck air in from the place you are spitting out 70 C air?

Gekkani
Posts: 116
Joined: Sun Aug 11, 2002 3:26 pm
Contact:

Post by Gekkani » Mon Jan 27, 2003 9:35 am


Here are some comments from differents websites:

- The card becomes quite hot during operation. While testing the card in an open environment (i.e., outside of a PC case), the heatsink on the back of the card reached over 68° Celsius

- The FX consummates up to 75 watts! The Radeon 9700 Pro has 54 watts.

- nVidia is recommending no less than a 300-watt power supply, and one of nVidia's add-in board partners is actually recommending a 350-watt power supply to ensure that the GeForceFX remains fed and happy.

- The other painfully noticeable factor here is fan noise. The GeForceFX has already been nicknamed in some site forums as The DustBuster.

- A further problem is the noise level. The fan produces an incredible racket on par with a vacuum cleaner - there's simply no other way to describe it.

- You can hear the card even if you're in another room of the house.

- The first time I realized what the fan sounded like, I did a very Keanu-like "Whoa."

- To further extend the comparison, the GeForce FX 5800 weighed in at 77 dBA

***

We did some sound pressure level measurements of the Flow FX system while we were testing. We put a Radio Shack decibel meter on a tripod and positioned the meter's microphone in the same position: approximately where your left ear would be when sitting in front of your monitor.

We used C-weighting for this measurement, which like A-weighting, emphasizes the range of human hearing (20Hz – 20KHz) but C-weighting has an attenuated high end. The C-curve is "flat," but with limited bandwidth, with -3 dB corners of 31.5 Hz and 8 kHz, respectively.

Here's what we found:

Radeon 9700 Pro's baseline sound level was about 54dB SPL (sound pressure level). When we fired up a 3D app, there was no change in the sound level. It remained at 54dB SPL.

The GeForceFX is a different story, however. Its baseline sound level was also around 54dB SPL, but upon starting up a 3D app, the Flow FX fan kicked into high gear, and the sound level rose to around 58dB SPL. Recall that decibels are on a logarithmic scale, so this 4dB increase represents more than a twofold increase in the sound level of the overall fan noise output of the test machine we used.


***

From Tomshardware, Anandtech, HardOCP and Extremetech.

Here it is a link to hear that DustBuster:

http://images.anandtech.com/reviews/vid ... nvidia.zip

ATI for comparation:

http://images.anandtech.com/reviews/vid ... ew/ati.zip

Woahh... Gforce FX noise level its simply amazing... totally insane IMHO

GamingGod
Posts: 2057
Joined: Fri Oct 04, 2002 9:52 pm
Location: United States, Mobile, AL

Post by GamingGod » Mon Jan 27, 2003 10:03 am

man that sucks, i really like nvidia. they need to get their act together cause ive always disliked ati because their drivers have always sucked, but if nvidia doesnt come out with a $200-$300 that will run doom3 with everything turned all the way up at playable framerates, and being at least on par with other graphic cards noise levels then im gonna have to get an ati card.

Gandalf
Posts: 331
Joined: Tue Dec 24, 2002 9:04 am
Location: Belgium

Post by Gandalf » Mon Jan 27, 2003 11:00 am

God the power consumption of that thing is absolutely disgusting! It consumes (and disapates in heat ~) more than a friggin CPU! It's ludicrous.
I think I'll just order a radeon, or wait for less hot stuff :P.

sgtpokey
*Lifetime Patron*
Posts: 301
Joined: Sat Dec 14, 2002 11:29 pm
Location: Dublin, CA / Liverpool UK

this editorial on overcloskers sums it up nicely:

Post by sgtpokey » Mon Jan 27, 2003 11:18 am

This little editorial on overclockers.com sums up the intitial reviews and reactions nicely:

http://www.overclockers.com/tips00268/

sgtpokey
*Lifetime Patron*
Posts: 301
Joined: Sat Dec 14, 2002 11:29 pm
Location: Dublin, CA / Liverpool UK

Post by sgtpokey » Mon Jan 27, 2003 11:24 am

Or to post some frank quotes from the overclockers article:

"...For those dubious advantages, you end up with a space-chewing, heat-pumping, noise-making, GPI-throttling video card. "

"...It's very hard to believe nVidia designed a card like this because they thought it was a good idea. It's so environmentally unfriendly at a time when even the performance loons notice the noise. Would you design that air boondoggle that still leaves the heatsinks close to 70°C unless you absolutely had to? "

Dru
Patron of SPCR
Posts: 154
Joined: Sun Aug 11, 2002 3:26 pm
Location: Gilbert, Arizona, USA
Contact:

Post by Dru » Mon Jan 27, 2003 11:29 am

Sad, sad, sad.

I think nVidia dropped the ball on this product. I wonder if they were racing to get this out fast to topple ATI? To me, it looks like ATI designed a much better product. I bet ATI is happy about how the FX performs. They can keep their 9700pro priced up there for quite some time.

GamingGod
Posts: 2057
Joined: Fri Oct 04, 2002 9:52 pm
Location: United States, Mobile, AL

Post by GamingGod » Mon Jan 27, 2003 12:30 pm

well its all good in the long hall, cause ati pretty much has to drop prices on all their cards because that is what competition does, even when there isnt that much competition. So hopefully the 9700 pros should drop a good $50 soon. And of course then ati will release the 350 driving the prices down even more :D happy days.

Beyonder
Posts: 757
Joined: Wed Sep 11, 2002 11:56 pm
Location: EARTH.

Post by Beyonder » Mon Jan 27, 2003 3:19 pm

Is anyone else confused how this chip could draw THAT much power, especially when it's running on a .13 process? It simply doesn't make sense....that thing is sucking more juice than just about any CPU processor. If you compared it to a celeron 1.1A, it's sucking perhaps three times as much....


It doesn't make sense. I thought power draw was supposed to decrease as you dropped the die size....

Herb W.
Posts: 113
Joined: Sun Sep 15, 2002 8:35 am
Location: Toronto

Post by Herb W. » Mon Jan 27, 2003 4:31 pm

Part of the reason for its high power requirement must lie in the fact that it has something like 40% more transistors than a P4...plus you need to factor in the speed at which they are pushing it (CPUs always run hotter when overclocked, for example).

jhh
Posts: 218
Joined: Sun Jan 19, 2003 4:47 am

Post by jhh » Mon Jan 27, 2003 6:32 pm

I think the problem is that in a modern gaming machine the video card actually does a hell of a lot more work than a CPU, but while the CPU has a big open, airy area to dedicate to large heatsinks, the GPU has to sit on a poxy little card. If a P4 had to fit on an AGP card I'd bet that'd sound pretty hideous too!!

IMO The long run answer is cases/mobos dedicating more room to the AGP slot - as much room as the CPU would be ideal, but to start with maybe someone should just move it away from the 1st PCI slot?

I wonder how this card would fare with a Zalman fanless heatsink? obv 70w is far too much without a fan, but maybe with a 80mm fan eitherside?

Gxcad
Patron of SPCR
Posts: 429
Joined: Sun Aug 11, 2002 3:26 pm
Location: San Francisco, CA
Contact:

Post by Gxcad » Mon Jan 27, 2003 10:16 pm

I think Nvidia screwed up when they made this card with only a 128bit memory bus, thus limiting the memory bandwidth to only 16gb/sec when the ATI 9700 pro can already do 19.8gb/sec with only 310mhz DDR BGA chips and thus create much less heat and more bandwidth. I think Nvidia were desperate to make the FX at least topple the ATI 9700 pro and also to create hype-able specs like 500mhz core and 1ghz ram since they already knew for quite some time that their Geforce FX did not blow away the competition. They probably had to increase the voltage on the core and memory quite a bit to acheive good yeilds on their claimed specs (which were also necessary to perform well enough to even compete with ATI's 9700 pro) thus creating the insane cooling requirements necessary for the FX. I'm guessing clock per clock the R300 and the NV30 are probably about on par so its all up to the memory bandwidth and core clock for who is faster, and Nvidia had to make up for the slight lack in memory bandwidth compared to the 9700 pro with a much faster core, afterall they aren't going to get much more than 1ghz from their ram. It is pretty pathetic on Nvidia's part that even with a more advanced .13 micron core and a full year product cycle (just like ATI did with the 9700 pro, and twice Nvidia's usual 6 month product cycle) that they can't topple the 9700 pro by more than just a bit. Historically, memory bandwidth has been extremely important and proportional to the performance of a videocard and the faster frequency isn't going to make up for the double bandwidth of the 256bit bus of the R300 (just look at the score increase when the 9500 is hacked into a 9700, effectively doubling the memory bandwidth). Soon (mid march) ATI will release the R350 with ~400mhz core and ~800mhz memory (radeon 9800? 9900?) which give the FX only a month of the performance crown, and I fail to see how Nvidia is going to hang onto the crown in march with only a small performance lead as it is, over a 6 month old product no less, despite superior .13 micron technology. Nvidia has nothing ATI didn't have 6 months ago apart from a few percent more in speed. Competition is fierce:).

-Ken
Last edited by Gxcad on Tue Jan 28, 2003 3:53 pm, edited 1 time in total.

juniorrank1
Posts: 4
Joined: Sat Jan 25, 2003 2:30 pm
Location: Madison, WI

Post by juniorrank1 » Mon Jan 27, 2003 11:21 pm

thats crazy

nVidia decided early in the FX's development to manufacture the GPU as a
0.13 micron part. This would allow higher clock speeds, and one would think, less heat. But ATI went with a 0.15 micron chip, got lucky and was able to release the board months earlier, and they now have a product that has been the best available for months already, and still holds advantages over nVidia's new offering, unless nVidia can pull some amazing tweaks. hmmmm

wussboy
Posts: 635
Joined: Wed Oct 23, 2002 12:34 pm
Location: Southampton, UK

Post by wussboy » Tue Jan 28, 2003 8:53 am

I've been very curious to see how the hype would pan-out. And I'm not a nVidia lover or hater. I'll buy the best card for the best price regardless of who made it. But it does seem that the FX is a last-gasp kind of card. Like if they could have thought about it some more or had more time they would have liked to do some things differently. Curious. But I've heard mp3s of that heatsink fan and there's no way in hell that thing would ever go inside my computer. I hope other companies can come up with better solutions.

Dru
Patron of SPCR
Posts: 154
Joined: Sun Aug 11, 2002 3:26 pm
Location: Gilbert, Arizona, USA
Contact:

Post by Dru » Tue Jan 28, 2003 9:45 am

I have a friend who hates nVidia and is hoping Asus will do a better job at cooling than the stock nVidia FX. There is a box shot at Asus but no card shot yet (that I know of).

For me, I can't really justify spending more than $200 or $250 on a video card ... unless I were rich (which I'm not) :cry: hehe... If I had the money today, I'd probably get that Sapphire 9700pro with the Zalman 80A.

Rusty075
SPCR Reviewer
Posts: 4000
Joined: Sun Aug 11, 2002 3:26 pm
Location: Phoenix, AZ
Contact:

Post by Rusty075 » Tue Jan 28, 2003 10:43 am

Considering that every game available today (even UT2003) plays just fine on a Geforce2Ti I think I'll wait until there's a need to upgrade before I do. Clearly the FX is aimed at people who measure their d**k size in Frames per Second :lol: :lol: :lol:

chiahaochang
Patron of SPCR
Posts: 99
Joined: Sun Aug 11, 2002 3:26 pm
Location: Columbus, OH USA

Post by chiahaochang » Tue Jan 28, 2003 12:53 pm

Gxcad wrote:It is pretty pathetic on Nvidia's part that even with a more advanced .13 micron core and a full year product cycle (just like ATI did with the 9700 pro, and twice Nvidia's usual 6 month product cycle) that they can't topple the 9700 pro by more than just a bit.
IIRC: nVidia's releasing the GFFX "late" as TSMC was having problems with their .13u process. The GFFX was supposed to hit the streets before the 9700Pro, but their release date was pushed back twice. It would've been the difference between nVidia being a leader vs playing catchup.
Historically, memory bandwidth has been extremely important and proportional to the performance of a videocard and the faster frequency isn't going to make up for the double bandwidth of the 256bit bus of the R300 (just look at the score increase when the 9500 is hacked into a 9700, effectively doubling the memory bandwidth). Soon (mid march) ATI will release the R350 with ~400mhz core and ~800mhz memory (radeon 9800? 9900?) which give the FX only a month of the performance crown, and I fail to see how Nvidia is going to hang onto the crown in march with only a small performance lead as it is, over a 6 month old product no less, despite superior .13 micron technology. Nvidia has nothing ATI didn't have 6 months ago apart from a few percent more in speed. Competion is fierce:).
I'm sure that nVidia is also working on a followup to the GFFX and R350. This will probably put nVidia back in their 6mo product cycle. For nVidia's sake, hopefully they'll have a 256bit memory bus. Or somehow solve their memory bandwidth "problems". I'm about to build a new PC, and the Radeon 9700Pro is probably going into the machine as it seems to have a performance edge at high res with FSAA and AF on. Plus, it doesn't have a leaf-blower attached to the card. :roll:

GamingGod
Posts: 2057
Joined: Fri Oct 04, 2002 9:52 pm
Location: United States, Mobile, AL

Post by GamingGod » Tue Jan 28, 2003 1:16 pm

the only reason i would even think of putting a top of the line graphics card in my computer is because i want to be able to play games at 1280 or 1600 at playable rates and hopefully have lots of Antialiasing too. But $300US is about the most ill spend. Really when you think about it, its not that bad to buy one if your really gonna use it, cause if you a $150 card now, then wait a few years and buy a $150 card in a year or two, you end up with a year of having a lowbundget card, then you upgrade to a card that use to be top of the line, but now is considered budget. So basically might as well have the top of the line card for the whole 2 years and hopefully you wont have to upgrade so soon. sorry if that was kinda confusing ?

crisspy
Friend of SPCR
Posts: 228
Joined: Sun Sep 29, 2002 9:05 pm
Location: Powell River, BC, Canada

Post by crisspy » Tue Jan 28, 2003 1:19 pm

Sounds to me like it's time for a major re-arrange on those cards. At present the GPU is always put on the 'down' side of the card, along with all the other components. That should be chaged to something like this:

Code: Select all

               _____________________
              |                     |
              |      80mm Fan       |
              |_____________________|
   | | | | | | | | | | | | | | | | | | | | | |
   | | | | | | | | | | | | | | | | | | | | | | <- heat sink fins
   | | | | | | | | | | | | | | | | | | | | | |
   ===|===========|===========|===========|=== <- isothermal heat pipe
      | <stands>  |   =GPU=   | <stands>  |    <-GPU on 'up' side
|=====|===========|===¥¥¥¥¥===|===========|=== <- AGP card
| M I S C. P A R T S              R A M
|
| <-bracket
Another way would be to move most of the components to the 'up' side, and put the GPU and heat plane on the 'down' side, with the fins running towards the back of the case. The heatsink would be boxed over, and a fan would draw air from inside the case, to be blown through the length of the heatsink, then exhausted out the back at the slot (or an adjacent slot). With an isothermal heat spreader you get full performance from the entire length of a larger heatsink, even when the air is cooler at one end than the other. That scheme could work on the 'up' side of the card too, venting into the case right at the back, and right next to a case exhaust fan.

Something is going to have to give one way or another. 80 watts is too much heat to dump winny-nilly into a computer case.

Gxcad
Patron of SPCR
Posts: 429
Joined: Sun Aug 11, 2002 3:26 pm
Location: San Francisco, CA
Contact:

Post by Gxcad » Tue Jan 28, 2003 4:13 pm

Chia, perhaps I was a little too hard on Nvidia when I said it is "pathetic" as they still were able to release a decent CORE if not the card itself being too hot and in the end, they do perform better than the 9700pro which was and still is an excellent product from ATI. I still think Nvidia will be in a pinch for a while yet though with ATI possibly taking the crown back with the R350. Thing Nvidia has going for them is they have a new core to work with and bring out the best of while ATI might start running out of juice to suck out of the R3xx chips and need a new core design pretty soon. Who knows, just a thought.

BTW, what is IIRC, I think I used to know but I forgot as I don't see it that often;).

Gaminggod, there is an error to your calculation. $150 now will be $75 in one year but $300 will become $150 in one year. Buy a $150 card now and in a year get another $150 card and SELL your old card and you have a $150 card a year from now for $225 during that time. Buy a $300 card now and keep it in a years time and you have a $150 card for $300 (+$75) not to mention none of the newer technologies that come in the newer yet "budjet" chips. Think of it this way, a geforce 2 ultra or a geforce mx 440. They perform similarly with a slight edge to the 440, but the 440 runs a helluva lot cooler (albiet no newer technologies in this example due to nvidia using geforce 2 cores for the geforce 4 mx series). Another example is a Geforce TI4600 vs Radeon 9500pro. The 9500pro has DX9 while the TI4600 does not, but the TI4600 was available a long time before the 9500pro. Something I haven't even touched on yet is the CPU being the bottleneck for high end cards. This means you will have to invest some dough in a high end $$ cpu as well to bring out the best from your high end graphics card, and your high end graphics card probably won't even be neccessary for current games (although I admit you can turn on all the eyecandy and still play). I just think with all things considered, the budjet card year after year is the clear choice for anyone without an astronomical income or the need for the best/bragging rights.

To sum it up, buying a high end card early will cost you more than upgrading to a budjet card year after year, but you will have the high end performance for the first year. A recent budjet card will also likely run cooler and have more features than an old similarly performing generation old card.

Concludes my theory on upgrading videocards:).

-Ken

P.S. The most I would spend on a gfx card at the moment is $100:)

crisspy
Friend of SPCR
Posts: 228
Joined: Sun Sep 29, 2002 9:05 pm
Location: Powell River, BC, Canada

Post by crisspy » Tue Jan 28, 2003 5:44 pm

One of my main reasons for choosing GPU-X over GPU-Y is driver/software maturity. If you can get an adequately performing 'budget' card that is yesteryear's hot stuff, and is still realatively well-performing, then you benefiet from year(s) of troubleshooting and driver improvements. That goes for the driver software for the card; OS level stuff like DX, OpenGL, and graphics system utilities; as well as many apps like games that eventually get optimized for the card. I know that both ATI and NVidia have been stablizing more and more through the years, but yesterday's-tech often packs a puch with respect to hassle free operation. And that means more time for me to enjoy.

I stay chronically behind the bleeding edge most of the time, but it has it's up sides. And at this point in the history of computers, last year's stuff is still pretty damn amazing.

GamingGod
Posts: 2057
Joined: Fri Oct 04, 2002 9:52 pm
Location: United States, Mobile, AL

Post by GamingGod » Tue Jan 28, 2003 5:46 pm

well we disagree because i would rather have a 9700pro now for $300 or less and have it last for 2 years, rather than have a geforce ti4200 for $150 and have to replace it in a year with another card that is gonna cost $150-$200.

DaShiv
Posts: 76
Joined: Tue Jan 14, 2003 12:32 pm
Location: Berkeley, CA, USA

Post by DaShiv » Tue Jan 28, 2003 9:21 pm

In my experience with games, recently it's been the older cards with support/compatibility problems, and never the newest ones--mostly because of so many crazy gamers wanting to be on the massively hyped bleeding edge. For example, some of the recent batch of games have Voodoo 5 compatibility problems. Granted that's not exactly a fair example because it's probably older than what you had in mind and it's from a defunct company, but I've yet to encounter any issues with the Radeon 9700 Pro and gaming, nor do I expect to anytime soon.

I do agree that hardware has gotten much, much more powerful than is "necessary" to run games, but some overly-jaded gamers have borderline-unrealistic expectations about what are "good" graphics and "playable" framerates. Many online first-person shooter players demand that framerates never drop below 60 frames/sec even in the busiest situations, which on older hardware would necessitate running current games in the dreaded "Ugly Mode."

I was running a Geforce2 32MB GTS before I upgraded recently, and before I upgraded I had to bump down Age of Mythology to the low settings to eliminate slowdowns in the sometimes massive battles. The horribly polygonal figures at that detail quality when the game zoomed in for cut-scenes made me laugh out loud. There was a world of difference with the 9700 running everything at highest quality with a bit of AA/AF, especially in games like Unreal Tournament 2003 ("Wow! There's grass!"). It's a compelling reason to upgrade for some people and not for others; YMMV.

In terms of the price/performance ratio the $150-200 segment of video cards is very compelling to stay with (although by "budget" you're probably referring to the sub-$150 level), but IMO how often you upgrade depends entirely on which games you play (i.e. how much of the latest and greatest) and what quality/performance tolerance thresholds you have. Drivers and support don't seem like issues to me, especially with things like nVidia's Detonator drivers working across the board for all GeForce cards, etc. With ATi looking healthier than ever and nVidia's rock-solid driver support, I honestly believe that drivers have become a non-issue. In any case, I certainly would be very surprised if I had to waste any time "troubleshooting" the drivers on my 9700 even though it's currently the latest and greatest--a status it holds until next month's release of the Leafblower FX, anyway.

In fact, the drivers issue might be the opposite way around. Impossible Creatures requires DirectX 9, which my card already supports. (And ATI's new drivers, released on the day DX9 was released, ran Impossible Creatures and everything else without a hitch, as expected.) Granted, I doubt that the game takes much (if any) advantage of it in terms of hardware, but when (if?) games in the future are more DirectX 9 dependent, I won't have to upgrade specifically for those features. Even though the bleeding edge video cards are usually not very good buys for their price (as with any other high-end computer component), sometimes there are benefits for being ahead of the curve in the long run, as GamingGod points out. If you buy a $150 DirectX 8.1 card now, I think it's very possible that you'll need to pick up a $150 DirectX 9 card to run the latest games with full DX9 features a year or so from now. I think which card is the "best" to buy really depends on one's needs and expectations. Some of us happen to have very high ones. :D

DaShiv
Posts: 76
Joined: Tue Jan 14, 2003 12:32 pm
Location: Berkeley, CA, USA

Post by DaShiv » Tue Jan 28, 2003 9:49 pm

I forgot to mention: very compelling arguments about video card upgrading, Gxcad, and I think definitely in terms of price/performance the budget cards will of course offer the best bang for the buck. In terms of upgrading, however: some of us like doing it all at once, as incremental upgrades over time run into the problem of non-upgraded components becoming system bottlenecks (like you pointed out with CPU's). That's not a problem though if you build your system from scratch with a CPU to match the video card, and then not worry about upgrades for two years. :)

I think a high-end video card definitely has better returns in a system with other high-end parts to match, but it will be beneficial even in many circumstances. For example, some games like Quake III are notoriously CPU dependent, but Tom's VGA charts show that games like Aquanox and Unreal Tournament give high returns for better video cards with slower processors (though higher ones with faster processors of course). And the "eye candy" of antialiasing and anisotropic filtering is almost entirely video card dependent (please correct me if I'm wrong). So while simply slapping a new video card into your system doesn't get the "most" out of the card if the whole system isn't up to it, I think it's appreciable for those with the disposable income.

For price-to-performance, budget cards are the way to go, but there's no way to go with quality without spending money. You get what you pay for. :)

chiahaochang
Patron of SPCR
Posts: 99
Joined: Sun Aug 11, 2002 3:26 pm
Location: Columbus, OH USA

Post by chiahaochang » Wed Jan 29, 2003 7:01 am

IIRC = If I Recall Correctly.

I think ATI took the safe route by going with the more mature .15u process. It was a risk on nVidia's part to try to go for a .13u process, but you have to take risks to lead the market. I don't think that nVidia's gamble on the .13u process was an unreasonable risk for them to take.

I think their biggest failing on GFFX is the noisy and exotic cooling solution it requires. I don't think the loss of 1 PCI slot is that big of a deal, but the noise of the GFFX will probably turn off all but the most hardcore of gamers (who value performance over anything else). I could be wrong of course. But, I think the average gamer/PC user would find the noise levels of the stock fan on the 9700Pro acceptable, especially considering they probably have a noisy CPU HSF, a noisy PSU, and a noisy HDD. I'm guessing the majority of the people in this forum wouldn't find the noise level of the 9700Pro's fan acceptable, but at least the 9700pro can be cooled quietly.

I'm somewhat behind on the video card curve. I do follow an upgrade plan similar to what GamingGod laid out. I've got an ATI Radeon 7500 right now, which I bought to replace a GeForce (original) that died when the fan siezed up. When playing games on the 7500, I've been running at 1600x1200 (the native res of my 20.1" LCD) and have to turn the details options all the way down. I swear I can hear the Radeon 7500 begging for "mercy".

jhh
Posts: 218
Joined: Sun Jan 19, 2003 4:47 am

Post by jhh » Wed Jan 29, 2003 9:36 am

The idea that a 4200Ti would have to be replaced in a years time is a bit much.

Even todays budget cards (by budget I mean cards like 4200s) are still waiting for games to catch up and support features like advanced pixel shaders.

All the fancy stuff you can do with directX8 is only just starting to be seen in games NOW and that'ss been around for a year! What's the point in buying a top-end video card when by the time games actually use it it'll be half the price?

GamingGod
Posts: 2057
Joined: Fri Oct 04, 2002 9:52 pm
Location: United States, Mobile, AL

Post by GamingGod » Wed Jan 29, 2003 10:12 am

because i want to be able to play games at at least 1280 or 1600 at 60+frames a second with antialiasing and everything else turned all the way up. You cant do that with a 4200, hell you cant turn antialiasing on at all with the 4200 and expect playable rates.

Post Reply