Power consumption difference between AGP 6200 and MX440?

They make noise, too.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
Mikael
Posts: 206
Joined: Mon Dec 06, 2004 3:12 am
Location: Gothenburg, Sweden

Power consumption difference between AGP 6200 and MX440?

Post by Mikael » Sun Jun 04, 2006 6:33 am

I have this Linux box that is really cool running. I use the following parts:

AthlonXP Thoroughbred B @ 1GHz (10x100, 1.25V)
Abit KX7-333
256MB PC3200 @ 100MHz (2.5V)
GeForce4 MX440 SE (passive)

The box has no case fans. The CPU has a small GlacialTech Igloo heatsink and the fan runs at something like 600-700RPM, keeping the CPU around body temperature even when under load. The PSU is an FSP 350W with a fairly quiet 120mm fan in the bottom.

Now, I've bought a 19" LCD and I'd like to have a graphics card with DVI output. It also certainly wouldn't hurt if it was a little faster... Being a Linux box, I'd really want to go with an Nvidia card too. So, my question is: How does the power consumption of the MX440 and 6200 compare? The 6200 would have to be able to run passive with minimal airflow around it.

What do you think?

autoboy
Posts: 1008
Joined: Fri Dec 10, 2004 8:10 pm
Location: San Jose, California

Post by autoboy » Sun Jun 04, 2006 7:05 am

A 6200 or 5200 would work fine for what you want. Look around and find the biggest passive heatsink card you can find.

Redzo
Posts: 464
Joined: Thu Jan 26, 2006 1:51 am
Location: Sweden, Stockholm

Post by Redzo » Tue Jun 06, 2006 10:42 am

I would go with 6600 instead. Much faster then 6200 and there are quite few passive ones on the market.

QuietOC
Posts: 1407
Joined: Tue Dec 13, 2005 1:08 pm
Location: Michigan
Contact:

Post by QuietOC » Tue Jun 06, 2006 11:19 am

autoboy wrote:A 6200 or 5200 would work fine for what you want. Look around and find the biggest passive heatsink card you can find.
Order of heat/power usage from low to high:

6200(NV44/NV44A)<6200(NV43)<6600(NV43)<5200(NV34)

Slower clocked cards will always run cooler. So, get a 6600 not a 6600GT. The early 6200s were 6600s with half their pixel pipelines disabled. The newer 6200A version is a simpler, cooler running chip.

From X-bit Labs:
Image
Image

If you consider ATI: a Radeon 9550/Mobility 9600 are the coolest AGP cards. Evidently, even a 9600XT is quite cool running.

The coolest nVidia DX9 AGP cards are the 5700LE, 6600LE, and 6200A.

Edited to remove unneccesary information.
Last edited by QuietOC on Thu Jun 08, 2006 6:12 am, edited 10 times in total.

Le_Gritche
Posts: 140
Joined: Wed Jan 18, 2006 4:57 am
Location: France, Lyon

Re: Power consumption difference between AGP 6200 and MX440?

Post by Le_Gritche » Tue Jun 06, 2006 2:12 pm

Mikael wrote:So, my question is: How does the power consumption of the MX440 and 6200 compare? The 6200 would have to be able to run passive with minimal airflow around it.
I don't know about the 6200, but I can compare the MX440 SE and ATI's 9250. The 9250 runs hotter, and it's only because it sustains a heavy underclock (core at 120 MHz, or 50% of stock speed) that I can have it running as well as the MX440SE at 200 MHz.
So in this respect the MX440SE is quite good, but I can't tell compared to the 6200.
The only thing I can say is that the MX440 as 31 million transistors at 150 nm I think, while the 6200 has 77 million transistors at 110 nm.

QuietOC
Posts: 1407
Joined: Tue Dec 13, 2005 1:08 pm
Location: Michigan
Contact:

Post by QuietOC » Tue Jun 06, 2006 3:28 pm

Image

The GeForce FX 5700 DDR looks like a decent choice for a cool running nVidia AGP video card. Even overclocked it is still running pretty cool, and this is a full 128-bit memory bus card. It is just not as cool as the Radeon 9550/9600s. The slower FX5700LE should run even cooler.

fastturtle
Posts: 198
Joined: Thu May 19, 2005 12:48 pm
Location: Shi-Khan: Vulcan or MosEisley Tattonnie

Post by fastturtle » Wed Jun 07, 2006 2:56 pm

Mikael:

I'm sorry to say you wont be able to get much performance boost with an FX6200. The reason is the card's crippled as it has no hardware acceleration. Check out the Gentoo-XGL wiki and look under the hardware acceleration issue and you'll see what I mean.

Example based on GLX Gears test of Replacement FX5200 with 128 megs and original TNT2-M64 with 32 megs
TNT2-M64 avg using x11 nv driver was 1150-1300 fps and 1250-1450 with Nvidia's driver

FX5200 avg using x11 nv driver was 1150-1350 fps and 1350-1500 with Nvidia's driver.
Note that these tests were run on the same board right after each other. Simply swapped the video cards to see the differences.

Obviously this raised one hell of a question because I didn't see at least a 2x boost in fps as I should have from the TNT2 and after some serious research I have an explanation why. It's called hardware accelertaion. In otherwords the TNT2 was doing all the work it could, removing load from the CPU while the FX5200 was acting more like an IGP setup, only a little faster due to integrated memory.

The explanation I found is this, the FX series is classified as the Mainstream, which means they will most likely be crippled due to a lack of hardware accelertaion, while the MX Series are the Performance Models. In other words the sports version of the model and means they have much better performance then you'd suspect.

What I'd suggest if you're looking for a better card is to consider a 7300gs as that does have hardware accelertaion and should give a big boost. Otherwise save your money and keep that MX as there are few cards that provide a worthwhile boost until you get into the 6600 and better cards. At that point you get into heaters and PSU demands such as the fact that the 6600 requires at least a 350watt while stepping up to the 7300gs requires at least 380 and the 7600's all demand at least a 400 watt minimum.

So ask yourself if it's worth spending the money on the new Video card and PSU or adding system memory? Personally, I'd go for the increased memory as Linux is far better able to use it and as long as you stay below 800 megs, you wont even have to rebuild the kernel. :D

Mikael
Posts: 206
Joined: Mon Dec 06, 2004 3:12 am
Location: Gothenburg, Sweden

Post by Mikael » Wed Jun 07, 2006 11:07 pm

I think some of you misunderstood me. I'm not changing to get more performance. I want to use the DVI input on my LCD, that's all. An increase in performance would be nice, but it's not like I'm going to have any real use for it in Linux anyway. Atleast not for games.

I certainly won't replace the 5-10W GF4MX with a 25-30W 6600. That's a huge difference and totally unmotivated for this computer.

fastturtle: What's this lack of hardware acceleration you're talking about? You mean that it is disabled in the Linux drivers? It doesn't really make any sense to sell a 3D accelerator that doesn't work in Linux, but does work in Windows... If you're talking about the actual hardware, all Nvidia cards have hardware 3D acceleration.
fastturtle wrote:The explanation I found is this, the FX series is classified as the Mainstream, which means they will most likely be crippled due to a lack of hardware accelertaion, while the MX Series are the Performance Models.
The MX series are the budget models of the older product ranges, not performance models.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Thu Jun 08, 2006 4:58 am

You can still buy the FX5200 new, although I don’t know how it compares with the 6200 AGP. Note: Not all FX5200’s come with DVI and you get them with DVI, D-Sub & TV-out fairly cheaply.
The lowest power consuming VGA card that I’ve come across in the last year was an ATI Radeon 7000 64MB DVI. If you do stray into the ATI camp keep away from the 9250 128MB; it consumes 5W more than an FX5200 128MB and 5W less than a 6600 256MB; this is at idle.

QuietOC
Posts: 1407
Joined: Tue Dec 13, 2005 1:08 pm
Location: Michigan
Contact:

Post by QuietOC » Thu Jun 08, 2006 5:01 am

fastturtle wrote:Obviously this raised one hell of a question because I didn't see at least a 2x boost in fps as I should have from the TNT2 and after some serious research I have an explanation why. It's called hardware accelertaion. In otherwords the TNT2 was doing all the work it could, removing load from the CPU while the FX5200 was acting more like an IGP setup, only a little faster due to integrated memory.
What?!!!

You are just not using a good test. Look at the fps numbers!!! The difference between 1,000fps and 10,000 fps is academic. This "test" is basically showing how fast the driver is, and is probably otherwise CPU/platform limited.

The 6200 is NOT an GeForceFX series chip. It is very closely related to the 7300GS (or vice-versa). The 7300GS is not available in AGP form, and even if it was it would have to rely on a PCIe bridge which is somewhat power hungry.

I think you are best getting a passive slower clocked 6200A AGP (NV44A). The "A" version is a newer 110nm chip. This is a native AGP chip, and seems to be about as fast as a Radeon 9600 at most things despite having a narrow 64-bit memory bus. $30 at Newegg.

Also, another reason to avoid the FX5200 (other than it running hot). DVI evidently is not well supported on FX5200 cards. The 5200 chip lacks a built-in TMDS.

BTW: Memory type also indicates relative video card power use. From low to high:

DDR < DDR2 < DDR3

Post Reply