G35 let down

All about them.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
Pigbristle
Posts: 12
Joined: Wed Dec 20, 2006 2:56 pm

G35 let down

Post by Pigbristle » Sat Oct 27, 2007 5:45 am

I like my integrated graphics on my motherboards, don't ask me why, I just do!
So with great anticipation, I'm looking froward to the new G35 chipset models.
(by the way does anybody know when there due out?)

But little things niggle me, like! why the hell do they still use D-Sub instead of DVI? DVI has been standard now on graphic cards for as long as I can remember.

And for gods sake, don't tell me were still getting a parallel port as well!
Is this really moving forward?

tehfire
Posts: 530
Joined: Mon Jan 01, 2007 9:57 am
Location: US

Post by tehfire » Sat Oct 27, 2007 6:21 am

They use the VGA connection because real DVI support requires an SVDO chip to work - an extra expense that MB companies, for some reason, don't want to foot. I have a G33 system and I bought a Silicon Image ADD2-N "X8760" ADD2 card on ebay and this gives me full DVI support. They're really cheap on ebay, about $10 including shipping, and it works perfectly - just stick it in your x16 slot and plug in the monitor - mine was automatically detected. Adds about 1-2W to total power consumption according to one of the forums I read here.

As to release - all I know is it's supposed to be "very soon".

Pigbristle
Posts: 12
Joined: Wed Dec 20, 2006 2:56 pm

Post by Pigbristle » Sat Oct 27, 2007 8:46 am

tehfire wrote:They use the VGA connection because real DVI support requires an SVDO chip to work - an extra expense that MB companies, for some reason, don't want to foot. I have a G33 system and I bought a Silicon Image ADD2-N "X8760" ADD2 card on ebay .
Thanks for the explanation!
I.m sure it wouldn't cost them that much? they are after all putting HDMI ports on them now. Would of been nice to have both HDMI & DVI though.

I'm just ready for my upgrade,was going to get the g33, but with news that the g35 is soon out, I thought I'd wait for some reviews.

I did know about the ADD2 cards but I will be putting in a x16 graphics card in after a month or so. I use the integrated as back up, for when I'm in between changing my x16 cards or when I come to sell I take out my x16 card first.

Hey! it'll run UT2 (2003) on integrated graphics, what more do they want? ;O)

Wibla
Friend of SPCR
Posts: 779
Joined: Sun Jun 03, 2007 12:03 am
Location: Norway

Post by Wibla » Sat Oct 27, 2007 9:21 am

HDMI ~= DVI .. you can get adapters.

gentonix
Posts: 47
Joined: Tue Jan 10, 2006 8:20 am

Post by gentonix » Sat Oct 27, 2007 10:23 am

Asus P5E-VM HDMI is already released, check the Asus website (which is down at the moment, so I can't give you a direct link). Should be available through retailers shortly.

Don't however confuse it with P5E-VM DO which is already selling - it has the Q35 chipset.

Edit. and here's the direct link:

http://www.asus.com/products.aspx?l1=3& ... odelmenu=1

Pigbristle
Posts: 12
Joined: Wed Dec 20, 2006 2:56 pm

Post by Pigbristle » Sat Oct 27, 2007 12:02 pm

gentonix wrote:Asus P5E-VM HDMI is already released, check the Asus website

http://www.asus.com/products.aspx?l1=3& ... odelmenu=1
Nice one! First one I've seen.

Just got to see what gigabyte have to offer first ;O)

Don't no way, been running an Asus for years with no trouble, but for some reason I'm fancying the Gigabyte this time round.

Let me know if you see any G35 reviews won't you ;O)

ido
Posts: 11
Joined: Sat Oct 27, 2007 3:33 pm
Location: ~ :)

Post by ido » Sat Oct 27, 2007 3:39 pm

gentonix wrote:Asus P5E-VM HDMI is already released, check the Asus website (which is down at the moment, so I can't give you a direct link). Should be available through retailers shortly.

Don't however confuse it with P5E-VM DO which is already selling - it has the Q35 chipset.
No eSATA on the Asus board, too bad. Can it be add?

I'm especially interested to see how much load does the G35 takes off the CPU when playing HD videos.

All in all it seems like a very nice chipset, I just might buy one for my new setup.

gentonix
Posts: 47
Joined: Tue Jan 10, 2006 8:20 am

Post by gentonix » Sat Oct 27, 2007 11:52 pm

ido wrote: No eSATA on the Asus board, too bad. Can it be add?
Yes, with a PCI-e add-in card for example. Search "pci-e esata" on Google or your favourite retailer.

G35 has some HD video acceleration features, but it comes with a price though, because the TDP of the chipset is 28W, which is over two times the TDP of G33. It remains to be seen what is indeed the CPU utilization and the retail price of these boards, because IP35 board and discrete Radeon HD2400 card might actually be
a) cheaper
and/or
b) able to render HD content with less power draw

PS. here's a gallery of upcoming G35 boards (along with others, so scroll down)

http://www.computerbase.de/bild/news/17135/

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Mon Oct 29, 2007 10:20 am

Pigbristle wrote:Would of been nice to have both HDMI & DVI though.
Gigabyte have a G33 board with DVI, HDMI & D-Sub so hopefully a G35 board will be released with the same options.
gentonix wrote:
ido wrote:No eSATA on the Asus board, too bad. Can it be add?
Yes, with a PCI-e add-in card for example. Search "pci-e esata" on Google or your favourite retailer.
You can turn any internal SATA port into an eSATA port using a simple and cheap bracket that doesn’t require an add-in card.
gentonix wrote:G35 has some HD video acceleration features, but it comes with a price though, because the TDP of the chipset is 28W, which is over two times the TDP of G33. It remains to be seen what is indeed the CPU utilization and the retail price of these boards, because IP35 board and discrete Radeon HD2400 card might actually be
a) cheaper
and/or
b) able to render HD content with less power draw.
The fact that the TDP is so high is ironically a good sign as it looks as if Intel may actually have significantly improved their IGP for once.
As for comparing the power efficiency of a G35 versus a HD 2400 for HD video decoding don’t forget that if you add a HD 2400 Pro to a G33 board that it consumes roughly 23W AC more at idle. It’s a guess how much a G35 will consume at idle but I imagine most of the extra TDP relates to 3D and/or HD decoding duties so at idle it should be close to a G33. So even if the HD 2400 solution beats the G35 when decoding Blu-ray/HD-DVD discs at all other times you are hit with a 23W penalty which in many situations far outweighs the gains.
This is something that is often overlooked and looking at how power efficient Intel’s new 45nm process is, the case for using a discrete GPU for HD video decoding with regard to power efficiency becomes even less attractive.
If the G35 doesn’t include Blu-ray and HD-DVD decode support then AMD’s next generation IGP in conjunction with a Phenom X2 may offer the best power efficiency.

Note: There’s even the possibility that the G35 uses a smaller fabrication process or at least an improved one at the same process node so it could actually consume less at idle than a G33. Once Intel’s chipsets migrate to their 45nm process we may see significant reductions in power consumption provided that the gains for the CPUs on this process can be translated to chipsets as well. AMD’s 45nm process is as unknown at this point.

What does concern me is that Gigabyte for example use very small heatsinks with their G33 boards which is a problem if you want to buy a C2D with a 800MHz FSB and over-clock it significantly. When you add in the extra power consumption of the new IGP I would look carefully at the heatsink on any G35 board before you buy it.

gentonix
Posts: 47
Joined: Tue Jan 10, 2006 8:20 am

Post by gentonix » Mon Oct 29, 2007 12:38 pm

smilingcrow wrote: The fact that the TDP is so high is ironically a good sign as it looks as if Intel may actually have significantly improved their IGP for once.
Well, the TDP of G965 was high too (in fact the same), but it wasn't a particularly effective chip anyway, at least when compared to similar IGPs from AMD (the TDP for AMD 690G is 9W) and Nvidia.
As for comparing the power efficiency of a G35 versus a HD 2400 for HD video decoding don’t forget that if you add a HD 2400 Pro to a G33 board that it consumes roughly 23W AC more at idle.
Is there a reliable source for this figure? The articles here, here and here lists the peak power consumption at ~25 watts, which I'm guessing is an official figure from AMD. Would be kind of odd, if the idle power draw would be only 2 watts less than that, given the fact that card downclocks itself when idle (I've read that this is broken for some cards in some drivers though).

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Mon Oct 29, 2007 1:06 pm

gentonix wrote:Is there a reliable source for this figure? The articles here, here and here lists the peak power consumption at ~25 watts, which I'm guessing is an official figure from AMD. Would be kind of odd, if the idle power draw would be only 2 watts less than that, given the fact that card downclocks itself when idle (I've read that this is broken for some cards in some drivers though).
I’ve only seen real-world figures and maybe they were affected by the driver issue that you mentioned. It would be good to see some real-world data using the correct drivers to show these cards in their proper light; assuming what you say is an issue. Does anybody have a link for testing with the proper drivers?
The idle power consumption of the entry level DX10 cards has been disappointing so it would be good to see this proved a non issue.

gentonix
Posts: 47
Joined: Tue Jan 10, 2006 8:20 am

Post by gentonix » Mon Oct 29, 2007 11:08 pm

The real DX10 entry level cards from Nvidia, GeForce 8300 series, aren't available through retail channels yet, though. GeForce 8400GS seems to be more aimed at defeating the Radeon 2400 Pro/XT at the same price point in terms of performance.

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Tue Oct 30, 2007 8:16 am

smilingcrow wrote: The idle power consumption of the entry level DX10 cards has been disappointing so it would be good to see this proved a non issue.
What is so disappointing about the HD 2400?

viewtopic.php?p=350685#350685

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Tue Oct 30, 2007 9:03 am

jojo4u wrote:What is so disappointing about the HD 2400?
viewtopic.php?p=350685#350685
I looked at the power data in that review and it just confirms my negative feelings towards the entry level DX10 cards. The HD 2400 and 8400 GS both consume around 23W more at idle than an IGP which I find disappointing. This thread is looking at integrated chipsets so in that context I find the extra power consumption of these cards disproportionately high. For low power systems that idle anywhere between 20 and 50W an extra 23W is a large increase.

The cards seem to be quite limited in terms of features and performance but in certain cases they do make sense. E.g. if your CPU is too weak to decode HD video discs the addition of one of these GPUs might be the cheapest way to add this feature. They’re also a cheap way to add HDMI to a current system. As for gaming performance I can't comment.

jojo4u
Posts: 806
Joined: Sat Dec 14, 2002 7:00 am
Location: Germany

Post by jojo4u » Tue Oct 30, 2007 9:24 am

Sorry, I didn't read your first post in the thread. Regarding the performance of the HD 2400, we have a saying in Germany: "Among blind people, the one-eyed is king."

jaganath
Posts: 5085
Joined: Tue Sep 20, 2005 6:55 am
Location: UK

Post by jaganath » Tue Oct 30, 2007 9:55 am

we have a saying in Germany: "Among blind people, the one-eyed is king."
hey, cool, we have that saying too! :P

gentonix
Posts: 47
Joined: Tue Jan 10, 2006 8:20 am

Post by gentonix » Thu Nov 01, 2007 9:42 am

In the other forum, this
viewtopic.php?p=375872&highlight=#375872
post seems to indicate that the idle power consumption of Radeon HD 2400 Pro might not be that high after all.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Thu Nov 01, 2007 10:25 am

gentonix wrote:In the other forum, this
viewtopic.php?p=375872&highlight=#375872
post seems to indicate that the idle power consumption of Radeon HD 2400 Pro might not be that high after all.
4W at idle for a DX10 card with HD decoding support is amazing; I assume it’s down to the latest driver significantly under-clocking and under-volting the card in 2D mode.
Can anyone else verify this? I ask because it makes these cards the obvious choice for many situations but I don’t want to recommend them without this being verified as the only data I’ve seen in online reviews contradicts this; they used older drivers which may be the cause of the disparity.

gentonix
Posts: 47
Joined: Tue Jan 10, 2006 8:20 am

Post by gentonix » Thu Nov 01, 2007 11:19 pm

Well, I'd say that it probably consumes more than 4 watts, because when the IGP is enabled the northbridge will consume more power than when it's disabled (when there is a discrete graphics card attached). In fact, with the memory controller in the CPU, the IGP is the single biggest power consumer in the chipset. So you have to take that into account. The difference is only 4 watts, but in the reality the idle power draw of the Radeon 2400 Pro can be something like 9-11 watts (this is purely a guess based on above reasoning).

Other components+IGP=Total idle consumption with IGP
Other components-IGP+Radeon HD2400 Pro=Total idle consumption with HD2400 Pro

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Fri Nov 02, 2007 12:52 am

gentonix wrote:Well, I'd say that it probably consumes more than 4 watts, because when the IGP is enabled the northbridge will consume more power than when it's disabled (when there is a discrete graphics card attached),
I think in terms of the nett power consumption of adding a VGA card rather than the actual power consumption as the former is easy to accurately measure whilst the later is not.
So when I say 4W is low it’s in comparison to the 9 or 10W of an entry level Nvidia 6 or 7 series card and the 20W+ of the 8 series.

With process shrinks and the like I had hoped that non gaming VGA cards would start using less power at idle but it seemed to be going the other away. I just have to get over my aversion to ATI drivers now. :shock:

gentonix
Posts: 47
Joined: Tue Jan 10, 2006 8:20 am

Post by gentonix » Fri Nov 02, 2007 1:02 am

Ok, that was not apparent from your post, but it makes sense. Other than the driver issue, I'd be interested to see a comparison between Windows XP and Vista with the Radeon card, because if it uses clock gating in several parts of the 3D engine like the Radeon Xpress 1150 in my laptop, then it might be that with Vista and the Aero interface, those power saving methods could not be used, since the 3D parts of the chip would be needed all the time. This is only a hypothesis though, I have no real experience with Vista or the card yet.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Fri Nov 02, 2007 1:19 am

gentonix wrote:I'd be interested to see a comparison between Windows XP and Vista with the Radeon card, because if it uses clock gating in several parts of the 3D engine like the Radeon Xpress 1150 in my laptop, then it might be that with Vista and the Aero interface, those power saving methods could not be used, since the 3D parts of the chip would be needed all the time. This is only a hypothesis though, I have no real experience with Vista or the card yet.
Interesting hypothesis. I have no interest in Vista currently but no doubt someone on the forums will look into this soon enough.
I hope the power saving mode is purely a driver issue and doesn’t require a newer BIOS or board revision. I doubt that it’s a newer stepping of the GPU as we would likely have heard about that.

wim
Posts: 777
Joined: Wed Apr 28, 2004 5:16 am
Location: canberra, australia

Re: G35 let down

Post by wim » Tue Nov 06, 2007 7:42 pm

Pigbristle wrote:I like my integrated graphics on my motherboards, don't ask me why, I just do!
So with great anticipation, I'm looking froward to the new G35 chipset models.
(by the way does anybody know when there due out?)

But little things niggle me, like! why the hell do they still use D-Sub instead of DVI? DVI has been standard now on graphic cards for as long as I can remember.

And for gods sake, don't tell me were still getting a parallel port as well!
Is this really moving forward?
i was hangin around waiting for dvi on g35 too... parallel? D-Sub?? you just ruined my day :(
:)

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Re: G35 let down

Post by smilingcrow » Wed Nov 07, 2007 12:39 pm

wim wrote:i was hangin around waiting for dvi on g35 too... parallel? D-Sub?? you just ruined my day
You can get DVI & HDMI with Gigabyte’s G33-S2H so just wait and see what they release for G35.

Pigbristle
Posts: 12
Joined: Wed Dec 20, 2006 2:56 pm

Re: G35 let down

Post by Pigbristle » Thu Nov 08, 2007 3:48 am

smilingcrow wrote:
wim wrote:i was hangin around waiting for dvi on g35 too... parallel? D-Sub?? you just ruined my day
You can get DVI & HDMI with Gigabyte’s G33-S2H so just wait and see what they release for G35.
But it's PCI-E x16 slot is apparently only x4, & not x16 optimized!

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Re: G35 let down

Post by smilingcrow » Thu Nov 08, 2007 4:15 am

Pigbristle wrote:But it's PCI-E x16 slot is apparently only x4, & not x16 optimized!
It’s worth knowing that but I imagine that most people that buy a motherboard with DVI and HDMI aren’t concerned about this limitation.

wim
Posts: 777
Joined: Wed Apr 28, 2004 5:16 am
Location: canberra, australia

Re: G35 let down

Post by wim » Thu Nov 08, 2007 6:48 pm

smilingcrow wrote:
Pigbristle wrote:But it's PCI-E x16 slot is apparently only x4, & not x16 optimized!
It’s worth knowing that but I imagine that most people that buy a motherboard with DVI and HDMI aren’t concerned about this limitation.
it could be some concern.. i want to play games occasionally but most of the time the onboard video would be fine so don't want to use the extra power (-> cooling -> noise) when its not needed. i had an idea that if i had onboard video i could simply disable the 8800GT or whatever and fall back to the motherboard's gpu after the game is over, so to speak..

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Re: G35 let down

Post by smilingcrow » Fri Nov 09, 2007 5:39 am

wim wrote:I had an idea that if i had onboard video i could simply disable the 8800GT or whatever and fall back to the motherboard's gpu after the game is over, so to speak..
I don’t think that this is currently possible; you’ll have to physically remove the card to save the power. I’ll test it later to confirm.

Nvidia and ATI are bringing out motherboards that support this but I’m not sure if the technology will be for laptops only!

Alex
Posts: 185
Joined: Sun Dec 17, 2006 12:49 pm
Location: Stockholm

Post by Alex » Fri Nov 09, 2007 6:39 am

I can tell you it seems like my HD 2600 is slowed down (automatically) with the ATI Catalyst 7.9 and 7.10.
At idle I have GPU 110 MHz, Memory 252 MHz according to ATI Overdrive ("standard" is GPU 800 MHz, Memory 700 MHz).

I can't promise that the power consumption is lowered though (can't measure).

I think the new nVidia and ATI cards all will have such kind of features.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Fri Nov 09, 2007 10:32 am

Alex wrote:I can tell you it seems like my HD 2600 is slowed down (automatically) with the ATI Catalyst 7.9 and 7.10.
At idle I have GPU 110 MHz, Memory 252 MHz according to ATI Overdrive ("standard" is GPU 800 MHz, Memory 700 MHz).
I can't promise that the power consumption is lowered though (can't measure).
Take a look at these two threads for details on this – One and Two.

Post Reply