G35 let down
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
-
- Posts: 12
- Joined: Wed Dec 20, 2006 2:56 pm
G35 let down
I like my integrated graphics on my motherboards, don't ask me why, I just do!
So with great anticipation, I'm looking froward to the new G35 chipset models.
(by the way does anybody know when there due out?)
But little things niggle me, like! why the hell do they still use D-Sub instead of DVI? DVI has been standard now on graphic cards for as long as I can remember.
And for gods sake, don't tell me were still getting a parallel port as well!
Is this really moving forward?
So with great anticipation, I'm looking froward to the new G35 chipset models.
(by the way does anybody know when there due out?)
But little things niggle me, like! why the hell do they still use D-Sub instead of DVI? DVI has been standard now on graphic cards for as long as I can remember.
And for gods sake, don't tell me were still getting a parallel port as well!
Is this really moving forward?
They use the VGA connection because real DVI support requires an SVDO chip to work - an extra expense that MB companies, for some reason, don't want to foot. I have a G33 system and I bought a Silicon Image ADD2-N "X8760" ADD2 card on ebay and this gives me full DVI support. They're really cheap on ebay, about $10 including shipping, and it works perfectly - just stick it in your x16 slot and plug in the monitor - mine was automatically detected. Adds about 1-2W to total power consumption according to one of the forums I read here.
As to release - all I know is it's supposed to be "very soon".
As to release - all I know is it's supposed to be "very soon".
-
- Posts: 12
- Joined: Wed Dec 20, 2006 2:56 pm
Thanks for the explanation!tehfire wrote:They use the VGA connection because real DVI support requires an SVDO chip to work - an extra expense that MB companies, for some reason, don't want to foot. I have a G33 system and I bought a Silicon Image ADD2-N "X8760" ADD2 card on ebay .
I.m sure it wouldn't cost them that much? they are after all putting HDMI ports on them now. Would of been nice to have both HDMI & DVI though.
I'm just ready for my upgrade,was going to get the g33, but with news that the g35 is soon out, I thought I'd wait for some reviews.
I did know about the ADD2 cards but I will be putting in a x16 graphics card in after a month or so. I use the integrated as back up, for when I'm in between changing my x16 cards or when I come to sell I take out my x16 card first.
Hey! it'll run UT2 (2003) on integrated graphics, what more do they want? ;O)
Asus P5E-VM HDMI is already released, check the Asus website (which is down at the moment, so I can't give you a direct link). Should be available through retailers shortly.
Don't however confuse it with P5E-VM DO which is already selling - it has the Q35 chipset.
Edit. and here's the direct link:
http://www.asus.com/products.aspx?l1=3& ... odelmenu=1
Don't however confuse it with P5E-VM DO which is already selling - it has the Q35 chipset.
Edit. and here's the direct link:
http://www.asus.com/products.aspx?l1=3& ... odelmenu=1
-
- Posts: 12
- Joined: Wed Dec 20, 2006 2:56 pm
Nice one! First one I've seen.gentonix wrote:Asus P5E-VM HDMI is already released, check the Asus website
http://www.asus.com/products.aspx?l1=3& ... odelmenu=1
Just got to see what gigabyte have to offer first ;O)
Don't no way, been running an Asus for years with no trouble, but for some reason I'm fancying the Gigabyte this time round.
Let me know if you see any G35 reviews won't you ;O)
No eSATA on the Asus board, too bad. Can it be add?gentonix wrote:Asus P5E-VM HDMI is already released, check the Asus website (which is down at the moment, so I can't give you a direct link). Should be available through retailers shortly.
Don't however confuse it with P5E-VM DO which is already selling - it has the Q35 chipset.
I'm especially interested to see how much load does the G35 takes off the CPU when playing HD videos.
All in all it seems like a very nice chipset, I just might buy one for my new setup.
Yes, with a PCI-e add-in card for example. Search "pci-e esata" on Google or your favourite retailer.ido wrote: No eSATA on the Asus board, too bad. Can it be add?
G35 has some HD video acceleration features, but it comes with a price though, because the TDP of the chipset is 28W, which is over two times the TDP of G33. It remains to be seen what is indeed the CPU utilization and the retail price of these boards, because IP35 board and discrete Radeon HD2400 card might actually be
a) cheaper
and/or
b) able to render HD content with less power draw
PS. here's a gallery of upcoming G35 boards (along with others, so scroll down)
http://www.computerbase.de/bild/news/17135/
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
Gigabyte have a G33 board with DVI, HDMI & D-Sub so hopefully a G35 board will be released with the same options.Pigbristle wrote:Would of been nice to have both HDMI & DVI though.
You can turn any internal SATA port into an eSATA port using a simple and cheap bracket that doesn’t require an add-in card.gentonix wrote:Yes, with a PCI-e add-in card for example. Search "pci-e esata" on Google or your favourite retailer.ido wrote:No eSATA on the Asus board, too bad. Can it be add?
The fact that the TDP is so high is ironically a good sign as it looks as if Intel may actually have significantly improved their IGP for once.gentonix wrote:G35 has some HD video acceleration features, but it comes with a price though, because the TDP of the chipset is 28W, which is over two times the TDP of G33. It remains to be seen what is indeed the CPU utilization and the retail price of these boards, because IP35 board and discrete Radeon HD2400 card might actually be
a) cheaper
and/or
b) able to render HD content with less power draw.
As for comparing the power efficiency of a G35 versus a HD 2400 for HD video decoding don’t forget that if you add a HD 2400 Pro to a G33 board that it consumes roughly 23W AC more at idle. It’s a guess how much a G35 will consume at idle but I imagine most of the extra TDP relates to 3D and/or HD decoding duties so at idle it should be close to a G33. So even if the HD 2400 solution beats the G35 when decoding Blu-ray/HD-DVD discs at all other times you are hit with a 23W penalty which in many situations far outweighs the gains.
This is something that is often overlooked and looking at how power efficient Intel’s new 45nm process is, the case for using a discrete GPU for HD video decoding with regard to power efficiency becomes even less attractive.
If the G35 doesn’t include Blu-ray and HD-DVD decode support then AMD’s next generation IGP in conjunction with a Phenom X2 may offer the best power efficiency.
Note: There’s even the possibility that the G35 uses a smaller fabrication process or at least an improved one at the same process node so it could actually consume less at idle than a G33. Once Intel’s chipsets migrate to their 45nm process we may see significant reductions in power consumption provided that the gains for the CPUs on this process can be translated to chipsets as well. AMD’s 45nm process is as unknown at this point.
What does concern me is that Gigabyte for example use very small heatsinks with their G33 boards which is a problem if you want to buy a C2D with a 800MHz FSB and over-clock it significantly. When you add in the extra power consumption of the new IGP I would look carefully at the heatsink on any G35 board before you buy it.
Well, the TDP of G965 was high too (in fact the same), but it wasn't a particularly effective chip anyway, at least when compared to similar IGPs from AMD (the TDP for AMD 690G is 9W) and Nvidia.smilingcrow wrote: The fact that the TDP is so high is ironically a good sign as it looks as if Intel may actually have significantly improved their IGP for once.
Is there a reliable source for this figure? The articles here, here and here lists the peak power consumption at ~25 watts, which I'm guessing is an official figure from AMD. Would be kind of odd, if the idle power draw would be only 2 watts less than that, given the fact that card downclocks itself when idle (I've read that this is broken for some cards in some drivers though).As for comparing the power efficiency of a G35 versus a HD 2400 for HD video decoding don’t forget that if you add a HD 2400 Pro to a G33 board that it consumes roughly 23W AC more at idle.
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
I’ve only seen real-world figures and maybe they were affected by the driver issue that you mentioned. It would be good to see some real-world data using the correct drivers to show these cards in their proper light; assuming what you say is an issue. Does anybody have a link for testing with the proper drivers?gentonix wrote:Is there a reliable source for this figure? The articles here, here and here lists the peak power consumption at ~25 watts, which I'm guessing is an official figure from AMD. Would be kind of odd, if the idle power draw would be only 2 watts less than that, given the fact that card downclocks itself when idle (I've read that this is broken for some cards in some drivers though).
The idle power consumption of the entry level DX10 cards has been disappointing so it would be good to see this proved a non issue.
What is so disappointing about the HD 2400?smilingcrow wrote: The idle power consumption of the entry level DX10 cards has been disappointing so it would be good to see this proved a non issue.
viewtopic.php?p=350685#350685
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
I looked at the power data in that review and it just confirms my negative feelings towards the entry level DX10 cards. The HD 2400 and 8400 GS both consume around 23W more at idle than an IGP which I find disappointing. This thread is looking at integrated chipsets so in that context I find the extra power consumption of these cards disproportionately high. For low power systems that idle anywhere between 20 and 50W an extra 23W is a large increase.jojo4u wrote:What is so disappointing about the HD 2400?
viewtopic.php?p=350685#350685
The cards seem to be quite limited in terms of features and performance but in certain cases they do make sense. E.g. if your CPU is too weak to decode HD video discs the addition of one of these GPUs might be the cheapest way to add this feature. They’re also a cheap way to add HDMI to a current system. As for gaming performance I can't comment.
In the other forum, this
viewtopic.php?p=375872&highlight=#375872
post seems to indicate that the idle power consumption of Radeon HD 2400 Pro might not be that high after all.
viewtopic.php?p=375872&highlight=#375872
post seems to indicate that the idle power consumption of Radeon HD 2400 Pro might not be that high after all.
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
4W at idle for a DX10 card with HD decoding support is amazing; I assume it’s down to the latest driver significantly under-clocking and under-volting the card in 2D mode.gentonix wrote:In the other forum, this
viewtopic.php?p=375872&highlight=#375872
post seems to indicate that the idle power consumption of Radeon HD 2400 Pro might not be that high after all.
Can anyone else verify this? I ask because it makes these cards the obvious choice for many situations but I don’t want to recommend them without this being verified as the only data I’ve seen in online reviews contradicts this; they used older drivers which may be the cause of the disparity.
Well, I'd say that it probably consumes more than 4 watts, because when the IGP is enabled the northbridge will consume more power than when it's disabled (when there is a discrete graphics card attached). In fact, with the memory controller in the CPU, the IGP is the single biggest power consumer in the chipset. So you have to take that into account. The difference is only 4 watts, but in the reality the idle power draw of the Radeon 2400 Pro can be something like 9-11 watts (this is purely a guess based on above reasoning).
Other components+IGP=Total idle consumption with IGP
Other components-IGP+Radeon HD2400 Pro=Total idle consumption with HD2400 Pro
Other components+IGP=Total idle consumption with IGP
Other components-IGP+Radeon HD2400 Pro=Total idle consumption with HD2400 Pro
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
I think in terms of the nett power consumption of adding a VGA card rather than the actual power consumption as the former is easy to accurately measure whilst the later is not.gentonix wrote:Well, I'd say that it probably consumes more than 4 watts, because when the IGP is enabled the northbridge will consume more power than when it's disabled (when there is a discrete graphics card attached),
So when I say 4W is low it’s in comparison to the 9 or 10W of an entry level Nvidia 6 or 7 series card and the 20W+ of the 8 series.
With process shrinks and the like I had hoped that non gaming VGA cards would start using less power at idle but it seemed to be going the other away. I just have to get over my aversion to ATI drivers now.
Ok, that was not apparent from your post, but it makes sense. Other than the driver issue, I'd be interested to see a comparison between Windows XP and Vista with the Radeon card, because if it uses clock gating in several parts of the 3D engine like the Radeon Xpress 1150 in my laptop, then it might be that with Vista and the Aero interface, those power saving methods could not be used, since the 3D parts of the chip would be needed all the time. This is only a hypothesis though, I have no real experience with Vista or the card yet.
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
Interesting hypothesis. I have no interest in Vista currently but no doubt someone on the forums will look into this soon enough.gentonix wrote:I'd be interested to see a comparison between Windows XP and Vista with the Radeon card, because if it uses clock gating in several parts of the 3D engine like the Radeon Xpress 1150 in my laptop, then it might be that with Vista and the Aero interface, those power saving methods could not be used, since the 3D parts of the chip would be needed all the time. This is only a hypothesis though, I have no real experience with Vista or the card yet.
I hope the power saving mode is purely a driver issue and doesn’t require a newer BIOS or board revision. I doubt that it’s a newer stepping of the GPU as we would likely have heard about that.
Re: G35 let down
i was hangin around waiting for dvi on g35 too... parallel? D-Sub?? you just ruined my dayPigbristle wrote:I like my integrated graphics on my motherboards, don't ask me why, I just do!
So with great anticipation, I'm looking froward to the new G35 chipset models.
(by the way does anybody know when there due out?)
But little things niggle me, like! why the hell do they still use D-Sub instead of DVI? DVI has been standard now on graphic cards for as long as I can remember.
And for gods sake, don't tell me were still getting a parallel port as well!
Is this really moving forward?
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
Re: G35 let down
You can get DVI & HDMI with Gigabyte’s G33-S2H so just wait and see what they release for G35.wim wrote:i was hangin around waiting for dvi on g35 too... parallel? D-Sub?? you just ruined my day
-
- Posts: 12
- Joined: Wed Dec 20, 2006 2:56 pm
Re: G35 let down
But it's PCI-E x16 slot is apparently only x4, & not x16 optimized!smilingcrow wrote:You can get DVI & HDMI with Gigabyte’s G33-S2H so just wait and see what they release for G35.wim wrote:i was hangin around waiting for dvi on g35 too... parallel? D-Sub?? you just ruined my day
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
Re: G35 let down
It’s worth knowing that but I imagine that most people that buy a motherboard with DVI and HDMI aren’t concerned about this limitation.Pigbristle wrote:But it's PCI-E x16 slot is apparently only x4, & not x16 optimized!
Re: G35 let down
it could be some concern.. i want to play games occasionally but most of the time the onboard video would be fine so don't want to use the extra power (-> cooling -> noise) when its not needed. i had an idea that if i had onboard video i could simply disable the 8800GT or whatever and fall back to the motherboard's gpu after the game is over, so to speak..smilingcrow wrote:It’s worth knowing that but I imagine that most people that buy a motherboard with DVI and HDMI aren’t concerned about this limitation.Pigbristle wrote:But it's PCI-E x16 slot is apparently only x4, & not x16 optimized!
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
Re: G35 let down
I don’t think that this is currently possible; you’ll have to physically remove the card to save the power. I’ll test it later to confirm.wim wrote:I had an idea that if i had onboard video i could simply disable the 8800GT or whatever and fall back to the motherboard's gpu after the game is over, so to speak..
Nvidia and ATI are bringing out motherboards that support this but I’m not sure if the technology will be for laptops only!
I can tell you it seems like my HD 2600 is slowed down (automatically) with the ATI Catalyst 7.9 and 7.10.
At idle I have GPU 110 MHz, Memory 252 MHz according to ATI Overdrive ("standard" is GPU 800 MHz, Memory 700 MHz).
I can't promise that the power consumption is lowered though (can't measure).
I think the new nVidia and ATI cards all will have such kind of features.
At idle I have GPU 110 MHz, Memory 252 MHz according to ATI Overdrive ("standard" is GPU 800 MHz, Memory 700 MHz).
I can't promise that the power consumption is lowered though (can't measure).
I think the new nVidia and ATI cards all will have such kind of features.
-
- *Lifetime Patron*
- Posts: 1809
- Joined: Sat Apr 24, 2004 1:45 am
- Location: At Home
Take a look at these two threads for details on this – One and Two.Alex wrote:I can tell you it seems like my HD 2600 is slowed down (automatically) with the ATI Catalyst 7.9 and 7.10.
At idle I have GPU 110 MHz, Memory 252 MHz according to ATI Overdrive ("standard" is GPU 800 MHz, Memory 700 MHz).
I can't promise that the power consumption is lowered though (can't measure).