GA-G33M-S2H mATX C2D board

All about them.

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

mimwdv
Posts: 110
Joined: Mon Dec 27, 2004 3:54 pm
Location: Sydney, Australia

GA-G33M-S2H mATX C2D board

Post by mimwdv » Thu Aug 30, 2007 10:46 pm

Has anyone had any experience with the GA-G33M-S2H board? I need to replace my GA-965P-S3 mobo, which seems to be causing lots of flaky problems in my system. This board seems to have everything I need feature-wise, but I'd be interested in any opinions, esp on undervolting, power consumption, etc.

I'm tempted to throw out my e6400 as well and just go for a new low-power AM2 system, but it seems like a waste since the e6400 is fine under stress testing. Any opinions either way? The system is mainly used for TV recording, transcoding, and file storage.

fresh
Posts: 89
Joined: Sun Dec 24, 2006 2:30 pm
Location: Slovenia

Post by fresh » Fri Aug 31, 2007 4:25 am

I am the owner of the Gigabyte G33M-DS2R motherboard and I can't say a single bad thing about it. With onboard graphics turned on with my E4500 rev. M0, the total sistem power consumption in IDLE is at 57 watts. In bios you can overvolt or undervolt anything from FSB, Vcore to MEM (1,8v-2,2v) and FSB clocks are reported to go over 450 MHz, plus it suports 45 nm penryn procesors.

GnatGoSplat
Posts: 98
Joined: Fri Jan 26, 2007 1:59 pm
Location: Battlefield, MO

Post by GnatGoSplat » Fri Aug 31, 2007 5:00 am

What kind of power consumption are you getting under load and what other peripherals do you have attached to the motherboard and psu?

fresh
Posts: 89
Joined: Sun Dec 24, 2006 2:30 pm
Location: Slovenia

Post by fresh » Fri Aug 31, 2007 5:39 am

I measured power consumption with Voltcraft Energy Check 3000. Connected are the following devices:

Seasonic S12-600w
G33M-DS2R, onboard graphics
E4500 2.2Ghz, rev. M0 (supposebly only 8 watts in C1E mode); downvolted to 1,1V
2x1 GB Geil Ultra DDR2 800Mhz
NEC DVDrom
Samsung HD501LJ, 500GB disk
Zalman CNPS 9500 at 3V - silent
Fan Regulator
(everything else is passively cooled, including chipset with Thermalright hr-05)

IDLE: 56,6W
LOAD: 75,5W (2 primes running)

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Sat Sep 01, 2007 9:16 am

The G33M-S2H is a power hog according to this review. I have a G33M-S2 and posted some data on it here. Unfortunately the video quality through the D-SUB is below par at 1680x1050 and it has no DVI, which means I might as well have bought a P35 board. I’m going to see if they sell a DVI ADD card for it.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Sun Sep 02, 2007 5:33 am

fresh wrote:I measured power consumption with Voltcraft Energy Check 3000.

IDLE: 56,6W
LOAD: 75,5W (2 primes running)
Which setting did you use for Prime95? That's a very low power draw.

mimwdv
Posts: 110
Joined: Mon Dec 27, 2004 3:54 pm
Location: Sydney, Australia

Post by mimwdv » Wed Sep 05, 2007 3:07 pm

If Fresh's results can be applied to the S2H it seems like it'd be a good solution, although that review is a bit of a worry. I'll get that board ordered and hope for the best. thanks for the help!

Flandry
Posts: 84
Joined: Wed Sep 21, 2005 8:59 pm
Location: IHTFP, MA

Post by Flandry » Fri Sep 07, 2007 3:09 pm

I am leaning toward the G33M-DS2R for my new system, but I hadn't been able to tell that it had undervolting options in BIOS. Every screenshot i've seen just shows options to increment by different amounts.

Thanks for sharing your experience, fresh. What's the lowest Vcore you can set? My laptop can't go below 0.7V, which is much more than the processor needs at the lowest multiplier.

Also, is it possible to set the multiplier lower than the minimum value that EIST uses?

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Sat Sep 08, 2007 12:43 am

Flandry wrote:I am leaning toward the G33M-DS2R for my new system, but I hadn't been able to tell that it had undervolting options in BIOS. Every screenshot i've seen just shows options to increment by different amounts.
That is for the RAM, FSB etc. The CPU gives a ridiculously long list from 0.5V up to some crazy high number, at least in every Gigabyte board that I’ve seen in recent years including my G33M-S2.
Flandry wrote:My laptop can't go below 0.7V, which is much more than the processor needs at the lowest multiplier.
How did you manage to test that the CPU is stable below 0.7V? That’s Looooooooooooooooow.
Flandry wrote:Also, is it possible to set the multiplier lower than the minimum value that EIST uses?
I’ve never seen this as an option, I guess it’s a hardware limitation of the CPU.

Flandry
Posts: 84
Joined: Wed Sep 21, 2005 8:59 pm
Location: IHTFP, MA

Post by Flandry » Sat Sep 08, 2007 8:18 am

smilingcrow wrote:That is for the RAM, FSB etc. The CPU gives a ridiculously long list from 0.5V up to some crazy high number, at least in every Gigabyte board that I’ve seen in recent years including my G33M-S2.
Ah, thanks, that's great to know. That also puts some rumors i had heard, that the mobo Vcore limits are CPU-dependent, to rest.
How did you manage to test that the CPU is stable below 0.7V? That’s Looooooooooooooooow.
Ah :oops:, well i don't know it, but it runs stable at both 6x and 8x at 0.7V, so i assume it could handle <0.7V for 6x. I imagine a good MoDT or whatever the acronym is could do <0.7V in BIOS. 0.7 is probably the lowest VID on P-M or something. I really can't complain -- it's incredible how much power and heat reduction i get even with the Voltage drop i can access.
I’ve never seen this as an option, I guess it’s a hardware limitation of the CPU.
Ah rats. I think perhaps AMD CPUs are different in that regard. It makes EIST almost worthless with these new 1+GHz FSB CPUs -- they have multipliers not much more than 6. :roll:

Thanks for the info!

Another question occurred to me, as i've noticed that the -DS2R is not "HDCP certified" according to Giga's page, while the S2H is. From a query i've made on HOCP (about whether mobos have to be "HDCP" compliant), it sounds like that's purely because of the digital video output on the S2H versus the VGA d-sub on the DS2R, and that HDCP means nothing for a mobo per se, just for the video card/output. I'm not sure where that leaves audio, though. Anyone have insights on this whole Hollywood mess?

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Sat Sep 08, 2007 10:27 am

Flandry wrote:Ah :oops:, well i don't know it, but it runs stable at both 6x and 8x at 0.7V, so i assume it could handle <0.7V for 6x. I imagine a good MoDT or whatever the acronym is could do <0.7V in BIOS. 0.7 is probably the lowest VID on P-M or something. I really can't complain -- it's incredible how much power and heat reduction i get even with the Voltage drop i can access.
What CPU does the laptop have that runs at 0.7V and how did you set it at that value?
Ah rats. I think perhaps AMD CPUs are different in that regard. It makes EIST almost worthless with these new 1+GHz FSB CPUs -- they have multipliers not much more than 6. :roll:
That’s something that Intel needs to address. I’m happy with the 1 and 2 MB cache chips with a FSB of 200 and high multipliers. People that want Quad cores are stuck with 266 and soon to be 333 when Penryn is released.

Flandry
Posts: 84
Joined: Wed Sep 21, 2005 8:59 pm
Location: IHTFP, MA

Post by Flandry » Sat Sep 08, 2007 11:10 am

smilingcrow wrote:What CPU does the laptop have that runs at 0.7V and how did you set it at that value?
It's a Pentium-M Dotham 1.7GHz chip. I use Notebook Hardware Control to set custom Voltages for each multiplier, and also restrict the top multiplier to 12x battery or 14x plugged. The CPU rarely gets over 50oC like that (even when gaming), which means the fan doesn't ramp up much. I expect NHC is functionally the same as crystalcpuid, except limited to laptops and much more user-friendly. It also lets me clock the Radeon down when i don't need to toast bread on the lappy.
That’s something that Intel needs to address. I’m happy with the 1 and 2 MB cache chips with a FSB of 200 and high multipliers. People that want Quad cores are stuck with 266 and soon to be 333 when Penryn is released.
Putting people in the situation where they are trading off memory bandwidth for low power idling is a sad state. I really don't know how Intel will deal with it. Hopefully it's simply a matter of CPU microcode and BIOS updates for mobos, but if that's the case i don't know why they haven't already done that for the high FSB chips. It shows a rather cavalier attitude toward energy savings, imo.

I do wonder though -- haven't the newer (higher FSB) C2D chips been released with lower thermal power or at least with claimed lower power consumption? If refinements to the fab are reducing waste heat with the higher FSB processors, it may be both faster and more energy efficient to go that way. Can someone confirm or dispel those rumors?

I was wondering if the FSB could be cranked down low to make an equivalent to a low multiplier. Does this mobo let you set the FSB to lower values than the CPU is designed for, or only >=? How low does it go?

Someone recommended the Abit AN-M2 mobo to me. I couldn't find much about it but what i did see really didn't grab me. I don't know why I am resisting AMD so much. :/

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Sat Sep 08, 2007 1:46 pm

Flandry wrote:It's a Pentium-M Dotham 1.7GHz chip. I use Notebook Hardware Control to set custom Voltages for each multiplier, and also restrict the top multiplier to 12x battery or 14x plugged. The CPU rarely gets over 50oC like that (even when gaming), which means the fan doesn't ramp up much. I expect NHC is functionally the same as crystalcpuid, except limited to laptops and much more user-friendly. It also lets me clock the Radeon down when i don't need to toast bread on the lappy.
I’m suspicious about utilities that offer to lower VCore to such low levels as from my experience they don’t actually do what they say they are doing; CrystalCPUID is guilty of this.
Although I can’t remember what Dothan’s voltage settings are so maybe what you are seeing is accurate? I prefer to test the power consumption whilst changing the VCore to confirm the settings are real.
Flandry wrote:Putting people in the situation where they are trading off memory bandwidth for low power idling is a sad state.
It’s not so bad in practice. I have an E2160 and the difference in power draw at 1.8GHz between a FSB of 200 and 300 is 3W at idle and load. That’s not a lot when you consider that it’s a 50% increase in buss speed and RAM speed; DDR2-667 versus DDR2-1000.

Intel get in right in the current mobile platform as it halves the FSB speed at idle as well as lowering the multi to 6.
Flandry wrote:I really don't know how Intel will deal with it. Hopefully it's simply a matter of CPU microcode and BIOS updates for mobos, but if that's the case i don't know why they haven't already done that for the high FSB chips. It shows a rather cavalier attitude toward energy savings, imo.
It’s possible that they may have issues with circuit switching speeds, so they can’t jump from a multi of say 4 to 12 quickly enough; that’s one theory I’ve come across.
If it was simple to rectify I figure they would have by now or at least with Penryn which seems to have the same minimum multi of 6.
Flandry wrote:I do wonder though -- haven't the newer (higher FSB) C2D chips been released with lower thermal power or at least with claimed lower power consumption? If refinements to the fab are reducing waste heat with the higher FSB processors, it may be both faster and more energy efficient to go that way. Can someone confirm or dispel those rumors?
The newer revisions of both the 200 and 333MHz FSB chips are more efficient at load and also idle I believe. The Q6600 (G0 revision) seems to under-volt very well.
Flandry wrote:I was wondering if the FSB could be cranked down low to make an equivalent to a low multiplier. Does this mobo let you set the FSB to lower values than the CPU is designed for, or only >=? How low does it go?
I don’t think that’s the best solution as memory bandwidth will get even more constrained. I have lowered the FSB of a 266MHz C2D to well below stock but haven’t tried with a 200MHZ FSB chip; I’ll take a look to see how low it goes. The older chipsets that officially support Pentium 4’s with a FSB of 133MHz (533) might offer the best options here. My G33 and also the P35 only officially support 200MHz and higher. The P965 would be a better option.

Flandry
Posts: 84
Joined: Wed Sep 21, 2005 8:59 pm
Location: IHTFP, MA

Post by Flandry » Sat Sep 08, 2007 4:55 pm

smilingcrow wrote:
Flandry wrote:It's a Pentium-M Dotham 1.7GHz chip. I use Notebook Hardware Control to set custom Voltages for each multiplier, and also restrict the top multiplier to 12x battery or 14x plugged. The CPU rarely gets over 50oC like that (even when gaming), which means the fan doesn't ramp up much. I expect NHC is functionally the same as crystalcpuid, except limited to laptops and much more user-friendly. It also lets me clock the Radeon down when i don't need to toast bread on the lappy.
I’m suspicious about utilities that offer to lower VCore to such low levels as from my experience they don’t actually do what they say they are doing; CrystalCPUID is guilty of this.
Although I can’t remember what Dothan’s voltage settings are so maybe what you are seeing is accurate? I prefer to test the power consumption whilst changing the VCore to confirm the settings are real.
I was really questioning myself when i read this, because i've definitely seen that on my P3 board (i can change the FSB to anything i want with clockgen or whatever the program is that supports that pll is -- i can't remember -- and it actually does nothing). I tried setting the laptop Vcore to 0.8V and watched the battery drain numbers to see if there was any perceptible change at 6x. Well, too many things were going on to get a stable drain, but then i remembered that when i tested the different multipliers way back when, 10x worked at 0.748 -- but not at 0.732. Likewise 12 worked at 0.828, but not stably below that. So i have good evidence that those are real voltages. However, the original Pentium-M was so good that it became the basis for all modern Intel desktop processors, so i don't think 0.7 is too unbelievable--although it is incredible considering how high CPUs require still.

BTW, according the battery drain circuitry, the power consumption was around 12W when idling, with the LCD backlight on high. :shock:
Intel get in right in the current mobile platform as it halves the FSB speed at idle as well as lowering the multi to 6.
Well yeah, it's obviously most useful in that context.

The newer revisions of both the 200 and 333MHz FSB chips are more efficient at load and also idle I believe. The Q6600 (G0 revision) seems to under-volt very well.
My French is soo bad. :oops: Props to those mad frenchies for a very thorough review, though! oO The problem with that table for me is that i have no intuition what any of those chips have as FSB by looking at them, so it's just so many numbers. I gather you are saying that the new 200 and 333 (that's 800/1333 FSB effective, right?) chips are the standouts (leaving the ~1000 FSB ones inferior??), though. Right? The 6750 looks tasty...
I don’t think that’s the best solution as memory bandwidth will get even more constrained. I have lowered the FSB of a 266MHz C2D to well below stock but haven’t tried with a 200MHZ FSB chip; I’ll take a look to see how low it goes. The older chipsets that officially support Pentium 4’s with a FSB of 133MHz (533) might offer the best options here. My G33 and also the P35 only officially support 200MHz and higher. The P965 would be a better option.
Well, the real problem is that i want too much range of performance from this system. I want to be able to game on it sometimes, and when not, scale it right down as far as possible. As long as it's possible to set really low FSBs/Vcores, etc, things can be tweaked until i find something i ilike or hit a chip wall. If it's not possible, i'll get annoyed. :lol:

Thanks for that info. Anyway, as long as 200MHz is supported, i could get a 333MHz chip and have plenty of room on either end to clock according to whim. ;)

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Sun Sep 09, 2007 7:40 am

Flandry wrote:So i have good evidence that those are real voltages. However, the original Pentium-M was so good that it became the basis for all modern Intel desktop processors, so i don't think 0.7 is too unbelievable--although it is incredible considering how high CPUs require still.
I hear you. I didn’t realise that Dothan allowed such low voltages to be set from software, the limit with Core (2) Duo seems to be about 0.95V for mobile parts. I installed Notebook Hardware Control on a MODT system with a Core Duo and it went as low as 0.95V the same as RMClock. I wonder how much there is to be gained from going below that level as according to tests they don’t consume much power at idle anyway.
Flandry wrote:I gather you are saying that the new 200 and 333 (that's 800/1333 FSB effective, right?) chips are the standouts (leaving the ~1000 FSB ones inferior??), though. Right? The 6750 looks tasty...
The new revisions are for the 200 and 333MHz FSB chips only it seems.
Flandry wrote:Well, the real problem is that i want too much range of performance from this system. I want to be able to game on it sometimes, and when not, scale it right down as far as possible. As long as it's possible to set really low FSBs/Vcores, etc, things can be tweaked until i find something i ilike or hit a chip wall. If it's not possible, i'll get annoyed. :lol:

Thanks for that info. Anyway, as long as 200MHz is supported, i could get a 333MHz chip and have plenty of room on either end to clock according to whim. ;)
I tested my G33 board with an E2160 (L2) and it was stable (tested for a short period with Prime95) at 150MHz but wouldn’t post at 100, 120 or 125 so I left it at that. Here’s some basic data at idle:

FSB/CPU/VCore/RAM/Watts

150/900/0.950/500/49
200/1.2/1.025/667/51.5
300/1.8/1.187/1000/57

The two lower speeds had the Vcore set in the BIOS as the system was stable at a voltage lower than that which Speedstep could provide; in this case using Speedstep to control voltage would have increased power consumption.
The 300MHz FSB test was done with VCore on standard setting in the BIOS using the lowest Speedstep setting with RMClock. This allowed the system to run stably at 2.7GHz using the maximum Speedstep voltage of 1.325V but required that it idled at 1.8GHz at 1.162V (the minimum Speedstep setting) when it would have been stable at 1.025V if it had been possible to use that. You can’t have it all when it comes to Speedstep and manual voltage control unfortunately.

For me the data points to the E4600 (not released yet) as being the ideal Core 2 Duo as it has a FSB of 200, multi of 12, 2MB cache and will use the power efficient M0 revision. Even at 3GHz it will be idling at 250 * 6 = 1.5GHz which might be a very decent trade-off between low power at idle and high performance.

Lensman
Posts: 147
Joined: Thu Jul 26, 2007 1:15 am

Post by Lensman » Sun Sep 09, 2007 8:11 am

mimwdv wrote:If Fresh's results can be applied to the S2H it seems like it'd be a good solution, although that review is a bit of a worry. I'll get that board ordered and hope for the best. thanks for the help!
You didn't mention whether you wanted or needed the DVI/HDMI output of the S2H.

A low-to-mid range video card for DVI out will add 10-30W at idle, negating any power advantage the D2H's competitor's have.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Sun Sep 09, 2007 9:14 am

You can use an ADD2 card with the G33 chipset for DVI or HDMI if you can find one that’s compatible! I’m looking into this subject in this thread

mimwdv
Posts: 110
Joined: Mon Dec 27, 2004 3:54 pm
Location: Sydney, Australia

Post by mimwdv » Sun Sep 09, 2007 6:28 pm

Lensman wrote:
mimwdv wrote:If Fresh's results can be applied to the S2H it seems like it'd be a good solution, although that review is a bit of a worry. I'll get that board ordered and hope for the best. thanks for the help!
You didn't mention whether you wanted or needed the DVI/HDMI output of the S2H.

A low-to-mid range video card for DVI out will add 10-30W at idle, negating any power advantage the D2H's competitor's have.
Yeah, I'm looking for the HDMI/HDCP out. I'm planning to make this an HD 1080p capable HTPC eventually, provided I can get it to run stable with the new mobo and the onboard graphics handles the HD OK. My current HTPC will then get retired, and I'll buy a super low power consumption setup for the server. If the C2D system doesn't cut it for HD or proves to still be unstable, I'll look at replacing both systems in ~18 months.

jebusau
Posts: 10
Joined: Sun Apr 30, 2006 9:24 pm

Post by jebusau » Wed Sep 12, 2007 4:49 am

i think the extra power that this board uses is due to the Sil1392 chip (page 8 of the manual) which is used for processing the HDMI / DVI output.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Thu Sep 13, 2007 6:39 am

jebusau wrote:i think the extra power that this board uses is due to the Sil1392 chip (page 8 of the manual) which is used for processing the HDMI / DVI output.
The difference between the Gigabyte and Elitegroup board is: Idle/load – 12/9W.
Could a HDMI/DVI transmitter account for anywhere near 12W! A 7600GS 256MB only adds about 10W at idle over an IGP and that’s a full GPU with RAM. Unless the HDCP support is a power hog!

Some motherboards do over-volt by default which would distort the values but I’ve looked at a lot of Gigabyte boards in the last 12 months and none of them seemed to do this. If the System Voltage Control in the BIOS was set to Auto that would explain the extra power consumption at idle, but it would typically mean that the difference at load would be even greater which isn’t the case here.

jebusau
Posts: 10
Joined: Sun Apr 30, 2006 9:24 pm

Post by jebusau » Thu Sep 13, 2007 6:15 pm

smilingcrow wrote:
jebusau wrote:i think the extra power that this board uses is due to the Sil1392 chip (page 8 of the manual) which is used for processing the HDMI / DVI output.
The difference between the Gigabyte and Elitegroup board is: Idle/load – 12/9W.
Could a HDMI/DVI transmitter account for anywhere near 12W! A 7600GS 256MB only adds about 10W at idle over an IGP and that’s a full GPU with RAM. Unless the HDCP support is a power hog!

Some motherboards do over-volt by default which would distort the values but I’ve looked at a lot of Gigabyte boards in the last 12 months and none of them seemed to do this. If the System Voltage Control in the BIOS was set to Auto that would explain the extra power consumption at idle, but it would typically mean that the difference at load would be even greater which isn’t the case here.
yer i thought the same thing... i was hoping someone here would shed some light on whether or not a HDMI processor account for that difference?

i googled the chip model and the only results were in non-english

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Fri Sep 14, 2007 12:29 am

jebusau wrote:i googled the chip model and the only results were in non-english
Try here.

Power Management
• Low-power 1.8V core operation
• Low-power standby mode
• Flexible power-down modes

Flandry
Posts: 84
Joined: Wed Sep 21, 2005 8:59 pm
Location: IHTFP, MA

Post by Flandry » Fri Sep 14, 2007 5:54 am

Hmm, they don't even give power consumption in the two page specs PDF.

I really think Gigabyte has some flawed design tendencies or something. Their Geforce 8600 cards also came out with much higher power consumption than comparable cards in every comparative review i could find.

It irks me enough that i've put off upgrading. I was all set to get one of these for the many tweaking options, but seeing how power hungry the board is has put me back in the ambivalent category.

Lensman
Posts: 147
Joined: Thu Jul 26, 2007 1:15 am

Post by Lensman » Fri Sep 14, 2007 9:11 am

I can't see that chip using 10W at idle, but with all the power conservation in the marketing materials for Silicon Image, it may be that HDMI chips are power hungry.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Fri Sep 14, 2007 9:48 am

Lensman wrote:I can't see that chip using 10W at idle, but with all the power conservation in the marketing materials for Silicon Image, it may be that HDMI chips are power hungry.
Doesn’t HDMI = DVI + Audio + HDCP?
A DVI ADD2 card is low power and a HDMI transmitter is unlikely to be fabricated on an older process than a DVI transmitter due to it being a newer product.
Mixing digital audio can’t take much power either which leaves HDCP as the possible culprit. But isn’t HDCP encryption only applied to protected content which means that when connecting a PC to a TFT for other Windows activities that HDCP encryption is not applied? I don’t think the SiI1392 HDMI Transmitter is the power hog here I think it may be the four-phase voltage regulator.

BTW, Scan computers in the UK have the GA-G33M-S2H on offer at £62.86 plus shipping over the weekend.

Mikael
Posts: 206
Joined: Mon Dec 06, 2004 3:12 am
Location: Gothenburg, Sweden

Post by Mikael » Fri Sep 21, 2007 3:37 am

I am now the owner of a Gigabyte GA-G33M-S2H. Here's the rest of the config:

Corsair HX520W (230VDC)
E6600 @ 3.00GHz (333*9, 1.17V)
2GB DDR2 @ 833MHz 5-5-5-15 (1.9V)
Scythe Ninja rev. B with Scythe S-Flex on 950RPM
Samsung T166 320GB (HD321KJ)
D-Link DWL-G520 WLAN PCI card

Idle: 79W
Load: 124W

Using SPCR data for the HX520, it can be estimated that the system needs 55-60W during idle. A rough estimate for the board itself would be ~25W. Not great, but hardly a catastrophe.

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Tue Sep 25, 2007 11:11 am

I just tested the G33M-S2H on exactly the same system as I’m using for my G33M-S2 and the power consumption is 0.5W more at idle and the same at load. So ignore the Tomshardware review as its completely misleading.
I’ll post more power data later but the board can idle in the 50 – 56W range depending on the CPU and how far you undervolt it. At last we seem to have a good G33 board with DVI/HDMI. :D

smilingcrow
*Lifetime Patron*
Posts: 1809
Joined: Sat Apr 24, 2004 1:45 am
Location: At Home

Post by smilingcrow » Tue Sep 25, 2007 8:55 pm

I can’t get Speedfan to work in the G33M-S2H, I’ve tried 4.32 and 4.33 with no joy. Has anybody managed to get it working?
I disabled the fan management in the BIOS which is what I needed to do in my G33M-S2 to get it working with Speedfan; I’ll try other options.

The DVI connection doesn’t see my monitor either. :(
The F1 BIOS has serious issues with PS2 keyboards and the F2 doesn’t seem 100% either.
My G33M-S2 hasn’t given me problems so this is disappointing considering how similar they are.

Added. The DVI connection works with another monitor but just not my Viewsonic. :(

mimwdv
Posts: 110
Joined: Mon Dec 27, 2004 3:54 pm
Location: Sydney, Australia

Post by mimwdv » Wed Nov 28, 2007 4:40 pm

After much deliberation and tinkering I've finally got my new system configured and stable:
E6400 (stock speeds @1.0V)7700-AlCu, Nexus fan mod
G33M-S2H
2x2.5" HDs
3xdigital tuners
PW-200-V, 80W brick
Antec NSK1300 with 120mm Yate Loon fan.
With this configuration it idles at 1.6GHz ~62W, Orthos load 2.13GHz ~84W, with Orthos and all 3 tuners recording ~88W, and if I defrag a drive as well ~90W.

I've got speedfan 4.33 controlling the CPU fan - smilingcrow, did you ever get speedfan working with your S2H?

gb115b
*Lifetime Patron*
Posts: 289
Joined: Tue Jun 13, 2006 12:47 am
Location: London

Post by gb115b » Thu Nov 29, 2007 6:58 am

what settings did you use?

my msi g33 has trouble with speedfan at the mo too

Post Reply