Thermaltake Duorb VGA Cooler: Are Two Orbs Better Than One?

Viewing page 4 of 6 pages. Previous 1 2 3 4 5 6 Next


Our test procedure is an in-system test, designed to determine whether the cooler is adequate for use in a low-noise system. By adequately cooled, we mean cooled well enough that no misbehavior related to thermal overload is exhibited. Thermal misbehavior in a graphics card can show up in a variety of ways, including:

  • Sudden system shutdown or reboot without warning.
  • Jaggies and other visual artifacts on the screen.
  • Motion slowing and/or screen freezing.

Any of these misbehaviors are annoying at best and dangerous at worst — dangerous to the health and lifespan of the graphics card, and sometimes to the system OS.

Test Platform

Measurement and Analysis Tools

  • ATI Tool version 0.26 as a tool for stressing the GPU
  • CPUBurn P6 processor stress software.
  • SpeedFan version 4.33 to show CPU & GPU temperature
  • Seasonic Power Angel AC power meter, used to monitor the power consumption of the system
  • A custom-built internal variable fan speed controller to power the system fan
  • Calibrated strobe light to measure fan RPM
  • A custom-built external variable fan speed controller to power the VGA heatsink fan (if applicable)
  • Bruel & Kjaer (B&K) model 2203 Sound Level Meter, used to accurately measure SPL (sound pressure level) down to 20 dBA and below.

A summary of how our video card/cooler test platform is put together can be found here.

Our main test consists of ATITool's artifact scanner running in conjunction with CPUBurn to stress both the graphics card and processor simultaneously. It is a realistic test that mimics the stress on the CPU and GPU produced by a modern video game, only more consistantly. The software is left running until the GPU temperature stabilizes for at least 10 minutes at which point, both the CPU and GPU temperatures are recorded. In addition we also take measurements of the system's overall noise level and power consumption using a B&K Sound Meter and a Seasonic Power Angel respectively. If the heatsink has a fan, the procedure is repeated at various fan speeds while the system fan is left at the lowest setting of 7V. If it is a passive cooler, the system fan instead is varied to study the effect of system airflow on the heatsink's performance. If artifacts are detected in ATITool or other instability is noted, the heatsink is deemed inadequate to cool the video card in our test system.

Preliminary testing is also done at idle, and with only CPUBurn running for comparison. For idle results, the system is left stagnant for 10 minutes before ATITool is loaded and the first temperature it reports is used. We do this because on our test platform, after ATITool is loaded, it puts some kind of stress on the GPU, causing the temperature to climb immediately (even if it is left idle for hours beforehand) and the power consumption to increase by approximately 10W. We theorize that initially the card is in 2D mode, either underclocked or undervolted (or possibly both) and that ATITool automatically puts it in 3D mode, which would account for the rise in temperature and power draw. ATITool is left running in the background for the remainder of testing which is why the GPU temperature during CPUBurn will appear higher compared to idle. Consider this the difference between 2D idle and 3D idle.

The ambient conditions were 25°C and 14 dBA at the time of testing.

Previous 1 2 3 4 5 6 Next

Cooling - Article Index
Help support this site, buy the ThermalTake CL-G0102 DuOrb VGA Cooler from one of our affiliate retailers!