Hardware.info 120mm roundup
Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee
-
- *Lifetime Patron*
- Posts: 1288
- Joined: Sat Oct 25, 2003 3:21 pm
- Location: 15143, USA
- Contact:
-
- Site Admin
- Posts: 12285
- Joined: Sun Aug 11, 2002 3:26 pm
- Location: Vancouver, BC, Canada
- Contact:
Re: Hardware.info 120mm roundup
It's a herculean effort, and despite issues with testing methods, it could be useful.
Re: Hardware.info 120mm roundup
Their testing methods weren't that bad. It's not like they tested a silent server and used Photoshop for benchmarks instead of file transfers, DLNA, HTTP, or Virtual Machines- that would be truly useless.
Re: Hardware.info 120mm roundup
You could use this info to cherry pick new candidates for the next SPCR fan roundup. Like the best in low noise airflow, Noiseblocker M12 and eLoop or BeQuiet PW2.MikeC wrote:It's a herculean effort, and despite issues with testing methods, it could be useful.
-
- Posts: 41
- Joined: Wed Jul 13, 2011 4:11 am
Re: Hardware.info 120mm roundup
Unfortunately this is a criticism I've always had and is still hauntingly relevant to this day - Hardware.info always favours completely excessive quantity over any quality. Moreover, they rarely check their individual results. It literally seems like they have a conveyor belt and an intern set up to do the tests. No re-tests, no averaging of results, often not even the slightest sanity check.
As a Dutch person, we only have two reasonably large native language tech sites: hardware.info and tweakers.net. So I come by hardware.info quite regularly and I'm consistently disappointed.
To give some idea of how bad their tests are, here's a correlation graph I made of their monitor database about one and a half years ago:
This graph compares two sites: Prad.de (an extremely well-regarded display testing review website) and Hardware.info. I've picked their database for all the screens they have both tested, and I plot the difference in their maximum luminosity and maximum power consumption figures. Both sites have these parameters listed explicitly.
On the horizontal axis is the (power consumption prad.de / power consumption hardware.info) - 1. On the vertical axis (max luminosity prad.de / max luminosity hardware.info) - 1. I.e., a figure of 0 on both axes means the sites completely agree, 0.1 means they differ by 10%, 0.5 means they differ by 50%.
What we see in this graph is that only 39% of the data points here (14 out of 36) agree within 10%, which I'd say is already a fairly lax margin of error. By far most displays fall outside of this tolerance band. This is unlikely to be just tolerance of testing equipment, especially because this result is pretty much uncorrelated with e.g. display luminosity uniformity.
The big issue here is that hardware.info has been extremely lazy and hasn't checked if the display was at full brightness before they tested. Most results lie in a diagonal line to the top-right, meaning that prad.de measured both higher maximum luminance and higher power consumption than hardware.info for their maximum data. I.e. prad.de properly put the display on max. brightness, hardware.info just plugged the display in, forgot to have a brain and noted out-of-factory results.
Worse even is a couple of results that lie on the vertical or horizontal axis. Here, they did remember to put the display on max. brightness, but they forgot either to measure power consumption or luminosity at this measurement point.
The problem with Hardware.info is that for any 'mass test' they do, I can plot a graph like this, compare it to a reputable secondary source and show that *consistently* more than half of their test results are wrong. They've done a test of AC power meters (my field of interest) - almost entirely wrong. They've done tests of ssd power consumption - >90% wrong.
In summary: just never use hardware.info test results. They're beyond broken.
As a Dutch person, we only have two reasonably large native language tech sites: hardware.info and tweakers.net. So I come by hardware.info quite regularly and I'm consistently disappointed.
To give some idea of how bad their tests are, here's a correlation graph I made of their monitor database about one and a half years ago:
This graph compares two sites: Prad.de (an extremely well-regarded display testing review website) and Hardware.info. I've picked their database for all the screens they have both tested, and I plot the difference in their maximum luminosity and maximum power consumption figures. Both sites have these parameters listed explicitly.
On the horizontal axis is the (power consumption prad.de / power consumption hardware.info) - 1. On the vertical axis (max luminosity prad.de / max luminosity hardware.info) - 1. I.e., a figure of 0 on both axes means the sites completely agree, 0.1 means they differ by 10%, 0.5 means they differ by 50%.
What we see in this graph is that only 39% of the data points here (14 out of 36) agree within 10%, which I'd say is already a fairly lax margin of error. By far most displays fall outside of this tolerance band. This is unlikely to be just tolerance of testing equipment, especially because this result is pretty much uncorrelated with e.g. display luminosity uniformity.
The big issue here is that hardware.info has been extremely lazy and hasn't checked if the display was at full brightness before they tested. Most results lie in a diagonal line to the top-right, meaning that prad.de measured both higher maximum luminance and higher power consumption than hardware.info for their maximum data. I.e. prad.de properly put the display on max. brightness, hardware.info just plugged the display in, forgot to have a brain and noted out-of-factory results.
Worse even is a couple of results that lie on the vertical or horizontal axis. Here, they did remember to put the display on max. brightness, but they forgot either to measure power consumption or luminosity at this measurement point.
The problem with Hardware.info is that for any 'mass test' they do, I can plot a graph like this, compare it to a reputable secondary source and show that *consistently* more than half of their test results are wrong. They've done a test of AC power meters (my field of interest) - almost entirely wrong. They've done tests of ssd power consumption - >90% wrong.
In summary: just never use hardware.info test results. They're beyond broken.
-
- Site Admin
- Posts: 12285
- Joined: Sun Aug 11, 2002 3:26 pm
- Location: Vancouver, BC, Canada
- Contact:
Re: Hardware.info 120mm roundup
Thanks for the thorough warning.multiplexer wrote:In summary: just never use hardware.info test results. They're beyond broken.
Re: Hardware.info 120mm roundup
Have you seen hardware.fr (en français) dossiers? I think they're pretty good, I like their interactive graphics.
40 120mm PWM fans (june 2012)
63 120mm DC fans (september 2012)
40 140mm fans (february 2013)
18 heatsinks under 40€ (september 2014)
40 120mm PWM fans (june 2012)
63 120mm DC fans (september 2012)
40 140mm fans (february 2013)
18 heatsinks under 40€ (september 2014)
-
- *Lifetime Patron*
- Posts: 1288
- Joined: Sat Oct 25, 2003 3:21 pm
- Location: 15143, USA
- Contact:
Re: Hardware.info 120mm roundup
No fooling. I recall being blown away with Prad's LCD reviews when I discovered them a few years back.multiplexer wrote:Prad.de (an extremely well-regarded display testing review website)
Thanks for the Hardware.info info.