Viewing page 3 of 4 pages. Previous 1 2 3 4 Next
3. REDEFINING PERFORMANCE METRICS
Just as it emphasized the preeminence of clock speed, Intel always supported numeric benchmark testing as a means of assessing processor performance. So it is truly fascinating that Intel now questions the relevance of such benchmarks. In a press-only session entitled A New Approach to Platform Evaluation, Intel's Performance, Benchmarking and Analysis Group stated,
"Platform performance tests where speed is the only coin of the realm is where we are today. This approach has served the industry well, but it needs to evolve to address changing usage models. Where we want to go is to move beyond speed obsessed metrics, and toward an approach where user experience drives the entire process."
Let's pause here and examine this quote in detail. "Move beyond speed obsessed metrics" towards a focus on user experience? Wow! These lines echo something I wrote to describe SPCR's focus many years ago: "The essence of our interest is the enhancement of the computing experience... You could call it ergonomics in the broadest sense." (This text comes from the About Us link at the top of any page in SPCR.)
I've said for years that CPU clock speed increases have been largely meaningless since 1GHz was breached. CPU clock speed increases alone have not significantly improved most PC users' computing experience. The practice of undervolting and underclocking the processor for cooler running had its genesis at SPCR in the past four years; the latter, especially, is based on the assumption that default processor speed is more than adequate for a good computing experience by the users. It's ironic that the company whose CPU development policies have been the most antithetical to SPCR aims is now using such similar language and disparaging an obsession with speed.
Here's another aside: Even the titles of some of the key individuals involved in these new platform evaluation projects are revealing.
Worldwide Client Capability Evangelist
Staff Human Factors Engineer
Subjective/Objective Media Expert
The first portion of this presentation on A New Approach to Platform Evaluation took aim at the holiest of all benchmarks, the Timedemo Gaming Benchmark, which is a foundation of almost every overclocking, performance, and gaming hardware review website in the world. It asked what relevance the results have to the actual gaming experience, as the slide below shows. As a large portion of the audience represented such web sites, this attack on the long-standing tradition of timedemo gaming benchmarks caused a major raising of hackles and a slew of combative questions.
The presentation went on to demonstrate that time-demos work well for testing isolated 3D graphics performance, but they are less effective for platform testing.
Having torn down the old temple, Intel's Performance, Benchmarking and Analysis Group set out to build a new one, based on controlled scientific polling of users actually playing games on real PC systems.
The end results of this research are quite interesting and deserve to be discussed in detail elsewhere ¬ó and they will be in an article that is being prepared. But from the broad perspective, the most important outcome of this project is that the 175 users' assessments of the game playing experience has been encoded into a new Gaming Capabilities Assessment Tool. In other words, the users' experience is at the very heart of the new gaming capabilities assessment software tool.
A beta release of the new Gaming Capabilities Assessment Tool was provided to selected journalists at IDF Fall 2005.
|Help support this site, buy from one of our affiliate retailers!|