Nasty CPU-Burn Bug in Microsoft Word!

Cooling Processors quietly

Moderators: NeilBlanchard, Ralf Hutter, sthayashi, Lawrence Lee

Post Reply
MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Nasty CPU-Burn Bug in Microsoft Word!

Post by MikeC » Fri Oct 18, 2002 10:16 am

Reader Matt Richards wrote in this morning about a nasty anomalie in Microsoft Word that pushes CPU usage to 100 percent "if the background spell checking option in the Works 2000 word processor is selected." Posted in news: http://www.silentpcreview.com/modules.p ... =0&thold=0

Red Dawn
Posts: 169
Joined: Fri Oct 04, 2002 11:46 am
Location: Stockholm

Post by Red Dawn » Fri Oct 18, 2002 10:56 am

And I who just installed word... :roll:

Is there a real fix on the way?

WillyTrombone
Posts: 16
Joined: Thu Oct 17, 2002 7:25 pm
Location: OC, California. Just OC. No preceding articles.

Post by WillyTrombone » Fri Oct 18, 2002 6:02 pm

yeah, openoffice.org :shock:



:D

Red Dawn
Posts: 169
Joined: Fri Oct 04, 2002 11:46 am
Location: Stockholm

Post by Red Dawn » Sat Oct 19, 2002 4:40 am

Doh!

Maybe I'll start writing stuff in edit.com... the processor usage should be minimal, to say the least... ;-)

LeoV
Patron of SPCR
Posts: 47
Joined: Sun Aug 11, 2002 3:26 pm

Post by LeoV » Sun Oct 20, 2002 11:08 am

This is certainly bad, but an even worse problem is that the person who reported it didn't "stress test" his system, as I bet many other quiet enthusiasts don't. If it cannot survive 100% CPU usage for long, that's a hardware issue, not software! Not to mention, thermal throttling kicks in around 80C, so it was probably not even running at 2ghz anymore...

I have always been a proponent of worst-case testing by using programs from the CPUburn suite: BURNP6.EXE for Pentium4 and BURNK7.EXE for Athlon. This program pushes CPU power consumption beyond even manufacturer's "expected max" values, producing higher temperatures than any "normal" software can attain. In my experience, BURNP6 out-heats Prime95, Hot CPU Tester, and other programs by a long margin!

My P4/2.26B @ 2.5ghz quiet system (system pic; more pics throughout this thread) attains a max of 66.0C CPU, no errors, no thermal throttling, after running BURNP6.EXE for a long time. At 2.55ghz it would actually encounter errors past 65C -- but every other test would pass with flying colors, even MemTest86! Now that I have a known rock-stable config, I don't even get to 60C with most normal 100%-CPU apps--the worst was 64C after a long UT2003 game with 20+ windows in the background. Idle temp is 37C.

My point is, why run at X GHz if you cannot actually use 100% of it, for good or evil? Rigorous stability testing in worst-case scenarios is mandatory for ultra-quiet system builders--it's already mandatory for overclockers, anyway.
Last edited by LeoV on Sun Oct 20, 2002 11:19 am, edited 1 time in total.

WillyTrombone
Posts: 16
Joined: Thu Oct 17, 2002 7:25 pm
Location: OC, California. Just OC. No preceding articles.

Post by WillyTrombone » Sun Oct 20, 2002 11:18 am

My point is, why run at X GHz if you cannot safely use 100% of it?
Marketing. It's the same reason they sell hard drives based on a gigabyte being 1000*1000*1000 bytes instead of the true 1024*1024*1024 (effectively telling you the hard drive is about 7.4 times bigger than it is) and why some vendors sell overclocked chips. The answer is (1) because they can and (2) because people like buying things with bigger numbers on them and (3) most consumers don't know they're getting something that doesn't performe as well as the ads imply.

LeoV
Patron of SPCR
Posts: 47
Joined: Sun Aug 11, 2002 3:26 pm

Post by LeoV » Sun Oct 20, 2002 11:23 am

Willy,
Two points, first one is you meant 7.4% larger.

I wasn't talking about vendors' chips. I was talking about people building their own quiet systems, but not testing them properly. Any OEM computer you buy will easily survive even a run of BURN?.EXE because they are designed quite carefully. However, many people who build their systems to push the edge--this includes overclockers as well as quiet PC enthusiasts--aren't aware of the need for very serious testing. As a result, the system may work perfectly 99% of the time, but it's actually a stability time bomb.

Red Dawn
Posts: 169
Joined: Fri Oct 04, 2002 11:46 am
Location: Stockholm

Post by Red Dawn » Sun Oct 20, 2002 11:42 am

Personally, I've gone from moderate overclocking to quiet down my computer a great deal, and during that journey I've come to learn that (at least in overclocking circles) stability testing is a must. As I've ventured into this relatively new area of computing (for me), I have always stability tested things (as far as I could with the equipment I've got), but some people take shortcuts. The guy who recognized this problem may or may not have known about this potential problem. You are, however, absolutely correct when you say that you should test things out first in a 'worst-case-scenario' before putting them into everyday use.

It's better to know potential problems beforehand, rather than finding out afterwards, when it may be too late.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Sun Oct 20, 2002 12:15 pm

This brings up a basic issue: whether every home PC builder needs to be concerned with stability under long term 100% CPU usage when 99.9% of the time, the longest it will stay at 100% is less than 5 mins. The idea of stress testing for 24/7 (or whatever) is exactly what Intel recommends, what most engineering companies strive for: performance under extreme loads. But is this really relevant for many many home users? It probably is not for Matt Richards.

I can tell you right now that my quietest PC, which is really truly virtually inaudible, will not survive a 100% CPU test for more than maybe half an hour. Do I care? Absolutely not. I know what it can do, and I know exactly how I use it; it is perfect for how I use it. (system mirroring, and second backup machine for occasional use). Why should I be worried about what it does at 100% 24/7? Let the server makers worry about that.

Now that's a personal POV about that specific PC of mine, not a statement abnout all PCs.

WHat I'd suggest generally, though, is that the home PC builder/modder does NOT have to saddle hm/herself with this 24/7 mentality -- rather, the machine should made stable and useful for the way it will actually be used. If you go in for heavy duty 3D games for 6 hour stretches, obviously you need a rather different machine than someone who web surfs, emails, & does office work. Let the system integrators worry about making general purpose machines that can survive any application; we're making ours for ourselves.

I would not sacrifice 3 or 5 or 8 dBA for more airflow to make a system 24/7 stable when it's never turned on for longer than a couple of hours. (for example)

LeoV
Patron of SPCR
Posts: 47
Joined: Sun Aug 11, 2002 3:26 pm

Post by LeoV » Sun Oct 20, 2002 12:48 pm

Mike, I don't suggest that people should sacrifice quietness for stability. However, if you don't intend to use your CPU 100%, then why not clock it down? When you overclock a 1.6A Northwood to 2ghz, the implication is that you actually need the extra speed, which means that at some point you expect to run the CPU at 100%. If that's not the case, then *IMHO* you'd have greater peace of mind knowing it's running safely at 1.6ghz than with potential errors and nasty surprises at 2ghz. It's pretty clear that Matt Richards experienced a very unpleasant surprise, as I would have in his place.

I'm not criticizing you, but I think we have differing philosophies on this subject. I have found Murphy's law to hold strong for PCs: anything that could possibly go wrong eventually does. I've had enough nasty surprises in my past PC experiences for a lifetime's worth. Because of this, I'm willing to give up a few MHz for rock stable, error-free, abuse-proof operation.

I may torture my PC much more than the average Joe, but many people (or perhaps their friends/children) may suddenly find the need to run a CPU-intensive program which is sensitive to errors. IMHO they shouldn't have to think about whether a program is "safe" to run on their machine! This is especially true for programs dealing with sensitive information.

Once you know what the best testers are (CPUburn, MemTest86, 3dmark2k are amongst the most bloodthirsty), it's not hard to test--and I bet many people would trade a few mhz in return for no unexpected surprises later.

Red Dawn
Posts: 169
Joined: Fri Oct 04, 2002 11:46 am
Location: Stockholm

Post by Red Dawn » Sun Oct 20, 2002 1:06 pm

It definately is an interesting topic Mike, and there's more than one way of looking at it.

Like you say, it's all about personal needs, and my needs include keeping a safety margin in place, if something such as this, or likewise, should occur, and keep a margin for flexibility.
With flexibility I mean multi-purpose.
I'll take my current computer as an example here; I use it (nowadays) mainly for web browsing, filesharing, listening to music, maybe writing the occasional document in word, etc. But, from time to time I get the urge to play computer games, a lot. It can range from a measily 30 minutes, to a hefty five or six hour session. Though I've put most of my gaming aside for now, I used to be quite a counter-strike enthusiast - spending ridiculous amounts of time honing my skills, and from time to time I re-visit my former 'stomping ground', and others before the game of counter-strike.

As you may see, my computer needs present a possible double-edged sword, while on the one hand I like having room to play around with, and the other the increasing dominance for extremely low sound levels, while I do 'play around'.

It'll be interesting what the end result will be. :)
For now though, I'd like to keep that safety margin in place.

MikeC
Site Admin
Posts: 12285
Joined: Sun Aug 11, 2002 3:26 pm
Location: Vancouver, BC, Canada
Contact:

Post by MikeC » Sun Oct 20, 2002 1:16 pm

When you overclock a 1.6A Northwood to 2ghz, the implication is that you actually need the extra speed
Did Matt do that? I don't think he mentioned that; just that he has a P4-2G.

I did, certainly, in that article, and discussed stability issues at higher than 2G -- I think I got it up to 2.3 at one point, but had to raise the Vcore for Acrobat Distiller (an amazingly useful stability gauge for me) to work error-free. That system is not the very quiet one I was referring to; the very quiet one is a P3.

It will be interesting to ask Matt whether his system is oc'd.

Regarding criticism -- are you kiddning? no one is above that -- criticize away! :)

I'm not sure we're in disagreement. I want my systems to be totally stable, too, and when I build systems for friends, I make sure they'll be stable for them.

What I have done in some cases is to build in a 12/7 or 12/5 switch for the CPU fan (usually a Panaflo). Generally, with a good HS & good case airflow design, the 12V Panaflo provides enough cooling for 100% 24/7. In one case, I did a switch that toggle both a case fan and the CPU fan. There's your split personality for you, Red. ;)

Mournegrym
Posts: 13
Joined: Thu Apr 03, 2003 5:22 am
Location: Sweden

Re: Nasty CPU-Burn Bug in Microsoft Word!

Post by Mournegrym » Fri Apr 04, 2003 8:56 am

MikeC wrote:Reader Matt Richards wrote in this morning about a nasty anomalie in Microsoft Word that pushes CPU usage to 100 percent "if the background spell checking option in the Works 2000 word processor is selected." Posted in news: http://www.silentpcreview.com/modules.p ... =0&thold=0
Dug out an ancient thread here, hehe...

I can confirm that this bug is still present in Word 2002 SP-2, if the background spell checking is on. Seems M$ don't care...

GamingGod
Posts: 2057
Joined: Fri Oct 04, 2002 9:52 pm
Location: United States, Mobile, AL

Post by GamingGod » Fri Apr 04, 2003 9:47 am

Microsoft is an evil monoply company hell bent on ruling the world. (this is my opinion).

Gandalf
Posts: 331
Joined: Tue Dec 24, 2002 9:04 am
Location: Belgium

Post by Gandalf » Fri Apr 04, 2003 9:50 am

Actually, openoffice sucks big donkey dick. Especially when it comes to copy/pasting things from other applications. (Go ahead, try it, try copying stuff from several sites)

Post Reply