I don't follow what you mean about conversion of heat to electricity. I would have guessed that a 40 watt PC left always on would be similar to a 40 watt electric heater, or an old 40 watt light bulb in a room without windows.
As you say, almost all the energy used by a computer winds up as heat, and heat is heat - doesn't matter if comes from electric light bulb, electric heater, a person.
My figures come from the following (based on very superficial research (i.e. what comes up quick in google and wikipedia
- if anybody knows anything about this or has some good reference, please leap in and correct or confirm this):
Wikipedia entry on air conditioners:
"As an example presume that inside the closed system a 100 watt light bulb is activated, and the air conditioner has an efficiency of 200%. The air conditioner's energy consumption will increase by 50 watts to compensate for this, thus making the 100 W light bulb utilise a total of 150 W of energy.
Note that it is typical for air conditioners to operate at 'efficiencies' of significantly greater than 100%, see Coefficient of performance."
"However, modern [portable AC] units run on approximately 1 to 3 ratio i.e., to produce 3 kW of cooling this will use 1 kW of electricity."
From figures below, this appears to relate to more efficient current units,
older units were generally less efficient, and even typical current units may be less efficient. Doing a little extra looking turned up:
"Today's best air conditioners use 30% to 50% less energy to produce the same amount of cooling as air conditioners made in the mid 1970s. Even if your air conditioner is only 10 years old, you may save 20% to 40% of your cooling energy costs by replacing it with a newer, more efficient model."
"Room Air Conditioners—EER
Room air conditioners generally range from 5,500 Btu per hour to 14,000 Btu per hour. National appliance standards require room air conditioners built after January 1, 1990, to have an EER of 8.0 or greater. Select a room air conditioner with an EER of at least 9.0 if you live in a mild climate. If you live in a hot climate, select one with an EER over 10."
"Energy-Efficient Air Conditioning", US Dept of Energy
http://www.pueblo.gsa.gov/cic_text/hous ... ircond.htm
So, if I read this right, and assuming that the 1w to cool 3w was a fairly efficient current model, then an older A/C might take 2w to cool 3w;
or about 26 watts to cool our 40 watt load.
"Energy Efficiency Ratio (EER) which is the ratio of cooling capacity in Btu/Hr and the input power in watts W at a given operating point."
http://en.wikipedia.org/wiki/Seasonal_e ... ency_ratio
So EER is in units of Btu/Hr/watt
1 Btu/hr = 0.2929 watts (From a unit conversion website)
So to convert EER to watt (load)/watt (powering AC), multiply by 0.2929 watt-hrs/Btu
Thus a window AC unit made after 1990 (EER >=8 ) should remove 2.34 watts per watt of input.
If you live in a warm climate, you should select one (EER >= 10) that can remove > 2.929 watts
(i.e. just about 3 watts removed per watt input), which is the same as the example from Wikipedia above. (It is nice when things cross-check.)
So, for an older AC unit, the 40watt load might consume about 66watts of total power (load + AC) during cooling season. Poorly maintained or less efficient A/C might make it worse, and well maintained, more efficient or more modern A/C might make it better, down to about 53 watts or less.
Also, as the computer will raise indoor temperatures, it will lengthen the cooling season (at least a little).
All told, even assuming it replaces some electric heat in winter, still going to cost about extra $35-45/year to run the thing 24/7,
and still responsible for 600+ extra pounds of CO/2 (etc.) in the air
Far better turn it off when not in use.
[Final paragraph based on:
to estimate cooling/heating season from eyeballing climate data for Waco TX, (arbitrarily chosen point in Texas)
cost electricity 0.11c/kw-h (approx Texas average for 2006)
Greenhouse gas emission from
http://www.epa.gov/cleanenergy/energy-r ... lator.html
]