I hope the heat has now dissipated, so I can touch this subject again :)
First of all, thanks for all the replies.
I think people bring forward many good points, from various points of view.
Just to make one point clear:
I don't advocate nor do I believe in outside set draconian limitations (esp. imposed by me, like Tibor suggested in jest).
I'm not an eco-fascist.
I'm just trying to gauge an alternative between the two extremes:
1) Continue business as usual scenario: use exponentially more energy, power and materials (most of us are in this camp)
2) Become an eco-hippie and shun technology altogether (most of us don't want to be in this camp)
It seems that most of us are doing number 1 (not blaming anybody, I'm in this camp myself, but trying to transition).
I think 2 is out of the question for many of us and I don't personally see it feasible for me, unless it's imposed on me.
However, this binary opposition begs the question: what lies in between?
What are the personal trade-offs and what limits are each one willing to live with?
Using less, but not giving up completely. Trying to intentionally to limit one's electricity/energy intake, CO2 emissions and raw material feeds to something that at least doesn't grow, but hopefully even diminishes for a while, before leveling off.
A recent exercise gave me more fuel to try something in this direction:
My current computer + display + UPS would use 6.5 kWh/day of electricity if it were on 24h/day (it's not). That's roughly 3kWh if it's on 10-12h and used 6-8 hours of that (it is). This includes very little high-intensive CPU/GPU usage. Most of my hard drives are idling and certainly almost all of my optical drives are (I test media/drives professionally, but certainly not all the time).
3kWh is a lot. It's almost three times the amount my fridge uses and even that is not particularly green model for a fridge.
I know I could seriously cut this down by going the laptop route (I am doing that) or building a small "good enough" Core ULV box with enough resources to cover 90% of my activities. The rest of the number crunching and game playing I could do on the heavy machine and I could even optimize that for it's energy use a lot.
If you combine this fact with the the following facts:
- Oil maximum production is peaking (even the most optimistic scenario from CERA/2006, say it'll happen between 2030-2050, some say already between 2005-2015)
- natural gas is not too far behind in it's production peak
- current fuel alternatives do not scale to fill this gap (at least not yet, fusion remains the big IF, as does massive solar with few additional technological generations ahead)
- if we have to turn to coal for energy, it's going to increase CO2 emissions even further (CCS capacity for 'clean fuel' plants is not online to produce clean coil now or in the next 10 years)
- even current CO2 emissions have been and are rising alarmingly (trust peer-reviewed IPCC, not oil company backed lobby groups)
- climate is warming (at least partly due to this) and this poses serious challenges to environment (incl. livelihood, not just the length of the warm season or amount of hurricanes)
So, instead of just waiting for politicians or the industry to come up with the solutions, I can also contribute a little bit of my own. Maybe use a little less.
Even in the (imho unlikely) case that most of the above scientists' opinions are proven wrong in hindsight 50 years from now, at least I've learned how to optimize my own usage better.
Maybe it's a worthwhile effort on it's own, just like silent computing.
Food for though.
friendly regards,
Halcyon
PS If you aren't putting your idle cycles to use, consider the following:
http://www.climateprediction.net/
Or just turn it off ;)