April 23, 2014
 
 
RSSRSS feed

Green Computing is More Than Sleep Mode - page 2

Your Real Ecological Footprint

  • October 12, 2009
  • By Juliet Kemp
If all you're considering is power, we can plug in some numbers. With these power usage figures, an average figure for a desktop in use is around 120W; for a laptop it's around 35W. Assuming the same usage of both machines, and the same power-saving setup (although note that in fact the desktops are also a bit less efficient when sleeping), your laptop can have a lifespan up to 3.4 times less and still come out ahead. So a laptop lasting 3 years would beat out a desktop lasting just over 10 years, on power usage alone. However, a typical LCD monitor uses 35W, so if you're using your laptop with an external screen all the time, that figure goes down to 1.7 (so a laptop lasting upwards of 3 years would beat out only a desktop lasting less than 5 years).

But power usage isn't all there is: there are those production and disposal costs, for which lifespan really does matter. For example, here's an analysis of the environmental production costs for a Macbook. With a laptop, you also have the issue of the cost of the battery and its lifespan: how many recharge cycles are you going to get per battery? Laptops also tend to be less fixable if something goes wrong than desktops (thus further shortening their lifespan). Unfortunately, it's hard to get solid figures on much of this, and in practice of course, other issues (such as portability) may be of more immediate importance when deciding what type of machine to get.

Software issues

When you consider start-of-life and end-of-life costs, it becomes clear that it's better to squeeze as much as you can out of the hardware for as long as possible than it is to replace early to take advantage of energy usage improvements.

But how long is "as long as possible"? When does "end of life" actually apply? When your machine actually stops working altogether; or when it becomes irritating to use?

One drive for more frequent hardware upgrades is that coders tend to write with the assumption that they can use as much CPU as they like; and to write for the often high-powered machines that they themselves may be using. Until recently (when the motherboard went bang) I was still regularly using an 8-yr-old PC: but running a modern Java app on it was an exercise in frustration. Isn't it time to focus more on efficiency of code? My laptop is a Mac, and I was ecstatic to find out that the new Snow Leopard release apparently has a focus on getting more speed, and apparent speed, out of the existing hardware.

If it becomes more common for users and companies to make the effort to keep their existing kit in operation, can we push coders into allowing for this, and thus putting their energies into writing more efficient code? You can argue that netbooks are a useful part of this drive: they operate on the assumption that most of us don't really need massive computing power. As more people place themselves at this hardware point, we could (I hope) see less of the assumption that all users have high-end machines available. Which will increase effective hardware lifespans – and may even cut power bills by requiring less heavy CPU work. It's something to start working towards as we try to tread more lightly on the planet.

Sitemap | Contact Us