The Cloud Begins with Coal
August 17, 2013 6:07 AM   Subscribe

"The information and technology ecosystem now represents around 10 per cent of the world's electricity generation, and it's hungry for filthy coal. In a report likely to inspire depression among environmentalists, and fluffy statements from tech companies, analyst firm Digital Power Group has synthesized numerous reports and crunched data on the real electricity consumption of our digital world." - IT now 10 percent of world's electricity consumption, report finds
posted by jammy (34 comments total) 12 users marked this as a favorite
 
Yes but people commute and travel less because, among other things, their devices make working from home and keeping in touch with distant relatives easier. That's huge. The average number of miles Americans drove last year dropped for the 8th year in a row.
posted by shivohum at 6:16 AM on August 17, 2013 [18 favorites]


This is meaningless without a comparison against what it is replacing: how much energy does it take to store a megabyte of data on paper? How much to mail a letter versus an email? How much to have a video teleconference versus flying someone across the country for a meeting? Business did not start with computers; it is evolving to use them. How do the computer-based solutions compare to older solutions? Of course, there are things that simply couldn't exist without databases: long-term storage of every transaction a store chain has, for example. These aren't necessarily taking a ton of power, though, because they can be written once and kept in nonvolatile storage for years.
posted by sonic meat machine at 6:18 AM on August 17, 2013 [15 favorites]


I suppose if I watch TV on the computer that counts as "IT" instead of something else. If I get an internet-enabled refrigerator I think 100% of my electricity consumption could be "IT".
posted by Phssthpok at 6:20 AM on August 17, 2013


This clearly means we need to shut down more nuclear power plants and replace them with good intentions.
posted by cthuljew at 6:25 AM on August 17, 2013 [26 favorites]


If I get an internet-enabled refrigerator I think 100% of my electricity consumption could be "IT".

If you get an internet-enabled refrigerator your threat to the environment is likely far greater than mere electricity generation because you have probably purchased the entire skymall catalogue.
posted by srboisvert at 6:27 AM on August 17, 2013 [42 favorites]


In 1999, a similar study was done that purported to show internet use was consuming 13% of US electrical supplies. Berkely labs found the study to be overstated by at least a factor of 8. Computers in general have become more energy efficient since 2001, with the shift away from CRTs to LCDs, etc- even if there are many more devices, they tend to be smaller (tablet use is growing dramatically, which is far lighter in terms of energy utilization). I'm afraid I'm not buying this one, especially in view of it's provenance.
posted by jenkinsEar at 6:44 AM on August 17, 2013 [3 favorites]


Link to the report.
posted by peeedro at 6:45 AM on August 17, 2013 [1 favorite]


I'm afraid I'm not buying this one, especially in view of it's provenance.

One of the many interesting things about this report's conclusions is how well they harmonize with that of Greenpeace's 2010 report on the same topic.

Another interesting observation from the report is the impact of the growth in ICT electricity consumption on the distribution of electricity demand throughout the day. Many ICT devices run all day and all night, smoothing out the daily afternoon/evening 'peak'.

This new reality of increasing night-time demand suggests that if we wish to move the cloud away from coal we'll need to embrace low-carbon baseload (i.e. nuclear or fossils w/ CCS) or find far better and cheaper ways of storing solar power. Ideally both.
posted by narcotizingdysfunction at 7:04 AM on August 17, 2013 [1 favorite]


So I had to do some work in a data center for a day a while back. The room I was working in had on the order of 3000 servers, plus several air conditioners each the size of a car; it's so loud you have to wear earplugs due to OSHA rules.

On my way out of the building, my escort said "oh shoot" and made us run back through a maze of passages to turn the lights out in each one. His boss tells at him if he leaves the lights on, you see.
posted by miyabo at 7:17 AM on August 17, 2013 [1 favorite]


On my way out of the building, my escort said "oh shoot" and made us run back through a maze of passages to turn the lights out in each one. His boss tells at him if he leaves the lights on, you see.

That's bizarre. Most data centers I've seen have the lights on 24/7, because they tend to be staffed 24/7. They are also dimly/efficiently lit, because lights generate heat that would be useless and thus not worth the energy to cool.
posted by sonic meat machine at 7:28 AM on August 17, 2013


The funniest thing is that the biggest datacentres in the world are currently or have already shifted to renewable energy. Google at 34%, Facebook at 23%, Apple at 100%. These datacentres are also some of the most efficient in the world because, even if you were using dirty coal, energy is still fucking expensive.
posted by Talez at 7:31 AM on August 17, 2013 [2 favorites]


Its mostly hydro though right? For the most part they are just sucking up a fixed quantity resource. If thats in Iceland where there is no other demand that's fine. But when its Hydro Quebec, BPA, or TVA power, its a little hand wave-y no?
posted by JPD at 7:49 AM on August 17, 2013 [1 favorite]


My refrigerator runs on a "system board" (what they charged me for replacing when it failed). I suspect my refrigerator is more energy efficient because of it.
posted by Obscure Reference at 7:57 AM on August 17, 2013


Sponsored by: National Mining Association, American Coalition for Clean Coal Electricity
I'm sure that's an important factor.
"Although charging up a single tablet or smart phone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year," they argue.

This example uses publicly available data on the average power utilization of a telco network, the cost of wireless network infrastructure, and the energy that goes into making a tablet, although it ignores the data centers the video is served out of, and tablet charging.
A modern refrigerator uses around US$40-$60 worth of juice per year, so they're ultimately claiming that the delivery of 52 hours of internet video causes the network (not the viewing device) to use $80-$120 worth of electricity, somewhere around $2/hour. That seems quite improbable.
posted by Western Infidels at 8:18 AM on August 17, 2013 [13 favorites]


It's hilarious that this report was written by a pro-coal organization. If the people who build and run datacenters care at all about where their power comes from, I am certain they're not excited about coal.

Google and Facebook both put a lot of effort into using less electricity. And they're leading the way in commodity PC server farms. They are interested both in saving money directly on electricity, and saving indirectly by having less waste heat (and subsequent cooling and server density). Here's some info Google publishes on managing datacenter efficiency. One of the simplest and smartest innovations is to move a DC power source into the rack; it's just stupid that every single computer has its own AC power supply. Also they did the legwork to figure out how to safely run their datacenters at a higher temperature. No reason to chill the machine room to 65F if you can run at 80F.

The other big story coming is the rise of the ARM architecture, the CPU design that's currently powering pretty much every modern battery powered computer (like your iPhone). It's been fascinating seeing ARM take over market share from Intel, both because of their flexible licensable designs and their intense power efficiency. As the Reg article notes there's not much ARM in datacenters yet, but it seems inevitable.
posted by Nelson at 8:22 AM on August 17, 2013 [5 favorites]


In terms of data centers and device manufacturers I am not so worried about energy consumption: the owners have to pay for the energy they use and are strongly motivated to reduce their consumption. There is also the benefit of moving data processing into a fewer number of servers - and having those run in the most efficient way. Scale helps here.

Likewise without the home. A tablet uses only about 1/20th as much energy per year as a desktop PC (which is, in itself, about a quarter of the cost of a refrigerator).
posted by rongorongo at 8:29 AM on August 17, 2013


iSmog
posted by Kirth Gerson at 8:47 AM on August 17, 2013


The one data center I was in -- in what looks like an abandoned building near downtown St. Louis -- was lit up as bright as noon, had ridiculous high ceilings, and was cold enough to store meat in. Theoretically there were people working there 24/7, but most of the time that consisted of 2-3 people on a different floor.
posted by Foosnark at 8:53 AM on August 17, 2013 [1 favorite]


Did anyone else notice the footnote to the study that mentions in passing that the word "gullible" has been removed from the dictionary?
posted by double block and bleed at 8:55 AM on August 17, 2013


I think Swanson's Law is one of the great hopes of humanity. Solar is almost competitive with gas now, but solar will almost certainly decrease in price, whereas I can't see that happening for a sustained duration for oil. Because of this exponential efficiency growth, solar is something that really won't matter until, all at once, it does a lot. Perhaps at the scale and size of personal computing in how it affects society.
posted by curuinor at 9:13 AM on August 17, 2013 [4 favorites]


“Always On” is a difficulty for both computer and power networks. For large database-driven web technologies, you always have to have a master, somewhere, that always needs to be available. You can have all the fancy failover and replication that you want, but it's got to come from that one. It needn't always be the same one, but you probably want it to be in a place with good links to your clients, so it usually is.

Power networks again have to balance variable supply→demand flows. Some sources provide pure capacity (nuclear), but cause network issues if demand is lower than inflexible supply. Others supply near pure energy (solar), but good luck trying to dispatch that for an evening peak. Something, somewhere always has to be on, if only to maintain grid sync and avoid the dreaded blackstart. Blackstart is one of the few times when you absolutely positively need just one central authority, sending commands to particular generators to ready, spin up, synch. You can't just have one little facility coming on at its own bidding, and expecting it not to shut down in a puff of smoke, arc-flash and SF6.

So, Always On: no individual needs it, or wants its consequences, but everybody expects it.

Though the source of the article is dubious, it underlines the culture clash between IT and power. When Google sailed in with their Power Metering initiatives a few years back, they underestimated the inertia of the industry (A 25 year old technology? Suspiciously new. Con Edison only stopped DC power service in NYC in 2007 …), and the lack of energy literacy in the populace. Yes, that last link is a spoof, but isn't far off the mark about how hard it is for people to care where power comes from. Google's smarts were appreciated, but their concepts of ‘long term’ and ‘reliability’ weren't even on the same planet. Then again, we could likely have learned a whole bunch about ‘efficiency’ from them.
posted by scruss at 9:58 AM on August 17, 2013 [2 favorites]


Re: North American hydropower, they actually have a good bit of overcapacity at the moment, thanks to reduced industrial demand and cheap (shale) natural gas (Hydro-Québec can't export as much as it planned to because of that).

Getting people excited about energy-conserving home renovations again might help a good bit.
posted by Monday, stony Monday at 10:15 AM on August 17, 2013


A modern refrigerator uses around US$40-$60 worth of juice per year, so they're ultimately claiming that the delivery of 52 hours of internet video causes the network (not the viewing device) to use $80-$120 worth of electricity, somewhere around $2/hour.

I think they assume you're getting those bits over a wireless telco network. They didn't make this very clear in the report, but that's what the footnote seems to indicate.

Let's say your phone's radio uses 1.5 watts, and the tower uses the same (for the sake of argument) so 3 watts total. 3 W * 3600 * 52 / 1000 = 561 kWh/year. I left out the other power costs of running the telco network, so this estimate is probably low. Still, it's probably between 1 and 2 fridges/year (assuming about 400 kWh/year for a EnergyStar fridge).

(Of course, who has precious plan GBs to spare on watching Netflix?!)
posted by RobotVoodooPower at 10:27 AM on August 17, 2013


(And I'm a complete idiot, and it's actually 3 W * 52 / 1000 = 0.16 kWh/year, a comically low number. I have no idea where they're getting their figures)
posted by RobotVoodooPower at 10:41 AM on August 17, 2013 [1 favorite]


The total energy use associated with computer-bound knowledge workers (whether it comes from coal, oil, nuclear, or whatever) would go way down if they left their cars in the driveway and telecommuted at least one day a week. In addition, taking all those telecommuters off the road, if you spaced them out over the week, would reduce traffic jams and infrastructure expansion needs enough that state and federal governments should be happy to compensate telecommuters for it.
posted by pracowity at 10:56 AM on August 17, 2013


"Although charging up a single tablet or smart phone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year," they argue.

Uuuuh.

This doesn't pass the sniff test for me. I might be overly skeptical, but I'd need to dig into how they're calculating that. Are they assuming that video is being compressed from HD source and streamed as a one-off event, for example? Are they just using a naive calculation like [ total energy load / hours of video streamed * percentage of traffic from video ]?

This reminds me of the stuff that flew around ~3 years ago, suggesting that every single google search consumed enough power to brew a cup of tea.
posted by verb at 11:24 AM on August 17, 2013


RobotVoodooPower: And I'm a complete idiot, and it's actually 3 W * 52 / 1000 = 0.16 kWh/year, a comically low number...
I don't think anyone should ever be embarrassed for having a go at figuring out something for themselves. Everyone makes silly math errors. Having the courage to risk the math errors puts you ahead of an awful lot of journalists, unfortunately.
posted by Western Infidels at 11:45 AM on August 17, 2013 [6 favorites]


Datacenters should be using the same motion-sensor lighting my Target uses in its cold cases.
posted by Lyn Never at 11:53 AM on August 17, 2013


Let's say your phone's radio uses 1.5 watts, and the tower uses the same (for the sake of argument) so 3 watts total.... 561 kWh/year

Oops. 3 watts for an hour is 3 watt-hours. There are ~8766 hours in a year. 3*8766 = 26,298 Wh or 26.3 kWh. (The tower probably uses more but that's probably on the right order-of-magnitude per customer.)

Phones are certainly greener than desktops. According to recent DOE numbers, PC+Monitor usage is ~270 watts. If used 4 hours per day, 394kWh/year.
posted by Twang at 12:00 PM on August 17, 2013


I think I see what they're doing. They're assuming somewhere around 4 kWh/GB (for wireless links?) where terrestrial links are typically under 1 kWh/GB.

So I guess if you've got ~150 GB of video data transferred per year over cellular data, and factor in the manufacturing cost for a relatively short-lived tablet, you could get up to a fridge or two.
posted by RobotVoodooPower at 12:24 PM on August 17, 2013


It is certainly a 'coal centric' viewpoint.

Coal is 40% of global electricity and is forcast, by the IEA, to supply 50% of the growth in world demand over the next 20 years.

It doesn't have to be and isn't everywhere, and is dropping in many places. In California even wind power provides more power than coal.
posted by eye of newt at 12:26 PM on August 17, 2013 [1 favorite]


scruss: "“Always On” is a difficulty for both computer and power networks. For large database-driven web technologies, you always have to have a master, somewhere, that always needs to be available. You can have all the fancy failover and replication that you want, but it's got to come from that one. It needn't always be the same one, but you probably want it to be in a place with good links to your clients, so it usually is."

It's getting better, thanks to virtualization and the move to 2.5" spinny disks and SSDs. Rather than having a minimum of 3 or 4 physical machines running all the time for every website that gets even moderate usage, you can now use those 3 or 4 physical machines to run 100 different high volume websites and spin up more nodes as necessary during peak periods. The entire farm needn't run 24x7, although they often still do at places that are not Facebook or Google scale. As more people move to managed cloud solutions like AWS, though, they get that sort of efficiency gain for free.

RobotVoodooPower: "Let's say your phone's radio uses 1.5 watts, and the tower uses the same (for the sake of argument) so 3 watts total. 3 W * 3600 * 52 / 1000 = 561 kWh/year. I left out the other power costs of running the telco network, so this estimate is probably low. Still, it's probably between 1 and 2 fridges/year (assuming about 400 kWh/year for a EnergyStar fridge)."

The tower is going to use a lot more (hundreds of watts minimum between the radios, the comms gear and the AC), but spread over many users. And your phone's radio will use a lot less. No more than 250mw average, otherwise it can't be FCC certified to be anywhere near your face when it is transmitting.
posted by wierdo at 2:36 PM on August 17, 2013


This clearly means we need to shut down more nuclear power plants and replace them with good intentions.

Not sure if this was a direct reference to Germany's energy transition or not, but anyway here is Amory Lovins dismantling the most prominent myths about Germany's progress toward a low-carbon grid, which is well worth reading regardless.
posted by gompa at 2:39 PM on August 17, 2013 [1 favorite]


So, some guy who heads a "consulting firm" (basically a PR outfit) that gives reach-arounds the the extraction industry writes an article saying OMG WITHOUT COAL YOU CAN'T HAS YOUR SOCIAL MEDIA LOL. And we're supposed to be impressed by this?

It's not surprising that the internet takes a huge amount of energy, and any serious ecologically-oriented study of our energy system needs to take a hard look at this. But that's a system design issue. As someone pointed out up-thread, we need to better understand the trade offs between increased internet use and (potentially) reduced transportation--likewise with energy consumption (my internet enabled thermostat lowers my personal carbon footprint, but what about globally?).

Now, as to whether or not it's worth spewing more greenhouse gas (or building more nukes) just so that everyone can "be free" to obsess on the Kimye family... that's another, more intractable issue.
posted by mondo dentro at 9:57 AM on August 18, 2013


« Older Angel numbers, from 0 to 1299   |   Open Content, An Idea Whose Time Has Come Newer »


This thread has been archived and is closed to new comments