Distributed computing: something for everyone, a call to arms.
May 28, 2008 8:38 AM   Subscribe

CPU Filter: You know what they say about idle hands... What about idle FLOPs? Distributed computing (a.k.a. grid computing / a.k.a. cloud computing) has come a long way in the past years, and most people probably don't know the vast number of projects they can put their idle CPUs to work on - it's not just aliens and genomes anymore. There are more than one hundred projects ranging from 3D rendering to climate prediction to saving the world with nutritious rice to neurons and nanobots. Why not lend an idle hand?
posted by tybeet (39 comments total) 4 users marked this as a favorite
 
My idle hand is too busy playing with the Devil's tool.
posted by Floydd at 8:59 AM on May 28, 2008 [2 favorites]


Don't forget about earthquake detection.
posted by nitsuj at 9:00 AM on May 28, 2008 [2 favorites]


So I download a client for every core on my CPU?
posted by The Power Nap at 9:01 AM on May 28, 2008


Many of the clients (BOINC projects and Folding@Home are two I know of) have 64-bit (aka multi-core) clients, so you shouldn't have to, but you certainly can run multiple clients. I personally run BOINC on 1 core, and Folding@Home on another.
posted by tybeet at 9:02 AM on May 28, 2008


@nitsuj: Thanks, that's a really interesting article, and timely too.
posted by tybeet at 9:04 AM on May 28, 2008


Call me when I can sell my spare cycles (to someone who will actually pay).
posted by DU at 9:08 AM on May 28, 2008 [1 favorite]


I used to enjoy running Folding@Home until they came out with the PS3 client. It just kicks the ever-living hell out of the PC client. 50,000 PS3s generate 1500 TFLOPs while 200,000 PCs generate 188 TFLOPs.* I don't see the need to peg out my CPU and heat up my case any further when I'm accomplishing less than 100th of the work as some idle game console.
posted by BeerFilter at 9:19 AM on May 28, 2008 [1 favorite]


Beerfilter: note that the GPU client kicks the PS3's butt per-PC. That's XP with an ATI card.
posted by a robot made out of meat at 9:26 AM on May 28, 2008


You can also donate bandwidth by running a Tor Relay.
posted by aerotive at 9:27 AM on May 28, 2008


You can also donate bandwidth by running a Tor Relay.

Sure, if you're okay with kiddie-porn-a-plenty passing over your wires. Don't get me wrong, Tor is a great idea in theory and I've used it myself just to check it out, but just like Freenet before it it's become mostly a facilitator of jerk material for diaper snipers. Almost half the services listed on core.onion are pedo-oriented, and any non-twisted individual would be disgusted with some of the "can someone repost that video of the little girl pissing" threads on core.onion.
posted by DecemberBoy at 9:40 AM on May 28, 2008 [1 favorite]


Call me when I can sell my spare cycles

I'm looking for an early '70s Honda 750, anyone got an extra one of those by chance?
posted by quin at 9:55 AM on May 28, 2008


Didn't the Wii come with a distributed computing client of some kind? I thought that when it was idle, if you had the settings turned on, it would use it's spare clock-cycles for something interesting.
posted by quin at 9:57 AM on May 28, 2008


Thanks, that's an impressive list of clients. I have no conception about useful these things actually are in advancing their area od science - do they make useful incremental advances (or even breakthroughs)?
posted by patricio at 9:57 AM on May 28, 2008


I joined these distributed computing programs years and years ago when they first appeared thinking they were a wonderful and creative solution to massive computing needs. I quickly changed my mind.

My first real reaction after downloading some of the software was that it made my own use of my own computer much harder. Back then the programs probably weren't as well written and computers (as well as Internet connections) were much slower. The end result was that after about a week, I took the program off my computer as it turned my then state-of-the-art desktop into a dog.

My next reaction, over the following several years, was that not much in the way of real results was coming from any of the projects. I don't know that I've still ever heard of more than one or so real successes. Maybe someone has a list of what distributed computing has actually done for humanity. Or maybe the idea of distributed computing sounds better than it is. Maybe the promise of nearly unlimited and free computing power allows researchers to be lazy rather than targeted in their research.

My current reaction is that without real success stories, all these programs tend to do is burn though a lot of power (on the order of $100 a year for you personally in added electricity bills if the math I've seen holds true) and wear out your computer earlier than would otherwise happen. These distributed computing platforms therefore raise environmental concerns.

The ultimate question is this - is it still best for everyone just to turn your computer off when you aren't using it? Maybe.
posted by Muddler at 10:00 AM on May 28, 2008


DecemberBoy: "You can also donate bandwidth by running a Tor Relay.

Sure, if you're okay with kiddie-porn-a-plenty passing over your wires. Don't get me wrong, Tor is a great idea in theory and I've used it myself just to check it out, but just like Freenet before it it's become mostly a facilitator of jerk material for diaper snipers. Almost half the services listed on core.onion are pedo-oriented, and any non-twisted individual would be disgusted with some of the "can someone repost that video of the little girl pissing" threads on core.onion.
"

Well I'm not ok with kiddie porn passing over anybody's wires but I still think it's reasonable to run a relay. It helps many journalists, human-rights workers, aid workers and has plenty of non-nefarious uses.
posted by aerotive at 10:12 AM on May 28, 2008 [1 favorite]


Metafilter: facilitator of jerk material for diaper snipers
posted by DU at 10:23 AM on May 28, 2008


Why not lend an idle hand?

Because modern systems have energy saving features that get completely screwed up by using 'spare' cycles. I want my computer to wind down and go to sleep when I'm not using it, rather than chugging through calculations with dubious efficiency and benefits.
posted by malevolent at 10:26 AM on May 28, 2008


Save the electricity instead.
posted by MillMan at 10:36 AM on May 28, 2008


64-bit (aka multi-core)
Argh no. 64-bit and multi-core machines are completely unrelated. Core Duo are multi core 32 bit and Athlon 64s are single-core 64-bit.
posted by Skorgu at 10:41 AM on May 28, 2008 [1 favorite]


Because modern systems have energy saving features that get completely screwed up by using 'spare' cycles. I want my computer to wind down and go to sleep when I'm not using it, rather than chugging through calculations with dubious efficiency and benefits.

I agree with malevolent. I'd much rather save the wear and tear on my computing components as well as save electricity by having my machine shut down or sleep.
posted by gyc at 10:56 AM on May 28, 2008


Didn't the Wii come with a distributed computing client of some kind? I thought that when it was idle, if you had the settings turned on, it would use it's spare clock-cycles for something interesting.

The PS3 does something like that, but the Wii uses its standby mode (in which the power light is yellow) to periodically download Wii Connect 24 messages. Wii Connect 24 is basically just a message-passing service; it provides persistent server storage to facilitate information passing between Wiis.

Some things that a Wii will do while on standby:
- Miis will travel between the systems of all the people you've exchanged system friend codes with.
- Messageboard messages will be received from Wii Connect 24 from other Wiis and Nintendo. Nintendo mostly sends update notifications and the occasional game ad (if you've opted in). Mario Kart Wii uses this to send tournament notices.
- A few games with level-sharing functions use this to implement them. Elebits may be the only current example of this.
- Some downloaded channels use Wii Connect 24 to keep themselves updated, and to update their channel banners. The Shop, Weather, News, Check Mii Out, Everybody Votes and Nintendo Channels do this. If a system hasn't been connected to the internet for a while opening one of those channels will start a lengthy download process, but if the system has a stable connection it will download this stuff while in standby mode.

Notice that none of these things is really distributed computing. It is not known on this side of the NDA wall whether a Wii could do distributed computing or whether system routines handle all the Wii Connect message passing.
posted by JHarris at 11:05 AM on May 28, 2008


I'm looking for an early '70s Honda 750, anyone got an extra one of those by chance?

Sorry. Had a '72 750 KO series with the sandcast block. Never should have let it go....
posted by sourwookie at 11:19 AM on May 28, 2008


Does anyone have more info about the AI project (the 'neurons' link above)?

In their FAQ they state as established fact a number of issues that as far as I know are still very much open questions, and occasionally devolve into pure sci-fi fantasy. And there's all this weirdly vague harping on security questions, and they're not very forthcoming about what they'll actually be running on your machine, other than that it's for profit and may change from time to time. And their project leader has no visible experience in research of any kind, instead seems to have spent the last decade or so as an IT consultant, mostly in the banking industry.

I guess you could say the project is raising some red flags for me.
posted by ook at 11:22 AM on May 28, 2008


@Skorgu:
Argh no. 64-bit and multi-core machines are completely unrelated. Core Duo are multi core 32 bit and Athlon 64s are single-core 64-bit.

There's no need for trolling. Technically it is an emulated 64-bit system; It runs under a 64-bit operating system, which means it uses 64-bit software, and so for the vast... vast majority of people it is indistinguishable from a CPU which physically has two cores.

@Muddler:
I think you're too quick to jump to conclusions. Your past experience with distributed computing projects was in its infancy. There's no evidence of processor bottlenecking now, and bandwidth is not a problem assuming you have high-speed, or alternatively you can schedule the work units to be sent when you're not using the computer (say, at 4AM).

As for the science part of it. There is a LOT being done with what's calculated. Take a look at the number of white papers generated from Folding@Home alone. Take a look at the Genome@Home project, and its success in mapping the protein sequences from the Human Genome Project's map.

I think it's unfounded to claim there is no success being made with these projects, let alone that it makes people lazy. It's not as though these projects are doing the interpretive work for them: all it can do right now is provide the computational resources that would otherwise take hundreds, thousands, or millions of years on even most supercomputers.

True, none of these programs may shift the scientific paradigm, but they are doing incredible things for the scientists who back them up.

And if you don't trust the abstracted, grandiose nature of the questions of some of these projects, you certainly don't have to participate in them. There are very tangible things being done, such as the Climate Prediction project, which when it comes to an end will provide the most illustrative prediction of the future of our global climate, and when we may meet with the worst effects from global warming.

@ook: Thank you for bringing those links to my attention, I was unaware of any shifty-business with the AI program, and unaware of their for-profit nature. This just goes to show you that with the blooming nature of distributed computing, and how much easier it is to bootstrap your own project onto it, it may be opening the doors for irresponsible use of others' clocks.
posted by tybeet at 12:50 PM on May 28, 2008 [1 favorite]


This only seems like freely lending idle CPU time at first glance. The electricity cost of participating in a distributed computing project is non-trivial. Take using a PlayStation 3 to do Folding@Home, for example, which is one of the most efficient ways to contribute.

An original PS3 uses about 200W to do Folding@Home. Run continuously that's 1752 kWh per year. At the average cost of 10.31 cents per kWh, that will run you $180/year. A newer model, 40GB PS3 will cost about $103 per year to do Folding@Home. By contrast, a PS3 only uses about 2.3W in standby mode, or about $2 worth of electricity in a year.

The take home point is that many of these are noble projects, but you should definitely weigh the very real cost of contributing to them. I would be a lot more inclined to give cycles to something like Folding@Home if Stanford gave me a tax deduction for the value of my donation. Using the above numbers, even if you assume the PS3s only do Folding roughly half the time and that they're all newer PS3s, that's still some $2.5 million per year donated to Folding.
posted by jedicus at 12:56 PM on May 28, 2008


@Skorgu: My apologies, you're absolutely correct (I was tangling things up in my head when I wrote that last response).
posted by tybeet at 1:02 PM on May 28, 2008


It's probably easier on the environment for the scientists to use spare distributed cycles from pre-existing computers (efficient use of resources), than to build their own cluster and do it in-house, where the environmental impact of building new machines, then using them inefficiently over the computers potential lifespan to end up in a Chinese peasants cabbage patch. Plus the programs can be configured to run only after X numbers of inactive time so it doesn't interfere with your workload. And the "wear and tear" on a solid state component I would be curious to know more about. The only decent arguments for not doing it are:

1. I don't care
2. It's a PITA and not worth my time. What does it get me?
3. It costs electricity

The rest of it is just rationalization.
posted by stbalbach at 1:07 PM on May 28, 2008


@jedicus: You're right, the PS3 has a substantial energy use when folding, but that's because it has inefficient design. You cannot generalize this to all distributed computing use.

Processors are becoming more efficient, as Athlon's X2 processors only use 45-watts, which is about as much as your average incandescent bulb. If your concern is cost: people often run under their monthly allotted kWh for their home - I know I do - so it's not a big deal. If your concern is for the environment or conservatism, then that's a matter for another thread, but perhaps you might consider that the benefits of science (particularly considering there are programs looking at our climate) may outweigh the cost of energy-consumption, so that's really in the air ;).
posted by tybeet at 1:11 PM on May 28, 2008


I run folding@home on my backup server. Its idle most of the time anyway.
posted by DJWeezy at 1:13 PM on May 28, 2008


Actually tybeet, as BeerFilter pointed out, the PS3 is actually pretty efficient at doing Folding@Home calculations. A new PS3 gets about .26GFlops/W. A modern dual core processor, by contrast, gets about .5GFlops/W, which is better but not by an order of magnitude. Adding a GPU client increases the efficiency to about .9GFlops/W (but increases the total power consumption to be about the same as the PS3).

My concern isn't so much the cost itself as the fact that the cost is hidden. Distributed computing projects need to do more to make it clear to people that your electricity bill will go up as a result of participating. And because many of these projects are tax-exempt, I would like to see a structure in place for getting a tax deduction from the donated cycles, at least to cover the cost of the electricity used.
posted by jedicus at 1:36 PM on May 28, 2008


tybeet: I've looked at it some more and have come to the conclusion that Intelligence Realm is really just one guy who's read a little too much Kurzweil: lots of vague hand-wavy dreams for the future but not much actual expertise or knowledge. He claims that right now all they're doing is "capacity testing" a neural network; minor details such as implementing machine learning will come later.

Unless he's doing some shady computation during that 'capacity testing' which he's not telling anybody about, the project at this point is just wasteful, not evil. Even so: I'd be wary about running any sort of distributed app without a good amount of trust in whoever's backing it; anybody can write a BOINC app, but that's no guarantee that it does anything useful.

(Then again, my personal favorite distributed-processing application arguably doesn't do anything useful either.)
posted by ook at 2:25 PM on May 28, 2008


stbalbach, regarding wear and tear on solid state components, I can only offer an anecdote from my own experience:

My first F@H client was a home-built Athlon XP 2500+. I kept F@H on it so that it never went to idle. I was either gaming, using it, or it was running at 100% load form F@H at all times. It ran hot at stock speeds anyway, but I had overclocked it from 1.8 GHz to 2.2 GHz. It had a cooler and fan on it louder than any I've ever heard outside of a server room. During the summer (no AC) I had to take the side panel off and have a box fan blowing into it to keep it from crashing.

It's also the only machine I've ever had where the processor failed. Once it died I think I got a bottom-end Athlon 64s, and was simply amazed at how much cooler it ran. I stopped overclocking CPUs. I was still running F@H until I got my current Athlon X2, but I stopped after a few months when they were coming out with the GPU clients that wouldn't run on my video card, and the numbers just didn't add up anymore.

I'm no electrical engineer, and I used to think there was no such thing as "wear and tear" on a CPU, but I've come to think that there can be an effect on overall life of a CPU when comparing a proc that sat idle most of the time at 35°C versus a proc that sat at full load most of the time at 70°C.
posted by BeerFilter at 4:43 PM on May 28, 2008


Processors do wear out, but if you run them at fairly normal temperatures, they'll last 10+ years. If you stress them the way you did, by inadequately cooling them, then yes, they can fail sooner. Having to blow a fan into the case to keep it from crashing means it's absolutely certain that you weren't cooling the chip adequately. That's why it failed early.

You might have made the easy and simple mistake of using too much heatsink compound. People think it's supposed to go on like peanutbutter, but it's supposed to be a microscopic film. If you get the heatsink wrong, or if you don't get the fan on tight enough, then the system fans can whine like crazy without actually doing very much cooling. If the heat isn't properly wicked out to the heatsink, it won't be removed properly from the system.

Further, the 32-bit Athlons had no onboard temperature sensor, and would cheerfully slag themselves almost instantly if you removed the heatsink. Intel CPUs have never done that, and modern Athlons don't either; they all have onboard temperature sensors and will throttle themselves if they're running too hot.

As long as you're cooling a chip adequately, it doesn't particularly matter whether you use it or not. Modern CPUs are very good at saving power when not being used, so running a distributed client can have a noticeable impact on your electricity bill. Overall reliability, however, should be entirely unaffected.
posted by Malor at 6:47 PM on May 28, 2008


tybeet: Skorgu is not trolling, you are just very confused about how processors work, and he was attempting to set you straight. He is right: having multiple CPUs on one socket (multi-core) is totally unrelated to the property of having a 64-bit instruction set and address space. They have nothing to do with one another, whatsoever.

Also, while the Athlon64/Opteron processors were much faster per watt when they were competing against Pentium4-based Netburst processors, that is not the case at all anymore. Comparing an AMD Phenom against an Intel Core 2 Quad, the Phenom uses 1.5x as much power and is less than 2/3 as fast.
posted by blasdelf at 8:17 PM on May 28, 2008


Thanks tybeet for this post. I've been running distributed computing clients for over a decade, starting with the encryption-cracking clients back in the mid-90s (the point there was to show how breakable the encryption was).

I obsessively checked the SETI@Home site every day in '98-'99 and joined the day they went public in May 1999. Still running it!

The link you had under "projects" in your original post deserves to be mentioned again:

Distributed Computing projects that benefit humanity


The best analysis and overview of available DC clients I've seen yet!
posted by intermod at 8:22 PM on May 28, 2008


I've come to think that there can be an effect on overall life of a CPU when comparing a proc that sat idle most of the time at 35°C versus a proc that sat at full load most of the time at 70°C.

Boosting the voltage isn't good either (a common overclocking trick). None of that overclocking stuff is terribly clever. Colorful, yes; clever, no.
posted by ryanrs at 5:35 AM on May 29, 2008


@blasdelf: I think you missed my second comment.
posted by tybeet at 9:20 AM on May 29, 2008


This may be naive of me, but I've often wondered if something like Amazon's Mechanical Turk could be created to encourage participation in remote distributed computing.

Here's the model:

As a research group, you can rent a vast network of machines.
As the "turk-ish" business, you pay a percentage of the rental to your nodes
As a node, you get a cut proportional to the work your CPU achieves.

I'm sure many people have junk computers just taking up space somewhere. Why not plug those things in and make some additional cash?
posted by TimeTravelSpeed at 9:25 AM on May 29, 2008


@TimeTravelSpeed this goes back to quin's somewhat cynical post. I agree with you in thinking that there is potential for this to become financially driven, especially considering that not everyone who needs supercomputing power will need a supercomputer, as that's a huge investment, and you may only need it for the weekend.

My personal opinion is that as this moves further away from science, and more towards such things as animation, that it may become economical for some businesses to design a client with a very short deadline. If this happens, just consider the retrospective rammifications it would have for the scientific community - certainly you wouldn't want to give clocks away for free when someone else will pay for them.

However, is it better how it is now? jedicus makes a good point about hidden costs - we're basically donating anywhere from $15-200 a year to science. Whether this kind of donation is better in the sense that it's tangibly being used for computation is beside the point that it's hidden, and if we had instead given a check for that amount to the Stanford department of biology/chemistry which heads up Folding@Home (or more to the point, giving money to a walk for breast cancer, rather than giving clocks to cancer computation), we could have seen a tax deduction for charitable contribution.

It's interesting to think about.
posted by tybeet at 10:48 AM on May 29, 2008


« Older Scott McClellan was "badly misguided"   |   Weekly Themed Art Challenge Newer »


This thread has been archived and is closed to new comments