Does the world want BOINC?
August 9, 2023 12:48 PM   Subscribe

David P. Anderson, founder of volunteer scientific computing software BOINC, looks back at its successes and failures. BOINC is still available and providing researchers with PetaFLOPS of computing power, but maybe hasn't achieved what Anderson wanted: "I thought that volunteer computing would become an important paradigm for scientific computing, and that it would last forever. But somehow - in spite of the efforts of me and lots of others - it doesn't look like this will happen."
posted by Tehhund (28 comments total) 13 users marked this as a favorite
 
- Your computer runs hot, and the fan may run at a higher speed.
- Your electric bill may increase noticeably.


These points seem significant. Computers have generally gotten better at not doing stuff when they are not doing stuff.
posted by atoxyl at 1:02 PM on August 9, 2023 [4 favorites]


WOW.

This is really (really) long but looks SO incredibly interesting on first read! I've known about SETI@home and Folding@home (pretty sure I participated in SETI@home for a little while), but I had no idea of the breadth and longevity of the project (or even that so many projects were related).

I am really, really looking forward to reading this whole thing. I am very glad to know it exists. I may come back later and comment more when I've had a chance to read more, but for the moment: thank you so much for posting this, Tehhund. I am grateful to have learned about BOINC today, and about David P. Anderson and all the many fine people he names in his big long post.
posted by kristi at 1:07 PM on August 9, 2023 [1 favorite]


I tried BOINC, but the settings to control how much of my CPU would be used never seemed to work properly. For Folding@home (which I guess is similar in concept?) there was never a native Apple Silicon client, and so I stopped that too. I am open to the idea of donating unused computing power, but the implementations I have experienced so far have not been particularly user friendly.
posted by modernnomad at 1:09 PM on August 9, 2023 [3 favorites]


I was originally enthusiastic about volunteer computing, but the fact of the matter is that electricity costs are high enough that you have to seriously consider just how much you're donating to the project, both economically and environmentally, versus just turning off your computer and then waiting for it to boot again.

Boot times have been remarkably reduced since the aughts, where Apple Silicon Macs are now effectively ready the second you lift the lid or touch the keyboard, and even mighty Threadripper desktops can be cold booted in 5-10 seconds, just enough time to take a sip of your tea and crack your knuckles before you enter your password. In the aughts, you got a personal benefit out of leaving your computer on in terms of zero boot times, and today, not so much.

That makes "just leave it on, and we'll use it to do something nice" a difficult value proposition with today's electricity prices. At, say, the kind of $.44kWh charges you might see in California, you might be talking about an extra $100+/year of electricity. Considering the environmental effects of electricity use in areas still using fossil fuels (almost everywhere), many of us believe there are likely to be repercussions beyond the mere cost. So you have to ask yourself something like "how much good am I doing versus the harm of all this extra electricity use?" It's never been entirely clear, and energy use is fraught enough this summer to make almost anyone think twice about letting their computing hardware crank 24/7 without clear evidence of blockbuster results.
posted by I EAT TAPAS at 1:12 PM on August 9, 2023 [5 favorites]


I had a startup in 2000 in this market in the same era when David was briefly at United Devices. We went broke, mostly because of the tech downturn and the difficulty of raising more capital. I'm very grateful we shut down cleanly and quickly and could get on with our lives. Some of the other startups had raised more money and stumbled along for years. United Devices for instance raised $45M by 2007, then got bought by Univa, now apparently owned by Altair. None of them have had much impact or success. It's still an area people dabble in; the latest variant is scavenging time on GPUs for AI training.

Volunteer computing may not have succeeded in the last 25 years but cloud computing sure has, as has the coordination required for running jobs across a lot of unreliable machines. Back in 2000 our biggest problem was explaining to companies the idea of running their code on computers they didn't own. Now that's the way everything works, albeit centralized in a few cloud computing providers like Amazon, Google, Microsoft, Baidu, etc.

And yeah, the improvement in low power modes in consumer computers definitely makes the whole volunteer thing less interesting. In the year 2000 your only realistic power saving mode was turning the computer entirely off, something a lot of people didn't do because it'd take 4 minutes to turn it on again. Now computers regularly cut their power consumption to just a few watts unless being actively used. Running SETI@Home or whatever has a significant cost.

I appreciate David's reflections on BOINC. There's a lot of detail in there, particularly about what I'd call the politics of community projects. I'm glad it's all written down.
posted by Nelson at 1:29 PM on August 9, 2023 [7 favorites]


scientific progress goes BOINC
posted by avocet at 1:31 PM on August 9, 2023 [27 favorites]


came here to say what avocet said, am pleased that someone said it before me. sense of faith in the correctness of the universe: slightly restored!
posted by bombastic lowercase pronouncements at 1:38 PM on August 9, 2023 [5 favorites]


Regarding power consumption: David laments that there's no way to tell the OS or hardware "run this task in your lowest power mode" because that would probably provide more FLOPS/watt than ramping the CPU and GPU to maximum speed:
It would be great if BOINC could, by default, compute in a low-power state. It would get a bit less computing done, but there would be significantly less energy usage, heat, and fan activity. The problem was that, as far was anyone could tell, there was no way for BOINC to put the CPU in a low-power state, or even to learn what the current power state was. We would have needed operating system support for this.
I don't know if that's still a limitation or if OSs can do this now.
posted by Tehhund at 2:14 PM on August 9, 2023 [2 favorites]


Obviously the other time people might want to do this is when electricity prices go negative. I suspect that the intersect between BOINC people and people with smart electric tariffs is pretty high, so I can imagine that that would work out pretty nicely.
posted by ambrosen at 2:22 PM on August 9, 2023 [5 favorites]


David laments that there's no way to tell the OS or hardware "run this task in your lowest power mode"

Intel's 12 generation Alder Lake CPUs have what are called "efficiency cores" which are specifically designed to do low-priority computation. I know this because VMWare Workstation insists on using those cores instead of the productivity cores, making it super slow. So: there is definitely an option these days, on at least some machines.
posted by grumpybear69 at 2:27 PM on August 9, 2023 [2 favorites]


These days every major OS can manually set or throttle the clock speed of the CPU cores, pin jobs to specific cores, or turn off cores entirely. This capability is available to user code, at least in Windows and Linux, so I think you could write a BOINC executor that used it. Not sure it's particularly useful though; watts used is more or less directly proportional to CPU cycles. Sure you could use fewer watts but then you're getting a lot less work done. I think "it would get a bit less computing done" is wildly optimistic.

I think the GPU version of this idea can still be interesting. Mostly because there's such a shortage of GPUs. But there are still a lot of gamer enthusiasts who own them and maybe it'd be worth the trouble to be able to borrow them. It might cost $1 to $5 a day in electricity to run something full speed but companies are willing to pay that. It'd still be easier to run a job on a special GPU compute cluster it's getting hard to lease time on those.

(It's worth noting there are enormous costs to running jobs on scavenged CPU/GPU time. It's got to be a lot better than a leased cloud facility to be worth the bother.)

See also The rise and fall of the PlayStation supercomputers, a brief fad using the PS3's unique parallel execution design.
posted by Nelson at 2:45 PM on August 9, 2023 [1 favorite]


I listened to a fascinating episode of Ezra Klein's podcast (which is often great) on new uses of AI for scientific computing. Google's Deep Mind team went to work on the protein folding problem and won an international competition in ~2018, solving 100 protein structures accurate to within the width of an atom.

The program took just a few seconds to generate a structure from a protein's amino acid sequence. They realized "We could solve all proteins in about 2 years". So they did that. They found the structure of [if I heard right] 280 million proteins from 20 important animal and plant species.
posted by neuron at 2:50 PM on August 9, 2023 [8 favorites]


Protein folding is super interesting. Folding@Home is one of the longest running CPU scavenging systems out there, it dates back to the year 2000. There was a brief effort in 2006 to run FAH on BOINC but it didn't take off. Too bad, the neat thing about BOINC was it was a general purpose computer and could run lots of types of jobs (although David does a good job explaining the drawbacks of that.)

FAH is still running but I can't comment on the science or whether its approach is still relevant. Google's result with folding was based on very different algorithms, that was the real breakthrough.
posted by Nelson at 3:06 PM on August 9, 2023 [2 favorites]


I'm pretty sure I was running some kind of distributed scientific computing before 2000; I remember running it on a few office computers in a job I had back in DC in the mid-90s. Not sure which one it was.
posted by tavella at 3:19 PM on August 9, 2023 [1 favorite]


I started with the original SETI@Home back in the 90s. I was highly involved in BOINC back in the 2000s. I helped migrate stuff from what ever bug tracker they had to Trac (I think). I volunteered as a forum moderator at a couple different places, including Einstein@Home. For a while, they had a volunteer group that offered assistance to people who were trying to get BOINC to work and couldn't. I was an alpha tester for the core software itself and for quite a few projects. I got to know a bunch of cool people and learned a lot about computers. It was a good decade plus .

There has been a lot of science done with BOINC. Gravitational Waves, getting the LHC up and running. Protein folding at Rosetta@Home....


eta: I even got a mention in the article. ;)
posted by kathrynm at 3:42 PM on August 9, 2023 [9 favorites]


Back then, screensavers mattered. In fact, some people called SETI@home "screensaver computing".
I think that's a big one. It's one thing to swap screensavers when you're running one anyway, it's another to install one when you're not. The cool visuals were a big part of the appeal, even if they weren't really necessary.
posted by ChurchHatesTucker at 4:28 PM on August 9, 2023 [3 favorites]


Archiveteam's warrior is an interesting parallel to BOINC. The "ArchiveTeam's choice" option is the equivilant of BOINC's "Science United".

(We of Archiveteam also dabbled in distributed volunteer storage with the IA.BAK project but it never got off the ground in similar ways to his forays into it didn't.)
posted by joeyh at 4:35 PM on August 9, 2023 [2 favorites]


Honest question, since it's been so many years...is this at all similar to using your computer to help SETI, which I remember being a 90s, maybe 2000s thing?
posted by tiny frying pan at 4:36 PM on August 9, 2023


FFS if you read the article or any of the comments here you'll find out if it's similar to using your computer to help SETI.
posted by Nelson at 5:10 PM on August 9, 2023 [4 favorites]


It's a crazy-long article. Anyway, BOINC actually grew out of the project your thinking of (SETI@home) - initially SETI@home was its own program, but they made BOINC to be a more general program that things like SETI@home could use.
posted by Tehhund at 5:25 PM on August 9, 2023 [3 favorites]


(Thanks, Tehhund!)
posted by tiny frying pan at 6:12 PM on August 9, 2023 [1 favorite]


Mod note: Reminder to be kind and considerate when commenting. Let's keep these threads productive and an open space for conversation/curiosity.
posted by travelingthyme (staff) at 6:35 PM on August 9, 2023 [6 favorites]


He mentions this briefly, but I think bit coin et al have given many likely volunteers for BOINC very similar, tho worthless, projects to work on.

Crypto ate up all the attention that the "PR", as he says, would have garnered
posted by eustatic at 8:11 PM on August 9, 2023 [4 favorites]


I was in grad school in the mid-2000's, and it was an era of dreaming about volunteer peer-to-peer systems that spanned the internet. Distributed systems papers envisioned a world of millions of unreliable nodes with dubious internet connections, and worked out lots of clever ways to deal with that. Then AWS picked up steam, and cloud data centers were more reliable and had better networks than random volunteer nodes, which justified the cost. If we're looking for reasons BOINC hasn't become more mainstream, I'd guess that it's because it's not too expensive to use AWS for the same kinds of high-throughput scientific computing workloads. That and the power consumption cost for volunteers, as others have mentioned.
posted by qxntpqbbbqxl at 8:48 PM on August 9, 2023 [4 favorites]


I appreciate David's reflections on BOINC. There's a lot of detail in there, particularly about what I'd call the politics of community projects. I'm glad it's all written down.

Yes, this is really great, and I appreciate the self-reflection aspect of it.

Being an old PHP coder, I had to take a look under the hood. And yeah, that seems about right for 15+ years of PHP code. (By which I mean it’s not an example of how you should do anything, but I’m not surprised it has been the foundation for good work for a long time.)
posted by jimw at 8:49 PM on August 9, 2023 [1 favorite]


I started running SETI@Home on literally Day One, May 17th, 1999. It had been in closed beta for a while and I was in the habit of checking for it at lunchtime. I nearly spat out my meal when I saw that the day had arrived. From then followed years of installing it on more and more powerful computers, and plenty of those were work computers that it shouldn't have been on.

I moved to BOINC when S@H did, and am running it right now -- I just checked and it's doing two Einstein at Home tasks. Seriously, it's just a dumb little thing that runs in the background (as yes of course they implemented low priority tasks, duh). It looks like I gave it 50% of my CPU, and if it's consuming power or making fan noise I haven't noticed. I do have a low power Shuttle computer (one is seen in a photo in Dan's article!) which is designed to be quiet and efficient. I'm not a gamer :)

My computers also worked on distributed.net tasks, hunting for prime numbers, and I remember the EFF-sponsored races to crack the RC5 encryption codes. I just love the concept of distributed computing!

SETI@Home itself wound down the handout of new tasks (chunks of data to crunch on) a couple years ago.

With the insane amount of CPU power that we have now, it saddens me that BOINC hasn't done more. On the other hand, cloud- and super-computing has also become insanely powerful. Just harder for little projects to get to, I guess, and certainly not free.
posted by intermod at 8:34 PM on August 10, 2023 [1 favorite]


I too ran SETI@home when it was the thing, and one of the first things I did with my PS3 was set up the built in Folding@home engine.

I quit running SETI@home because the Athlon Thunderbird I had at the time would just cook itself under load. No matter how good the cooler, no matter what thermal paste, it just ran super hot. Eventually it started crashing randomly under load and I knew that if I kept running it full tilt it was only going to get worse, so once it got that bad I quit.

I also ran F@H on my Core2Duo laptop for a while, but then the fan died after sitting at 100% for months on end, so I stopped doing that after replacing the fan. Kept the PS3 going for some years, burning around 140 watts 24/7. That didn't seem like a lot back in the days of incandescent light bulbs.
posted by wierdo at 9:37 PM on August 10, 2023 [2 favorites]


I should have mentioned that I have two desktop computers that I tinker with, and they are always running BOINC. I haven't upgraded the CPUs in years but I have a friend who buys a new GPU every year or two and sells me his old one for cheap, so those upgrades allow me to accumulate imaginary BOINC points faster and faster.
posted by Tehhund at 2:55 AM on August 11, 2023 [1 favorite]


« Older “Companies may need to be ready to defend...   |   Why do you want to force someone to stay with you? Newer »


This thread has been archived and is closed to new comments