Join 3,502 readers in helping fund MetaFilter (Hide)


Eureqa!
January 15, 2012 8:21 PM   Subscribe

Wired called it 'A Robot Scientist.' H+ Magazine asked, 'Signs Of The Singularity?' Even the more pedestrian Science News titled their article 'Software Scientist.' So what is Eureqa?

Eureqa is a software package that takes complex data and produces equations that describe the data by evolving solutions from a population of random programs. Based on genetic programming, it combines an elegant interface with a simple way to elicit expert knowledge from the user. But most importantly, it provides a testable way to differentiate general principle from trivial models.

With this approach Eureqa has accurately replicated some fundamental laws of physics and has discovered new models describing biological systems.
posted by BillW (24 comments total) 45 users marked this as a favorite

 
Another note on Eureqa- is it a sign that science is changing to something dealing with such vast data sets that human scientists may no longer be able to comprehend it all?
posted by Apocryphon at 8:29 PM on January 15, 2012 [2 favorites]


Thanks for the links. It seems like a promising augmentation to the current tool-set available to scientists. I was also reminded of this article which talks about how, as we make more and more skilled human work able to be completed by machines we don't have much in the way of alternatives for human work to replace it. Only so many people can make and maintain the robots.
posted by codacorolla at 8:32 PM on January 15, 2012


Thank you for this FPP on a potentially interesting and important subject. But without expert input, it's too susceptible to armchair philosophizing. I'm interested to see input from MeFi users who are working scientists. I suspect their comments would be rather nuanced and contain lots of caveats.
posted by Nomyte at 8:36 PM on January 15, 2012


pedestrian? I guess news has to be strapped to a rocket filled with cocaine nowadays to be legitimate.
posted by edgeways at 8:37 PM on January 15, 2012 [1 favorite]


Okay two points:

1) this doesn't sound like some breakthrough, but most likely their packaging a bunch of known algorithms together into a usable product for non-computer scientists. Stuff like Weka has been around forever. Deriving something simple like newton's laws might not be that complicated. Unfortunately, people who write science articles as having a tendency to claim everything new is super-revolutionary.

2) They're using something called symbolic regression. It's been around for a while. The bibliography on that link is all books from the late 90s.

Maybe this will be useful software for scientists. But these articles are hype overload.
posted by delmoi at 9:02 PM on January 15, 2012 [1 favorite]


pedestrian? I guess news has to be strapped to a rocket filled with cocaine nowadays to be legitimate.

Well, that's actually what you don't want. Rockets filled with cocaine may be exciting but they will probably not give you an accurate picture of what's actually happening.
posted by delmoi at 9:05 PM on January 15, 2012


To be fair, H+ kind of thinks *everything* is a sign of the singularity. It's kind of the point of H+.
posted by vertigo25 at 9:17 PM on January 15, 2012 [1 favorite]


I'm interested to see input from MeFi users who are working scientists.

Well, I'm an engineer and I spend more time using MeFi than working, but I do produce models of complex data for my job. Furthermore, I've written genetic algorithms to produce models in an automatic fashion before, which is exactly what Eureqa does.

Unfortunately, I deal with discrete choice data, which is something Eureqa doesn't do. I actually tried loading in a small problem I've been meaning to spend a day working on; determining what factors influence whether workers work at home. Eureqa helpfully suggested I remove the outliers, i.e. all the records of people who do work at home. I may try and see if I can't find a continuous data problem to hand it to see if it's actually useful.

The interface is pretty decent, though, and it has an option to use Amazon's cloud computing resources built-in; I didn't try that, but in the right problem domain, it could be good.

My guess, though, is that a lot of real-world problems need expert knowledge to produce a useful result, and there is no way to put that in Eureqa. Cute toy, though, if you happen to have some pendulum data.
posted by Homeboy Trouble at 9:49 PM on January 15, 2012 [3 favorites]


Yeah the biggest problem with AI, really, is getting data and massaging it into the format where you can even use these great algorithms.
posted by delmoi at 11:57 PM on January 15, 2012 [1 favorite]


The term "curve fitting" is often used in a derogatory manner for researchers who are simply applying (overly) complex models to data without adding theoretical insight. This is a "curve fitter". It may be useful in some applications, but to call this "science" is really stretching it. Without theory, functions that describe curves are mere trivia.
posted by Philosopher Dirtbike at 12:31 AM on January 16, 2012 [1 favorite]


I'm pretty convinced that we've already had five or six singularities. The most recently previous was the cell phone boom, sound recording before that. These things changed the world in ways no one guessed at beforehand, the only thing keeping them from being recognized as singularities is that they happened too slowly to satisfy the singularitarians.
posted by LogicalDash at 12:38 AM on January 16, 2012


I'm pretty convinced that we've already had five or six singularities. The most recently previous was the cell phone boom, sound recording before that. These things changed the world in ways no one guessed at beforehand, the only thing keeping them from being recognized as singularities is that they happened too slowly to satisfy the singularitarians.
The cellphone boom? Like, before i had to be at home to get a call, now i can get a call anywhere? That's you're idea of a singularity? It's something that had much less of an impact then the car, the train, the personal computer, or the steamship. Or the printing press, for that matter. I mean yeah our lives are slightly different then before. Recorded sound is important but it's one of many major changes that took place in the last 250-300 years, each of which changed the world in an enormous way.

I think the technological singularity is pretty unlikely. But the idea is not just "our lives will be slightly different" but rather that the pace of change will continue asymptotically towards infinity.

Of course, if you compare the pace of change over the last 10 years to the 1990s that seems really ridiculous. People went from a 25mhz 486 to a 3.2ghz Pentium 4 in the 90s. This decade we've gone from the iPaq to the iPhone. CPU speeds are still in the 3ghz range on the desktop.
posted by delmoi at 1:13 AM on January 16, 2012 [1 favorite]


A pet peeve of mine, LogicalDash, but that's not what a technological singularity is, at least not as Vinge originally talked about it. Vinge observed that the kinds of events you describe— they're not really events, but they're spans of time such that someone living before one would have a hard time understanding the world after one— exist through history and seem to be coming more frequently over time. The singularity is when (if) this comprehensibility horizon drops to zero.

The kinds of things you refer to are called singularities by some writers (I think of them as "Strossian singularities", as opposed to "Vingean" singularities). They're easier to write about than a real singularity (which is by definition hard-to-impossible to write about from this side; hence the tendency to sidestep it with the now-clichéd "rapture of the nerds" setting) but they're also not as intrinsically interesting.

There are plenty of flaws to pick at in the Vingean singularity notion, of course, but the existence of earlier horizons is not one of them.

Back on topic:

The term "curve fitting" is often used in a derogatory manner for researchers who are simply applying (overly) complex models to data without adding theoretical insight. This is a "curve fitter".

Or as Hamming famously observed, The purpose of computing is insight, not numbers. Eureqa strikes me as similar to, say, data-visualization software. It can point out relationships in the data, and that can lead to theoretical insight. (Historically there can be a long delay there— many physical theories are curve-fit before anyone figures out why that curve applied to that system. Like, IIRC, Rydberg's formulas for hydrogen spectra, which only later were explained by quantum mechanics. But Rydberg's observations still helped other theorists like Bohr.)
posted by hattifattener at 1:25 AM on January 16, 2012 [2 favorites]


delmoi, maybe your life is slightly different, but I was thinking of all those Indian farmers who can communicate all of a sudden. Also I was using that as my particular example of mass connectivity. Also smartphones have more weirdness going on than you would believe.
posted by LogicalDash at 3:37 AM on January 16, 2012


So does this mean scientists have time to do philosophy again, finally?
posted by edguardo at 4:07 AM on January 16, 2012


The term "curve fitting" is often used in a derogatory manner for researchers who are simply applying (overly) complex models to data without adding theoretical insight. This is a "curve fitter". It may be useful in some applications, but to call this "science" is really stretching it. Without theory, functions that describe curves are mere trivia.

So what we need is a robot that can tell us what stuff means, right?
posted by edguardo at 4:47 AM on January 16, 2012


I actually work using genetic programming on "real world" problems and things like Eureqa are a definite improvement for certain kinds of problems. However, I must take exception to the offhand way the interface is dismissed. After all, Apple's products "just" have a very nice interface - but that makes the difference between interesting and effective. Putting a good interface on GP is no mean feat and one that the community has been struggling with for some time.

That said, it surely doesn't make scientists obsolete and I believe almost everything to do with The Singularity falls under Sturgeon's Law. But what GP does give you is insights into things that would otherwise be at least very obscure. I work for a company developing a molecular diagnostics for cancer and the combinations of genes that we find are often unique and unexpected - things that would have been very difficult to find a priori. Some friends of mine at Dow Chemical use GP as a way to accelerate first principles modeling by getting an initial estimate that they can hand to a chemist as a starting point. This typically reduces the development time from ~10 months to 1.
posted by BillW at 5:15 AM on January 16, 2012 [1 favorite]


delmoi, CPU speeds may have flatlined but that's because chip makers have been going the multicore route. MIPS has still been rising, so you can't really insinuate that we've come to a standstill in Moore's law.
posted by daHIFI at 8:42 AM on January 16, 2012


I wouldn't bet against Moore's Law! In more than 30 years of professional programming, I have heard that it was no longer true no few than 4 times. GP also has the nice
Property of being very parallel in its nature, so we get a lot of bang for our buck, which is a good thing as we surely need it!

As an aside, most GP folks think of machine learning as different from AI as we are in no way trying to imitate human intelligence or the mechanisms of the brain. Instead we learn by example, usually in fairly mechanistic ways. In the case of GP, by borrowing the rough principles of evolution.
posted by BillW at 8:53 AM on January 16, 2012


Moore's Law has never been about clock speeds. If you want evidence of the continuing exponential growth in transistor density, look at flash storage.
posted by ryanrs at 11:24 AM on January 16, 2012


A practicing econometrician weighs in. Even though I agree that this is really cool, I'm also seconding the argument that this seems to be a very advanced form of curve-fitting. (If possible, I'm going to check it out on some data when I go into work tomorrow.) That's no small accomplishment, of course, but without an underlying explanation that you can feel in your kishkas, it's not science. But it might make some kinds of science easier.
posted by 314/ at 8:23 PM on January 16, 2012


delmoi, maybe your life is slightly different, but I was thinking of all those Indian farmers who can communicate all of a sudden. Also I was using that as my particular example of mass connectivity. Also smartphones have more weirdness going on than you would believe.
Which would have happened with traditional phone service, had it reached them. The problem was the reach of the technology, not the type. And I'm not saying that it hasn't changed society, but it hasn't made life 'incomprehensible'. A housewife from the 1950s would have no trouble using an iPhone. the concepts of dialing a phone, sending a telegraph, and taking a photo were all around her already. The only difference is in convenience.
delmoi, CPU speeds may have flatlined but that's because chip makers have been going the multicore route. MIPS has still been rising, so you can't really insinuate that we've come to a standstill in Moore's law.
I think you're inverting causality here. Two things: if you can make a 4 core 1ghz chip or a 1-core 4ghz chip, the 1-core 4ghz chip will be faster. It will be truly 4x while the 4-core chip could be anywhere from 1 to 3.9x as fast, depending on the task. And it's much more difficult to program well for multi-core.

2: a 4-core, 4ghz chip would be 1-15.9 x as fast.

What's interesting, the latest chips from Intel and AMD can actually run at 8ghz or so, so long as you keep them in contact with liquid helium. The problem mostly has to do with the amount of heat generated, rather then the physical limitations.
posted by delmoi at 9:56 PM on January 16, 2012


A practicing econometrician weighs in. Even though I agree that this is really cool, I'm also seconding the argument that this seems to be a very advanced form of curve-fitting. (If possible, I'm going to check it out on some data when I go into work tomorrow.)

I'm actually kind of skeptical about their use of genetic algorithms here. One thing that makes life so diverse on earth is the fact that there's genetic antagonism as well as just optimization. That doesn't mean you get the 'best' solution but rather a solution that can survive against other things that evolved to eat it. Would humans ever have gotten so smart if we weren't both hunters and hunted? Probably not.

So when you actually use GAs you can get really efficient outcomes, but they don't always come up with a lot of creativity as things end up in local maxima. I doubt they have any kind of competition, and even if they did that doesn't make the solutions better, just more interesting.

Looking through their user docs, it's kind of lame. It seems like they just have a bunch of algebraic primitives, which means whole classes of interesting problems won't get solved - anything based on anything more computationally complex then an algebraic equation. It sounds like over-fitting could be a huge problem as well (meaning you tweak your rules until you find something that matches the data by coincidence, not because of the underlying pattern).

But the big advance that newton made is that he came up with new functions, namely, calculus rather then just using "off the shelf" mathematical primitives.

Their hype said they derived the "laws of motion" from input data... but obviously you could get that if you included integrals and stuff to begin with! Had their system derived calculus from basic rules that would have been more impressive :P

That doesn't mean it's not a useful tool. Looks like it has a nice UI, despite the obvious .NET crap (the new Office UI was innovative and cool design (IMO). Using an "Office UI widget" and adding a bunch of ugly buttons you made in paint is not)
posted by delmoi at 11:16 PM on January 16, 2012


sweet! altho i'd caution :P
posted by kliuless at 8:22 AM on January 17, 2012


« Older The sci-fi and fantasy trailer edits of bloodrunsc...  |  Why the video pros are moving ... Newer »


This thread has been archived and is closed to new comments