Ants with dead-end vision, backtracking capabilities
January 7, 2014 1:26 AM   Subscribe

I’m trying to build a jigsaw puzzle. I wish I could show you what it will be, but the picture isn’t on the box. But I can show you some of the pieces that snapped into place this year, and try to share a context for why they mattered so much to me.
Bret Victor discusses scientific thinking and computing from a deeply humane perspective through the eyes of Douglas Engelbart, Alan key and other great thinkers of our time.
posted by Foci for Analysis (30 comments total) 40 users marked this as a favorite
 
Interesting snippets. Not sure his pieces are all from the same puzzle though. And he's a bit late catching up with Julian Jaynes.
posted by Segundus at 3:32 AM on January 7, 2014


Gosh, this is good stuff. Thanks for posting.
posted by EinAtlanta at 5:24 AM on January 7, 2014


Tried to read it, but didn't get far.
Can someone talk me through it? These people are suggesting that a different way of thinking has arisen in the modern era? And they are trying to say why?
posted by YAMWAK at 5:33 AM on January 7, 2014


always funny to see david hestenes pop up, patron saint of every undergraduate looking for the reason they don't understand vector calculus... until they discover that clifford algebras are worse.
posted by ennui.bz at 5:56 AM on January 7, 2014


i like what's going on here. does modern information overload necessitate the distillation of communication to digestible blurbs, or is it the natural outcome of an inherent intellectual laziness in people? check out all my big dumb words. preponderance. ruminate. zeitgeist. let's all read more.
posted by Quart at 6:18 AM on January 7, 2014 [2 favorites]


I think those are excerpts of linked articles, although the article titles look like plain text.
posted by thelonius at 6:32 AM on January 7, 2014


Can someone talk me through it?

As I understand it these are just quotes from books Bret Victor read this year and was struck by, not all of them new, together with his brief comments.

He thinks they might all fit together in some way, but that's just his impression; he's not telling us how because he doesn't exactly know yet (the jigsaw box doesn't have a picture on it).

Basically, some guy's personal 2013 list of great reads.
posted by Segundus at 6:35 AM on January 7, 2014


Thanks. I clicked through to some of the texts, but I really didn't get the feeling of a cohesive whole.

I don't think his 'jigsaw puzzle' analogy is quite right - the individual components would need some processing before they could fit together into a single document. Not a jigsaw puzzle but rather an unfinished meal. The ingredients need cooking and serving.
posted by YAMWAK at 6:46 AM on January 7, 2014 [1 favorite]


The "readers tips" alone are some of the most powerful ideas I've encountered in a while... let alone the materials he introduces.

I keep finding clues that Quarternion mathematics is an extremely powerful tool for understanding the physical universe, and the gateway through Clifford algebra might be enough to get me there.

I strongly believe that there is a lot of physics that we just can't do because we've lost some part of the understanding in a very deep way in the past 150 years... but I don't believe the conspiracy theory that it was JP Morgan who had it done deliberately.

I MUST understand this stuff... put it up there near #1 on my bucket list.

Thank you, thank you, thank you for giving me this pointer.
posted by MikeWarot at 7:03 AM on January 7, 2014 [2 favorites]


Bruno LaTour is one of my fave shit-disturbers from way back, and his reading tips are thoughtful and useful, to me.

It's nice, too, that there's (apparently) a lady on his list! I do think it would be good if he found some stuff on cognition by people who don't present as white. Like Charles Mills, for a very readable start. Maria Lugones.

But I guess I should be making these suggestions to *him*.
posted by allthinky at 7:46 AM on January 7, 2014


The ingredients need cooking and serving.

Well thank god we were all born with our own minds capable of processing and digesting complex, orthogonal ideas.
posted by crayz at 8:21 AM on January 7, 2014 [1 favorite]


but I don't believe the conspiracy theory that it was JP Morgan who had it done deliberately.

Wait...what?
posted by yoink at 8:51 AM on January 7, 2014


I've been reading Surfaces and Essences, the book by Hofstadter. I'm a bit annoyed at it, to be honest; it feels like he restates the same idea too many times.
posted by sonic meat machine at 9:04 AM on January 7, 2014


I love the jigsaw puzzle metaphor. What fun!
posted by mondo dentro at 9:19 AM on January 7, 2014


I strongly believe that there is a lot of physics that we just can't do because we've lost some part of the understanding in a very deep way in the past 150 years

Could you expand on this? The past 150 years--hell the past five--have been an almost uninterrupted period of pushing the boundaries of our understanding of the physical universe.
posted by feckless fecal fear mongering at 9:37 AM on January 7, 2014


; it feels like he restates the same idea too many times.

That's because a Hofstadter entity is something that restates Hofstadter entity ideas.
posted by weston at 9:51 AM on January 7, 2014 [5 favorites]


<subscribing to discussion here>
posted by benito.strauss at 10:02 AM on January 7, 2014


Could you expand on this? The past 150 years--hell the past five--have been an almost uninterrupted period of pushing the boundaries of our understanding of the physical universe.

Yeah, I'd hardly posit 1860 as a high water mark of physics...
posted by sonic meat machine at 12:16 PM on January 7, 2014


Print was directly responsible for the emergence of a literate and educated society, which (for example) made possible the idea of societal self-governance.

Has anyone told the Athenians?
posted by GeorgeBickham at 2:12 PM on January 7, 2014


As you read and watch Alan Kay, try not to think about computational technology, but about a society that is fluent in thinking and debating in the dimensions opened up by the computational medium.

reading about what chris crawford's been up to -- "Kay pushed Crawford to do better in everything he made. He would tell his friend: 'If you don't fail at least 90 per cent of the time, you're not aiming high enough.' " -- reminds me of his 'history of thinking' :P

re: quarternions, wait til you get to the octonions! (i also vaguely feel that p-adic probabilities might have to do with something ;)

oh and speaking of ants (and the oneness of everything...)

cheers!
posted by kliuless at 4:56 PM on January 7, 2014 [1 favorite]


You think I'm foolish for saying physics could have been faster.... well here is my world view, which you may consider, and are welcome to disagree with, but it's consistent for me:

The past 150 years of physics have done quite an impressive number of things, true... but I think there are a lot of things we're just getting around to, that could have been done long ago, if we didn't get our mathematics wrong. The Abramov-Bohm effect is just starting to make inroads in physics, for example... but if you look at the original equations for electrodynamics, you'll find that the "Curl" (B) is a derivative of A, the Vector potential. The Quarternion versions of the equations explain a lot of thing, or so it appears.

By analogy: Computing has done a massive amount of things in the past 50 years.. but if Babbage had actually shipped hardware instead of trying to leapfrog himself, we could have had that same level of progress 100 years earlier.

Ronald Rife invented an optical microscope that works well past the commonly accepted diffraction limits, and patented it in 1929... people are just getting around to figuring out that it actually does work.

Tesla had figured out how to pump the ionosphere and extract power from it, like a laser... 100 years ago... and we're not even trying to do it now.

There has been amazing progress in the past 150 years... but there could have been more, much much more.

Oh.. and we could have computers that are actually secure, but everyone copied Unix, and forgot about real security because they thought it was good enough.
posted by MikeWarot at 5:37 PM on January 7, 2014


Thanks for the post. This is lot more digestible than what I was trying to do earlier this week.

A few days ago I was wondering what makes computation different and why it has happened now; so on a whim I spent more than a few hours of unguided exploration on the history of mathematics via Wikipedia.

Now I'd like to thank kliuless for reminding me to watch the rest of a Alan Kay lecture that was paused on one of my browser tabs. Mathematics is still not a subject I know a lot about and so an ant metaphor/model seems apt to describe my meandering path.

While writing this post I decided that the ant model is just as aptly applied to the history of mathematical exploration and discoveries in general. In mathematics the most blind ants have been pushing paths into the most obscure and distant corners. Some of those ants have had more social capital and better timing than other ants (intellectual pheromones perhaps) and those paths are now well trodden.

On the subject of computation. It feels like a brand new modern ant superhighway to a far away place but I guess as kliuless mentions that would be missing the fact that we have learned to make ant highways that allow higher volumes of ants to travel at higher speeds to a whole range of destinations (if we build in those directions).

In high school in the late 80's I remember chafing at the method of mathematics I was being taught. I didn't have a computer but I knew enough about them to feel very strongly that my teachers should be showing me how to model all this stuff computationally. It was a naive view but I still have that view and many others like it.
posted by vicx at 6:24 PM on January 7, 2014


Computing has done a massive amount of things in the past 50 years.. but if Babbage had actually shipped hardware instead of trying to leapfrog himself, we could have had that same level of progress 100 years earlier.

Well, no. Computers, like all human techonology, rely on a synergy of an enormous network of other things to even exist. The existence of a computer on your desk right now represents an enormous interdependent confluence of technologies and social norms that simply could not have existed in the world in the Victorian era, unless you say "oh and if everything else was totally different too."

The past 150 years of physics have done quite an impressive number of things, true... but I think there are a lot of things we're just getting around to, that could have been done long ago, if we didn't get our mathematics wrong.

'We could have done X, if only Y were different' is kind of a silly mental exercise don't you think? I mean, we could have had widespread use of penicillin during plagues, if only we hadn't been getting our biology wrong.
posted by feckless fecal fear mongering at 10:25 PM on January 7, 2014


Modern computers, as they are currently built, do depend on a an enormous network of parts, true enough. This is one solution in a vast problem space, there are many other ways that haven't been explored yet, or likely won't ever be explored. It would take an extreme amount of hubris to declare that we have the lowest cost solution in the space constrained by available technology.

Babbage pushed his problem space quite strongly, we have standardized screw thread sizes, and quite a bit of mechanical standards still influenced by the funds he put through his machinist of choice.

I came up with the idea of a simple mechanical programmable logic array that Babbage's machinist could have turned out in mass quantities... 3 inputs, 1 output, with a disc with 8 positions for pins to contain the program state. You could make a honeycomb out of them, and be doing boolean logic in no time. It's all feasible with what was available back then.

Solid state electronics is only one of many possible applications of the lithography we've developed to support the manufacture of ICs... there are many others, some may even be superior to anything we can imagine as a group... but individuals within that group could push very hard in a direction and get something considered impossible.

I see how you think it takes a huge supply chain, but its not really that awful if you only want to make the first few of something. The supply chain is to drive the price down, not to make something possible.
posted by MikeWarot at 10:53 PM on January 7, 2014


The supply chain is to drive the price down, not to make something possible.

Um, no. The supply chain is exactly what makes it possible. You could not be typing on that computer without someone yanking oil out of the ground and turning it into plastic. And silicone and rare metals and and and.

I'm sorry but you just don't know what you're talking about.
posted by feckless fecal fear mongering at 5:13 AM on January 8, 2014


The supply chain makes it cheap, and not a laboratory 1-off masterpiece. The Antikythera mechanism didn't have a modern supply chain to turn them out in abundance, but it was made in the 1st century BC. It had technology not seen again until the 14th century.

The first transistors didn't have a supply chain.. they were one-offs. This computer I'm using IS the result of a long supply chain, but it's possible to replace it without that chain. The replacement won't be as good, nor interchangeable, but it can be replaced, as long as the knowledge of how it functions, and how those components work, can be maintained.

It's a trade off of mass production, and the efficiency of scale (along with defect management), verses having a hand built mono-type.

Again, I didn't say it would be easy to replace, just that it was possible.

---

I believe that it's possible to build a fully functional computing engine out of Babbage era precision machined parts. It won't be Von Neuman architecture, but it would be Turing complete.
posted by MikeWarot at 7:14 AM on January 8, 2014


we could have computers that are actually secure, but everyone copied Unix, and forgot about real security because they thought it was good enough.

It seems to me that the von Neumann architecture, in which program instructions and data can be treated as the same thing, could never be truly secure.
posted by thelonius at 7:16 AM on January 8, 2014


Sorry dude, all I can do here is repeat myself:

'We could have done X, if only Y were different' is kind of a silly mental exercise don't you think? I mean, we could have had widespread use of penicillin during plagues, if only we hadn't been getting our biology wrong.
posted by feckless fecal fear mongering at 8:15 AM on January 8, 2014


I'm not disputing the past, but I'm saying the resources were available to make computing work, it was bad organization that prevented it from happening. Babbage almost single-handedly kept computing back 100 years.
posted by MikeWarot at 10:17 AM on January 8, 2014


'We could have done X, if only Y were different' is kind of a silly mental exercise don't you think?

judea pearl might beg to differ! :P
posted by kliuless at 3:36 PM on January 8, 2014


« Older One procedural universe, coming right up   |   Taoist tea house Newer »


This thread has been archived and is closed to new comments