Homebrewed CPU
May 28, 2009 5:51 AM Subscribe
Intel’s fabrication plants can churn out hundreds of thousands of processor chips a day. But what does it take to handcraft a single 8-bit CPU and a computer? Give or take 18 months, about $1,000 and 1,253 pieces of wire.
“Computers can seem like complete black boxes. We understand what they do, but not how they do it, really,” says Chamberlin. “When I was finally able to mentally connect the dots all the way from the physics of a transistor up to a functioning computer, it was an incredible thrill.”
Another way to do this is to get a Computer Engineering degree. You don't bother with the actual hardware construction though.
posted by smackfu at 6:02 AM on May 28, 2009 [2 favorites]
Another way to do this is to get a Computer Engineering degree. You don't bother with the actual hardware construction though.
posted by smackfu at 6:02 AM on May 28, 2009 [2 favorites]
I think This is the link that should be the second link. Check out this Board layout. Actually it looks like he did what I was talking about: take a bunch of 'component' chips to build a CPU.
posted by delmoi at 6:03 AM on May 28, 2009
posted by delmoi at 6:03 AM on May 28, 2009
When I worked for Sperry back in the day. The CPU of the computers was one or more circuit boards with discrete elements. Looked about the same. To reinvent the wheel like this is really cool.
posted by RussHy at 6:12 AM on May 28, 2009
posted by RussHy at 6:12 AM on May 28, 2009
Another way to do this is to get a Computer Engineering degree.
I've only taken one "True" CE course, but it taught me enough to design and implement a simple CPU, which could have been loaded on an FPGA. That was the only course I took that dealt with actual circuits and logic gates, everything else was about higher level programming languages. We learned how to make all of the components: Flipflops, multiplexers, registers, adders, multipliers, ALUs, that go into a CPU. I think if I sat down with a book I could probably do it.
It was actually really cool, at that point I understood how a computer operates all the way down to the transistor level, before that even though I was a pretty good programmer it was always kind of mysterious exactly how CPU instructions were actually run.
The point is, you could probably learn to do this in six months in your spare time. I think This is the textbook we used (it has a new cover, though) and it looks like the book comes with some software on CD you can use.
posted by delmoi at 6:15 AM on May 28, 2009 [1 favorite]
I've only taken one "True" CE course, but it taught me enough to design and implement a simple CPU, which could have been loaded on an FPGA. That was the only course I took that dealt with actual circuits and logic gates, everything else was about higher level programming languages. We learned how to make all of the components: Flipflops, multiplexers, registers, adders, multipliers, ALUs, that go into a CPU. I think if I sat down with a book I could probably do it.
It was actually really cool, at that point I understood how a computer operates all the way down to the transistor level, before that even though I was a pretty good programmer it was always kind of mysterious exactly how CPU instructions were actually run.
The point is, you could probably learn to do this in six months in your spare time. I think This is the textbook we used (it has a new cover, though) and it looks like the book comes with some software on CD you can use.
posted by delmoi at 6:15 AM on May 28, 2009 [1 favorite]
How about a functioning CPU from bare components done in a weekend? Meet joeprocessor. The open directory has general notes about the implementation in a text file, the rest are scans of the design notes, and photos of the CPU itself, built on breadboard instead of wirewrap.
Designed by joe holt, who was also a major contributor to Adobe Acrobat and Adobe Illustrator and is currently a professor at Bennington College.
posted by plinth at 6:48 AM on May 28, 2009
Designed by joe holt, who was also a major contributor to Adobe Acrobat and Adobe Illustrator and is currently a professor at Bennington College.
posted by plinth at 6:48 AM on May 28, 2009
The thing is, though, that even without buying a "real" CPU you can buy chips that have things like logic gate packages, ALUs, Flipflops and other components you would use to make a CPU. What you would end up with would look more like an old ciruitboard from back when Atari's were popular, rather then a huge mess of wires.That's what we did in my computer engineering class. But I always felt like the biggest "hack" in the whole thing was that they hand you an ALU which you wire up to all the other components. Once you have a black box that performs basic arithmetic operations, the rest is easy.
I think I was one of the last classes to actually do that by hand with physical components. Now it's all done in simulation with a graphical layout tool, but you get to make 16 and/or 32 bit computers instead of the simple 8-bit ones.
posted by deanc at 6:51 AM on May 28, 2009
Almost all the components come from the 1970s- and 1980s-era technology.
Yeah, that's pretty cool, I guess. But I want to build this.
posted by DU at 6:58 AM on May 28, 2009 [2 favorites]
Yeah, that's pretty cool, I guess. But I want to build this.
posted by DU at 6:58 AM on May 28, 2009 [2 favorites]
Yeah, that's pretty cool, I guess. But I want to build this.
Whoa - it's set up in someone's bedroom! I suppose the soft clicking of relays might be a soothing at night.
posted by exogenous at 7:06 AM on May 28, 2009
Whoa - it's set up in someone's bedroom! I suppose the soft clicking of relays might be a soothing at night.
posted by exogenous at 7:06 AM on May 28, 2009
Computer Organization and Design was the book we used for our "this is how processors are built from the ground up" class. One of my favorite textbooks, and the used copies are very affordable. Don't worry about them being out-of-date... an intro CE class only covers 20-year-old technology anyways.
posted by smackfu at 7:07 AM on May 28, 2009
posted by smackfu at 7:07 AM on May 28, 2009
I think I was one of the last classes to actually do that by hand with physical components. Now it's all done in simulation with a graphical layout tool, but you get to make 16 and/or 32 bit computers instead of the simple 8-bit ones.
We had that, but the first labs involved us actually pushing our designs to an FPGA so we got to push buttons and make things light up. After a while though we kept everything on the PC.
posted by delmoi at 7:19 AM on May 28, 2009
We had that, but the first labs involved us actually pushing our designs to an FPGA so we got to push buttons and make things light up. After a while though we kept everything on the PC.
posted by delmoi at 7:19 AM on May 28, 2009
That modem blew me away, but not for any of the reasons he highlighted. He talks a bit about how hard it was to connect to a computer, but surely that's just a matter of matching the DB9 specs? He even specifically mentions voltage levels. Serial comm hasn't changed that much.
Whereas he completely breezes past the far harder connection: The modem on the other end. The 300 baud code on the server end can't have been tested all that thoroughly. Or is it really just a matter of speed and all the code is completely general? Even so, cranking the speed of any program down by a factor of 20 is probable to expose bugs but he had zero problems.
Kudos to the server end's terminal software.
posted by DU at 7:23 AM on May 28, 2009
Whereas he completely breezes past the far harder connection: The modem on the other end. The 300 baud code on the server end can't have been tested all that thoroughly. Or is it really just a matter of speed and all the code is completely general? Even so, cranking the speed of any program down by a factor of 20 is probable to expose bugs but he had zero problems.
Kudos to the server end's terminal software.
posted by DU at 7:23 AM on May 28, 2009
Another way to do this is to get a Computer Engineering degree. You don't bother with the actual hardware construction though.
I submit that you do not really know how something is built until you bother with the actual hardware construction.
posted by DU at 7:28 AM on May 28, 2009
I submit that you do not really know how something is built until you bother with the actual hardware construction.
posted by DU at 7:28 AM on May 28, 2009
I submit that you do not really know how something is built until you bother with the actual hardware construction.
Eh. My memory of electronics labs was that you spent 5 minutes thinking about what you were going to do, 10 minutes wiring it up, and an hour figuring out which wire wasn't connected right, or which resistor was red-black-red instead of red-blue-red. The only thing I learned doing that was that I should be a software guy.
Whereas he completely breezes past the far harder connection: The modem on the other end.
I guess that the other modem must just default to 300 baud when it gets no response to the handshake, which is convenient.
posted by smackfu at 7:37 AM on May 28, 2009
Eh. My memory of electronics labs was that you spent 5 minutes thinking about what you were going to do, 10 minutes wiring it up, and an hour figuring out which wire wasn't connected right, or which resistor was red-black-red instead of red-blue-red. The only thing I learned doing that was that I should be a software guy.
Whereas he completely breezes past the far harder connection: The modem on the other end.
I guess that the other modem must just default to 300 baud when it gets no response to the handshake, which is convenient.
posted by smackfu at 7:37 AM on May 28, 2009
Actually after watching the video I was thinking he should have visited the moble version of wikipedia which would have looked better in Lynx. Then I ended up reading the entire article on modems. It's really interesting to see how that technology evolved as well.
posted by delmoi at 7:44 AM on May 28, 2009
posted by delmoi at 7:44 AM on May 28, 2009
@smackfu: Eh. My memory of electronics labs was that you spent 5 minutes thinking about what you were going to do, 10 minutes wiring it up, and an hour figuring out which wire wasn't connected right, or which resistor was red-black-red instead of red-blue-red. The only thing I learned doing that was that I should be a software guy.
Actually, that is exactly what you're supposed to learn from doing it in hardware. Engineers who have never been made to do that are the reason so much engineering turns out crap.
posted by localroger at 7:51 AM on May 28, 2009 [2 favorites]
Actually, that is exactly what you're supposed to learn from doing it in hardware. Engineers who have never been made to do that are the reason so much engineering turns out crap.
posted by localroger at 7:51 AM on May 28, 2009 [2 favorites]
My memory of electronics labs was that you spent 5 minutes thinking about what you were going to do, 10 minutes wiring it up, and an hour figuring out which wire wasn't connected right, or which resistor was red-black-red instead of red-blue-red. The only thing I learned doing that was that I should be a software guy.
The same ratio applies to software, IME. And sometimes that hour is wasted on finding a stupid problem, but often that hour is spent in learning that something you thought was true or obvious wasn't really. And that's when you really learn what's going on.
posted by DU at 7:53 AM on May 28, 2009
The same ratio applies to software, IME. And sometimes that hour is wasted on finding a stupid problem, but often that hour is spent in learning that something you thought was true or obvious wasn't really. And that's when you really learn what's going on.
posted by DU at 7:53 AM on May 28, 2009
Well, I've known architects who really could have benefited from hammering a couple of goddamn boards together once and awhile so they'd understand why their crazy-ass "warped wood focus wall / interstice" was a stupid idea...
posted by rokusan at 8:10 AM on May 28, 2009 [1 favorite]
posted by rokusan at 8:10 AM on May 28, 2009 [1 favorite]
No wireless. Less space than a nomad. Lame.
posted by mazola at 8:38 AM on May 28, 2009 [4 favorites]
posted by mazola at 8:38 AM on May 28, 2009 [4 favorites]
I found this extremely cool, although not quite as cool when I realized it was largely wiring together existing chips. Still, kudos to him for learning how these things worked from the high/low voltage and "1/0" level up. I've been wanting to do that for a while now, ever since the 1632 fictional book series inspired in me this "Connections"-like desire to really understand how things work, and to understand how, if we needed to, we could pull ourselves back up to our semblance of technology we're all so used to today. It's been a kind of dream of mine to see if I can't eventually homebuild a microprocessor sufficient to run a functional OS.
However, that homebuilt relay CPU is pretty awesome for that reason- it's not just skipping over sections of ALUs, but implementing everything you'd find on an 8-bit chip, in big clacky structures no less. :)
posted by hincandenza at 8:45 AM on May 28, 2009
However, that homebuilt relay CPU is pretty awesome for that reason- it's not just skipping over sections of ALUs, but implementing everything you'd find on an 8-bit chip, in big clacky structures no less. :)
posted by hincandenza at 8:45 AM on May 28, 2009
Once you have a black box that performs basic arithmetic operations, the rest is easy.
what? that might be true for a class project but in the 'real world' the ALU is about the simplest part of the whole design. the complexity of the instruction fetch / instruction decoder in a modern x86 cpu is staggering. even if you were designing a risc processor, getting the pipeline right and handling all of the hazards is a heck of a lot more challenging than designing an ALU.
posted by joeblough at 9:05 AM on May 28, 2009 [2 favorites]
what? that might be true for a class project but in the 'real world' the ALU is about the simplest part of the whole design. the complexity of the instruction fetch / instruction decoder in a modern x86 cpu is staggering. even if you were designing a risc processor, getting the pipeline right and handling all of the hazards is a heck of a lot more challenging than designing an ALU.
posted by joeblough at 9:05 AM on May 28, 2009 [2 favorites]
what? that might be true for a class project but in the 'real world' the ALU is about the simplest part of the whole design. even if you were designing a risc processor, getting the pipeline right and handling all of the hazards is a heck of a lot more challenging than designing an ALU.
I'm not sure why this comment of yours was necessary-- I think it was pretty clear from the context of my comment that I was referring to the process of making a simple 8-bit, non-pipelined microprocessor from component parts. When one of the components is an ALU, meaning that you don't have to wire together chips that will do the binary arithmetic yourself, suddenly the whole project suddenly seems a lot easier.
posted by deanc at 9:17 AM on May 28, 2009
I'm not sure why this comment of yours was necessary-- I think it was pretty clear from the context of my comment that I was referring to the process of making a simple 8-bit, non-pipelined microprocessor from component parts. When one of the components is an ALU, meaning that you don't have to wire together chips that will do the binary arithmetic yourself, suddenly the whole project suddenly seems a lot easier.
posted by deanc at 9:17 AM on May 28, 2009
Yeah, but how long / much does the second one take / cost?
(this is cool)
posted by Reverend John at 9:45 AM on May 28, 2009
(this is cool)
posted by Reverend John at 9:45 AM on May 28, 2009
I have coded a simple (no interrupts, no periphs), 8-bit CPU on an FPGA in 24 hours. I planned out the instruction formats ahead of time and went in with a clear plan. It didn't work straight off, and by then I was really tired, so I didn't get it debugged until the next day. So a day of planning, a day of coding, and a day of debugging. Simple CPUs just aren't that complicated.
posted by ryanrs at 9:49 AM on May 28, 2009
posted by ryanrs at 9:49 AM on May 28, 2009
Soul of a New Machine (featuring Metafilter's own Jessamyn's dad) has a lot about how they used to wire up the prototype boards for new computers by hand. It sounded awful.
posted by smackfu at 9:55 AM on May 28, 2009
posted by smackfu at 9:55 AM on May 28, 2009
Kids these days still do this in Comp Eng. I saw a 4th year design project fair a few years back and there's always one team that implements MIX on a FPGA. Doing it on TTL chips is obviously a lot harder, but only in terms of effort. I don't think it's conceptually any more difficult.
To make an analogy, most people buy cakes. Comp Eng students bake cakes given flour. This guy baked a cake given several bushels of wheat kernels. In any case, none of them are farmers out there laying out IC masks and debugging them with electron microscopes and such.
But the measure of difficulty aside, it's a cool project and it's awesome the same way all home-built stuff is awesome. In that awesome way.
posted by GuyZero at 10:06 AM on May 28, 2009
To make an analogy, most people buy cakes. Comp Eng students bake cakes given flour. This guy baked a cake given several bushels of wheat kernels. In any case, none of them are farmers out there laying out IC masks and debugging them with electron microscopes and such.
But the measure of difficulty aside, it's a cool project and it's awesome the same way all home-built stuff is awesome. In that awesome way.
posted by GuyZero at 10:06 AM on May 28, 2009
One of my friends went to Stanford, and they were required to build an 8-bit processor from scratch and use it to solve problems in a matter of just a few weeks. And this was back in the 80s, so there wouldn't have been FPGA or any of the modern ways to cheat -- as far as I know, he designed and built it from scratch using single discrete transistors and wire.
He said it was one of the hardest things he'd ever done.
posted by Malor at 11:38 AM on May 28, 2009
He said it was one of the hardest things he'd ever done.
posted by Malor at 11:38 AM on May 28, 2009
You know, in thinking about it, that processor project might have been 4-bit, instead of 8-bit. Regardless, it was apparently a supremely difficult thing to do while trying to carry your other courses as well.
posted by Malor at 11:40 AM on May 28, 2009
posted by Malor at 11:40 AM on May 28, 2009
Back in my day, if you wanted to know how computers worked you used marbles.
posted by digsrus at 12:02 PM on May 28, 2009
posted by digsrus at 12:02 PM on May 28, 2009
DU:
About the serial stuff: the RS232 spec allows for a very wide range of voltages. Cheap USB-serial converters probably have as low of a voltage level as possible. The old modem probably couldn't use it, either due to aging electronics or the less mature state of the RS232 standard in 1964.
About the modem on the other end: The Livermore modem is emulating a 1962-era Bell 103, which any modem (hard or soft) made up through today can still speak. If you call a modem and let it scream at you long enough, you'll eventually hear the Bell 103 carrier tone.
posted by zsazsa at 12:09 PM on May 28, 2009 [1 favorite]
About the serial stuff: the RS232 spec allows for a very wide range of voltages. Cheap USB-serial converters probably have as low of a voltage level as possible. The old modem probably couldn't use it, either due to aging electronics or the less mature state of the RS232 standard in 1964.
About the modem on the other end: The Livermore modem is emulating a 1962-era Bell 103, which any modem (hard or soft) made up through today can still speak. If you call a modem and let it scream at you long enough, you'll eventually hear the Bell 103 carrier tone.
posted by zsazsa at 12:09 PM on May 28, 2009 [1 favorite]
This is cool. I wish I had the money and the time, I would spend all my time building this kind of stuff.
My get off my lawn story? In my CE class first we built AND and NAND gates using transistors, then we were allowed to buy a chip with NANDs on it to build a XOR gate with. We continued this way up to a full computer, building every component from more basic parts before you were allowed to buy it as a nice tiny black box. We ended up with a 4 bit processor with a very very limited instruction set and less memory than displaying this word requires, that you could program either burning something in an EEPROM with the expensive machines in the labs, or by moving jumpers and using a momentary pushbutton as a clock signal.
It took us 12 weeks, and it was just one of 7 or 9 classes we had to take.
It feels great to demystify a computer this way, anyone of normal intelligence could learn it in a few months, no need for any hard mathematics or physics. If you want to think of yourself as computer literate, you should give it a try, or at least read the Computer Organization and Design book someone recommended here, it is excellent.
At the end of the class, I had a box full of 'Matryoshka' breadboards. Anytime someone asked me to fix their computer, I would say 'Sure, but first let me explain to you how your computer works. See this ALU? It has a few hundreds of these (pull breadboard out of box) inside it, and see this one chip here? It has (pull dustier breadboard out of box) dozens of these inside. And see than one chip in that corner? its has ......... and finally, do you see this? This is sand, made of silicon. Shall we talk about the periodic table and valence electrons?' No one asked twice.
posted by dirty lies at 12:35 PM on May 28, 2009 [2 favorites]
My get off my lawn story? In my CE class first we built AND and NAND gates using transistors, then we were allowed to buy a chip with NANDs on it to build a XOR gate with. We continued this way up to a full computer, building every component from more basic parts before you were allowed to buy it as a nice tiny black box. We ended up with a 4 bit processor with a very very limited instruction set and less memory than displaying this word requires, that you could program either burning something in an EEPROM with the expensive machines in the labs, or by moving jumpers and using a momentary pushbutton as a clock signal.
It took us 12 weeks, and it was just one of 7 or 9 classes we had to take.
It feels great to demystify a computer this way, anyone of normal intelligence could learn it in a few months, no need for any hard mathematics or physics. If you want to think of yourself as computer literate, you should give it a try, or at least read the Computer Organization and Design book someone recommended here, it is excellent.
At the end of the class, I had a box full of 'Matryoshka' breadboards. Anytime someone asked me to fix their computer, I would say 'Sure, but first let me explain to you how your computer works. See this ALU? It has a few hundreds of these (pull breadboard out of box) inside it, and see this one chip here? It has (pull dustier breadboard out of box) dozens of these inside. And see than one chip in that corner? its has ......... and finally, do you see this? This is sand, made of silicon. Shall we talk about the periodic table and valence electrons?' No one asked twice.
posted by dirty lies at 12:35 PM on May 28, 2009 [2 favorites]
If this were my project, as soon as it started working I'd replace the crystal with a function generator and see how much faster than 8MHz it would go before amusing errors started pouring out. It's TTL, so it's not like it'll run any hotter.
Of course, my friends and I also joke about implementing something like a 486 core in InP and trying to run it at like 30GHz
posted by 7segment at 12:38 PM on May 28, 2009
Of course, my friends and I also joke about implementing something like a 486 core in InP and trying to run it at like 30GHz
posted by 7segment at 12:38 PM on May 28, 2009
One of my Projects For When I'm Very Rich is a pneumatic computer. Basically, figure out how to make a NAND out of pistons and valves, then expand from there. Steam powered.
posted by qvantamon at 1:24 PM on May 28, 2009
posted by qvantamon at 1:24 PM on May 28, 2009
Stop by the Computer History Museum and see the Babbage Machine while it's still there. Hand-cranked computing power. With carry bits.
posted by GuyZero at 1:30 PM on May 28, 2009
posted by GuyZero at 1:30 PM on May 28, 2009
If you want to make a CPU, here's the kit to do it. A million gates, some flash, some dram, crude VGA, and a keyboard port. Plenty of resources to make a 32-bit processor. Costs $200 and programming tools are free from Xilinx.
You should be able to make something capable of running Linux, should you chose to take it that far.
posted by ryanrs at 2:42 PM on May 28, 2009
You should be able to make something capable of running Linux, should you chose to take it that far.
posted by ryanrs at 2:42 PM on May 28, 2009
About the serial stuff: the RS232 spec allows for a very wide range of voltages. Cheap USB-serial converters probably have as low of a voltage level as possible.
Or maybe just a USB port limitation. USB has a single 5V power line, which is on the low end for RS-232.
posted by smackfu at 2:50 PM on May 28, 2009
Or maybe just a USB port limitation. USB has a single 5V power line, which is on the low end for RS-232.
posted by smackfu at 2:50 PM on May 28, 2009
A MAX232 or the like.
posted by blenderfish at 3:29 PM on May 28, 2009
posted by blenderfish at 3:29 PM on May 28, 2009
It's worth noting that the Commodore 64 had exclusively TTL-level RS-232. (RS-232C, they called it, IIRC) So, this has been a problem for a long, long, time. :)
posted by blenderfish at 3:32 PM on May 28, 2009
posted by blenderfish at 3:32 PM on May 28, 2009
Danny Hillis (yes, that Danny Hillis) built a "computer" that plays Tic-Tac-Toe. Out of Tinkertoys. (I met him, too, but all he wanted to talk about was video CODECs.) It gets a brief mention in his book The Pattern in the Stone.
Also - first computer EVAR? Hoax? Dunno. Interesting, though, anyway, for the drawings if nothing else.
One of the 'advanced' exercises (which older EEs would consider somewhere between 'basic' and 'insultingly easy') they require you to do when you get to a certain level in Crestron programming is to build a toggle (aka a T flip flop) out of a series of AND (or OR) and NOT symbols. Most of the software guys throw their hands up in despair after a few minutes.
Silly software rabbits. It's all just gates. (or should that be - it's all just Gates?)
posted by ostranenie at 4:55 PM on May 28, 2009
Also - first computer EVAR? Hoax? Dunno. Interesting, though, anyway, for the drawings if nothing else.
One of the 'advanced' exercises (which older EEs would consider somewhere between 'basic' and 'insultingly easy') they require you to do when you get to a certain level in Crestron programming is to build a toggle (aka a T flip flop) out of a series of AND (or OR) and NOT symbols. Most of the software guys throw their hands up in despair after a few minutes.
Silly software rabbits. It's all just gates. (or should that be - it's all just Gates?)
posted by ostranenie at 4:55 PM on May 28, 2009
Also, they had readymade adapters to hook the 'user port' of the C64 to a serial modem. (I know this because I am old, and I had one.)
posted by ostranenie at 4:56 PM on May 28, 2009
posted by ostranenie at 4:56 PM on May 28, 2009
How about building your own Apple II in '79? Granted, he had inside information, but...
posted by ostranenie at 4:59 PM on May 28, 2009 [1 favorite]
posted by ostranenie at 4:59 PM on May 28, 2009 [1 favorite]
the Commodore 64 had exclusively TTL-level RS-232. (RS-232C, they called it, IIRC)No, RS-232C is simply the third (fourth?) revision of the standard. As far as I know, TTL-level async serial is not part of any of the "232" standards, though it's pretty common. (There's also the issue that TTL-level serial is usually inverted w.r.t. RS232, with +5V being MARK and 0v being SPACE, instead of ~ -9/+9 volts respectively.)
posted by hattifattener at 5:17 PM on May 28, 2009
But I always felt like the biggest "hack" in the whole thing was that they hand you an ALU which you wire up to all the other components. Once you have a black box that performs basic arithmetic operations, the rest is easy.
Aw, the ALU is the fun part. None of that pipelining bullshit, nosirree.
posted by spaceman_spiff at 8:29 PM on May 28, 2009
Aw, the ALU is the fun part. None of that pipelining bullshit, nosirree.
posted by spaceman_spiff at 8:29 PM on May 28, 2009
Yay! He is at Maker Faire this weekend. Nice guy, took the time to answer some questions. I felt like I could have easily bugged him with questions for a couple of hours, unfortunately I felt that way at almost every exhibit. He did mention that one of the things he'd learned was that there was a reason why processors have a pinout telling whether or not they are currently accessing memory. Also, I gather VGA was a bitch, and he had to 'cheat' on the serial port by getting an adapter that emulates RS-232 over a USB connection (not sure if it uses software on the other end or what).
posted by BrotherCaine at 1:32 AM on May 31, 2009
posted by BrotherCaine at 1:32 AM on May 31, 2009
@ostranenie: The A.K. Dewdney article was an April Fools joke.
posted by Hello Dad, I'm in Jail at 2:50 AM on May 31, 2009
posted by Hello Dad, I'm in Jail at 2:50 AM on May 31, 2009
« Older Software to track stolen laptops | The Hallway Newer »
This thread has been archived and is closed to new comments
Anyway, that's pretty cool. The thing is, though, that even without buying a "real" CPU you can buy chips that have things like logic gate packages, ALUs, Flipflops and other components you would use to make a CPU. What you would end up with would look more like an old ciruitboard from back when Atari's were popular, rather then a huge mess of wires.
And of course you can also implement your design in an FPGA, which would take just a few seconds.
posted by delmoi at 6:00 AM on May 28, 2009