8-bit computer from Scratch
June 1, 2017 4:40 AM   Subscribe

Build an 8-Bit Computer from Scratch : "I built a programmable 8-bit computer from scratch on breadboards using only simple logic gates. I documented the whole project in a series of YouTube videos and on this web site." via
posted by Gyan (20 comments total) 44 users marked this as a favorite
 
10111011
posted by leotrotsky at 5:31 AM on June 1, 2017


Many Bens died to bring us this information.
posted by leotrotsky at 5:33 AM on June 1, 2017


I hope I don't hijack this thread but I've always been meaning to find some good literature/instructions on how exactly computers work. (Or also, unrelated, large networks like the Internet.) I have a rough understanding about these things (I can tell apart different components in a PC and know what they are needed for) but on a detailed level beyond the nuts and bolts of it? No idea.

As such I have come across this Youtube series before, but episode 2 was already a bit too byzantine for me. :( (Maybe I was just watching them out of order.)
posted by bigendian at 5:51 AM on June 1, 2017 [1 favorite]


I hope I don't hijack this thread but I've always been meaning to find some good literature/instructions on how exactly computers work.

The most helpful class I took helping me understand computers was a course on logical systems.* If you understand logic gates (easy) , and how logic gates can be grouped to produce mathematical operations (also pretty easy), you've got a pretty good grounding in the theory behind how computers work. Maybe then read a little about the Von Neumann Architecture.

Electronics, of course, is a whole other kettle of fish, but there's plenty of books on that, and electronics is just the medium. You could learn a bit about Turing machines and computability, but I'm not sure how helpful that'd be.

*I wasn't a CS or Engineering major, so I didn't have an extensive course load.
posted by leotrotsky at 6:10 AM on June 1, 2017 [5 favorites]


From the comments on Ben's site: someone went ahead and drew up schematics.
posted by Aya Hirano on the Astral Plane at 6:11 AM on June 1, 2017 [1 favorite]


I'd love if there were a course walking through a series of increasingly complicated computing devices (starting with a simple logic gate) that I could build with my daughter, so that she could also appreciate the underlying structure of computers.
posted by leotrotsky at 6:16 AM on June 1, 2017 [2 favorites]


You know those electronic sets people give kids, where it's a general purpose circuit board, and you have to manually insert the wires under the springs to hook up different circuits and make the light turn on or make a simple radio? That's what a computer is, only it has a CPU that takes instructions about what circuits to make, in the form of code that runs on the machine. The code is the instructions, the CPU is what makes the connections that form the different circuits. That's it. That's all a computer really is.
posted by saulgoodman at 6:17 AM on June 1, 2017


It's out of print, but Digital Computer Electronics by Albert Malvino is a good explanation of going from logic gates to computers. The basic architecture of the Ben's computer is very similar to what is shown in the book.

From NAND to Tetris is also a good introduction on how to go from simple logic gates to a (basic) microprocessor.

Plenty more computers built from scratch can be found in the Hombrew Computer Webring if you want to see how deep the rabbit hole goes.
posted by Pong74LS at 6:21 AM on June 1, 2017 [8 favorites]


Yah, in college (in the Physics department, of all places.. ) was analog (1st semester) and digital (2nd) electronics. It was a lab class with 1hr lab-lecture.

For the first semester, you learned with 'scopes, function generators, resistors, capacitors, diodes etc to learn basic stuff, then got into higher level things (building logic gates from transistors). The lecture was on NPN/PNP type transistors and all the math behind how they work (j-omega-c is about all I remember)

The second semester was entirely "build 8-bit machine w/ a motorola 68k cpu, with some ram, a timer chip, a hex keypad and 4 digit hex display." Trouble shooting a bad wire connection when you had 400+ connections was a pain.

I kinda wish it was required for CS or engineer types, in the sense that it gets you to really know what's happening in a machine and how the parts all inter-operate. Especially the magic smoke...
posted by k5.user at 6:36 AM on June 1, 2017 [4 favorites]


Electronics, of course, is a whole other kettle of fish, but there's plenty of books on that, and electronics is just the medium.

To a rather large extent, you can treat digital electronics as being comprised only of switches and wires. The switches are actually transistors, but they really operate only in saturation and cutoff modes in a digital circuit so you can abstract away their transistor-ness.

The transistor really is an amazing device.
posted by Slothrup at 6:40 AM on June 1, 2017 [1 favorite]


For a decent intro into bridging the gap between basic electronics and building a working computer, Arduino is a great place to start. The Sparkfun Inventor's Kit comes with everything you need to get started, plus a book with about 16 projects in it.

You're still doing some wiring things up on a breadboard, so it reinforces the relationship of the components, but you're also writing code that changes how the Arduino behaves for each project, so in a sense, you are making a whole new spring terminal electronics kit for each project by changing the code.

I've been Doing Science with my friend's kid for a few years now; we started with one of those Elenco Snap Circuits kits, then a 300-in-1 spring terminal kit, then went on to discrete components on breadboards. When we started messing with Arduino, he really took off. Just this weekend we finished soldering up and 3D printing a wearable enclosure for a multi-function smart watch (with apps!) that we designed, and wrote the firmware and software for. He's already gone on to making an Arduino-based device with Wi-Fi to control the lights in his fish tank, and interfacing it with Alexa so he can turn the lights on and off by voice command. Not bad for a freshman!

Arduino is a great platform, and your project can be as simple as making an LED flash at different speeds, to a "Simon Says" pushbutton game, to something much more complex.
posted by xedrik at 6:52 AM on June 1, 2017 [2 favorites]


From NAND to Tetris is also a good introduction on how to go from simple logic gates to a (basic) microprocessor.

That looks really awesome.
posted by leotrotsky at 6:58 AM on June 1, 2017


Tanenbaum's "Structured Computer Organization" provides an extensive description of the different layers that make up a computer, from the digital logic all the way up to the OS. You can easily find PDFs of the book floating around, to give you an impression. Tanenbaum's style is not for everyone.
posted by dmh at 7:14 AM on June 1, 2017 [1 favorite]


Code: The Hidden Language of Computer Hardware and Software is a wonderful introduction to how computers work. It starts at Morse code and walks you through the concepts of switches, circuits, RAM, basically all the way to a desktop.
posted by oulipian at 8:45 AM on June 1, 2017 [6 favorites]


Transistors are the basic element. They're inherently linear devices but in digital applications they work essentially as switches. All inputs and outputs can be considered low (close to 0 volts) or high (this used to be 5 volts, but most discrete logic today operates at 3.3 volts or 1.8 volts. Modern high-speed on-chip microprocessor logic voltage is often lower to reduce power draw and heat - often 1 volt or below but there are practical limits)

Simple arrangements of multiple transistors can perform the basic logic functions of AND, OR, and XOR. In their simplest form these logic GATES have two inputs and one output defined by those inputs:

AND: Output is logic high only when both inputs are logic high
OR: Output is high when at least one input is high
XOR: Output is high when one and only one input is high

(There are also "inverted" versions that work the same as above but replacing 'logic high' with 'logic low')

Gates can be arranged to do amazing things. Some arrangements can perform simple arithmetic functions like addition and subtraction.

Another useful arrangement of gates is the FLIP-FLOP. A flip-flop is simply a logic device that maintains its output value even when some of its inputs change. There are several flavors of flip-flop but the basic computing element is the "D-type" flip flop. It has a single data input (D), a clock control input (CLK), and a single output (Q). Whenever the CLK input transitions from low to high, the value of the Q output gets set to whatever the D input is at that instant. That Q output doesn't change even if the D input changes, until the next low-to-high CLK transition.

Eight D flip-flops with a common CLK makes an 8-bit register. Every CLK rising edge transfers an 8-bit input value to the 8-bit output, and maintains it until the next CLK.

The next-level element is called a STATE MACHINE. It's just an arrangement of gates and flip flops that sequentially changes from one known starting state through a defined sequence of intermediate states to a final resting state. This is the heart of a computer. Each basic computer instruction is a state machine. An example would be a state machine that captures an 8-bit value with one register, captures a second value with another register, adds the two values together with an arithmetic ADDER logic circuit, then latches the result to an output register.

If you create multiple state machine circuits for a collection of useful functions (logic, arithmetic, etc) and decide which one to activate based on another 8-bit logic value (called the INSTRUCTION REGISTER) we call that a central processing unit (CPU).

Early 8-bit computers were just that. Registers and state machines were defined to perform functions like memory addressing, reading, or writing, or setting LED indicators, or reading user input switch values.

They've since become massively more complex, but those were the early basics, and likely what the device in the linked article does.
posted by rocket88 at 8:47 AM on June 1, 2017 [5 favorites]


This is a wonderful post, and all of the comments are great. I have a friend that's been interested in learning more about how computers work; I've got my copy of Code in my backpack right now to hand over later today, and I'll send them a link to the post and this discussion as well. Thanks, Metafilter!
posted by curious nu at 9:11 AM on June 1, 2017 [1 favorite]


Thank you all for the helpful comments and recommendations!!
posted by bigendian at 10:22 AM on June 1, 2017


I've always been meaning to find some good literature/instructions on how exactly computers work.

I see several people have taken stabs at this but I think they've all used phrases that might sound hand-wavey or like "so that's where the magic I don't understand happens." I'm going to try to do it in a more specific way, to see if I can get around this. There are many ways to make a CPU, and this is only one and not at all the best. But you can make it work.

I am going to assume you find the following mostly MSI functions non-mysterious:
  • Tri-state buffer. A group of gates, usually 8 in implementation, with an enable input. When enabled the outputs mirror the inputs, when enable is off the outputs "float." Multiple tri-state outputs can be connected together as long as only one is enabled at a time.
  • Tri-state latch. Like the buffer, but it also has a latch input; when latch is disabled the outputs follow the inputs, but when it is enabled the outputs remember and hold the last state of the inputs. In a CPU, a latch can be a register or memory element. The latching and output functions are independent, e.g. it can latch while the outputs aren't enabled.
  • Counter. Has an input which causes the outputs to increment an output value. May also be tri-state, have a clear input that sets it to zero, or inputs for each bit and a preset input to set a preset value.
  • Adder. Has two sets of inputs and a set of outputs which, after a bit of delay for the carries to propagate, assumes the sum of the inputs.
  • Decoder. This takes inputs forming a binary number, and has an output for each possible number, thus 3-to-8 or 4-to-16. Wider decoders can be made by cascading them.
  • Memory. This is usually LSI but not so mysterious, as it's just a bunch of latches which are enabled through a really wide decoder. We will use both read-write general purpose RAM, and read-only ROM in our design.
OK, so let's see how we can hook these things up to make a CPU!

We need a latch to be the program counter which points at the instruction we're executing. In some designs the PC is a counter, in others the adder in the ALU is used to progress it to the next instruction. We will also need a hidden counter which has a reset input and is directly triggered to count by the CPU clock. Often this microinstruction counter increments on the rising edge of CLK, and other action is taken on the falling edge once the new count has settled.

In the beginning we need a RESET signal which zeroes out (or presets) the PC and microinstruction counter, and another latch we will call opcode. Then the clock falls, and... we do something!

What we do is use the microinstruction counter to enable the output of a microcode ROM which is being indexed by the microinstruction and opcode registers. There are other ways to do this; most early designs like the 6502 used a bunch of gates efficiently wired to generate all the needed outputs. (You are in a maze of twisty NAND gates, all alike.) But for simplicity we're going to assume ROM is cheap and you are writing microcode to implement CPU functionality. When the clock falls, the microcode ROM output is gated to the select, enable, latch, and count inputs of all the chips making up the CPU. This might cause values to be asserted by some chips, latched or passed on by others.

What is the difference between microcode and just plain code? For one, microcode is really wide; even with tricks to reduce the number of bits it might be 100 bits wide in a typical real computer. For another, it's usually not Turing complete. And with rare exceptions, it's fixed in place at the point of manufacture. (Although some computers, like the HP2100 series, had dynamically changeable RAM microcode, which was really cool.)

Anyway, so what does microcode do for you? Well, in any CPU you typically have a few hidden busses which are groups of inputs and outputs all connected together. So the first word of microcode might enable the latch input of the opcode register to remember the instruction we're currently executing, which is also being presented to the input of the microcode ROM. We'll need to remember it later as we proceed to work on other locations in RAM under its direction.

The next microinstruction might go ahead and increment the PC, or it might go ahead and present the operating instruction to a set of high bits of the microcode ROM. This would constitute a microcode jump to microcode specific to the opcode we're executing. Everything past that depends on the instruction.

So let's say our instruction is to move data from memory to a register. We need a hidden register to receive the memory address; our next few microcode instructions advance the PC, then enable memory to present the value stored there to the data bus while enabling the hidden register to latch that value for the target address. This might take multiple steps if the address bus is wider than the data bus. Then, with the target address built up, a microinstruction disables the PC output and enables the hidden register to output to the outside RAM address bus. This will cause RAM to make the value we're reading available on the outside data bus, but usually not immediately.

On the next microinstruction we enable the target register to latch the value present on the RAM data bus. Voila, we've moved a byte from RAM to a register, so...

On the next instruction we assert bits which reset the hidden opcode register and microinstruction counter. The PC is now presumably pointing at the next instruction. (We might have needed some more microcode to make that happen) and we proceed to decode it. Note this is slightly different from the master RESET because it doesn't default the PC register.

Modern computers have an Arithmetic-Logic Unit or ALU which is a general purpose math box. It usually has a function select input, conveniently driven by a few bits of microcode, which selects which function will be performed on the inputs to produce the output. More microcode bits enable different sources to the input and different receivers to latch the output. There are generally at least four hidden buses in a practical CPU to provide for moving instructions, addresses, and data around as they are used.

Now, this is just one way to build a CPU, but it has all the essential elements and the differences with other designs generally involve reduced complexity or increased efficiency. There are a lot of tricks for reducing the amount of microcode needed; decoders can be used on sets of bits since only one output can generally be enabled on one of the buses at a time. But even this isn't always true; the venerable 6502 uses a "might makes right" approach where in several cases two outputs are enabled at the same time, but the one that needs to be used has been made with bigger transistors. There is really nothing about computer design that is completely sacred.

You might find you need the bottom bits of the microinstruction counter to be sent to a decoder to guide a more complex process of advancing enabling microcode outputs, enabling latches to read, and disabling everything in a sensible order. A lot of this is guided by things like propagation and settle times of the components. A lot of this is more complicated in primitive computers where components were both slower and more expensive, so complexity was expensive.

But the fundamental essence is that the clock advances a state machine (non-magic phrase now, right? It can just be a counter driving a ROM) which progressively asserts different triggers depending on the opcode of the current instruction to move data around between registers, memory, and other functional blocks like an adder or full ALU. It is your selection of opcodes and how they cycle those outputs which makes your computer go from a program written by a regular programmer in opcodes.
posted by Bringer Tom at 3:34 PM on June 1, 2017 [3 favorites]


The PDP-8 is probably more minimal than any of the common 8 bit architectures, as it was designed in an era where even a single gate took up an entire plug-in card; yet, unlike most of its antecedents, it was a successful mass-produced product, with thousands of systems produced and probably dozens still running. The PDP8-S, which minimized circuit complexity by using a serial adder, only required 519 gates to implement its CPU, which I would think makes it a feasible, if complex, do-it-yourself project to construct it out of 7400 series integrated circuits.
posted by mr vino at 4:09 PM on June 1, 2017 [1 favorite]


The book oulipian mentioned is what I was going to suggest. It is starts out very gently and builds upon itself to show how the internals of a computer works.
posted by mmascolino at 8:23 PM on June 1, 2017


« Older And now for something completely different   |   Lynda Barry joins Family Circus Newer »


This thread has been archived and is closed to new comments