100 x 75 resolution!
July 14, 2019 5:15 AM   Subscribe

Did you ever think to yourself, "Hey, I probably could make a video card if I really tried?" No? Well, maybe you could! Here's Part 2.
posted by JHarris (17 comments total) 23 users marked this as a favorite
 
Wow, this is so far beyond the "everyone needs a hobby" joke, I can't even make that joke. Amazing project.
posted by sammyo at 6:30 AM on July 14, 2019


I caught part one of this last week and learned so much. The whole channel is great... really breaking down the concepts into comprehensible parts.
posted by adamt at 6:34 AM on July 14, 2019


I was only mildly impressed when I saw the first video, but when the second part popped up I had to drop the cynicism. Not only did he end up clearly explaining and demonstrating the basics of how VGA works, but also how seemingly minor aspects of the chips you choose can make or break a project.
posted by wierdo at 6:35 AM on July 14, 2019 [1 favorite]


I got flashbacks to creating modelines for XF86Config files and had to close the video. Maybe I'll watch more later when the muscles in my neck relax a bit.
posted by clawsoon at 7:29 AM on July 14, 2019 [10 favorites]


The nice thing about being young when you learn about things like X mode lines for the first time is that it all seems normal. That background did very much help in understanding what he was talking about regarding the pixel clock, refresh rates, and blanking intervals. Conversely, the videos very much made what had been mostly abstract concepts far more concrete in my head.
posted by wierdo at 8:10 AM on July 14, 2019 [4 favorites]


A few of my favourites from his channel:

Lighting an LED

...without burning it out

Building a 4-bit adder out of logic gates

And of course the whole series of videos where he builds an 8-bit breadboard computer. The part where he explains instruction fetch and decode is where I finally wrapped my head around what "microcode" is.
posted by swr at 8:36 AM on July 14, 2019 [6 favorites]


I loved how he was able to explain the NAND gate and how he was able to get away with eight bits. Do video cards still work this way, or can they send a signal for a particular pixel, now that we all mostly have flat panels?
posted by They sucked his brains out! at 8:59 AM on July 14, 2019


Do video cards still work this way?

HDMI sends all of the pixels for a frame in a serial stream, rather like VGA, though the pixel encoding is much more complicated.

Monitors don't have memories, so sending a signal for a particular pixel would not be useful. (In the stone age of computers, monitors often did have "memory", so addressing individual pixels was very useful.)
posted by monotreme at 10:18 AM on July 14, 2019 [2 favorites]


A wire is an inherently one-dimensional means of data transfer. Any two-dimensional image you send through it has to have some mapping applied to it to fit it through the pipe. The VGA scanline method is the same mechanism that TVs used since nearly the beginning, long before computers were household items, so it makes sense that a relatively simple system like this one could drive it.
posted by JHarris at 10:30 AM on July 14, 2019 [1 favorite]


This was really good. What is funny is that it was very easy to understand for me except the part when it came to using software. I understand what he is doing there but there is something about the arbitrariness of high level language vocabulary and syntax etc. that brings my brain to a screeching halt. I think it is that you just have to have it memorized whereas the other more physical? stuff is like a narrative where you can construct any part of it from earlier less complex parts. It has always been a stumbling block for me when it comes to any of the halfhearted attempts at learning to program that I have made. To this day I have not been able to memorize the multiplication tables.
posted by Pembquist at 10:53 AM on July 14, 2019 [1 favorite]


This was surprisingly enjoyable, and I pretty much failed out of EE when I was in college. I like the bit at the end where he’s swapping pre-programmed EPROMs in and out of the breadboard to show a succession of different images. It’s interesting to see complex things like video cards broken down into steps so that the “like this, but faster” jump is easier to make.
posted by migurski at 11:57 AM on July 14, 2019 [2 favorites]


This is a great, smooth production of something that (IME) is a really herky-jerky process the first time through. (Or the second, or the tenth!) I'm worried the result might be less useful to a newbie for that reason. If you copy exactly what he does, and you do it exactly correctly, you too may be able to build a breadboard VGA card.

But what if it doesn't work when you turn it on? How do you figure out what's wrong? That's what a newbie needs to see -- it's like the blooper roll from this presentation.

I guess an alternative would be to have a hackathon, where you start with a pile of parts, and the goal is for everybody to be able to put an image up on a VGA monitor. I led a hackathon for a Chiptune Music Box where ~10 people ended up with noisemakers at the end of the day. That's probably more efficient than trying to make a video where you show all the things that could possibly go wrong, how to figure out which one(s) of them did go wrong, and how to fix it.
posted by spacewrench at 2:14 PM on July 14, 2019 [2 favorites]


Fun summer project that overlaps so well: https://www.nand2tetris.org/.
posted by tayknight at 6:35 PM on July 14, 2019 [3 favorites]


> "The VGA scanline method is the same mechanism that TVs used since nearly the beginning …"
Not 'nearly'; since the very beginning. The basic mechanism was essentially the same & remained unchanged from the days of Nipkow's disk, Bidwell's phototelegraph, & Baird's TV.
posted by Pinback at 11:25 PM on July 14, 2019 [2 favorites]


"...So if you manage to break your flat panel monitor, I don't want to hear about it, and don't blame me. ...Actually, I do want to hear about it, but don't blame me."
posted by Rock Steady at 12:13 PM on July 15, 2019 [2 favorites]


In fall of 1997, a friend and I were taking "Introductory Digital Systems Laboratory" in college and this was our final project. We used a slightly different approach (I think we used programmable array logic to turn the counter output into the timings, rather than NOTs and NANDs and latches), and produced 320x240x256 colors on an actual CRT. We'd originally intended to use a super-fast DAC to get the color gradations, but discovered that a simple resistor ladder (like the one described in the video, but with 8 bit input, not 2) settled much faster than the DAC.

Phase 2 was rasterizing triangles, like a proper 3D card of the day. I think we got up to the single digit triangles per second. Not quite enough to play Quake...


My lab partner went on to become an LED artist. R.I.P.
posted by aneel at 9:12 PM on July 15, 2019 [2 favorites]


Not 'nearly'; since the very beginning. The basic mechanism was essentially the same & remained unchanged from the days of Nipkow's disk, Bidwell's phototelegraph, & Baird's TV.

I was thinking along the lines of the left-to-right, top-to-bottom scanning, thinking I remember hearing about methods that scanned differently. I could, of course, be wrong. It is obvious, as I mentioned above, that *some* mapping was required to convert the one-dimensional video signal into a two-dimensional field, but beyond that can't really speak much on the history of CRTs, so I will bow to superior knowledge, especially if that superior knowledge chooses to expound on those systems you mention, as they sound fascinating.
posted by JHarris at 6:36 AM on July 18, 2019


« Older Toronto Tomorrow   |   Batter Steals First, Making Baseball History Newer »


This thread has been archived and is closed to new comments