MacPaint and QuickDraw Source Code
July 20, 2010 12:41 AM   Subscribe

MacPaint and QuickDraw Source Code - The Computer History Museum and Apple, Inc. release the source code to one of the first major drawing tools and graphical libraries for personal computers, one that managed to fit inside a paltry 128 KB of memory.

"One of the earliest bits of software that made the original Macintosh computer so interesting to use and unusual for its time was a drawing program called MacPaint.

"Released in 1984 with the Mac, it is fondly remembered not only by those who used it, but also by computer scientists for numerous first-of-a-kind innovations. Those who spend a lot of time using Adobe Photoshop constantly use such features as the lasso tool for selecting non-rectangular shapes, and the paint bucket for filling closed areas with a pattern, and later, color. Both first appeared in MacPaint. The program was unique at the time for its ability to create graphics that could then be used in other applications."


Its memory constraints were so tight that its designers uncovered bugs in the underlying platform:

"At the worst case, there was only about 100 bytes free in MacPaint's heap. Most of the bugs we encountered when running MacPaint turned out to be bugs in the underlying system, which were exposed by running so close to the edge of available memory.

"It's interesting to note that MacPaint was a rather small program by today's standards, but I guess that it had to be to run in the Mac's one eighth of a megabyte of memory. The finished MacPaint consisted of 5,804 lines of Pascal code, augmented by another 2,738 lines of assembly language, which compiled into less than .05 megabytes of executable code."
posted by Blazecock Pileon (64 comments total) 52 users marked this as a favorite
 
… I … was going to go to bed, dammit.
posted by hattifattener at 12:45 AM on July 20, 2010 [4 favorites]


OMG. Yeah, ditto, HattiFattener.

These bring back so many memories... I now need to find how to compile this to run on my machine.
posted by strixus at 12:49 AM on July 20, 2010


Software from the days when things like optimization meant something, rather than just slapping a higher set of system requirements on the box.
posted by mrbill at 12:51 AM on July 20, 2010 [6 favorites]


For the love of Jobs, I have vector graphics files bigger than 128k. Heck, I've probably written comments bigger than that.
posted by loquacious at 12:59 AM on July 20, 2010 [4 favorites]


What heaven Atkinson must have been in ... to be allowed to be a single programmer building something like that!
posted by woodblock100 at 1:07 AM on July 20, 2010


In writing MacPaint, Bill was as concerned with whether human readers would understand the code as he was with what the computer would do with it. He later said about software in general, "It's an art form, like any other art form... I would spend time rewriting whole sections of code to make them more cleanly organized, more clear. I'm a firm believer that the best way to prevent bugs is to make it so that you can read through the code and understand exactly what it's doing… And maybe that was a little bit counter to what I ran into when I first came to Apple... If you want to get it smooth, you've got to rewrite it from scratch at least five times."

When the Lisa team was pushing to finalize their software in 1982, project managers started requiring programmers to submit weekly forms reporting on the number of lines of code they had written. Bill Atkinson thought that was silly. For the week in which he had rewritten QuickDraw’s region calculation routines to be six times faster and 2000 lines shorter, he put "-2000" on the form

This is great stuff.

Taking a look at the Pascal code now... don't think I've really looked at anything written in that since 1995...
posted by weston at 1:11 AM on July 20, 2010 [5 favorites]


.SEG ' '
.FUNC Monkey
;---------------------------------------------------------------------
;
; FUNCTION Monkey: BOOLEAN;
;
TST MonkeyLives ;IS THE MONKEY ACTIVE ?
SGE 4(SP) ;YES IF >= ZERO
NEG.B 4(SP) ;CONVERT TO PASCAL BOOLEAN
RTS
posted by _Lasar at 1:13 AM on July 20, 2010


Please give me a compiler that makes me put the most important code on bottom. And if you can't make me do that, please make me at least hide the important code pages down past a big block of forward declarations that no one gives a shit about. Thanks.
posted by fleacircus at 1:16 AM on July 20, 2010


Do you top-post, too?
posted by ryanrs at 2:03 AM on July 20, 2010


… I … was going to go to bed, dammit.

Seconded. Or thirded...er, I guess in the spirit of things that would be 11'ed. or probably 10, since the first comment would be 0. Must sleep, but anything that mentions the Computer History Museum in Santa Clara, Calif. makes me feel all tingly. Now there's a museum I would sleep in.

Oh right. Sleep.
posted by foonly at 2:25 AM on July 20, 2010


Software from the days when things like optimization meant something, rather than just slapping a higher set of system requirements on the box.

My first computer had less than half the Mac. I had the "big" model that was triple the RAM of its little brother. Macpaint was impressive, but even the Mac would have been considered obscenely over-appointed by the 30 and 40 year old coders of the day.
posted by rodgerd at 2:51 AM on July 20, 2010


Paint programs are totally trivial now. With the CANVAS tag these aren't worth the time it takes me to say they aren't worth anything. Microsoft Paint or Apple Paint; both useless.
posted by twoleftfeet at 3:05 AM on July 20, 2010


Canvas! Agh. My tract.
posted by hal9k at 3:28 AM on July 20, 2010


I've often thought that it's a damn shame that Apple didn't own the Amiga hardware. Combining the ur-hackery of the original Amiga team with this kind of end-user software wizardry would have created an unstoppable force. There's no freaking way the PC would have survived against that kind of aggregated talent.
posted by Malor at 3:53 AM on July 20, 2010


Malor: talent doesn't mean fuck-all in the marketplace; it is not a meritocracy.

On topic, this -- optimized, thoughtful, elegant code -- is why I love programming for low spec devices like the iPhone. It means something to the user to go through that effort of optimization.

Anyway, if you like doing this stuff in general, you want to be a systems programmer, not an app programmer.
posted by seanmpuckett at 4:21 AM on July 20, 2010


Wow, that MacPaint code is impressive. I especially like Undo.
posted by Sukiari at 4:26 AM on July 20, 2010


On the other hand I am having difficulty penetrating Quickdraw, mainly because it all(?) in 68000 assembly language, which I don't really know. All I know is the relatively demented 80C16 assembly language, and that's enough for me thanks.
posted by Sukiari at 4:29 AM on July 20, 2010


Very cool. And yea programming was a different skill back then; you had to account for every byte of memory and storage and every cycle of CPU time.
posted by octothorpe at 5:06 AM on July 20, 2010


I remember MacPaint from a class I took years ago in a classroom that had Apples. I LOVED the mirrored paintbrushes. When you turned on all four mirrors you could make impressively elaborate mandala-like pictures, which was really fun for someone with no artistic talent.
posted by marsha56 at 5:07 AM on July 20, 2010 [1 favorite]


This actually reminds me of an I dea I read somewhere not too long ago: port MacWrite to the iPad. Of course, porting MacPaint would be cool, too!
posted by grubi at 5:22 AM on July 20, 2010 [2 favorites]


Do you top-post, too?

Note that top posting is: "A follows from B, but I'm gonna put B first, out of context of how it is used in A. Have fun figuring this out!" What's your point?
posted by fleacircus at 5:23 AM on July 20, 2010


Really a shame Pascal didn't become the standard for app development. From writing Delphi and Modula-2 code as well as a lot of C code over the years I know was a lot more confident in the quality and reliability of the first two than the last.
posted by Space Coyote at 5:26 AM on July 20, 2010


Really a shame Pascal didn't become the standard for app development.
Brian Kernighan would disagree (as do I).
posted by plinth at 5:35 AM on July 20, 2010


low spec devices like the iPhone

Ha!
posted by smackfu at 5:39 AM on July 20, 2010 [1 favorite]


MacPaint was magic. I actually remember seeing, for the first time, images on a screen that weren't green text on black made me think "Computer, huh... I need one."

Remember "Zen and the Art of Macintosh?" An entire book on MacPaint art. It was amazing.
posted by cccorlew at 6:09 AM on July 20, 2010 [1 favorite]


On topic, this -- optimized, thoughtful, elegant code -- is why I love programming for low spec devices like the iPhone.

Tog on designing within the limitations of the original Mac vs. the iPad.
posted by Evilspork at 6:10 AM on July 20, 2010


Man, as a long time Mac user and an old 68000 guy this is like finding a Gutenberg Bible. I used to do some amazing things linking assembler to Pascal and C (or so I thought) but I was clearly a rank amateur in comparison.
posted by tommasz at 6:41 AM on July 20, 2010


The MC68000 instruction set is actually pretty nice. If all you know is Intel's crap ISAs, then you could do worse than taking a look at something better. (Also, I like the way the 68010 would barf internal microcode machine state onto the stack when certain exceptions occurred. Oh well, nothing's perfect.)
posted by Crabby Appleton at 6:45 AM on July 20, 2010 [1 favorite]


Neat. Though for nostalgia reasons I'm always going to be more of a fan of 1985s Deluxe Paint... Amazingly only a year later. I never have found anything else quite so good for working at the pixel level.
posted by Artw at 6:49 AM on July 20, 2010


_Lasar, here's an explanation for that snippet of code:
The Monkey was a small desk accessory that used the journaling hooks to feed random events to the current application, so the Macintosh seemed to be operated by an incredibly fast, somewhat angry monkey […] Bill Atkinson came up with the idea of defining a system flag called "MonkeyLives" (pronounced with a short "i" but often mispronounced with a long one), that indicated when the Monkey was running. The flag allowed MacPaint and other applications to test for the presence of the Monkey and disable the quit command while it was running, as well as other areas they wanted the Monkey to avoid. This allowed the Monkey to run all night, or even longer, driving the application through every possible situation.
- folklore.org

So, essentially, "is the fuzz tester running? Ignore it trying to quit me."
posted by egypturnash at 7:17 AM on July 20, 2010 [2 favorites]


For the love of Jobs, I have vector graphics files bigger than 128k. Heck, I've probably written comments bigger than that.
posted by loquacious at 12:59 AM on July 20 [1 favorite +] [!]


Eponysterical. (Favorited!)
posted by DigDoug at 7:20 AM on July 20, 2010 [1 favorite]


The NuPrometheus League strikes again!
posted by ecurtz at 7:26 AM on July 20, 2010


The MC68000 instruction set is actually pretty nice.

It really is, but the variable instruction length was a pain to deal with when you were patching code in memory. Replacing a MOVE instruction always required a bunch of NOPs. Guess the number wrong and kablooie!
posted by tommasz at 7:39 AM on July 20, 2010


It's fascinating to me they used Pascal. Did it compile to native 68000 code or did it run in a p-code interpreter (what the kids these days call a "virtual machine")? Were other early Mac applications in Pascal?

My memory is QuickDraw was pretty great. Is it the first common graphics rendering library?
posted by Nelson at 8:07 AM on July 20, 2010


Pascal on the Macintosh was compiled to 68000 - no P-CODE. In actuality, Pascal was compiled on a Lisa then run on a Macintosh for quite some time until the Mac could host the tools.
posted by plinth at 8:16 AM on July 20, 2010


I just cannot cope with how people are using "nanotechnology." The marketdroids are busy nano-izing anything that can conceivably have that juice painted onto it and precious little money is being spent on what I will now call TiFuR technology. (TiFuR stands for "Tiny Fucking Robots.") Where is the itty-bitty robot arm with three degrees of freedom and is a micrometer in length, capable of surviving in a broad range of pHs?

Is it using "tape," which, at a low enough level, might look like a simple spool of carbons laddered together in alternating double-single bonds, with one left over for some kind of data? And where is the processor for this? That processor will not have a lot of RAM available to it.

That's why these coders are transnational treasures. Someone who made functional games using 128 bytes of RAM, teams who built graphics libraries in a thousand (okay, 1,024) times as much memory, they're insanely valuable, because, once someone gets serious about TiFuR, they'll start looking around for programmers. Programmers who are now turning out these enormous packages to do the simplest things. I'm embarrassed to admit that the compiled GUI client for this Python deal I built was several megabytes. Just having that much space would have been inconceivable to me as a youth, and the sheer amounts of stuff I don't know about in those megabytes is vaguely shameful.

That's what we'll have to return to, if we want itty machines rooting through our garbage, picking up droplets of mercury, or squirming through of arteries, carefully scraping away plaque and tamping down anything that remains. We need people who can do things in a few kilobytes of memory, because that's what we'll have to work with.
posted by adipocere at 8:25 AM on July 20, 2010 [5 favorites]


I've often thought that it's a damn shame that Apple didn't own the Amiga hardware. Combining the ur-hackery of the original Amiga team with this kind of end-user software wizardry would have created an unstoppable force.

Apple is notorious for trying to make one right way to use their products, including the original Macintosh. I don't think they would've gotten along well with the more hacker-friendly Amiga.
posted by OwenMarshall at 8:43 AM on July 20, 2010


Were other early Mac applications in Pascal?
Nelson

Yeah, Pascal development was still common in the late 80s and early 90s when I first learned Mac programming. It was pretty much killed off during the transition from 68k to PowerPC during the mid 90s.

Paint programs are totally trivial now. With the CANVAS tag these aren't worth the time it takes me to say they aren't worth anything.
twoleftfeet

nanos gigantium humeris insidentes
posted by ecurtz at 8:53 AM on July 20, 2010


adipocere, I sometimes imagine that Forth or something much like it is going to become a highly valuable (and perhaps even popular) skill as we go down the road you've described. The blend of high and low levels of abstraction seem to make it really well suited for doing interesting things with highly constrained resources.

But, then again, assembly beat it out in the time of 16-64kb home computers, so maybe it's just a personal fascination.
posted by weston at 9:20 AM on July 20, 2010 [2 favorites]


So, for what it's worth, one particularly functionality heavy page of the website I am currently working on has 500k of JavaScript. That's not counting CSS and HTML and images and all that.

Paint programs are totally trivial now. With the CANVAS tag these aren't worth the time it takes me to say they aren't worth anything.

I've never been much of a CANVAS guy, but I can definately see the appeal of all that messing about at the pixel level in an old school style.
posted by Artw at 9:26 AM on July 20, 2010


Paint programs are totally trivial now.

I took a computer graphics class in college where we learned from basic principles all the scan-conversion algorithms for all the basic primitives, like lines and circles and such. A lot harder than you would think.
posted by smackfu at 9:30 AM on July 20, 2010


I took a computer graphics class in college where we learned from basic principles all the scan-conversion algorithms for all the basic primitives, like lines and circles and such. A lot harder than you would think.

Agreed. I had no idea drawing a line could be so difficult.

It was a little embarrassing when, after several late nights in the lab, one of our non-computer friends asked what we were working on and we sheepishly admitted, "drawing a line."
posted by howling fantods at 9:45 AM on July 20, 2010


And yea programming was a different skill back then; you had to account for every byte of memory and storage and every cycle of CPU time.

I've recently switched careers and gotten into firmware programming precisely because I missed the opportunity to use those skills. It's still just like the old days down in the embedded microcontroller world: 16 Mhz and maybe 2K of RAM, plus 16K of Flash, would be a common system configuration. But it's awesome fun because you own the entire machine - if you want an operating system, you have to write it yourself.
posted by Mars Saxman at 9:56 AM on July 20, 2010 [2 favorites]


I had no idea drawing a line could be so difficult.
Drawing a line is actually pretty easy. Bresenham's algorithm is straightforward to implement - the example in wikipedia is only 19 lines of code, but the plot(x,y) routine will drag it into the mud in terms of performance.

On 1 bit frame buffers, there are some optimizations for near horizontal lines that make it run like bat out of hell. These are the parts that require a lot of attention to detail as you want to make special case routines for drawing horizontal segments and vertical segments, and as was the case in QuickDraw, you had to deal with a variable sized pen, a pen pattern and clipping. Then it gets harder. Oh so much harder.
posted by plinth at 11:10 AM on July 20, 2010 [1 favorite]


Well that saves me looking up a pew pew laser beams algorithm for the space station based roguelike I always kind of sort of planned on building some day...
posted by Artw at 11:16 AM on July 20, 2010


Remember when you actually tried to code stuff to use only integers? Now we just use doubles for everything because "why not?"
posted by smackfu at 11:25 AM on July 20, 2010


8542 lines of code. wow. zlib (a common small compression library written in c is about 30,000 lines of code.

crazy.
posted by bottlebrushtree at 11:27 AM on July 20, 2010


because "why not?"

It's fine until you need to port it to an embedded processor that doesn't have a floating point unit and the software floating point library is too slow. Such a situation gave me an opportunity to write a small fixed-point arithmetic package in C, once upon a time.
posted by Crabby Appleton at 11:31 AM on July 20, 2010


Also, floating point numbers are not real numbers (in the mathematical sense) and can behave in counter-intuitive ways, if you don't know how to avoid the conditions under which they do so.
posted by Crabby Appleton at 11:33 AM on July 20, 2010


Remember when you actually tried to code stuff to use only integers? Now we just use doubles for everything because "why not?"

Ew ew ew ew ew. People who use wanton doubles should be shot.

It was a little embarrassing when, after several late nights in the lab, one of our non-computer friends asked what we were working on and we sheepishly admitted, "drawing a line."

Hey, if you study combinatorics your high level mathematical specialty is basically "counting".
posted by kmz at 11:36 AM on July 20, 2010


Fixed point and table look-ups for sin() and cos(), those were the days. Of course, we were just writing awesome shooting games in QuickBasic to play during lunch, but still.
posted by smackfu at 11:37 AM on July 20, 2010


I find floating point rounding errors more and more frequently as kids resort to "throwing floating point at it" but nobody told them that the double precision routines don't round automatically -- and in binary, the fraction 1/10 is an infinitely repeating binal, which when you multiply it by 10 returns the binary equivalent of .9999999...
posted by localroger at 2:23 PM on July 20, 2010


adipocere, I think that at first, we will be combining little tiny modems and radios with our tiny fucking robots.

OwenMarshall: "Apple is notorious for trying to make one right way to use their products, including the original Macintosh. I don't think they would've gotten along well with the more hacker-friendly Amiga."

What's the one right way to use my Mac? OS X, Windows, Linux, FreeBSD, OpenSolaris?

Apple's never advocated any one right way to use their computers. Phones, yeah, a different story. So jailbreak it.
posted by Sukiari at 2:56 PM on July 20, 2010


Apple's never advocated any one right way to use their computers.
Eh? The original Mac didn't even have arrow keys, to force you to use the mouse. They refused to provide PPC specs to make it as hard as possible to get Linux to run on the New World Macs, so even the strange restriction of "way" to mean "operating system" doesn't really work.

That said, I think it's a strength of the Mac that it was opinionated and had "a right way" -- a right way that Apple hired evangelists like Guy Kawasaki to promote. It rubbed off on the users, so that they demanded their software be "Mac-like", forcing develoeprs to make software that fitted in with that ethos. It meant the creation of some household names: Word, Excel, PowerPoint, Photoshop, Quark, Pagemaker, Flash ... these all began life as Mac apps.

There's a clear cultural line between that and what happened on the Amiga, but I actually think it could have worked well; Apple is good at assimilating hardware beards and keeping them away from the users. There'd have been no guru meditations, but there'd still have been bomb boxes.
posted by bonaldi at 3:39 PM on July 20, 2010


Curse _Lasar for beating me to posting the monkey code. But I can always write custom pseudocode. (If only I knew Pascal or 68000 assembly...)

IF NOT MonkeyLives THEN GOTO AntsToScrn

In other words, if there's a dead monkey, cover it in ants.
posted by spamguy at 3:44 PM on July 20, 2010


MacPaint and QuickDraw source? That's like finding the Epic of Gilgamesh tablets...

Something that's bothered me for many years is how to exhibit code in a design museum. Software engineering (let's not have that discussion right now) is at the heart of so much of modern life, but it's largely invisible and incomprehensible to non-programmers.

A bridge, a car, even computer hardware, you can plonk in a museum and with annotations and animations and what have you explain the design principles behind it to an interested layman. You can show what makes good design (and the rest), demonstrate the evolution of the art, and generally make a decent fist of leading people through the theory, process and outcome of making it happen.

Not so with software. Why does elegance matter? What does it even look like? What's the difference between a hacked-together bodge and a beautiful piece of work, when they appear to have the same result and, to quite a fine degree, make the same demands of hardware? You can write an essay on this, but how to show it?

It strikes me that MacPaint and QuickDraw, which are canonical examples of a certain kind of programming at the nexus of many important trends, would be good source material for experiments in answering those questions. Take a large exhibition space and some talented people and give them a large budget for stuff - what would you build to provoke that 'Aha!' moment?

Bonus points for explaining why software is quite so weird: codified thought with no intrinsic existence, no single form, that can affect real things, and what a complete mindfuck a Turing Machine actually is when you get to know it.

As I said, been bothering me for years, this, and I have no good answers. But by Jiminy, I'd like to make it happen (and possibly could, with the right ideas. Would go to bat with the people with money, at the drop of a stack pointer).
posted by Devonian at 4:40 PM on July 20, 2010 [3 favorites]


Isn't that basically the same issue with displaying a book in a museum? It takes an awful lot of effort to make it work and at tne end of the day you still say "well, you really need to read it to appreciate it."
posted by smackfu at 5:09 PM on July 20, 2010


Not really, smackfu. Software runs the world, and to most people it's invisible. I think it's worth a go.

(as for to really appreciate it, two of my very favourite lines of code, which came from a bit of software designed to run on a Hitachi variant of the Z80, go:

MLT SP
RET

Things to note. The original Z80 didn't have a MLT instruction - no multiply instructions at all, discounting shifts - but on this chip, MLT took a register pair, unsigned-multiplied the high byte by the low byte and stuffed the result back as a 16 bit word into the original register pair. Not that useful, but they figured out how to do it in the microcode so why not. (This wasn't a fast instruction. You could make a cup of tea, even at 10 MHz, while it executed, and of course it's atomic so you'd better not need to do anything with interrupts in a hurry.)

Normally, on a Z80, you only treat the BC, DE, and HL register sets (and their alternates) as usefully paired - although the A register could be paired with the flags register for stack pushes and pops. There were also IX and IY sixteen bit registers, but they couldn't be split into 8-bit pairs and were never treated as such, there was the rather arcane IR interrupt/refresh register, and finally you have the 16-bit only stack pointer itself, SP.

But the Hitachi chip could do its MLT thang to not only BC, DE and HL but also SP, which, when you're about to RET from a subroutine, better darn point to the return address. MLTing the high byte by the low byte results in the most obfuscatory, difficult and downright point(er)less vector jump you ever did see.

And someone did it. Just because they could.

This is not the level of appreciation which I seek. That would be silly.)
posted by Devonian at 5:58 PM on July 20, 2010 [2 favorites]


I think you could display this in a museum by letting a person use the actual program on an old Mac, set up on a pedestal, while at the same time on the wall behind the computer, you projected a large graphic display of the program running in response to what they do, sort of like when you are stepping through code in a visual debugger except prettier. Maybe have a dial on the computer that lets them set the speed of the emulation.
posted by fleacircus at 6:24 PM on July 20, 2010


Alls I can say is: neat.
posted by wierdo at 9:26 PM on July 20, 2010


Now we just use doubles for everything because "why not?"

Ew. Do people actually DO that?
posted by antifuse at 8:02 AM on July 21, 2010


Well, no, I still have never seen a standard for loop that uses doubles. But people definitely use doubles where they should be using fixed-point.
posted by smackfu at 8:08 AM on July 21, 2010


Everyone who uses JavaScript and PHP is just throwing doubles at everything...
posted by weston at 9:10 AM on July 21, 2010


Discontinuous regions are just plain amazing. That you could free-draw a selection and then cut & paste with it was astounding, from a programming standpoint anyway.
posted by wkearney99 at 11:00 AM on July 21, 2010


« Older Yes, it's what it sounds like.   |   Nature Sounds Newer »


This thread has been archived and is closed to new comments