"To me, a personal computer should be small, reliable, convenient to use and in expensive."
May 18, 2012 10:35 PM   Subscribe

 
This is great, because my Apple I is getting a little old.
posted by twoleftfeet at 10:50 PM on May 18, 2012 [6 favorites]


The Apple II was released June 5, 1977 and was of course one of the most successful early home computers. These days Wozniak is working with Aaron Sorkin on a movie of some kind.
posted by Artw at 11:24 PM on May 18, 2012


Heh:
The Apple-II cassette interface is simple, fast, and I think most reliable. The data transfer rate averages over 180 bytes per second, and the recording scheme is compatible with the interface used with the Apple-I.
I'm amused, because that's so insanely slow compared to the machine I'm sitting at. But still, 180 characters per second is roughly comparable in speed to an 1800 baud modem, if such a thing had ever existed. That's really impressive for the era, when you consider that, eleven years later, I paid about $200 for a 2400 baud modem. Admittedly, sending data to a tape is much easier than sending it over the phone system, but that's still pretty gorram impressive, especially when you consider that the feature would have added so little cost to the system.

What you're seeing, when you read that article, is a guy who's quite justifiably proud of having packed a ton of super-advanced stuff into the same system. The Apple I and II predate me a little (I didn't really start to get into computers until around 1980), so I'm not able to appreciate it as much as I might, but as far as I can see, that was a dizzying array of stuff at a still fairly reasonable price.

Most of the computer people who were reading this were used to thinking about these machines as having big toggle switch panels, huge reel-to-reel tapes, and lots and lots of blinkenlights. The mainframers loved to scorn the microcomputers, but micros in general, and this machine in particular, were seismic shifts in the world. It was the liberation of the masses from the techno-cabal, the high priests that controlled access to The Computers. Suddenly, you could have your own. You didn't have to share it with anyone, and the only rules for using it were imposed by hardware restrictions, not tetchy geeks with a strong dislike of people.

And then the dude went ahead and programmed a decent BASIC interpreter in 5K of code, after designing a motherboard so clever that people still sing its praises, all these years later.

I'm not sure that anyone really deserves billions of dollars, but surely, if anyone does, Woz would be on the short list.
posted by Malor at 12:24 AM on May 19, 2012 [15 favorites]


In the space of 2 weeks, Star Wars came out, Space Mountain opened at Disneyland, and the Apple II came out. Only a few short months later, Steely Dan's Aja dropped. All this while disco and punk are at peak vitality.

Was 1977 just one long stretch of smiley, glass eyed people stumbling around going "fuck yeah!" all the time? Cuz it seems like it had to have been.
posted by Senor Cardgage at 12:26 AM on May 19, 2012 [11 favorites]


You know, over on another board, we were talking about the 16-bit virtual computer in Notch's next game, 0x10c. We were talking about how early Macs had a basic networking stack in 7K, and how much losing 7K out of 64K would hurt. Then it occurred to me just how little RAM was actually in a Mac 128K. So I went and looked to see how many were ever made... I couldn't find anything other than 'hundreds of thousands."

If we assume that 500,000 Mac 128Ks were shipped, that means the total RAM in ALL of them, together, was about 64 gigs. You could buy that much RAM for about $400 today, and hold the resulting 16 chips easily in one hand. All the Mac 128Ks ever, in one hand.

Was 1977 just one long stretch of smiley, glass eyed people stumbling around going "fuck yeah!" all the time? Cuz it seems like it had to have been.

Well, I was pretty young at the time, but as far as I can tell, yes, that was a pretty accurate description of leisure time in the 1970s. But I think it was the drugs doing that more than Space Mountain. :-)
posted by Malor at 12:31 AM on May 19, 2012 [10 favorites]


Was 1977 just one long stretch of smiley, glass eyed people stumbling around going "fuck yeah!" all the time? Cuz it seems like it had to have been.

i was only 8 and i didn't know any better. i guess it might have been some kind of golden age.

can you believe that the manuals that came with the apple II actually had the full motherboard schematics in the back? i used to read those over and over again trying to understand it all.

man. the apple II. those were some seriously good times, going to warez parties and copying disk after disk. had a binder full of 5 1/4" floppies with the ]CATALOG outputs printed, on green and white striped paper no less, carefully cut out and stuffed into the vinyl pages with each disk. i can still smell that plastic-y smell.

at any rate the apple II made me what i am today, a verilog hacker. thanks woz.
posted by joeblough at 12:33 AM on May 19, 2012 [3 favorites]


Back in the day... the wonderful day... I recall needing to upgrade my Apple IIe and decided to make the plunge and purchase a meg of memory. The cost? $100.

I am of the belief that that if I pull the memory out of the iMac I'm sitting at, fire up the Delorian and go back to the "day", I could sell that memory for about a half a million.

All that aside, my thought in reading that article is that you had to have pretty big nerd cojones to even consider useing a computer back then.
posted by HuronBob at 3:18 AM on May 19, 2012


HuronBob, I don't remember offhand whether it's in Neuromancer or Count Zero, but one of William Gibson's characters has to make the hard choice to pawn the memory from his cyberspace deck. All 4 MB of it, or something like that. Today, that and the videocard attached to it would net you what, five bucks?
posted by kandinski at 3:43 AM on May 19, 2012 [1 favorite]


Another way of putting that memory comparison: according to Wikipedia, between five and six million Apple ][s were sold, over a period of about fifteen years. If you assume 6 million machines, at 64K each, that's a total of 384 gigs of RAM. If we use the cheapo 4-gig memory chips, it would take 96 to duplicate them.

Looks like 4 giggers are about $20 right now, so it would cost you $1,920 to buy as much RAM as was in every Apple ][ ever made, and you could fit the memory in a lunchbox.

HuronBob: did you really put a meg on a ][e? I'm sure it would be possible, but it would be all bank-switched and weird.
posted by Malor at 3:50 AM on May 19, 2012 [3 favorites]


I could sell that memory for about a half a million.

You know, I thought about that in a different context, that of what my younger self would have done to get ahold of a Raspberry Pi board. But then I realized that, even if I brought the unit back to, say, 1985, I wouldn't be able to interface it with anything back then. It would just be a hunk of plastic with some glowy lights, because I'd have no way to talk to it. No USB, no Ethernet, no DVI/HDMI.

Likewise, if you took modern RAM back with you, it would be worth untold millions if it could be used, maybe billions, but nobody could make electronics fast enough to talk to it.

If you brought back the whole computer, you'd likely disappear into some deep, dark hole once the government figured out what you had.
posted by Malor at 3:56 AM on May 19, 2012 [1 favorite]


Followup: I went and looked it up, and yes, it was quite possible to put a meg on an Apple IIe -- in fact, apparently Apple themselves had a dedicated card for a later version of a IIc. One use mentioned was for a disk cache in ProDOS.
posted by Malor at 4:23 AM on May 19, 2012


nerd cojones

I foolishly imagined this would be the first use of these two words together, but alas, no.
posted by fairmettle at 4:24 AM on May 19, 2012


No USB, no Ethernet, no DVI/HDMI.

You'd have Ethernet, though you might have to stop by the local university.

The easiest thing to do would be to pry off the cover and connect to the UART (assuming the bootloader sets it up).

With a good logic analyzer they'd probably be able to decode the higher-speed signals like USB and HDMI. The first commercial FPGAs were just becoming available in 1985 so they might be able to interface, but I don't know if clock speeds were up to snuff.

Anyway this would make a good sci-fi short story :)
posted by RobotVoodooPower at 4:55 AM on May 19, 2012


It was a great day in our household when my Mom brought home an Apple II. Although my parents have gone through many computers since, they've never been able to part with the Apple II. It's still in their basement with all of the manuals, the cassettes, floppies, monitor and I'm pretty sure the hand drawn maps and notes my brother and I made for each room of Temple of Apshai.
posted by Slack-a-gogo at 5:06 AM on May 19, 2012 [3 favorites]


Back in the day... the wonderful day... I recall needing to upgrade my Apple IIe and decided to make the plunge and purchase a meg of memory. The cost? $100.

I remember my Uncle loading up his ][e with a meg of RAM and it was indeed much more expensive than that. It was a massive and beautiful array of chips. I was supremely jealous.

Was jealous? I still am!
posted by mazola at 6:11 AM on May 19, 2012 [1 favorite]


HuronBob, I don't remember offhand whether it's in Neuromancer or Count Zero, but one of William Gibson's characters has to make the hard choice to pawn the memory from his cyberspace deck. All 4 MB of it, or something like that. Today, that and the videocard attached to it would net you what, five bucks?

Keanu Reeves in Johnny Mnemonic had a memory capacity of 160Gb so I guess by 1995s future things had moved on.
posted by Artw at 6:42 AM on May 19, 2012


If you assume 6 million machines, at 64K each, that's a total of 384 gigs of RAM. If we use the cheapo 4-gig memory chips, it would take 96 to duplicate them.

Nitpick. You are mixing bits and bytes here. You need to multiply your chip count by eight.
posted by JackFlash at 9:09 AM on May 19, 2012


I don't think so. 64K bytes times 6,000,000 machines is 384,000,000K bytes. A single four gigabyte memory chip holds 4,194,304K bytes. So we're working in the same units here, K bytes in both cases.

384000000 / 4194304 is 91.55 chips. There's a slop of a few chips because I mixed up my decimal and binary gigs there somewhere, but as far as I can see, it's off by only about 5 chips, not 700ish. And I erred high, you don't need quite as many chips as I first thought.

Unless I'm being extra-stupid here somewhere, what I said remains true -- you could indeed hold the memory of every Apple II computer in 96 4-gig memory chips. But you don't actually need the last five.
posted by Malor at 9:28 AM on May 19, 2012


I should clarify and say MAIN memory, I wasn't aware that Apples had such large expansion boards. (again, Apples predate me a little -- I used them in school, but never owned one, and never knew them very well.) Each 4 gig chip you buy now is only 4,000 times bigger than the one-meg expansions, so duplicating all those could get fairly expensive. Dunno how many there were.

Still, buying four thousand Apple II RAM expansions, for $20, is nothing to sneeze at.
posted by Malor at 9:39 AM on May 19, 2012


A 4Gb memory chip contains 4G bits. You need eight of them for 4 GBytes. The largest DRAM chips in production today are 8 Gbits which is 1 GByte. Memory chip sizes are almost always given in bits, not bytes. This is because they can have various data widths -- 1 bit, 4 bits, 8 bits, 16 bits or 32 bits. In each case the common factor is the total number of bits contained, not the byte width.

Perhaps you are confusing this with memory modules, those DIMMs that you plug into your computer motherboard. These have multiple memory chips on them and their capacity is given in GBytes.
posted by JackFlash at 9:47 AM on May 19, 2012 [1 favorite]


> Perhaps you are confusing this with memory modules, those DIMMs that you plug into your computer motherboard. These have multiple memory chips on them and their capacity is given in GBytes.

He was never talking about individual ram chips, but the easily purchased desktop ram that you can get from places like here. So it was a given he meant GigaBytes, not bits, since no one actually buys ram (or any storage really) by the bit anymore.
posted by mrzarquon at 10:13 AM on May 19, 2012


I'm guessing that he probably meant to say memory module or maybe even memory stick but he said "a single four gigabyte memory chip holds 4,194,304K bytes" which is incorrect. Like I originally said, it is a nitpick, but it is best to correct imprecise language to prevent confusion when talking about technology.
posted by JackFlash at 10:52 AM on May 19, 2012


Wozniak's relationship with Jobs seems like a metaphor for the way Jobs became successful.

Jobs bamboozled people to work with him and to execute his vision. He got financing this way, he got technical people this way and he got marketing this way.

Jobs was a genius, but not in the respectable and noble way that most people think. He was a genius at manipulating others.
posted by quanti at 3:51 PM on May 19, 2012 [2 favorites]


A 4Gb memory chip contains 4G bits. You need eight of them for 4 GBytes.

Oh, I see, I got a little sloppy on terminology. I'm talking about DIMMs, not individual RAM chips. 91.55 4 gigabyte DIMMS, which would easily fit in a lunchbox.

Can consumers even use individual RAM chips anymore? The ones that come on DIMMs pretty much require a wave solder machine to attach to usable traces.
posted by Malor at 4:59 PM on May 19, 2012


Sorry about the derail. I just jumped in to clarify a common misunderstanding and it kind of led down a rabbit hole.
posted by JackFlash at 5:46 PM on May 19, 2012


While we are being pedantic, I'm pretty sure that wave solder machines are useless for the SMT chips in modern memory modules. Wave soldering is strictly a through-hole technology.

Well, at risk of being pedantic once again, wave soldering is quite commonly used for surface mount components, although less so than oven reflow. The components are attached to the board with adhesive before running through the solder wave.

See, for example, here and here (PDFs).
posted by JackFlash at 6:43 PM on May 19, 2012



Was 1977 just one long stretch of smiley, glass eyed people stumbling around going "fuck yeah!" all the time?


That was also the year I moved to Chicago and also hit puberty, so...
posted by Halloween Jack at 1:43 PM on May 20, 2012 [1 favorite]


Jobs was a genius, but not in the respectable and noble way that most people think.

I'm not sure what "most people" think about Jobs--aside from being a now-deceased part of the Apple iconography in general--but Jeffrey Young's book about Steve Jobs, The Journey Is the Reward, was published in 1988, which makes it older than a lot of Apple's customer base, and that's about as unflattering a portrait of him that I've seen.

And yet, for all of that (and it really is worth looking the book up, even though it ends just as he's starting up NeXT), for all of his manipulation and reality distortion fields and temper tantrums and egotism, I think that it's quite likely that if Jobs hadn't teamed up with Woz, there likely wouldn't be an FPP about this computer--or if there was, it would be in the context of a long-lost prototype of what might have made for a bitchin' personal computer if its inventor had found someone who could have sold it to the masses. Jobs might have gone on to be a marketer/promoter for Commodore and made the Amiga a hit; Woz has always been a gentle goofball type who might have trouble giving away ice water during a heat wave. For another perspective on the early days of computers (including Jobs and Woz), read Steven Levy's Hackers; there are a lot of very smart people with very good ideas in it, but they're not necessarily the ones who became rich and famous, and Woz fared far better than most.
posted by Halloween Jack at 9:37 PM on May 20, 2012 [1 favorite]


« Older HAND THING   |   It's a lot of kicking Newer »


This thread has been archived and is closed to new comments