Skip

Pixar, 1972
September 2, 2011 1:43 PM   Subscribe

40 Year Old 3D Computer Graphics, created by Edwin Catmull and Fred Parke (with some help from Bob Ingebretsen) in... wait for it... 1972!
posted by cthuljew (53 comments total) 36 users marked this as a favorite

 
turn up your surface subdivisions bro
posted by nathancaswell at 1:47 PM on September 2, 2011 [1 favorite]


I wonder how many frames per hour the system they had could render.

Also, this is awesome.
posted by GuyZero at 1:50 PM on September 2, 2011 [1 favorite]


turn up your surface subdivisions bro

If you look at the hand they have 4- and 5-sided polygons in there.

YOU CRAZY GUYS - ALL YOU NEED ARE TRIANGLES!
posted by GuyZero at 1:50 PM on September 2, 2011 [7 favorites]


Wait, was this prerendered or is it real time?
posted by memebake at 1:51 PM on September 2, 2011


turn up your surface subdivisions bro

It took six years, but Catmull got around to it eventually.
posted by jedicus at 1:52 PM on September 2, 2011 [4 favorites]


And 28 years later, they applied apparently the exact same level of technology to the polygon rendering in the original Deus Ex.
posted by cortex at 1:55 PM on September 2, 2011 [8 favorites]


Vol Libre, from 1980, the first film to use fractals to generate graphics.
posted by Ian A.T. at 1:56 PM on September 2, 2011 [3 favorites]


Hamburger Helper, anyone?
posted by Sys Rq at 2:04 PM on September 2, 2011 [1 favorite]


What sort of machine would they have used to get these onto film?

I'm imagining something out of Terry Gilliam's Brazil, with multiple film reels and DC motors and halogen bulbs and a bunch of Fresnel lenses one on top of the other.

Now that I think about it, they probably could have just pointed the camera at the display in a dark room, but that seems pretty pedestrian and disappointing.
posted by vogon_poet at 2:06 PM on September 2, 2011


University of Utah graphics department, home of the teapot.
posted by rh at 2:08 PM on September 2, 2011 [3 favorites]


Never met Ed Catmull, but I was close with a family member of his. He's a class act. He took his sister to the Oscars when he was being honoured.
posted by blue_beetle at 2:09 PM on September 2, 2011


I'm surprised the quality of this is as good as it is. I was expecting just wireframes of a static model, but there's some relatively fancy shading and geometry transformation stuff going on there.

Wait, was this prerendered or is it real time?

That's a good question. The article mentions rendering directly to film, which I imagine is done a frame at a time, and it takes as long as it needs for each frame. Then you play them back at the normal number of frames per second. In which case I'd call that essentially "pre-rendered" but to analog storage rather than digital.

It would be interesting to know what sort of equipment was used for this, though.
posted by FishBike at 2:10 PM on September 2, 2011


If you really like that old stuff, see Portfolio of Shaded Computer Images or THE SPACE CURVE AS A FOLDED EDGE

And from a few years later, the CGL clips here.

Good times then. mermayd (Mrs Hexatron) had her hand kissed in the continental manner (that is, preceded by heels clicked and a bow) by Bezier. Yes, that Bezier.
posted by hexatron at 2:12 PM on September 2, 2011 [2 favorites]


I'm really surprised at how good it looks. Cool find.
posted by curious nu at 2:35 PM on September 2, 2011


And here's some technical stuff.
The images were rendered a frame at a time on a crt with a 16-mm camera pointed at it.
The department had two PDP10's. One was timeshared to terminals (many with some of the first mice). The other was single-user, and one signed up for an hour or so of time on it.

The single-user PDP10 made the pictures and movies. They were rendered onto a CRT and photographed with a 4x5 view camera (usually using Polaroid instant film) or onto a 16mm film camera for animation (the computer could advance the film between frames)

The computer could not hold an entire image in memory--I think there was only about 256k bytes in the whole computer. Images were computed a line (or a scan-line) at a time, and displayed on the CRT. Dedicated display memory was a dream--a few years later, the first display cards (then called frame buffers) were made, and sold for about $100000 for a 512x512x8 bit memory and a controller to send the image to a screen.

Mike Milochek was the photography god who made all the grad students' work look good. This was when photography was smelly chemicals and exacting craft.

The line-drawing stuff rendered quickly (under a minute per frame as I remember) but surfaces were an hour or two per frame (where they still are). Ed's first simulated Z-buffer image (a whole bunch of chess pawns--can't find it now) took several hours to compute.

I see Catmull being credited with inventing the Z-buffer; it was one item on an old list of 'wouldn't it be nice if computers were powerful enough to do this' ideas. Ray tracing was on that list too. Ed was the first to implement a Z-buffer

Fred Parke made some 3D movies then too (I picked up a 3D Bolex kit in New York & brought it back to Utah. It had a prism 'taking' lens that put 2 images on a single frame (not needed for generated images) and a polarized two-lens projection lens that replaced the projector's normal lens, and a few pairs of polarized glasses to look at the results.
posted by hexatron at 2:39 PM on September 2, 2011 [41 favorites]


That was awesome. I thought the two little fingers were crooked until I saw the film of the plaster cast of the real hand. The computer model was accurate!

I remember seeing stills of this in the mid 70s when it was new, and thinking that the University of Utah was the nirvana of computer graphics. I wanted to go to there and study. And then when it came time to apply to college, I realized it was in Utah. When I got to college in 1975 this was the best I could do, but at least it was on hardware I built myself and owned. It was low rez, but hey, real bezier curves!
posted by charlie don't surf at 2:40 PM on September 2, 2011


Hexatron: The computer could not hold an entire image in memory--I think there was only about 256k bytes in the whole computer. Images were computed a line (or a scan-line) at a time, and displayed on the CRT.

So if the computer could not hold the whole image in memory, how did the CRT hold it?
posted by memebake at 2:46 PM on September 2, 2011 [1 favorite]


memebake: The CRT didn't hold an image. The film shutter was open for a long time, and the film recorded whatever showed up on the CRT during that time
posted by hexatron at 2:52 PM on September 2, 2011 [6 favorites]


...and I mis-stated about line-drawing display. Most line drawing could display in real time, some could display in real time on the time-shared terminals that had green-on-black line displays (there were several of these) So you could 'drag a vertex with the mouse' etc.
posted by hexatron at 2:59 PM on September 2, 2011


Circa 1990, I bought a Targa card for, I think, $2300 so that I could have a full video framebuffer, display 24 bits per pixel at roughly 640x485 (if I remember correctly). That was kick-ass stuff, the first ray-tracer I wrote didn't have shading, everything was procedurally textured as one of 4 colors, and if the ray hit it that color got displayed (with a CGA card, color video on a 4.77 MHz IBM PC compatible, for which I'd sprung for the extra 8087 floating point coprocessor). Sure, the VGA card could do 320x240 in 256 colors, but I had to run my full-color images through an additional quantization step. It wasn't 'til I got the Targa card that I realized that filtering textures was a good idea...

Circa 1972, I believe that Hexatron's memory is correct. The "line drawing in real-time" would have been a vector display, often an oscilloscope with X/Y inputs. So you had to stay ahead of that beam and redraw frequently enough that persistence of vision and phosphor made it look like you had a real display. (I didn't start coding until 10 years later, when I thought that 1k of memory and a 6 digit LED display was a bit limiting...).

And, yeah, Ed's a stand-up guy.
posted by straw at 3:07 PM on September 2, 2011 [2 favorites]


Most line drawing could display in real time, some could display in real time on the time-shared terminals that had green-on-black line displays (there were several of these) So you could 'drag a vertex with the mouse' etc.

I remember those, I used a Tek 4014 Storage Tube display back in the day. Oh that brings back memories of that flashing screen. In fact, just the memory of it is giving me a headache.
posted by charlie don't surf at 3:10 PM on September 2, 2011


I wonder how many frames per hour the system they had could render.

Your metric is too low. Think more in terms of frames per day or week.

Not that long ago - ok, it's nearly been twenty years - I enrolled in community college "digital animation" class. My first day in class I get access to some old Macintosh, I think it might have been a then-newfangled IICX or something. The Video Toaster was the hot piece of hardware and software at the time. Avid's non-linear editors were just starting to be truly useful.

So in the space of a two hour class I sit down with Stratavision, a then higher-end 3D animation program. I figure out key frames, how to animate objects, and I go for my own version of a "Utah Teapot" screentest. Three objects, one light, one background. A sphere, a cone and a cube. Granted those all had high definition textures and one of the objects was a transparent textured glass, but we're talking dead-simple stuff, really.

My teacher is impressed. It's ambitious for a first-timer. So he lets me render it. Projected rendering time for 15-30 seconds of video? 4+ weeks! So we turn down the resolution a little and turn on the "use other networked computers to render" option, and set it to render over a weekend when no one was in the lab. It still took 4+ days on a dozen computers.

It's really easy to forget how much computers have advanced and evolved in such a short period of time. You could take a small box of smartphones and netbooks from today back in time twenty years and you'd have more processing power per second than many professional rendering farms had at the time. The concept of 1000 mhz processor was science fiction. Much less one with multiple cores you could carry in your pocket that ran on a battery for hours and hours. Not to mention a touch screen display, a camera or two, a GPS, an accelerometer chip and a multi-band digital radio.
posted by loquacious at 3:11 PM on September 2, 2011 [6 favorites]


In fairness to old computers, a dual-core 1GHz smartphone processor only runs for 5 seconds every day and has the good sense to turn itself off the rest of the time.
posted by GuyZero at 3:26 PM on September 2, 2011 [1 favorite]


In fairness to old computers..

Let's do the math! I remember going over to see John Whitney Jr's Cray X-MP sometime around 1985. He was complaining that the new animation he was doing for artist Matt Mullican was using too many effects and was taking 40 minutes per frame to render. And that was just basic Phong shaders. That Cray X-MP model has a theoretical max of 400 Mflops. It had no GPU, just a frame buffer. Price for a midrange Cray: ~$10 million not including software and peripherals.

In comparison, the A5 processor in the iPad 2 has been rated at up to 665 Mflops. But the GPU alone (if I read this right) looks like it runs about 4.8 Gflops! It can render in realtime, at speeds over 30fps, scenes that are far more complex than what the Cray took 40 minutes per frame to render.
posted by charlie don't surf at 3:43 PM on September 2, 2011 [4 favorites]


But come on... the Cray X-MP looks perfect as a wonderful seating unit in any modernist home.
posted by GuyZero at 3:49 PM on September 2, 2011 [2 favorites]


GuyZero, people are pitching the iPad as a design element too... See, for example, this comment from today's Batmobile thread...
posted by straw at 3:55 PM on September 2, 2011


Hexatron wrote:(Mrs Hexatron) had her hand kissed in the continental manner (that is, preceded by heels clicked and a bow) by Bezier. Yes, that Bezier.

Yes indeed! He was a charming gentleman, very continental. He showed us a huge plotter of some kind. Utah was so beautiful, but full of Mormons so we had to move back east because we had little kids.
posted by mermayd at 4:06 PM on September 2, 2011 [2 favorites]


LOL GuyZero, the Cray looks a lot crappier when it's sitting in an unfinished room with exposed fiberglass insulation and aluminum foil over all the windows. And sit on it? If you spend $10M, you don't let anyone near it.
posted by charlie don't surf at 4:29 PM on September 2, 2011 [2 favorites]


Hey, I remember this! It crushed the field and took 1st place in the democompo at Assembly '73.
posted by suckerpunch at 4:34 PM on September 2, 2011 [3 favorites]


I also think that this is the only digital copy of it.

If it's digital and it's on the internet, there are now potentially millions of copies of it. It was the only digital copy before it was uploaded, assuming that it wasn't copied from one folder to another, or from one computer to another.

This statement is just strange to me.
posted by Malice at 4:59 PM on September 2, 2011


I like how you can tell which side of 40 years old the commenters in this thread are.
posted by benito.strauss at 5:18 PM on September 2, 2011 [4 favorites]


Wow, that's pretty amazing.
posted by delmoi at 6:03 PM on September 2, 2011


That really is beautiful.

Being so close to the beginning of the semester, I've got a lecture to deliver that includes "smooth" shading in my 3D graphics course on Tuesday. I will show this. I hope my students appreciate the point when I tell them it's older than me. (They won't.)

Also, the "two faces linearly interpolated" look strikingly like Alyx Vance.
posted by rlk at 6:10 PM on September 2, 2011


40? 1972 was 40 years ago.
posted by maxwelton at 6:25 PM on September 2, 2011 [2 favorites]


Is it next year already?
posted by Sys Rq at 6:27 PM on September 2, 2011


The film shutter was open for a long time, and the film recorded whatever showed up on the CRT during that time

Wow. It's incredible to think about doing any graphics production work on a computer with less memory than a piece of Polaroid instant film.
posted by aaronetc at 6:31 PM on September 2, 2011


I would dispute the film's claim to represent the "world's first 3-D rendered movie", whatever that means.

As early as 1968 at MIT, Nick Negroponte, later founder of the MIT Media Lab, then a graduate student in the architecture department, was working on 3D movie renditions of architectural designs — ways to take an eye-level tour through an architectural design. These were black and white, line-drawing views of relatively simple structures, but by 1974 at Cornell I saw color 3D movies of architectural designs.

Here is Negroponte in 1975 discussing "Computer-aided participatory design."
posted by beagle at 6:43 PM on September 2, 2011


beagle, no one said "world's first 3-D rendered movie"--it was just '3D rendered'. No world's records were endangered by my statement.

There were lots of 3D computer films by that time. Fred Parke's may have been the first rendered 3D computer-generated film--I don't know, but there were already several 3D line-drawing films that I do know about; from UI and Ohio State at least.
Some films were made by drawing the frames with a plotter and then photographing the paper frames like drawn animation, but dammit I can't remember who did it.
posted by hexatron at 7:12 PM on September 2, 2011


GuyZero: But come on... the Cray X-MP looks perfect as a wonderful seating unit in any modernist home.

A while back, a friend, my girlfriend, and I had the opportunity to visit the National Cryptologic Museum (and the NSA gift shop). The museum has an X-MP set up, and you can walk right up to it and everything. Some of the LED panels are wired up, though I think they were rigged to a battery or something. Upon seeing the X-MP, my friend and I immediately rushed over and sat down on the bench. Soon one of the curators noticed us, and walked over.

"Do you know what you're sitting on?", he asked in a stern voice.

"Yeah! It's a Cray X-MP/Model 24! With a bench and everything!"

He looked at us for a few moments while trying to think of what to say, and eventually came up with:

"Well, as long as it's sitting in admiration."
posted by atbash at 7:49 PM on September 2, 2011 [17 favorites]


This thread has already become one of my favorites here.
posted by spiderskull at 12:06 AM on September 3, 2011


That style of cool jazz used for the soundtrack was already dated by 1972, but man, makes me long for those days when every film looked grainy in an earthy way and had a cool jazz soundtrack. You'd be right at home sitting in your den watching this on a boxy television, shag carpet on the floor and a rolling tray on the coffee table, and scattered across it some Mexican weed, along with a healthy pile of stems and seeds next to the Zig Zag papers. Everything's groovy in the early '70s ...

The animation is surprisingly good, as other people have mentioned.
posted by krinklyfig at 1:20 AM on September 3, 2011


So, some people mention that 3D imaging was already happening by 1972, but it hadn't even entered my imagination by the late '70s that such a thing was possible.

When I was a kid in the '70s, I remember going to the main computer room for the local school system with my soon to be step-dad. In those days they had a giant computer in a separate room, which was used for student and staff records. It was a punchcard system - I believe both the programs and data were stored on those cards, which had to be in just the right order in the stack to work properly. I was frankly blown away that I could play a game of computer golf, with each shot printed out on a dot matrix printer, as there was no console. You clicked one key for direction and one for velocity, and the printer would churn out a primitive rendering (using ASCII letters and symbols) of the hole you were on, along with the ball, which was placed according to the shot you played. I think I only got to play one or two holes, the printer chattering away furiously and showing me ever so slowly how the game played out, to my amazement. A whole round would have used up far too much paper and likely would have taken most of the day to complete ...

To me, it was magic, even though punchcard technology dated back to the 19th century. Not sure what I would have thought if someone showed me this film. My head would have asploded. Pretty sure of it.
posted by krinklyfig at 1:32 AM on September 3, 2011


40 Year Old 3D Computer Graphics, created [...] in... wait for it... 1972!

HAPPY NEW YEAR!
posted by sodium lights the horizon at 1:41 AM on September 3, 2011


I like how you can tell which side of 40 years old the commenters in this thread are.

Probably closer to 60. I'm north of 40, and this stuff predates me by quite a long while.
posted by Malor at 3:56 AM on September 3, 2011


hexatron:
In the opening frames of the movie the claim is made that it shows the "world's first 3D rendered movie." So there is in fact a "first" claim being made, which is disputable.
posted by beagle at 5:24 AM on September 3, 2011


beagle--'rendered' in the movie means 'displayed as shaded surfaces' vs 'line drawing'. All the earlier films were made of line drawings, typically by photographing a line-drawing display.

cf. 3D rendering:
3D rendering is the 3D computer graphics process of automatically converting 3D wire frame models into 2D images with 3D photorealistic effects on a computer.
posted by hexatron at 6:42 AM on September 3, 2011


Hey, once you young whippersnappers get off my lawn, you can learn a lot from people here who were in on the start of computer graphics like Hexatron:-) Yes, more like over 60 than over 40 for some of us. We have a picture of one of our sons, now over 40, at about age 6 drawing with an old paint program at NYIT. He now works in the movie industry on graphics special effects. in which he got an extra early start.
posted by mermayd at 7:01 AM on September 3, 2011


Sitting north of 40 myself (I would have been one of the kids who visited the local college to see the Computer Science people doing stuff like this), what I was noticing were comments from the younger side, like these
Wait, was this prerendered or is it real time?
and
I also think that this is the only digital copy of it.
This statement is just strange to me.
They really show how the changes in computer performance, while objectively just difference in quantity, have become differences in quality.

"Real-time"? If you'd ever sat waiting for a 1970's computer to compute the next 100 values in your pretty basic algorithm, you'd immediately know it wasn't real-time.

"Only digital copy"? Let's say the movie was 15Mb worth of data. Was it worth it spending $500 to buy an extra hard-drive just to keep as a back-up copy? (At first I thought I might be a bit high on that estimated price, but after a little research I'm thinking it's too low.) Shoot, I forgot about tape. I have no idea about the capacity or cost of tape. But I do remember when you just didn't have enough storage space to keep a copy of anything you might ever want.

I've got no desire to offend either the older or younger half. I just find it fascinating how in a particular time and place there are incidental facts of material culture that are just obvious to those who experienced it, but unknown and unsuspected by those who didn't.

The 1970s is the earliest decade I remember, and it's now a historical period. (Yeah, I know. But that 30-year old you're talking to? Born in the 80s. I think the whole last decade just kind of snuck by many of us, just because we never came up with a good name for it.) And when I see movies or TV shows set in the 70s, sometime there are things that are glaringly out of place. That has made me think about the image I have of the 1950s or the 1930s. It's mostly what's been given to me by popular media. I wonder what elements of people's lives have been completely left out.
posted by benito.strauss at 8:06 AM on September 3, 2011 [1 favorite]


When I was a kid in the '70s, I remember going to the main computer room for the local school system with my soon to be step-dad. In those days they had a giant computer in a separate room, which was used for student and staff records. It was a punchcard system.

Why you kids and your fancy punched cards, we didn't have no punched cards when I was in junior high school. We used computer cards that we filled out with a #2 pencil. And we liked it!

No I'm not kidding. I had access to the local University IBM/360 through our Jr. High math club (yeah I was a geek). We wrote FORTRAN programs on paper, and then transferred them to "mark sense" cards with a #2 pencil.
posted by charlie don't surf at 3:42 PM on September 3, 2011


less memory than a piece of Polaroid instant film.

hmmm a piece of 4x5 film has a great deal of memory compared to a sensor (100megs tops at the time of writing) , about 1gig i think, i think thats in terms of scannable info though.

Amazing post.
posted by sgt.serenity at 5:11 PM on September 3, 2011


This is really really cool.

I thought the uncanny valley was worse when things approached verisimilitude... surprising how utterly creeped out those simplified faces left me feeling.
posted by kinnakeet at 6:24 PM on September 3, 2011


Why you kids and your fancy punched cards, we didn't have no punched cards when I was in junior high school. We used computer cards that we filled out with a #2 pencil. And we liked it!

For class registration, we did that too. But the more permanent records were kept on punched cards.

Actually, we did like it. Other than the archaic computer system (mid 1980s by the time I was in high school), we had a lot of autonomy in picking not only our class but also our instructor, at least for the classes which had more than one section and teacher available. The choice was dependent on when your name came up for registration. When they finally upgraded the computer system and did away with the cards is when that type of selection also went away. You could still pick your classes but not your teacher anymore. That happened after my freshman year. It had to be done, but before then it was almost like college registration.
posted by krinklyfig at 10:54 PM on September 3, 2011


Let's say the movie was 15Mb worth of data. Was it worth it spending $500 to buy an extra hard-drive just to keep as a back-up copy? (At first I thought I might be a bit high on that estimated price, but after a little research I'm thinking it's too low.

You're lowballing by a factor of about 200-ish.

A 15MB hard drive in 1972 would have been well over $100,000 (Winchester drives didn't hit the market until a year later, but even they would have cost as much as a mansion in those days)
posted by ShutterBun at 6:34 AM on September 6, 2011


« Older Mayer Hawthorne, and tributes to his inspirations   |   As many games as grains of sand. Newer »


This thread has been archived and is closed to new comments



Post