Which is Better for TV Graphics, Mac or Windows?
May 31, 2017 1:01 PM Subscribe
Neither. What you need is the Cromemco System 400 with ArtStar! (slyt)
huh, that was pretty impressive. didn't know 1987 had that tech available.
back in 1987, i was using HCHAR and VCHAR on my TI-99
posted by bitteroldman at 1:11 PM on May 31, 2017 [1 favorite]
back in 1987, i was using HCHAR and VCHAR on my TI-99
posted by bitteroldman at 1:11 PM on May 31, 2017 [1 favorite]
Funny. I worked in television production in the earlier 80s. At that time, we had a $500k standalone computer graphics system (Thomson-CSF IIRC) that was insanely complex, took programming skills, and would often take days to render simple animated graphics.
I remember spending weeks to get a title text to colorize, roll forward, and then sparkle. Nowadays, every youtube vid has some kind of intro that prolly took 30s to compose.
posted by CrowGoat at 1:28 PM on May 31, 2017 [4 favorites]
I remember spending weeks to get a title text to colorize, roll forward, and then sparkle. Nowadays, every youtube vid has some kind of intro that prolly took 30s to compose.
posted by CrowGoat at 1:28 PM on May 31, 2017 [4 favorites]
Ah, you're lucky! We had a four million dollar standalone computer graphics system made of rusty tin cans! It was so complicated only four people in the world could use it, and they all hated us! You had to program it in ancient Sumerian cuneiform, and rendering a simple wipe transition took eight years! I spent a month once just trying to turn it on without getting tetanus!
Oh, we used to DREAM of rendering animated graphics in days.
posted by Naberius at 1:50 PM on May 31, 2017 [14 favorites]
Oh, we used to DREAM of rendering animated graphics in days.
posted by Naberius at 1:50 PM on May 31, 2017 [14 favorites]
I'm shocked that it's being operated by people wearing suits. That's doesn't square with my time in the computer animation industry at all. (There was one guy who'd always wear a tie. Everyone called him Tie. It was two years before I learned that that wasn't his real name.)
posted by clawsoon at 1:55 PM on May 31, 2017 [4 favorites]
posted by clawsoon at 1:55 PM on May 31, 2017 [4 favorites]
Cromemco also made the first digital camera.
posted by General Malaise at 2:27 PM on May 31, 2017 [2 favorites]
posted by General Malaise at 2:27 PM on May 31, 2017 [2 favorites]
I worked in the signal processing lab at CWRU for a while back in the early 80s. They'd bring visitors in to marvel at the expensive raster displays (don't call them monitors) that could display ... 64 simultaneous colors.
posted by lagomorphius at 3:36 PM on May 31, 2017
posted by lagomorphius at 3:36 PM on May 31, 2017
By 1987 we had a Quantel Harry at Post Perfect in NYC, which was basically the Quantel Paintbox coupled to a disk based recorder that held 50 seconds of 4:2:2 525 line component video. I don't recall there being any Cromemco systems in post production facilities.
posted by Dean358 at 3:47 PM on May 31, 2017
posted by Dean358 at 3:47 PM on May 31, 2017
At uni in the mid-90s, there was still one of these beauties hooked up in the studio.
It was last seen around 2000. In a skip.
posted by prismatic7 at 3:56 PM on May 31, 2017
It was last seen around 2000. In a skip.
posted by prismatic7 at 3:56 PM on May 31, 2017
I love their sample promotions. House fire! Bomb threat!
huh, that was pretty impressive. didn't know 1987 had that tech available.
1987 is when the VGA card came out. Which does support a flexible 8-bit graphics system with 256 simultaneously displayed colors from a quarter million color palette. No idea what the story is with the 24-bit system though (or how they're integrated). I wonder if the strange specification of "more than 367,000 colors from a palette of more than 16 million colors" just means they have regular 24-bit color, but the number of simultaneous colors is limited by the resolution (640x576?).
Also, there's a glitch in the upper left corner of the 3d.
posted by aubilenon at 4:03 PM on May 31, 2017
huh, that was pretty impressive. didn't know 1987 had that tech available.
1987 is when the VGA card came out. Which does support a flexible 8-bit graphics system with 256 simultaneously displayed colors from a quarter million color palette. No idea what the story is with the 24-bit system though (or how they're integrated). I wonder if the strange specification of "more than 367,000 colors from a palette of more than 16 million colors" just means they have regular 24-bit color, but the number of simultaneous colors is limited by the resolution (640x576?).
Also, there's a glitch in the upper left corner of the 3d.
posted by aubilenon at 4:03 PM on May 31, 2017
This is awesome. Is there an in-browser emulator for it somewhere?
posted by paper chromatographologist at 4:56 PM on May 31, 2017 [2 favorites]
posted by paper chromatographologist at 4:56 PM on May 31, 2017 [2 favorites]
Wait wait wait, no mention of Amiga and Video Toaster!? [wiki page] [rad celebrity endorsement vid]
posted by destructive cactus at 5:13 PM on May 31, 2017 [2 favorites]
posted by destructive cactus at 5:13 PM on May 31, 2017 [2 favorites]
The part where they just sorta ... pile ... bikini models into an image was weird and mildly creepy.
posted by aramaic at 5:13 PM on May 31, 2017 [3 favorites]
posted by aramaic at 5:13 PM on May 31, 2017 [3 favorites]
Yeah, the Video Toaster was announced in 1987 - and the Amiga had come out two years earlier, with only 12-bit colour but a really strong indication that consumer-grade silicon was getting interesting for professional purposes. But dedicated workstations for different tasks still had a while to go... and then the murders began.
VGA was kind-of impressive at the time (especially compared to the clumsy-arse EGA) but it was still a very simplistic architecture that knew about putting a chunk of linear memory onto a screen and mapping some colours in, but nothing else. Because why would you need a business computer to do anything more? If you wanted more, you were in a well-heeled profession which could afford custom, small-run, complex designs (with all peripherals priced to match, natch). SGI's glory days were between 1985 and 1995, for example, but then the combo of 3D mass-market PC adaptors and the ability to create ad-hoc processing clusters out of commodity boxes moved the momentum elsewhere and cratered the performance at any price ploy. For a while. the increased opportunity for consumer-type graphics content that could best be fed by boffo CAD/video/dev workstations kept the boat afloat, as did the big sciencey stuff, but good enough kept eating away at the best.
Markets coalesced. Having 50 percent of a market is great, but not if that whole market then becomes 10 percent of another market and you're suddenly in competition with people who are making and selling 95 chips to every five you box up at five times the price. Games could soak up a lot of graphics grunt, and that period was when Moore's Law was in a sweet spot of not only matching but driving consumer hunger for more of everything, please.
On the workstation planet, nobody got out alive.
posted by Devonian at 5:49 PM on May 31, 2017 [8 favorites]
VGA was kind-of impressive at the time (especially compared to the clumsy-arse EGA) but it was still a very simplistic architecture that knew about putting a chunk of linear memory onto a screen and mapping some colours in, but nothing else. Because why would you need a business computer to do anything more? If you wanted more, you were in a well-heeled profession which could afford custom, small-run, complex designs (with all peripherals priced to match, natch). SGI's glory days were between 1985 and 1995, for example, but then the combo of 3D mass-market PC adaptors and the ability to create ad-hoc processing clusters out of commodity boxes moved the momentum elsewhere and cratered the performance at any price ploy. For a while. the increased opportunity for consumer-type graphics content that could best be fed by boffo CAD/video/dev workstations kept the boat afloat, as did the big sciencey stuff, but good enough kept eating away at the best.
Markets coalesced. Having 50 percent of a market is great, but not if that whole market then becomes 10 percent of another market and you're suddenly in competition with people who are making and selling 95 chips to every five you box up at five times the price. Games could soak up a lot of graphics grunt, and that period was when Moore's Law was in a sweet spot of not only matching but driving consumer hunger for more of everything, please.
On the workstation planet, nobody got out alive.
posted by Devonian at 5:49 PM on May 31, 2017 [8 favorites]
I had no idea that combing an 8 bit and a 24 bit graphics yields a 32 bit image, but the math does seem to work out.
posted by maryr at 7:13 PM on May 31, 2017 [2 favorites]
posted by maryr at 7:13 PM on May 31, 2017 [2 favorites]
I would murder someone to get one of those old Fairlight cases and the skill to mod it and the keyboard and such for my PC.
posted by Samizdata at 7:22 PM on May 31, 2017
posted by Samizdata at 7:22 PM on May 31, 2017
This reminds me of my adolescent lust for the Video Toaster, a Commodore-Amiga-based video production system. I send away for a copy of this promotional video on VHS, and wished I could afford the four-digit price tag so I could make shitty Lawnmower-Man-style 3D scenes.
(on non-preview: jinx, destructive cactus!)
posted by escape from the potato planet at 7:25 PM on May 31, 2017 [2 favorites]
(on non-preview: jinx, destructive cactus!)
posted by escape from the potato planet at 7:25 PM on May 31, 2017 [2 favorites]
It was so complicated only four people in the world could use it, and they all hated us! You had to program it in ancient Sumerian cuneiform
ππππππππΈπΈπΈπΈπΈππΈππππΈπππΈππΈπΈ
ππππππππΈπΈπΈπΈπΈπππΈπΈπππππΈπΈπΈπ
posted by XMLicious at 8:31 PM on May 31, 2017
Naberius: "Oh, we used to DREAM of rendering animated graphics in days."
Was this at a TV station in Yorkshire?
posted by Chrysostom at 8:40 PM on May 31, 2017 [2 favorites]
Was this at a TV station in Yorkshire?
posted by Chrysostom at 8:40 PM on May 31, 2017 [2 favorites]
I always thought TRON should end with Jeff Bridges getting back out of the computer but it's hundreds of millions of years after he went in because the computer he was in had like a 0.73MHz CPU.
posted by under_petticoat_rule at 9:14 PM on May 31, 2017 [6 favorites]
posted by under_petticoat_rule at 9:14 PM on May 31, 2017 [6 favorites]
Or he finds that it's actually an emulation written in Javascript and running on a cruddy Android tablet. It's the Tron-Matrix mash-up the world needs!
(other alternative ending - screen fades to black with red lettering saying 'Opps! Your files have been encrypted. Send 3 bitcoin within 24 hours.')
posted by Devonian at 4:44 AM on June 1, 2017 [2 favorites]
(other alternative ending - screen fades to black with red lettering saying 'Opps! Your files have been encrypted. Send 3 bitcoin within 24 hours.')
posted by Devonian at 4:44 AM on June 1, 2017 [2 favorites]
The MSX2 had a 256-color mode using the Yamaha V9938 VDP chip, which I believe was introduced around 1985.
posted by RobotVoodooPower at 6:57 AM on June 1, 2017
posted by RobotVoodooPower at 6:57 AM on June 1, 2017
I remember trying to paint on my Amiga in HAM display mode, which was a very strange and non-intuitive experience. I got my Amiga 500 used in 1993. I'm not sure when it was manufactured, but the article I linked above says HAM was introduced in 1985.
posted by under_petticoat_rule at 7:37 AM on June 1, 2017
posted by under_petticoat_rule at 7:37 AM on June 1, 2017
I think one of, if not the, first graphics processor chips was the NEC 7220 from (checks Wikipedia) 1982. It had an internal processor that could do graphics primitives, had multiple mixed-mode windows and other useful stuff, could address up to 512K of video memory and produce 1024x1024 with four colour planes (the palette management was external, so depended on implementation). Drawing speed was 500ns/pixel (ouch... but this was 1982).
Not so much advanced for its time as unique, and quite widely used - although never by a major PC or home computer maker. I first encountered it in an HH Tiger (it's very obscure, you won't have heard of it, can I borrow some beard product?) but it also ended up in DEC Rainbows and stuff like the Vetrix Terminal. about which I can find little except worrying notes like "Singer/composer Todd Rundgren wrote a paint program using the Vectrix terminal and an Apple II with a graphics tablet.". There's an impressive promo video, or you can just relax to the fine Moogular experience of the short animation, Chips In Space. (Rundgren features in neither.)
posted by Devonian at 8:32 AM on June 1, 2017
Not so much advanced for its time as unique, and quite widely used - although never by a major PC or home computer maker. I first encountered it in an HH Tiger (it's very obscure, you won't have heard of it, can I borrow some beard product?) but it also ended up in DEC Rainbows and stuff like the Vetrix Terminal. about which I can find little except worrying notes like "Singer/composer Todd Rundgren wrote a paint program using the Vectrix terminal and an Apple II with a graphics tablet.". There's an impressive promo video, or you can just relax to the fine Moogular experience of the short animation, Chips In Space. (Rundgren features in neither.)
posted by Devonian at 8:32 AM on June 1, 2017
VGA ... was still a very simplistic architecture that knew about putting a chunk of linear memory onto a screen and mapping some colours in, but nothing else. Because why would you need a business computer to do anything more? If you wanted more, you were in a well-heeled profession which could afford custom, small-run, complex designs (with all peripherals priced to match, natch). SGI's ...
I can find a list of components featured in early SGI workstations ("Frame Buffer", "Update Controller", "Display Controller", "Bit plane"), but I can't find any interesting details - even CGA had a frame buffer, and "display controller" could mean nearly anything. Can you say a bit about the cool stuff they did that VGA didn't?
posted by aubilenon at 10:56 AM on June 1, 2017
I can find a list of components featured in early SGI workstations ("Frame Buffer", "Update Controller", "Display Controller", "Bit plane"), but I can't find any interesting details - even CGA had a frame buffer, and "display controller" could mean nearly anything. Can you say a bit about the cool stuff they did that VGA didn't?
posted by aubilenon at 10:56 AM on June 1, 2017
A frame buffer is just the area of memory that contains the data to be displayed, and they exist in every graphics system that uses frames (which is most of them; vector graphics systems have a list of points to connect up instead). In general, there's a series of bytes in memory which correspond in some way to a pattern of pixels on the screen. At its simplest, you have a block of memory where each bit in a byte is a pixel displayed black if it's a zero, white if it's a one; each successive byte is each successive row of eight pixels from the top left of the screen to the bottom right. So just about every computer you'll come across, from a ZX81 to a PS4, has frame buffers of some sort or another. (Early video game consoles had wonderfully bizarre systems, which is a different story altogether...)
With VGA, you don't really have much more. The CPU works out all the contents of the image and puts it in the frame buffer, and the VGA just translates that into video. I forget all the details (I did do low-level VGA programming a loooong time ago), but you could have multiple frame buffers so you could quickly switch from one to the other for rapid updates without having to wait for the CPU to redraw a screen while you watched, and of course you had different modes that had different resolutions (and thus different sizes of frame buffers), and ISTR there were a few tricks you could play with mode-switching at different times, but that was about it. Colours (or intensities) could be changed through palette mapping, which meant the main display system put out, say, colours one through 64 and a separate system assigned each one of those numbers to a real colour out of a much larger range.
SGI had a lot of different architectures, but in general what makes a graphics processor unit different from a VGA-like display adaptor is that the GPU has a lot of capabilities within the chip to create and manipulate objects and areas according to commands sent to it - it doesn't rely on the main CPU to do all the maths and create the contents of the display. So it could have circuitry that will automatically take a pattern and apply it to defined areas of the screen (texture mapping), or draw lines or triangles, or manage shapes by manipulating their corners, and so on. Because this is all on-chip and tightly coupled to the chip's own memory interface - which is often if not always faster than the CPU's memory interface - it can in any case manipulate display memory much faster than a dumb display adaptor. GPUs can also have multiple computational cores to do parallel processing on many different areas of a frame buffer (usually, many independent buffers that the chip stitches together for the final image), a memory architecture that's optimised for such shenanigans, and so on.
There's nothing you can do on a GPU that you can't do with a CPU and a dumb adaptor, and nothing you can do in hardware that you can't do in software, but how fast you can do it and how much it costs depend hugely on which choices you make - and can make, given the technology available to you at the time.
As graphics circuitry developed over time, some of these functions moved back and forth between the GPU and the CPU, depending on how performance and cost issues changed - SGI had 3D processing in hardware, then because it needed to be more price-competitive moved it back to the CPU for its lower-cost models, then as GPU design got better back into the GPU again. Things like the speed and cost of memory also played a big role in how such things were partitioned.
Because we can now build chips with thousands of processing cores, and graphics is uniquely suited to massively parallel processing, that's how we do GPUs these days - the gaming market is strong enough to pay for the R&D and big production runs that keep the cost down. GPUs are also the de-facto computational units for a lot of other massively-parallel supercomputing tasks; the gamers have paid for that, so when you play Call of Duty you're supporting Science, so yay. On the other hand, a lot of rendering animated very high resolution frames is complex and diverse and big that CPUs are still doing it, especially now that even general purpose CPUs have instructions explicitly for handling graphical data efficiently.
To do the history of desktop computer graphics justice would take a book, though.
posted by Devonian at 11:47 AM on June 1, 2017 [4 favorites]
With VGA, you don't really have much more. The CPU works out all the contents of the image and puts it in the frame buffer, and the VGA just translates that into video. I forget all the details (I did do low-level VGA programming a loooong time ago), but you could have multiple frame buffers so you could quickly switch from one to the other for rapid updates without having to wait for the CPU to redraw a screen while you watched, and of course you had different modes that had different resolutions (and thus different sizes of frame buffers), and ISTR there were a few tricks you could play with mode-switching at different times, but that was about it. Colours (or intensities) could be changed through palette mapping, which meant the main display system put out, say, colours one through 64 and a separate system assigned each one of those numbers to a real colour out of a much larger range.
SGI had a lot of different architectures, but in general what makes a graphics processor unit different from a VGA-like display adaptor is that the GPU has a lot of capabilities within the chip to create and manipulate objects and areas according to commands sent to it - it doesn't rely on the main CPU to do all the maths and create the contents of the display. So it could have circuitry that will automatically take a pattern and apply it to defined areas of the screen (texture mapping), or draw lines or triangles, or manage shapes by manipulating their corners, and so on. Because this is all on-chip and tightly coupled to the chip's own memory interface - which is often if not always faster than the CPU's memory interface - it can in any case manipulate display memory much faster than a dumb display adaptor. GPUs can also have multiple computational cores to do parallel processing on many different areas of a frame buffer (usually, many independent buffers that the chip stitches together for the final image), a memory architecture that's optimised for such shenanigans, and so on.
There's nothing you can do on a GPU that you can't do with a CPU and a dumb adaptor, and nothing you can do in hardware that you can't do in software, but how fast you can do it and how much it costs depend hugely on which choices you make - and can make, given the technology available to you at the time.
As graphics circuitry developed over time, some of these functions moved back and forth between the GPU and the CPU, depending on how performance and cost issues changed - SGI had 3D processing in hardware, then because it needed to be more price-competitive moved it back to the CPU for its lower-cost models, then as GPU design got better back into the GPU again. Things like the speed and cost of memory also played a big role in how such things were partitioned.
Because we can now build chips with thousands of processing cores, and graphics is uniquely suited to massively parallel processing, that's how we do GPUs these days - the gaming market is strong enough to pay for the R&D and big production runs that keep the cost down. GPUs are also the de-facto computational units for a lot of other massively-parallel supercomputing tasks; the gamers have paid for that, so when you play Call of Duty you're supporting Science, so yay. On the other hand, a lot of rendering animated very high resolution frames is complex and diverse and big that CPUs are still doing it, especially now that even general purpose CPUs have instructions explicitly for handling graphical data efficiently.
To do the history of desktop computer graphics justice would take a book, though.
posted by Devonian at 11:47 AM on June 1, 2017 [4 favorites]
Can you say a bit about the cool stuff they did that VGA didn't?
One important thing: SGI created OpenGL and the hardware to go with it. Dedicated hardware for 3D with an open API for developers.
posted by clawsoon at 11:53 AM on June 1, 2017
One important thing: SGI created OpenGL and the hardware to go with it. Dedicated hardware for 3D with an open API for developers.
posted by clawsoon at 11:53 AM on June 1, 2017
Hah, that's a real great explanation, but I feel guilty for making you type it! I even wrote "I know what a framebuffer is" in my initial draft of that comment, but I guess I lost it while editing. I am quite familiar with old PC graphics tech, but I never worked with much else, so I'm super interested to hear the nitty gritty details about things other platforms did. But from your description it doesn't sound like they did anything particularly weird, and most of it eventually came to commodity PC hardware, if way way way later (e.g., hardware geometry acceleration didn't show up until nVidia's GeForce in 1999).
posted by aubilenon at 1:15 PM on June 1, 2017 [1 favorite]
posted by aubilenon at 1:15 PM on June 1, 2017 [1 favorite]
Nah, I like doing it!
The history of computer graphics is really interesting, involved and as old as computers themselves. I only know those little bits I happened to either intersect as part of my engineering or journalism background, so others will know far more about the nitty gritties. Or you could dig around yourself; even finding promotional material online from an interesting system will give you some terms or clues to feed back into Google. For example, the only SGI machine I ever encountered properly was an O2 (I reviewed it for a magazine, as SGI had belatedly realised that talking to journalists might be useful... it didn't help them). Wikipedia sez about the O2's graphics -
The O2 used the CRM chipset that was specifically developed by SGI for the O2. It was developed to be a low-cost implementation of the OpenGL 1.1 architecture with ARB image extensions in both software and hardware. The chipset consists of the microprocessor, and the ICE, MRE and Display ASICs. All display list and vertex processing, as well as the control of the MRE ASIC is performed by the microprocessor. The ICE ASIC performs the packaging and unpacking of pixels as well as operations on pixel data. The MRE ASIC performs rasterization and texture mapping. Due to the unified memory architecture, the texture and framebuffer memory comes from main memory, resulting in a system that has a variable amount of each memory. The Display Engine generates analog video signals from framebuffer data fetched from the memory for display.
Which says basically nothing, but follow the clues there and within a few hops you'll be finding tons of technical stuff about the architecture of not only that but the preceding generations, Like this architecture, which led to this monster of a chip.
posted by Devonian at 2:03 PM on June 1, 2017 [1 favorite]
The history of computer graphics is really interesting, involved and as old as computers themselves. I only know those little bits I happened to either intersect as part of my engineering or journalism background, so others will know far more about the nitty gritties. Or you could dig around yourself; even finding promotional material online from an interesting system will give you some terms or clues to feed back into Google. For example, the only SGI machine I ever encountered properly was an O2 (I reviewed it for a magazine, as SGI had belatedly realised that talking to journalists might be useful... it didn't help them). Wikipedia sez about the O2's graphics -
The O2 used the CRM chipset that was specifically developed by SGI for the O2. It was developed to be a low-cost implementation of the OpenGL 1.1 architecture with ARB image extensions in both software and hardware. The chipset consists of the microprocessor, and the ICE, MRE and Display ASICs. All display list and vertex processing, as well as the control of the MRE ASIC is performed by the microprocessor. The ICE ASIC performs the packaging and unpacking of pixels as well as operations on pixel data. The MRE ASIC performs rasterization and texture mapping. Due to the unified memory architecture, the texture and framebuffer memory comes from main memory, resulting in a system that has a variable amount of each memory. The Display Engine generates analog video signals from framebuffer data fetched from the memory for display.
Which says basically nothing, but follow the clues there and within a few hops you'll be finding tons of technical stuff about the architecture of not only that but the preceding generations, Like this architecture, which led to this monster of a chip.
posted by Devonian at 2:03 PM on June 1, 2017 [1 favorite]
I can find little except worrying notes like "Singer/composer Todd Rundgren wrote a paint program using the Vectrix terminal and an Apple II with a graphics tablet.
Todd Rundgren, you say? I noticed his name as an owner of one of the "instruments" in the page prismatic7 linked to above.
posted by under_petticoat_rule at 3:28 PM on June 1, 2017
Todd Rundgren, you say? I noticed his name as an owner of one of the "instruments" in the page prismatic7 linked to above.
posted by under_petticoat_rule at 3:28 PM on June 1, 2017
aubilenon: But from your description it doesn't sound like they did anything particularly weird, and most of it eventually came to commodity PC hardware, if way way way later (e.g., hardware geometry acceleration didn't show up until nVidia's GeForce in 1999).
The story I heard coming into the industry (in 2000, just when the transition from SGI to PC was starting) was that Nvidia was founded by a breakaway group of SGI engineers, though I've heard that story disputed since then. I've never bothered to confirm it one way or another. I saw lots of O2s, some Octanes for high-end work, and Onyx or (in SGI's last gasp) Tezro in server roles. An Indy or two around as museum pieces.
You're right that pretty much everything eventually came to PC hardware, but think of that as a tribute to their technology: The fact that little of what they did seems weird is because they got a lot right when they created it in the first place. There aren't many standards which have maintained significance since 1991 and are still driving cutting-edge hardware technology.
If you want a more forceful answer, you could probably ask your question somewhere over on Nekochan.net, where I believe a few SGI diehards still share hardware and software info. Their wiki may have some useful hardware info, too.
posted by clawsoon at 6:55 PM on June 1, 2017 [1 favorite]
The story I heard coming into the industry (in 2000, just when the transition from SGI to PC was starting) was that Nvidia was founded by a breakaway group of SGI engineers, though I've heard that story disputed since then. I've never bothered to confirm it one way or another. I saw lots of O2s, some Octanes for high-end work, and Onyx or (in SGI's last gasp) Tezro in server roles. An Indy or two around as museum pieces.
You're right that pretty much everything eventually came to PC hardware, but think of that as a tribute to their technology: The fact that little of what they did seems weird is because they got a lot right when they created it in the first place. There aren't many standards which have maintained significance since 1991 and are still driving cutting-edge hardware technology.
If you want a more forceful answer, you could probably ask your question somewhere over on Nekochan.net, where I believe a few SGI diehards still share hardware and software info. Their wiki may have some useful hardware info, too.
posted by clawsoon at 6:55 PM on June 1, 2017 [1 favorite]
Painting with Light (about the new-fangled computer art)
posted by cosmic.osmo at 9:00 PM on June 1, 2017 [1 favorite]
posted by cosmic.osmo at 9:00 PM on June 1, 2017 [1 favorite]
A friend has just restored a Cromemco Dazzler system from the mid-70s. It has all of 2K of video memory, but it had genlock capability and so owned the early digital weather maps on television.
posted by scruss at 10:10 AM on June 3, 2017
posted by scruss at 10:10 AM on June 3, 2017
I didn't know about the Dazzler - apparently, it started off as a DIY project in Popular Electronics. The sample animation in this short video looks very much like Viewdata/Teletext, which was developed contemporaneously.
posted by Devonian at 1:17 PM on June 3, 2017
posted by Devonian at 1:17 PM on June 3, 2017
« Older "Oh! It's me, isn't it?" | goodnight nobody Newer »
This thread has been archived and is closed to new comments
posted by phooky at 1:10 PM on May 31, 2017 [4 favorites]