What's Inside your [Desktop] computer Graphics Card? SLYT - 6min 28sec
February 12, 2019 4:23 AM   Subscribe

Animated explanation of your desktop graphics cards components and what each part is designed to do. Use a computer? Game on a PC? Ever wonder how those graphics get so pretty? Let's go inside your high-end graphics card with this animation.

This Livewire short article (updated in 2018) is a good intro to the question What Is a Video Card?.

If you'd rather read about how we got the GPU technology we have today, this 2013 TechSpot article looks at the history of Graphic cards.

This PC world 11 image slide-show (also from 2013) purports to list the 10 Most Important Graphics Cards in PC history.

Know of a more recent (or more comprehensive ) edutainment article / video about graphics cards? Please share!
posted by Faintdreams (15 comments total) 30 users marked this as a favorite
 
How apropos, I'm in a computer graphics class this semester. I'm sending that first video to my prof, thanks for posting.
posted by scruffy-looking nerfherder at 6:06 AM on February 12, 2019


10 most important graphics cards... No mention of SGI. Sad.
posted by GuyZero at 7:36 AM on February 12, 2019 [2 favorites]


I am currently troubleshooting a dying graphics card so this is also apropos for me. Thank you for sharing.
posted by Hermione Granger at 8:31 AM on February 12, 2019


*lovingly pets my GTX 950*
posted by Fizz at 8:48 AM on February 12, 2019


The primary video link boils down to describing 4 components
  • Cooling system
  • GPU
  • VRAM
  • Voltage regulator
It's a pretty high level treatment.

I'd love to read an intro to programming a modern GPU. The equivalent of an assembly language tutorial, but for a GPU instead of a CPU. Any suggestions?
posted by Nelson at 9:19 AM on February 12, 2019 [1 favorite]


Yeah, the main thing the video says is that a video card is a little computer inside your computer with its own processor and memory that differs from the main computer mainly by having thousands of cores optimized for doing one kind of calculation over and over really fast.
posted by straight at 9:40 AM on February 12, 2019


What I think is interesting, and wish I knew more about, is the way GPUs are one area where the parallel processing that the computer industry has promised for so long has really taken off. I understand that there are graphics cards that don't actually do any graphics - you use them to offload parallel calculations.
posted by Multicellular Exothermic at 9:47 AM on February 12, 2019


NVidiocy.
posted by Jessica Savitch's Coke Spoon at 12:09 PM on February 12, 2019


The equivalent of an assembly language tutorial, but for a GPU instead of a CPU. Any suggestions?

I'm vaguely aware that there is an OpenGL assembly language (ARB), but personally I'd start with the higher-level and more-often used GLSL; there are a bunch of other shader languages, but it's widespread, capable, and fairly simple.
posted by sfenders at 12:36 PM on February 12, 2019 [3 favorites]


there are graphics cards that don't actually do any graphics - you use them to offload parallel calculations.

You can do that with just about any graphics card. The nvidia way, for example, is called cuda. Very useful if you're trying to train a deep neural network on your home PC.
posted by sfenders at 12:40 PM on February 12, 2019 [1 favorite]


Very useful if you're trying to train a deep neural network on your home PC.

Isn't everyone?!
posted by Fizz at 1:33 PM on February 12, 2019 [2 favorites]


This set of lecture slides has a pretty good description of how a GPU works although it somewhat assumes you know how 3d graphic work in general. I'm sure it would be even better with an actual lecture.
posted by GuyZero at 2:17 PM on February 12, 2019


Isn't everyone?!

You joke, but in Silicon Valley, yes, yes they are.
posted by GuyZero at 2:17 PM on February 12, 2019 [1 favorite]


Your graphics card has a clot in its brain, and the only way to get it out is to miniaturize you and a ship and inject you into it and have you adventure through the card to the right location and burn the clot out manually while tiny!

Oh, wait, um... this isn't that? Dammit!
posted by hippybear at 1:23 AM on February 13, 2019 [1 favorite]


GPU Performance for Game Artists is an article that made the rounds in gamedev circles a couple years ago that happens to give a really good overview.

For learning GPU programming I cannot recommend Shadertoy (WebGL req) enough for its whole “see something cool, edit it and run your modified version instantly” nature. The author (Inigo Quilez aka IQ) has also published exhaustive sets of optimized code for basic raytracing operations, particularly intersection testing. Alternately, if you’d prefer something both more user-friendly and in line with actual game development praxis, Unreal 4 is a free download - just open up any base material and try playing around with the Material Editor, which is basically streamlined visual programming for GPUs in abstract.
posted by Ryvar at 10:01 AM on February 13, 2019 [2 favorites]


« Older Te tiriti o Waitangi: the comic book   |   Oh-thello! Newer »


This thread has been archived and is closed to new comments