"the naive approach is often to use a Gaussian blur"
September 10, 2016 5:40 PM   Subscribe

Have you ever wondered what your graphics card is doing every time it displays one frame of a game? Turns out quite a lot.

In case you started reading this and blacked out, only to wake up in a tub full of math, here are some videos from the "As Fast as Possible" series to bring you up to speed:

Ambient Occlusion
Texture Filtering
Anti Aliasing
More Anti Aliasing
DX12 and Vulkan (and APIs)
Refresh Rates
Resolution
posted by selfnoise (32 comments total) 47 users marked this as a favorite
 
The same author did a similar analysis of GTA V last year.

It's the kind of thing that makes you fascinated, then envious, then frustrated, then sleepy.
posted by RobotVoodooPower at 5:59 PM on September 10, 2016 [4 favorites]


Metafilter: makes you fascinated, then envious, then frustrated, then sleepy.
posted by lalochezia at 7:13 PM on September 10, 2016 [18 favorites]


And just to be pedantic, they are all hacks. Every frame could be rendered exactly by ray traced representations of the physical laws but that would take seconds (or hours depending on level of detail) thus, the use of incredibly sophisticated highly efficient optimizations of the state of the art GPU (graphics processing unit) using matrix algorithms honed by the finest mathematical minds of our generation, all for a game hack!
posted by sammyo at 7:53 PM on September 10, 2016 [5 favorites]


The fad for mucking up nicely pristine video game frames with "aesthetic" depth of field blurs aka "bokeh" is an irritating skeuomorphic regression. One day I hope it will look as dated as magnetic ink fonts or airbrushing everything.
posted by meehawl at 8:28 PM on September 10, 2016 [6 favorites]


They should just pre-render every possible scene, and ship the game on 1000 hard drives.
posted by blue_beetle at 8:44 PM on September 10, 2016 [6 favorites]


The fad for mucking up nicely pristine video game frames with "aesthetic" depth of field blurs aka "bokeh" is an irritating skeuomorphic regression.

Eh, it depends on the game. I would say DoF is rarely used effectively in gameplay (outside of maybe iron sights blur) but certainly belongs in cutscenes. Most post-processing effects of that nature are best used only in the specific contexts where they are helpful, but end up getting larded onto every frame because... who knows why.

Now, motion blur? Motion blur should be deactivated unless you have a 144hz monitor. At 60hz with only a normal GTG speed I'm getting enough blur without you adding more, you silly bastards.

They should just pre-render every possible scene, and ship the game on 1000 hard drives.

Man, there's is some kind of dark alternate future where Trilobyte is shipping "The 91st Guest" on a 4 petabyte crystal.
posted by selfnoise at 8:49 PM on September 10, 2016 [10 favorites]


Myst and two of its sequels did that, Riven had like 5 CDs you needed to swap between when you went to a new area.

Wasn't there a game being publicised that used a 360 camera moved around a real-life "set" to set up environments you could move around?
posted by BungaDunga at 8:51 PM on September 10, 2016


They should just pre-render every possible scene, and ship the game on 1000 hard drives.
you work at google maps, right?
posted by j_curiouser at 9:07 PM on September 10, 2016 [7 favorites]


makes you fascinated, then envious, then frustrated, then sleepy

...and it completes this cycle every 16ms.
posted by lore at 9:23 PM on September 10, 2016 [10 favorites]


And just to be pedantic, they are all hacks. Every frame could be rendered exactly by ray traced representations of the physical laws but that would take seconds (or hours depending on level of detail) thus, the use of incredibly sophisticated highly efficient optimizations of the state of the art GPU (graphics processing unit) using matrix algorithms honed by the finest mathematical minds of our generation, all for a game hack!

Same convolution kernel operator thing (altho different kernels fundamentally and different attitude towards the kernels) got used both in forward and backward pass by AlphaGo for the convnet that they used for the game against Lee Sedol. Not same hardware: they were using TPU's, which cleverly give up precision for speed.
posted by hleehowon at 10:03 PM on September 10, 2016


"aesthetic" depth of field blurs aka "bokeh" is an irritating skeuomorphic regression.

yeah, welll...problem is the thing being skeuomorphically referenced is...your eyeball (at least for circular bokeh). You just don't notice it because it only happens to things you aren't focusing on directly...which is why it's so useful as far as 'this is what you should be focusing on' or 'your eyes aren't focusing properly because you've been drugged and strapped to a table' (like the example they show). Personally, I like a certain amount of immersion. Fun trick: make your built-in bokeh into a super-cheezy star filter...squint! (because eyelashes)
posted by sexyrobot at 10:36 PM on September 10, 2016 [12 favorites]


Can you please explain this to my expensive ass graphics card from NVIDIA which keeps insisting that I *have* to optimize DAI by reducing all the graphics qualities even though it is 100% capable of doing everything on Ultra as evidenced by the fact that I have been running everything on Ultra for all 410 hours I've logged for the game because I think it is having some kind of identity crisis right now.

Also screw GEForce Experience for suddenly forcing me to log in in order to take advantage of all these nice features and updates and stuff for the aforementioned expensive graphics card. I paid for you to do cool stuff without all that nonsense -- why are you pulling a "girl you THOUGHT" on me after all this time???? we both know it isn't necessary so why you lyin :(
posted by Hermione Granger at 11:10 PM on September 10, 2016


It's interesting how these things go in cycles. In the old days, everyone was using forward renderers with a bunch of hacks to avoid applying every light to every triangle. Then in the late 2000's/early 2010's, everyone switched to deferred rendering. Now the hot thing is once again forward rendering with a bunch of (compute shader powered!) hacks to avoid applying every light to every triangle.

I like motion blur and dof well enough, but SSAO (and the alphabet soup of variations on the theme) is a hack which I've grown to dislike-- it has a very distinct and very artificial look. If you've ever noticed mysterious dark halos around objects in a recent game, that's SSAO.

Voxel cone tracing is a newer approach that can produce both correct-ish ambient occlusion and reflections (no screen space hacks), and is on the edge of being usable today.
posted by Pyry at 12:27 AM on September 11, 2016 [5 favorites]


Voxels are the "Linux on the desktop" of graphics programming.
posted by blue_beetle at 8:04 AM on September 11, 2016 [3 favorites]


Does anyone else remember Comanche: Maximum Overkill? Voxels in 1992!
posted by shponglespore at 8:46 AM on September 11, 2016 [2 favorites]


Wow, the stuff they're doing now is just orders of magnitude more complex than when I last paid attention to rendering. It's incredible and impressive, but at the same time it all kind of leaves me cold. In fast-paced games, you're just not going to notice these subtleties, and the technology is skewed toward making More! Impressive! Rockets! With Heat Shimmer!, so if you're building something slower paced where the minor effects are more noticeable-- like, the dappled shadows under a tree on a summer day with a light breeze-- you're SOL with regards to the technology.

Summer's almost over! Someone build me a breezy summer day under a tree simulator, stat!
</oldmanrant>
posted by phooky at 9:11 AM on September 11, 2016 [3 favorites]


Modern games sometimes use voxelization as a component of an overall process. For example, I believe Minecraft uses it to store terrain data, although the actual terrain is polygonal.

The voxel cone tracing is specifically creating a simplistic voxelization internally as part of a lighting strategy.

Games like Comanche and Outcast, on the other hand use voxel renderers specifically for final output (mostly for terrain). Although Outcast is a bit of a mess as a lot of the game is actually rendered in polygons. You can read more here.
posted by selfnoise at 9:16 AM on September 11, 2016


like, the dappled shadows under a tree on a summer day with a light breeze

DOOM is perhaps a bad example of that. Try this game (on my mind since I'm playing it at the moment) . It's relatively slow paced and frequently looks absolutely bananas. (pedantic note: I actually don't think the TXAA implementation in the game is superior to the MSAA, it's extremely soft)

Hmm, I would post some more but now I'm wondering if I should make a post about "slow" graphics in games or something.
posted by selfnoise at 9:25 AM on September 11, 2016 [2 favorites]


Just load up a let's play and waggle around the controller so you think you're playing the game.

It saved you money at the arcade. It let your big brother make you feel included when he fought the computer.

It works great even with a cheapo integrated Intel graphics card.
posted by mccarty.tim at 10:54 AM on September 11, 2016 [3 favorites]


Anyway, in this thread, we discuss Space Ace as the future of gaming.
posted by mccarty.tim at 10:56 AM on September 11, 2016 [1 favorite]


Anyway, in this thread, we discuss Space Ace as the future of gaming.

Cuphead is bringing cel animation back to run-and-gun games (though I doubt it'll start a revival)
posted by RobotVoodooPower at 11:12 AM on September 11, 2016 [2 favorites]


That was a really fun read! To me it was the same kind of enjoyment as when a fantasy doorstopper spends fifteen pages describing how a certain spell works. Quite enjoyable!
posted by rebent at 3:00 PM on September 11, 2016 [2 favorites]


Does anyone else remember Comanche: Maximum Overkill? Voxels in 1992!

And Delta Force, which just lost its charm once they started using 3D cards...

Which reminds me of Rescue on Fractalus, kind of a spiritual (and seminal) ancestor of No Man's Sky.
posted by RobotVoodooPower at 7:03 PM on September 11, 2016


you just don't notice it because it only happens to things you aren't focusing on directly

I don't notice it because my brain has special hardware and my mind special functions to intergrate low-quality peripheral data and high-quality input from my fovea appropriately: to blank between saccadic jumps and join together smooth pursuits into an effortless tapestry of vision. I have a great illusion of of pristine focus wherever I direct my gaze, thanks very much. The last thing I want to have happen with a screen is to direct my gaze at a specific point and find that the view there is blurry. That takes me out of the simulation of reality and reminds me that there is this flat wall of pixels being interposed between my sense of the scene and the scene itself. it's a discordant note.
posted by meehawl at 9:20 PM on September 11, 2016


I know it's a trope at this point to talk about having Pong when I was a kid (though it is true that my brother bought Pong in 1976 or so.) The difference between that and our xbox one today is stunning. According to overheard conversations among my younger coworkers, the xbox one is supposedly like a little kid with a crayon compared to a gtx 1080. I have a difficult time imagining any difference that could justify paying $700 for a graphics card.
posted by double block and bleed at 3:57 AM on September 12, 2016


I like motion blur and dof well enough, but SSAO (and the alphabet soup of variations on the theme) is a hack which I've grown to dislike-- it has a very distinct and very artificial look. If you've ever noticed mysterious dark halos around objects in a recent game, that's SSAO.

Yeah. Corners Don't Look Like That. And it's a very GPU intensive effect! And in shooters, you don't want darker corners, you want to see enemies clearly everywhere. I hope developers just stop including it at all.
posted by floatboth at 4:14 AM on September 12, 2016 [1 favorite]


I have a difficult time imagining any difference that could justify paying $700 for a graphics card.

There's a big difference between what's in current gen consoles (weak APUs that can't even handle 60 fps at 1080p in most games) and mainstream $170-$240 graphics cards (like Radeon RX 480 — above 60 fps @ 1440p in most games).

After that, you're getting into diminishing returns territory. Sure, a GTX 1080 will let you play with high settings at 4K, but do you really need high settings? Modern games look great on medium, sometimes even on low settings. In many recent games, you can also lower the rendering resolution while drawing UI/HUD at native resolution (DOOM, Battlefield 4, Battlefield 1, Overwatch, etc.) and it will look excellent on a 28" 4K monitor.
posted by floatboth at 4:28 AM on September 12, 2016


And just to be pedantic, they are all hacks. Every frame could be rendered exactly by ray traced representations of the physical laws but that would take seconds (or hours depending on level of detail) thus, the use of incredibly sophisticated highly efficient optimizations of the state of the art GPU (graphics processing unit) using matrix algorithms honed by the finest mathematical minds of our generation, all for a game hack!

Even if you could raytrace all the lighting in real time, the real hack is the whole practice of building
worlds and objects and characters out of hollow origami triangles and then painting them with several layers of textures that try to make them look like they're made of real materials and are much more detailed and rounded than they actually are (which gets especially dodgy with stuff that's transparent/translucent like human skin). I'm not even sure what the more-realistic alternative to that would be. Voxels?
posted by straight at 3:42 PM on September 12, 2016


Oh, and speaking of polygons, those zany Euclideon folks in Australia are back with their Unlimited Detail Engine except now the vaporware(?) is Holodecks instead of magic voxel game engines.
posted by straight at 4:37 PM on September 13, 2016


Possibly related: "Computer Science research from TU Darmstadt and Intel Labs uses footage from open world games such as Grand Theft Auto 5 to produce large scale machine learning data in a speedier time."

They're using the various layering techniques and decomposition discussed here to streamline machine visual learning.
posted by Eideteker at 7:04 AM on September 14, 2016


Real time rendering is so genius and hacky, graphics programmers are like real world MacGyver heroes to me.

I'm a big fan of SSAO when it's "done right", but I thought the halos around objects not in contact with the background was an implementation error rather than an inherent feature? Sure, corners in real life don't always look like that, but they often do even when not technically due to ambient occlusion, and it signals "corner" visually pretty well.
posted by lucidium at 5:53 PM on September 14, 2016


The underlying problem is that a depth buffer doesn't have enough information to actually determine whether a protrusion is connected or disconnected from the background, so it has to use various hacks/heuristics to try to guess, usually in the form of a depth discontinuity threshold. Another problem is that SSAO tends to only look in a small neighborhood around a point, leading to erroneous dramatic corner darkening. Finally, the assumption that the ambient light is uniform in every direction rarely holds (e.g., in a closed room the SSAO corner darkening comes from the incorrect assumption that there is light coming from impossible directions).

That is, if you had a small open corner (imagine three faces of a cube) out in a field during an overcast day, you would see something like SSAO corner darkening, but a corner inside a closed room doesn't get that effect since a hypothetical uniform external ambient light field is 100% occluded to every surface, not just corners.
posted by Pyry at 3:52 PM on September 18, 2016


« Older Britain's Secret Wars   |   Domino Spiral Newer »


This thread has been archived and is closed to new comments