Nvidia release new GPUs
September 20, 2018 6:16 AM   Subscribe

In the industry of graphics accelerators, Nvidia enjoys market dominance. Today, their latest GPU microarchitecture, Turing, is available to the public in their RTX series of implementations of the new chip. Benchmarks show modest improvements over their predecessors, while new features supporting ray-tracing and deep learning offer innovation in the way real-time computer graphics are created.
posted by adept256 (26 comments total) 6 users marked this as a favorite
 
To editorialize, the price of these cards is a huge barrier. My first car cost half as much.
posted by adept256 at 6:30 AM on September 20, 2018 [3 favorites]


Basically.
posted by Fizz at 6:31 AM on September 20, 2018 [2 favorites]


My GTX 950 is still going fairly strong. It's about 3 years old now. I'm still able to play some of the top tier AAA games at max settings but a fair number of newer ones being released are now going to cause stress on my GPU and need to be set at lower graphic settings/adjustments.

That being said, even if someone gifted me one of these cards, I'd likely have to upgrade my motherboard and fan/cooling system and at that point you're basically rebuilding again and that's an investment I cannot afford to just make any time a new card comes out. This is something I'd do once every 5 years (assuming you have the money and desire to stay up to date with latest gen gaming).
posted by Fizz at 6:35 AM on September 20, 2018


And yet most of the games that would use these cards to their best ability are still full of thoughtlessly ultra-violent, often misogynistic power fantasies with cliche-heavy dialogue and plots right out of pulp. When can we get an upgrade of the content, rather than just the visuals?

I apologize, a little, I may be kind of sad that there's literally nothing I care to play right now that I haven't played several times before. I look at the PSN store and fucking weep.
posted by seanmpuckett at 7:05 AM on September 20, 2018 [5 favorites]


The short answer is no. You can buy a ps4pro or an xbox1s with a stack of games for the price of one of these. This is for enthusiasts, crazies, people with too much money.
posted by adept256 at 7:06 AM on September 20, 2018 [1 favorite]




My problem is that there are so few actually compelling AAA games these days. Most of what I play is indie stuff on the Switch. I have a 1060 but I think the last big game I played through on the PC was Prey? That was last year, right?

I can't justify upgrading a computer that is vastly overqualified in every other task just to play a few same-y games. Maybe I'll update when Cyberpunk or Starfield hit.

I particularly love immersive sims but that genre seems totally dead, so...
posted by selfnoise at 7:13 AM on September 20, 2018 [3 favorites]


I went looking for video samples and found some Guru3D videos on YT. Look for the ones posted Sept 19. Happy to see any others people can root out.

Despite being down on game content, the technology is still pretty cool.
posted by seanmpuckett at 7:15 AM on September 20, 2018


Well, and how much of that is the realization that the new cards probably aren't going to reopen the crypto rush, so they're selling to gamers and scientists and not bitbugs in this generation.

I mean, the stats on the 2080ti make it sound like 4k/60 is finally truly achievable, but it's not a huge improvement in lower resolutions where you're CPU bound anyway. And I think the jury is still out on 4k over Quad HD (1440p) for gaming.

It does get you a ticket to RT, but it's still too early to tell whether RT is going to be a game changer or if it's just going to be a fuzzy machine-learning solution to a problem most gamers don't have. But yeah, it's sorta telling that the AAA titles the reviewers are benchmarking with are all getting pretty long in the tooth at this point.

That said, some of the RT demos that are out there are pretty amazing and I'm looking forward to how it all shakes out in the next generation.
posted by Kyol at 7:17 AM on September 20, 2018 [1 favorite]


My problem is that there are so few actually compelling AAA games these days. Most of what I play is indie stuff on the Switch. I have a 1060 but I think the last big game I played through on the PC was Prey? That was last year, right?

Indeed. Most of the PC games I want to play and own aren't that heavy on my GPU. It's why I could probably go another 2 or 3 years before my card is truly considered obsolete. And even then, I'll have a backlog of games that I can continue playing for years to come.
posted by Fizz at 7:23 AM on September 20, 2018 [1 favorite]


My problem is that there are so few actually compelling AAA games these days. Most of what I play is indie stuff on the Switch.

Same here. I built up a collection of close to 100 games (almost entirely indies) on Steam, but for the whole last year I've barely touched them, because all of the same games are available on the Switch, and are more fun to play on a console (that doubles as a handheld!) to boot. In a sense, Nintendo came along and built the ultimate Steam box when nobody was looking.
posted by Strange Interlude at 7:25 AM on September 20, 2018 [3 favorites]


The funny part is that had they actually looked at the market, they would have realized that a solid midrange card (say, one in the $300-$350 range) would fly off the shelves, because there is a glut of PC gamers looking for an upgrade, thanks to the crypto bubble. But instead we get a high end enthusiast card that is built around a technology that isn't even usable (the needed updates for Windows won't be out for another month) and that devs won't target because of the small install base.
posted by NoxAeternum at 7:34 AM on September 20, 2018 [5 favorites]


AMD actually released their mainstream card last gen before the high end part, but we'll never know if that works better because it was the height of the crypto mining craze and you couldn't buy them for love or money anyway.
posted by selfnoise at 7:50 AM on September 20, 2018 [1 favorite]



Nvidia shares fall after Morgan Stanley says the performance of its new gaming card is disappointing

This said, their ticker is amazing: NVDA


Nvidia is also working on chips for driverless cars and got highlighted by Motley Fool a couple years ago, just before the price went from 35 to just north of 260 today. Crazy indeed.

Also, I've been playing Obduction lately (on the PC) - beautiful graphics, and it's by Rand Miller and the team at Cyan (the Myst games). Highly recommended if you don't like all that violence.
posted by emmet at 7:57 AM on September 20, 2018 [2 favorites]


So at what point do the 1080s come down in price enough so I can pick one up? I mean, No Man's Sky is plenty pretty right now but I wouldn't mind it looking a little better...
posted by backseatpilot at 7:59 AM on September 20, 2018


I could imagine wanting one of these, since it'd be nice to try some of the fancy visuals in Lone Echo and my laptop's 1070 struggles pretty hard even with everything set to Low. But, like Prey mentioned upthread, that came out last year and there doesn't seem to be much that's that good releasing any time soon.
posted by rhamphorhynchus at 8:02 AM on September 20, 2018


Nintendo came along and built the ultimate Steam box when nobody was looking.

It's worth noting that the Switch has an Nvidia chip in it.
posted by adept256 at 8:07 AM on September 20, 2018 [1 favorite]


Thoroughly excited about these, but not for gaming. I use a bunch of NVIDIA cards for scientific computing that relies heavily on their ray-tracing API—so each one of these will replace many (multiple 10's possibly) of their current high-end cards. Which ultimately means I can do my current problems for much lower cost or can really ramp up the problem sizes even more for equivalent cost.
posted by BlueDuke at 9:15 AM on September 20, 2018 [2 favorites]


Games are fun, but I really need a pair or three of previous gen cards to boost rendering speeds using Octane, a gpu-based renderer. I got locked out building a robust work machine when prices were driven up by miners.

I'm drooling over the prospect of finally scoring a set of 980's or 1070's or with luck 1080's.

Basically, though, what NoxAeternum said. Big mistake not building a lighter cheaper $300 version of last gen's 1080. I would buy 4 of them without a second thought. There's no world where a 20xx looks appealing.
posted by Wetterschneider at 9:26 AM on September 20, 2018


"I apologize, a little, I may be kind of sad that there's literally nothing I care to play right now that I haven't played several times before. I look at the PSN store and fucking weep."

I don't know Play Station specific, but right now is the best time in gaming history to be playing games that don't fit that description if you don't mind looking for them. Yeah, the best selling crap is going to be that sort of shit and also as games are just very boring, formulaic, and boring to boot -- but that's kind of the crux of best-selling anything, something really amazing can't be a best seller because it'll rub tons of people the wrong way or ask too much of them.

For me it seems graphics have kind of plateu'd graphically and have for a long time. These days it's less about power and more about artists and art direction. Even when going for whatever "realism" means in a videogame, it's still a matter of clever art than raw power. MGSV looked better than any of its peers and continues to look better than anything I've seen announced or previewed since. Not that it's perfect by any means, just solid artistic choices make it hold together well.
posted by GoblinHoney at 2:36 PM on September 20, 2018


The funny part is that had they actually looked at the market, they would have realized that a solid midrange card (say, one in the $300-$350 range) would fly off the shelves, because there is a glut of PC gamers looking for an upgrade, thanks to the crypto bubble.

The development cycle on hardware is so long that this card was probably in development before the crypto boom was even a thing. There won't be hardware that shows any kind of lesson learned from crypto for another few years. Hardware roadmaps are just insanely long and notoriously (unforgivingly) inflexible.
posted by fremen at 3:59 PM on September 20, 2018 [1 favorite]


I don't think I've played anything truly compelling and fresh since the above-mentioned Prey which was, yeah, last year, and which I played on my Xbox One S. Battlefields and Assassins just don't do anything for me any more, if they ever did. Multiplayer games in general have been hot garbage for a decade now.

I had fun tootling around in No Man's Sky for a bit but it's just an infinite copy-paste of about three hour's worth of ideas and gameplay. Subnautica is coming out on the PS4 soonish and looks like a better implementation of the same conceit, but we'll see. Divinity 2 has been pretty fun but at 60 hours I'm just about finished and on the one hand I'm glad because I've been bored with it and just going through the motions for about the last 20 hours now, and on the other hand I'm kind of peeved because there's all these people going "oh there's a hundred hours of gameplay there!" when really there isn't.

I got Spiderman out of rote obligation and haven't touched it yet and don't believe it will really do anything that Arkham Asylum hasn't already. RDR2 will probably be ok but it seems like it's just RDR where you have to cook dinner for everybody. I dunno, gaming has been shit for a while now - I know there's good stuff on the PC but it's stuff that doesn't need a thousand dollar graphics card.
posted by turbid dahlia at 4:36 PM on September 20, 2018


With raytracing maybe kinda coming in, I'd be interested in seeing a game whose graphics goal was 480p60 but actually approaching photorealism.
posted by GCU Sweet and Full of Grace at 4:57 PM on September 20, 2018 [1 favorite]


Some newer AAA games are said to have something called tourist or photo modes, which basically mean you being able to walk around and take screenshots without being bothered by any violent encounters. One hopes some other forms of interaction are left in too, though I don't know much about the details. I think some modders are also working on similar modes for popular recent games. This is what I've gathered from some articles in the gaming press. People who are interested in this might keep an eye on these news. If you don't care for the full games, maybe some of them are worth getting from a sale a few years down the road, just to visit the detailed environments and travel around.

Obviously, it needs to be done well to be worth your time, not just put in as an afterthought. If anyone knows more about these developments or has recommendations for actual good implementations of the concept, please inform us.

On using raytracing/pathtracing in real time graphics, this is how Quake 2, a 90s shooter, looks with a proof of concept implementation.

Quake 2 Realtime GPU Pathtracing: August 2017

It's low poly because the underlying game is from the 90s. The graininess can be fixed by more sampling, which needs better performance. As someone says in the comments, it reminds you a little of claymation (except with sharp edges all around).
posted by primal at 12:27 AM on September 21, 2018


Video game tourism is a real thing, and if you don't have the hardware, try these links in this fizz Previously. They're youtube links to hand picked vistas from the latest games with all the setting turned way up.

This video from nvidia demonstrates the ray tracing technology.
posted by adept256 at 1:45 AM on September 21, 2018


The graininess can be fixed by more sampling, which needs better performance.

And that's something I sorta feel about the shadow performance on the new RTX cards. On the one hand, yay, proper frickin' soft shadows with penumbras and umbras! On the other hand, it looks like a jittery noisy grainy solution due to the limited numbers of rays they're casting and how much of the scene is reconstructed from blah blah deep blah learning whatever-chitecture.
posted by Kyol at 6:28 AM on September 21, 2018


« Older "I’m not going to be put out, I’m not going...   |   How Maya Rudolph Became the Master of Impressions Newer »


This thread has been archived and is closed to new comments