The Better-verse
January 6, 2022 2:01 PM   Subscribe

Linux in a Pixel Shader - A RISC-V Emulator for VRChat. 'kind of wild that when you join a vrchat world someone can run a linux kernel on your gpu by packing it into a shader on their avatar' - suzuha (Twitter)

Community member _pi_ creates a tribute to 'the amazing creative-community surrounding that “game”/social platform.'
'What really sparked my imagination however, was the discovery that you can embed your own, custom shaders within such a world. You can even put them on your Avatars! With shaders, the sky is the limit - and if you take even just a cursory look at what the community has done in VRChat, you realize that even that is only a limit meant to be broken.'

VRChat is free (no VR headset required!).

Video of the big reveal of the 'secret project' at a community meet up.

Mentioned in the blog post, Treehouse in the Shade by 1001 and SCRN, two 'shader wizards'.

VRChat seems like an interesting place, other than the 'crashers', who intentionally load up their avatar with resource consuming shaders to crash other people's clients.
posted by asok (22 comments total) 16 users marked this as a favorite
 
I'm sorry, but while this is fascinating on a technical level, my first thought at the idea of someone being able to arbitrarily run an OS core on your machine just through a shader was "holy security vulnerabilities, Batman!"
posted by NoxAeternum at 2:15 PM on January 6, 2022 [10 favorites]


If it's possible to expose security vulnerabilities by running a Linux kernel on an emulated CPU in a shader on your GPU, then there are much easier to exploit vulnerabilities there.
posted by Joakim Ziegler at 2:27 PM on January 6, 2022 [11 favorites]


Well, given that NSO Group pwned dissidents' iPhones using a virtual CPU made of digital logic implemented in boolean pixel operations in an image-compression format, an adversary being able to run arbitrary code in the sandbox of an emulated CPU in a shader doesn't exactly put my mind at ease.
posted by acb at 2:40 PM on January 6, 2022 [18 favorites]


Countdown to this technique being repurposed for mining bitcoin on other people's computers in 3…2…1.
posted by adamrice at 3:34 PM on January 6, 2022 [5 favorites]


In the event that this technique could be used to mine cryptocurrency, there is still the challenge of anything off the GPU. Having said that, as mentioned in the blog post, 'VRC programmers have a way of looking at [a] wall of restrictions, then find a way of switching their existence to a zero dimensional object for a moment, then they appear on the other side of that wall.'
suzuha makes the comparison with the ability to insert Javascript on other peoples MySpace profiles, 'which is awesome that era of the web was charming and bohemian and fun'.
posted by asok at 3:50 PM on January 6, 2022


Countdown to this technique being repurposed for mining bitcoin on other people's computers in 3…2…1.

Like Norton does now?
posted by clawsoon at 3:51 PM on January 6, 2022 [8 favorites]


In the event that this technique could be used to mine cryptocurrency, there is still the challenge of anything off the GPU. Having said that, as mentioned in the blog post, 'VRC programmers have a way of looking at [a] wall of restrictions, then find a way of switching their existence to a zero dimensional object for a moment, then they appear on the other side of that wall.'
suzuha makes the comparison with the ability to insert Javascript on other peoples MySpace profiles, 'which is awesome that era of the web was charming and bohemian and fun'.


This is the Ian Malcolm Problem in action. As was pointed out above, we have already seen active exploits in the wild using virtual CPUs to run arbitrary code as an attack, so this is not an idle concern here. Furthermore, things like Javascript injection are how malware is commonly spread, which is why the "bohemian" argument doesn't play for me.
posted by NoxAeternum at 4:35 PM on January 6, 2022 [1 favorite]


I'd like to hang out in a virtual world styled and spatially mapped to the memory fragmentation of the system running it. Super colorful, super volatile.
posted by angelplasma at 5:11 PM on January 6, 2022 [2 favorites]


There's a pretty wide gap between "thing ends up accidentally being Turing complete" and "Same thing also has unrestricted access to filesystem data and networking capabilities"

I'd suspect the number of things that allow one to run arbitrary and untrusted OpenGL shaders is pretty low, and almost certainly sandboxed?

That's not to say that there's zero chance of exploitation, but just because a recent high-profile exploit happened to involve an ad-hoc virtual machine doesn't really mean that this demonstration indicates the existence or even possibility of any particular vulnerability.
posted by schmod at 7:18 PM on January 6, 2022 [5 favorites]


VRchat does at least have safety levels for avatars, so unless you muck with them, VRChat won't run shaders attached to an avatar unless you friend the person using the avatar. So this would most likely be running as a shader in a community-created world. VRchat does warn that community-created worlds aren't vetted and that they could have poor performance, but so much of the content on VRchat is from third-party creators that I suspect most people ignore that sort of warning.

The question of whether this could become exploitable is an interesting one; as a DoS, it's already the case that a bad shader can lag out VRchat to the point of unusability, so that's known. Breaking out of the sandbox would definitely be the interesting part and it doesn't look like that's happened yet. In general it's a lot easier to input into a GPU than it is to get output from it, and unless the GPU is set up explicitly to do compute as mentioned in the article, its just going to render out a texture. That said, based on what I'm seeing here, a crypto-mining shader feels like it might be within the realm of possibility; I don't know enough about the capabilities of Udon (VRC's scripting language) to know whether it can read texture data, which I believe would be the necessary missing link that would allow it to be used for that purpose, as shaders can't communicate with the rest of the PC let alone the network. I suspect that it doesn't provide that, for performance reasons if nothing else. So you could make a shader do the work, but you couldn't get the output from it necessary to make it useful. This work doesn't appear to bring anyone any closer to breaking out of the sandbox tho, which would be the real concern.
posted by Aleyn at 9:01 PM on January 6, 2022 [1 favorite]


Recent Reddit thread on crashers. Unclear if they are doing it for love or money.
posted by credulous at 10:17 PM on January 6, 2022


Like Norton does now?

Norton is just following in the footsteps of Kodak, Long Island Ice Tea, AMC, Overstock, and... RadioShack.
posted by They sucked his brains out! at 1:05 AM on January 7, 2022


In the community meetup video, we get to see the thing running, but we also get to see how unfortunately (but also unsurprisingly) slow it is.

As far as I could tell it doesn't actually complete the Linux boot process in the entire ~10 mins or so of demonstration and discussion, though it does get to a visible part of the process; we do also get to see a bit of some other less complex code running on another instance of the emulated CPU.

It's a work of pure joy and brilliance, and I'd love to see a video of it running that didn't have ~80 avatars running around and slowing it down (?), but from available evidence the number of immediate practical applications, malicious or otherwise, seems fairly limited, unless or until either it or GPUs get a couple of orders of magnitude faster.
posted by motty at 2:12 AM on January 7, 2022 [1 favorite]


People who are not involved in programming GPUs may not be aware that shader programs run amok are the entire method of running code on GPU accelerators. For instance, Ethereum mining on GPUs. Originally they were very small programs, maybe a couple of matrix operations on textures. Now you can run huge programs, as long as they use the subset of C and C++ operations that are supported by the special, limited compiler required. But you cannot, in fact, execute the system calls you would need to access the file system. Basically you can allocate memory and move things to and from the CPU and GPU. Not that it isn't quite an attack surface.
posted by wnissen at 9:59 AM on January 7, 2022 [2 favorites]


Yeah, I think people often get the impression that shaders are more limited than they really are, but on modern hardware shaders can do all the things a 'regular' program can do like conditionals, loops, and random memory access. So it's no more surprising that you can write a RISC-V emulator in a shader than that you can in a C program running on CPU-- the novelty is that someone bothered to do it. But importantly a shader running an interpreter doesn't gain any special powers: shaders were already intentionally Turing complete, so using a Turing-complete language to write a slow interpreter for another language is like using a chainsaw to manufacture a smaller, weaker chainsaw. A malicious actor would just use the more powerful chainsaw you handed them.
posted by Pyry at 10:45 AM on January 7, 2022 [3 favorites]


This makes me think that it may be time to bring back Core War ...
posted by graphweaver at 8:58 PM on January 7, 2022 [1 favorite]


AI impostors given away by subtle wrongness. Self-driving cars trapped by runic diagrams. An endless jukebox full of wordless nothings from another dimension. And now VR basilisks. Any sufficiently advanced technology really is indistinguishable from magic.
posted by Rhaomi at 10:38 PM on January 7, 2022 [1 favorite]


as shaders can't communicate with the rest of the PC let alone the network. I suspect that it doesn't provide that, for performance reasons if nothing else.

Can it do in-world stenography? Some exhibit/attraction that runs shaders on you and harvests data from the 'visual' output?
posted by snuffleupagus at 10:28 AM on January 8, 2022


Steganography, that is. Heh.
posted by snuffleupagus at 11:36 AM on January 8, 2022


Hey, so I crashed your view? Send me a screenshot. Do the extraction from that. You could probably factor primes that way. All piecemeal.

Oh yeah, what's does it say on my name tag? QWMCD you say? Hrm, should be BOB. Social engineering, you just have to get them to send/tell you the result.
posted by zengargoyle at 12:12 PM on January 8, 2022


Streaming has people displaying their game output live. There are sometimes promotional schemes around publishing screencaps.

If a shader program has access to everything in the game's memory (login credentials? client IP?) and can somehow exfiltrate that info through the game display there might be some theoretical risk?
posted by snuffleupagus at 2:09 PM on January 8, 2022


When a shader is invoked, it's bound to a set of GPU resources, and barring driver/hardware bugs it shouldn't be able to go outside of that set. So the potential danger depends on whether the application is uploading sensitive data to the GPU and how permissive it's being when running user-provided shaders.
posted by Pyry at 7:39 AM on January 9, 2022 [2 favorites]


« Older “Snow is like the cosmos itself."   |   Audie Cornish speaks upon her exit Newer »


This thread has been archived and is closed to new comments