Hallucination attack
March 30, 2024 2:52 PM   Subscribe

"During our research [we] encountered an interesting python hallucinated package called “huggingface-cli”... In three months the fake and empty package got more than 30k authentic downloads! (and still counting)... Our findings revealed that several large companies either use or recommend this package in their repositories..." [via The Register]
posted by clawsoon (15 comments total) 14 users marked this as a favorite
 
This is not the cyber-dystopia we were promised, but I'll take it, for now.
posted by signal at 3:10 PM on March 30 [1 favorite]


Of course, this wouldn’t detect hallucinated packages someone else already noticed and squatted!

I usually mentally model LLMs as an eager but sloppy intern whose work is only useful if it’s faster to check than redo, but I hadn’t really thought of people adversarially trying to figure out what wrong answers the AI is likely to give other people. So it’s more like a sloppy intern who talks about his work in a bar full of competitors and criminals every night.
posted by smelendez at 4:04 PM on March 30 [5 favorites]


And my work just released an internal AI tool based on ChatGPT 4. Terrifying.
posted by lock robster at 4:21 PM on March 30 [1 favorite]


And my work just released an internal AI tool based on ChatGPT 4. Terrifying.

Might wanna send them the article...
posted by clawsoon at 4:29 PM on March 30


Much like the wild west of crypto scams, wallet draining malware, rugpulls, and speedrunning the entire menu of financial and securities crimes, LLMs are creating a whole landscape of new threats.

It's an interesting parallel to consider how there are two distinct but inseparable issues - how useful and effective are these technologies at their intended purposes, and how damaging are all of the new ways for bad actors to exploit them?
posted by allegedly at 4:46 PM on March 30


I keep saying that the current round of "AI" is a bunch of bullshit just like crypto, the .com bubble, and every other tech wet dream that would make billions without an obvious problem the tech was capable of solving.

Two to five years from now, the grift will have moved on to something else, and we'll all mostly forget about the AI hype.
posted by Ickster at 5:06 PM on March 30 [2 favorites]


Two to five years from now, the grift will have moved on to something else, and we'll all mostly forget about the AI hype.

Except the aforementioned fads were all more-or-less inside-baseball kind of tech stuff. AI, on the other hand, is being wedged into all manner of everyday stuff that have direct effects on regular folk, who are going to be harmed by it. It’s a whole different level of terrible.
posted by Thorzdad at 5:17 PM on March 30 [8 favorites]


The real package for the huggingface cli is huggingface-hub, easy mistake to make. If the LLM is ingesting wrong package references from existing code this is GIGO (garbage in, garbage out).
posted by muddgirl at 5:18 PM on March 30


This is one reason I hate the term "hallucinations."
posted by muddgirl at 5:19 PM on March 30 [6 favorites]


Except the aforementioned fads were all more-or-less inside-baseball kind of tech stuff. AI, on the other hand, is being wedged into all manner of everyday stuff that have direct effects on regular folk, who are going to be harmed by it. It’s a whole different level of terrible.

Yeah, I know that. I was just preemptively arguing with anyone who might jump in to claim we just don't understand how revolutionary AI will be and that shit like TFA is just nitpicking.
posted by Ickster at 7:56 PM on March 30 [1 favorite]


The correct term should be "confabulations". I will die on this hill.
posted by biogeo at 7:56 PM on March 30 [10 favorites]


If you have a boss, in any field, your boss will encourage you to use AI. Then, said boss will encourage their boss to fire you in lieu of using AI by way of "they've been using AI, so we should do that as well".

What will always be hilarious is that they don't understand LLM and think "OH SHIT, ARTIFICIAL INTELLIGENCE WILL SOLVE ALL MY PROBLEMS!" and fire the folks who not only did the work to train them and also did it better than any "AI" could ever do, but they absolutely do not know why.

Business-minded people are not creative, so when they see an opportunity to cut-out the creative artists around them to achieve what they THINK are the same results, of course they will take it, and that's why this "AI" (that isn't in any way AI) is so popular to them (and the stock market)
posted by revmitcz at 1:24 AM on March 31 [7 favorites]


revmitcz: Business-minded people are not creative, so when they see an opportunity to cut-out the creative artists around them to achieve what they THINK are the same results, of course they will take it, and that's why this "AI" (that isn't in any way AI) is so popular to them (and the stock market)

One wrench has been thrown into this for creative businesses: Current case law in the US says that AI output can't be copyrighted. As a result, the animation company I work for has forbidden the use of generative AI in final output. Use as reference is okay, but whatever goes out the door has to be created by artists.

So it's a weird situation where the artists' jobs are protected because the capitalists aren't legally allowed to appropriate the surplus value of the AI's labour.
posted by clawsoon at 5:22 AM on March 31 [8 favorites]


Everything you use these days is an amalgamation of dozens (hundreds?) of packages downloaded from some repository. The days of developers knowing everything that was in their application are long over. Every time you install some new app you're rolling the dice. You probably won't get snake eyes, right?
posted by tommasz at 5:34 AM on March 31


Muddgirl is right, I ran into the same thing writing some nix that needs to run huggingface-cli but needed to install the huggingface-hub Python package to do it.

It's mixing up the command and the package name. Whether it saw people doing it or did it itself is fun to wonder about.
posted by ikea_femme at 8:07 AM on March 31


« Older Tar Trap Caught   |   "If that offends them, so be it." Newer »


You are not currently logged in. Log in or create a new account to post comments.