“I actually think that AI fundamentally makes us more human.” (BOOOO)
March 22, 2024 9:05 AM   Subscribe

Ted Gioia: "Tech leaders gathered in Austin for the South-by-Southwest conference a few days ago. There they showed a video boasting about the wonders of new AI technology. And the audience started booing." [Xitter link] Gioia argues that users are becoming much more wary, not only about "AI," but about tech in general.

Early paragraphs:
The booing started in response to the comment that “AI is a culture.” And the audience booed louder when the word disrupted was used as a term of praise (as is often the case in the tech world nowadays). [archive]

Ah, but the audience booed the loudest at this statement:

“I actually think that AI fundamentally makes us more human.”

The event was a debacle—the exact opposite of what the promoters anticipated.

And it should be a reality check for the technocracy.
posted by JHarris (111 comments total) 36 users marked this as a favorite
 
"We have arrived at the scary moment when our prevailing attitude to innovation has shifted from love to fear."

Am I fearful? No. So far with AI, I'm seeing only useless bulk that makes my life harder. I am not fearful of garbage, I am annoyed with garbage.
posted by Capt. Renault at 9:29 AM on March 22 [86 favorites]


If humans were meant to fly, they would have been born with wings.
posted by fairmettle at 9:33 AM on March 22 [1 favorite]


"Maybe that’s how it feels in Palo Alto or Cupertino. But everywhere else, the vibe is less groovy. The public no longer admires their tech titans, but fears them. Users no longer welcome their innovations, but mock them. This antipathy will quickly escalate into hostility, maybe even rage."

I'd be okay with outright hostility toward tech platforms. I never signed up for Facebook because I saw what it would become, but Twitter lured me in and gave me a ton of friends that I'm not in touch with anymore. So major life loss there, and after the ICQ/AIM/Y!M debacles I'm beginning to not trust any tech platform to be a good conveyer of community. If I add in the evacuation of people from LiveJournal, I've probably lost 100 people I regarded as good friends due to platform dependency in our relationships.
posted by hippybear at 9:36 AM on March 22 [27 favorites]


This year feels like a turning point for SXSW (and I say this as someone who hasn't been in over a decade.) There has always been a contingent of people talking about how it used to be cool, but this year it feels like people are actively tired of the naked corpo-backing that it gets. What used to be something of a playground for weirdos (twitter launched there, for goodness' sake) it's now pretty obviously just a corporate techno-weapon expo.
posted by nushustu at 9:39 AM on March 22 [11 favorites]


The New Yorker had an interesting piece last week pitching AI doomsayers against AI boosters. The latter group was spouting exactly the same hopelessly utopian promises we heard in the early days of the internet and were equally deaf to any suggestion that things could go horribly wrong. It did not bode well.
posted by Paul Slade at 9:40 AM on March 22 [8 favorites]


"I'm sure our wise tech leaders will read the room and listen to sense as they always do. Problem solved," said not even the lowliest of AIs.
posted by onebuttonmonkey at 9:41 AM on March 22 [2 favorites]


If humans were meant to fly, they would have been born with wings.

Convince me that the invention of heavier-than-air flight has been a net benefit to humanity as a whole. Go ahead.
posted by Faint of Butt at 9:43 AM on March 22 [6 favorites]


Medevac aircraft are pretty dang great, not something one can really do with a blimp.
posted by seanmpuckett at 9:52 AM on March 22 [26 favorites]


I think it comes down to "hype" and people have soured mightily on it. Tech "thought leaders" can't even tell that they're spouting "hype" any more—possibly because they're cynical to their core.
posted by bbrown at 9:53 AM on March 22 [7 favorites]


Into the bin with crypto, useless planet burning fucks.
posted by Artw at 9:54 AM on March 22 [25 favorites]


If humans were meant to fly, they would have been born with wings.

Seems disingenuous, the general-intelligence LLMs are replicating something humans can already do and using large amounts of electricity to do it. And the slimy way the LLMS were trained, without the consent of the people whose words/art/music/code became part of the training data, does not inspire confidence that the corporations have our best interests at heart.
posted by subdee at 9:58 AM on March 22 [42 favorites]


If humans were meant to fly, they would have been born with wings.

You realize that as an analogue to AI this means "If humans were meant to think, speak, and write, they would have been...oh wait."
posted by We put our faith in Blast Hardcheese at 9:59 AM on March 22 [49 favorites]


So many services have dropped their analog equivalents, forcing us to use an app or the internet to do things like sign up, pay bills, organize actions/events, etc. This makes us dependent on them in a simple way, which means they're free to do as they please instead of doing what consumers want.

Investors throw their money at AI shit, make social media and every other online service crappy, just a big daisy chain of cash being thrown around there and everyone hates it... in contrast, sites like Cohost and Pillowfort have banned AI and cater to what people want out of social media, but struggle to make ends meet or get their apps in one of the duopoly stores.

Users no longer welcome their innovations, but mock them. This antipathy will quickly escalate into hostility, maybe even rage.

Good. Luddite movement 2.0, let's go!!!!! They can't execute us all!
posted by picklenickle at 10:02 AM on March 22 [21 favorites]


Yeah. As one of the people most vocal in MeFi “AI” threads in terms of “yes, this is an interesting technology that has legitimate use and the fears are often overblown,” I absolutely agree with every single one of the nine “You must be suspicious of tech leaders when…” points in the article.

I do think it’s interesting tech. I think it’ll change jobs and many of us are going to find ourselves in more editorial/director-lite roles. I think it’s going to be an interesting few years getting there but things will sort out relatively quick.

I do not think the major players are the people that should be running it. I think they come from the Silicon Valley VC perspective that views the rest of us as biomass and markets - Neal Stephenson had them solidly pegged decades ago in Snowcrash. That’s why I’m always pushing awareness of open source ML, loudly criticizing OpenAI’s utter lack of transparency, and red-flagging the potential environmental impact of their upcoming Q* research.

I have loved the idea of neural networks since I first learned about them in the late 90s as a teenager. I basically ragequit an at-the-time-major cognitive science department because I was going to be forced to work in top-down strong AI by logicians who just wanted to write Lisp all day. As a standard model Robot-American kind of nerd I am loathe to admit emotion but I am emotionally invested. I am not neutral and I want to see this shit succeed.

Our society is not structured in a way that will distribute the benefits of any such success to the people who need it most. It is even less prepared for handling breakout success if someone actually achieves genuine AGI anytime soon (I still believe it is decades out). We aren’t even ready for OpenAI’s goal post-shifting redefinition of AGI, which is tellingly “systems that can replace most human workers.”

None of this is okay, and I think the audience is right to boo. Because ultimately it’s not the technology that people are boo-ing, it’s the attitudes and practices of the sociopaths currently driving it.
posted by Ryvar at 10:04 AM on March 22 [72 favorites]


Have we already talked about the anti-SXSW demonstrations because of their weapon manufacturer sponsorships?
posted by infini at 10:04 AM on March 22 [9 favorites]


Capt. Renault: "Am I fearful? No. So far with AI, I'm seeing only useless bulk that makes my life harder. I am not fearful of garbage, I am annoyed with garbage."

I don't believe that LLMs, etc, in their current form (or something we can reasonably extrapolate from their current form) can legitimately replace the work of humans. I do believe, however, that there are business executives who do believe they can. So they can be a thread to our livelihoods in that respect.

I've got a friend who was recently laid off, and we were brainstorming what she could do next. This was exactly the line I took: not that LLMs could do her job, but whatever work she pursued had to be something that some boss wouldn't think an LLM could do.
posted by adamrice at 10:08 AM on March 22 [28 favorites]


If humans were meant to fly, they would have been born with wings.

People aren't arguing these algorithms aren't possible. They are arguing that this technology is going to make human lives worse in the hands of capitalists, and has little use other than making profits.
posted by The Manwich Horror at 10:10 AM on March 22 [28 favorites]


I'm not sure what jobs exist that bosses don't think an LLM can do.
posted by hippybear at 10:10 AM on March 22 [15 favorites]


Not long ago we looked to Silicon Valley as the place where dreams came from, but now it feels more like ground zero for the next dystopian nightmare.

I think we've finally reached the point where tech companies aren't even pretending to improve users' lives. "Disruption" used to be, at least on the surface, about empowering individuals against systems that moved too slowly or had been captured by monopolies. Underneath, of course, it was largely about externalizing costs and shifting control to new monopolies. But nobody even seems to be pretending anymore. Everything is bad and just keeps getting worse.
posted by uncleozzy at 10:11 AM on March 22 [34 favorites]


Last night I got a text while I was driving. AI offered to "summarize" the text for me instead of JUST READING THE TEXT ALOUD. I declined. Who the fuck needs that? JUST READ THE TEXT.
posted by jenfullmoon at 10:15 AM on March 22 [39 favorites]


I don't think it's "innovation" that people dislike, as much as it is "innovation with gigantic and immediately obvious downsides."
posted by surlyben at 10:17 AM on March 22 [21 favorites]


Wait what?

How is there AI standing between you and a text you got while driving?

I never get texts so haven't gotten one while driving, but I think that Siri would just read it to me?
posted by hippybear at 10:17 AM on March 22 [1 favorite]


I'm not sure what jobs exist that bosses don't think an LLM can do.

Their own. Let's prove them wrong.
posted by Faint of Butt at 10:19 AM on March 22 [21 favorites]


I'm not sure what jobs exist that bosses don't think an LLM can do.

Their own. Let's prove them wrong.


The plot of a Twilight Zone episode in 1964.
posted by Melismata at 10:23 AM on March 22 [1 favorite]


I think it’ll change jobs and many of us are going to find ourselves in more editorial/director-lite roles. I think it’s going to be an interesting few years getting there but things will sort out relatively quick.

Dude, no offense but this is such an irritating attitude. In the 20s people said this about automation, that our jobs will become easier and more pleasant and we'll get more time off. But it didn't happen, it just generated more output and contributed to sweatshop labor, office jobs that still cause burnout and overwork, etc. Every advancement in the arts is also like this where it moves into a quantity>quality environment and product.

There are already art students who have committed suicide when AI was introduced to their courses, feeling there wasn't anywhere to put their skills to use. The idea of going from an artist to a button-clicker (using work from aforementioned hand-made labor...) is very depressing and not suitable to those who cannot perform that style of job. There's already so many contract artists who would normally do book illustrations, album covers, etc who are out of work, in an industry where being an independent artist is already low-paid and full of abusive contracts, no healthcare, etc. So sure, it will "sort itself out" because they'll be dead and we won't even notice or care the same way we don't think about textile weavers going from nice at-home jobs that they like to do on their own terms to horrible dangerous sweatshop labor outsourced overseas to people getting paid $1/hour. I hate people being dismissive about this issue like "it'll all be over soon!" "can't put the genie back in the bottle!" Man, yes I can.

If AI starts slipping into all my movies I'm making a straight beeline to my local stage theaters.

It's probably something I should have done long ago.

More live actors, more live artists, more live musicians! It would create jobs--FUN jobs that are enriching and support the well-being of the creators.
posted by picklenickle at 10:24 AM on March 22 [69 favorites]


I think it’s going to be an interesting few years getting there but things will sort out relatively quick.

That is a fun little way to gloss over bankruptcy, housing insecurity, loss of healthcare coverage, the torpedoing of personal savings/retirement preparation and all the other things that come with mass layoffs in multiple industries at once. "Interesting" yes it's terrifically "interesting" how my partner, a journalist, has literally no idea when he'll ever work again. It must be fucking fascinating because we literally do spend all night every night thinking about it! Instead of sleeping.

INTERESTING.
posted by We put our faith in Blast Hardcheese at 10:28 AM on March 22 [89 favorites]


Have we already talked about the anti-SXSW demonstrations because of their weapon manufacturer sponsorships?

I don't think too many people know that Google (or an equivalent subsidiary under a different Alphabet-given name) still helps the DoD make drones.

The problem kind of escalates when AIs go from eliminating jobs to actually eliminating people.
posted by They sucked his brains out! at 10:28 AM on March 22 [13 favorites]


The previous hype cycles—specifically blockchain, cryptocurrency and NFTs—were annoying but ultimately fizzled without doing too much damage to the economy at large because there weren't actually that many applications for them and capital investment in them was primarily oriented towards startups rather than existing companies.*

What worries me about LLM / generative AI is that it is so close to a thing that people have wanted for decades, and that it almost seems to deliver on those promises. The big tech companies are going all-in on it, and ordinary companies are buying into the hype (especially around what Copilot and similar dev tools can do) in ways that they didn't before. When the bottom falls out of this—when people adjust their Turing sensors to quickly recognize LLM content and dismiss it, or there's a massive security breach that came from using Copilot-dervied code—the collateral damage is going to be huge.

*Please do not read this as being dismissive of the retail investors who lost more than they could afford as a result of the promises made by some of those players.
posted by thecaddy at 10:28 AM on March 22 [3 favorites]


we don't think about textile weavers going from nice at-home jobs that they like to do on their own terms

This is at least partly because it was never actually like this. I get the point you're making and agree that the Industrial Revolution brought forms of horribleness we sometimes ignore, but let's not over-romanticize what life was like beforehand either.
posted by nickmark at 10:31 AM on March 22 [12 favorites]


Capitalist techno-solutionist: "I was saying boo-urns."
posted by audi alteram partem at 10:31 AM on March 22 [5 favorites]


"Interesting" yes it's terrifically "interesting" how my partner, a journalist, has literally no idea when he'll ever work again.

Again, not a new story. My father was a major newspaper journalist until he died in 1990; he would have gotten a golden handshake from his paper in 2001 or so. See the evolution of Rick in Doonesbury, who wrote for the Washington Post, was laid off, and then briefly worked at the Huffington Post, and then has done basically nothing since; I think Trudeau ran out of ideas on what to do with someone whose job had been replaced by tech.
posted by Melismata at 10:37 AM on March 22 [6 favorites]


They're trying to sell AI is liberating, but people know when something is being forced down their throats. You don't see people at SXSW talking about the liberatory potential of using AI bots to replace all the CEOs and return the savings to the workers. You don't see people in Ted talks showing how AI can lead us to fully automated gay space luxury communism. There is definitely an artificial narrowing of the discussion that is being done by people with all the money and the power who want to limit what AI can do to what can fatten their wallets.
posted by jonp72 at 10:38 AM on March 22 [26 favorites]


Again, not a new story.

Nobody said it was new, I just said that calling it "interesting" and other corporate-booster euphemisms is a lousy thing to do. And yes, it's what people who aren't directly impacted by it say every fuckin' time, and it always sucks.
posted by We put our faith in Blast Hardcheese at 10:40 AM on March 22 [15 favorites]


There are already art students who have committed suicide when AI was introduced to their courses, feeling there wasn't anywhere to put their skills to use.

Really? I missed this news and I can't seem to find anything.
posted by ElKevbo at 10:41 AM on March 22 [2 favorites]


Dude, no offense but this is such an irritating attitude. In the 20s people said this about automation, that our jobs will become easier and more pleasant and we'll get more time off. But it didn't happen, it just generated more output and contributed to sweatshop labor, office jobs that still cause burnout and overwork, etc

None taken, but you are misunderstanding what I wrote. I didn’t say it would be easier work, or less work. Just differently focused. Speculation, but: what is most likely to happen in most media-adjacent industries is that the MBAs and accountants will hear or read something that makes them believe ML-based systems can replace most of their workers. The next quarter that doesn’t show growth in profit they will execute massive layoffs. They will then faceplant into the brick wall of the limitations of such systems and the need for editorial review. The need for careful prompt engineering to generate results of sufficient quality such that their customers continue buying.

They will then rehire a few people or go under and be replaced by the competition.

Once they’ve stabilized and attempt to grow, or been replaced and their competition is ready to grow, the bottleneck of human editorial review and directing the ML systems will come up again. Most people will be rehired, though usually not by the same company. After several contraction-expansion cycles a new stasis will emerge.

Like I said: it’s going to be an interesting few years.

Nowhere, in any of this, am I stating or even implying the work will grow less or more pleasant. It’s still going to be work. The total output of the company will be far greater both overall and per-worker, because these are enabling technologies when utilized correctly, but under capitalism labor will reap no benefits from that. There is also no job security under capitalism because the sword hanging over all of our heads is starvation, and the people who do reap all the benefit are well aware that is the one societal change they can never safely jeopardize.

If you’re reading dismissiveness or inappropriate hope into my words, I’m not sure what to tell you. In a few years the dust will settle but that “interesting” is - to be extremely clear - of the very Bostonian “yeah shit’s gonna suck” variety.
posted by Ryvar at 10:43 AM on March 22 [8 favorites]


I work in a loosely ML-adjacent field (computing infrastructure for scientific researchers), and I'm seeing some really interesting work on real applications for LLMs and similar models. There's promising work in using them for weather and climate prediction, materials science, and drug discovery. A bunch of researchers are discovering that using glorified pattern-matching software can achieve faster and just-as-good results as detailed simulation, which I think is a fascinating result all on its own.

(Importantly, all of those areas involve checking the ML algorithm's results using older simulation-based methods or straight-up experimentation. So I feel a lot more comfortable trusting those results than I otherwise would.)

So in certain respects you could call me an "AI optimist", insofar as I really do think there are wonderful things we can and are doing with the tech. However, as fascinating as I find all that work and as excited as I am to see what happens next... those applications aren't what are driving billions of dollars of investment. Most of that money is being spent to try and replace creative work by people, especially people at the lower rungs of the economic ladder, so that other already-rich people can make more money. Which is frankly unconscionable, and why the backlash against the companies involved is deserved.
posted by learning from frequent failure at 10:45 AM on March 22 [18 favorites]


Really? I missed this news and I can't seem to find anything

Because it was in Japan:

https://www.fukuishimbun.co.jp/articles/-/1827461
posted by picklenickle at 11:02 AM on March 22 [8 favorites]


Nobody said it was new, I just said that calling it "interesting" and other corporate-booster euphemisms is a lousy thing to do.

Definitely not intended as corporate-boosting. That is not my jam. I’m thinking we have a cultural disconnect: when I say interesting it is the falsely-brave stoicism of people who would probably rather die than acknowledge difficulty and might actually let themselves starve before asking for help. New Englanders, basically.

Approaching 44, I have finally saved enough to potentially buy a place to live and break the rental trap. And I’m not going to: overhiring due to Covid means that my own industry has laid off more people in January 2024 than all of 2023 combined, and 2023 was already a very bad year for layoffs. My employer and - as far as I know - my job are both as secure as it gets, but I do believe I understand the general outline of the next few years and I’m not going to empty those savings this year or next.

I am genuinely sorry if I upset you or hurt your feelings, that was not my intent.
posted by Ryvar at 11:06 AM on March 22 [4 favorites]


I've drawn my whole life and painted pretty much my entire adult life and have watched in growing alarm as my Facebook feed(which I have sold a fair bit of art through) is slowly giving ground to AI generated imagery. Fantastical this and fantastical that and it all looks like shit because the 1000s of tiny decisions made while creating a work of art, decisions predicated on literally years of process, cannot be replaced, yet, by feeding prompts in to AI programs.
There is a blightening dreariness to the AI generated imagery I see, and I worry about its impact on the collective visuality of our cultures, the numbing sameness of the images squeezing out the individuality of the human enterprise.
And, Big Tech, yeah, I'm sure it will be bay unicorns, rainbows, and Skittles all around.
posted by Phlegmco(tm) at 11:07 AM on March 22 [30 favorites]


I don't get the love of algorithmic creations

They have some fun things. Can be neat to experiment with. But they don't really. Work right half the time. Very uncanny valley
posted by AngelWuff at 11:30 AM on March 22 [8 favorites]


someone whose [reporting] job had been replaced by tech.

An excellently disquieting example, because reporting wasn’t replaced by “tech”. We just don’t have reporting that we used to have.
posted by clew at 11:40 AM on March 22 [39 favorites]


So in certain respects you could call me an "AI optimist", insofar as I really do think there are wonderful things we can and are doing with the tech. However, as fascinating as I find all that work and as excited as I am to see what happens next... those applications aren't what are driving billions of dollars of investment.

It feels like a continuation of the “vibe” of the last decade of “tech,” that vast resources and powerful technologies are brought to bear on low-hanging fruit and short-term cash-ins and things nobody asked for. I’m also sort of “AI-optimistic” by MeFi standards, in the sense that I think that there are likely powerful applications in the broad sphere of ML and that even some of the currently hyped stuff is already more useful than people want to give it credit for being. But it’s also not remotely mysterious why people are sick of it all.
posted by atoxyl at 12:12 PM on March 22 [5 favorites]


>>we don't think about textile weavers going from nice at-home jobs that they like to do on their own terms
This is at least partly because it was never actually like this. I get the point you're making and agree that the Industrial Revolution brought forms of horribleness we sometimes ignore, but let's not over-romanticize what life was like beforehand either.


It was also, however, not as bad as you say, according to Brian Merchant on Adam Conover's podcast Factually in this episode. There was a whole cottage industry of weavers who had made quite comfortable lives for themselves before machine weaving took over; it's where the term "cottage industry" comes from in fact.
posted by JHarris at 12:15 PM on March 22 [7 favorites]


Hard times are when a man has worked at a job for 30 years, 30 years, and then they give him a watch, kick him in the butt, and say, "Hey, a computer took your place, daddy!" That's hard times!
Dusty Rhodes, 198-frickin-5
posted by uncleozzy at 12:16 PM on March 22 [5 favorites]


Because ultimately it’s not the technology that people are boo-ing, it’s the attitudes and practices of the sociopaths currently driving it.

I regret I have but one favorite to give.
posted by AdamCSnider at 12:17 PM on March 22 [8 favorites]


An excellently disquieting example, because reporting wasn’t replaced by “tech”.

THANK YOU

I'm not in an industry where I need to worry about someone thinking they can replace me with an algorithm, but I like art. I like music. I like writing--fiction, non-fiction. I like video games. I like a lot of different things that people make with their human creativity and human values, but instead I am buffeted by wave after wave of AI-generated garbage.

And then we see what happens when AI takes over other human jobs: AI moderation sometimes works, but other times it makes bizarre decisions that you have no way to appeal. The person you used to call at the customer service line has been replaced by a chatbot that just feeds you FAQ answers.

AI is the current vanguard of the enshittification of our world. You can say that the problem is the people who are pushing it, not the technology - but the technology is always going to be used to their advantage. It does not exist in an neutral context. Those people are too fucking entrenched to separate them from the technology I'm glad that they are at least getting their stupid faces booed.

The other day, I was browsing Twitter and someone was showing off the quality of some AI-generated video. I watched it; it was okay. It was slick. If someone only had $100 for special affects for their indie movie, it would be far better than anything else they could afford. But it wasn't good. It was derivative, dull, with that kind of queasy blockbuster sheen that AI-generated images tend to have, weird perspective issues, a background that didn't make sense when you looked more closely. And my immediate thought was: Fucking hell, this is just good enough that a CEO with one million dollars to spend on special effects would think they could get away with spending $100. And firing all those VFX artists.

We're not going to get a new Wes Anderson or Guillermo del Toro. We're going to get a director who says "tell [image generator] to give me a street scene in the style of Wes Anderson." That's the director the fucking suits will hire.

And people will watch it because, for the most part, people have less control over this type of thing. My industry is construction, where enshittification has definitely hit hard in other ways - poorer materials, poorer construction, for more money. Sure, there have been advances, and ways in which things have improved, but consumers don't have as much choice as free-market politicians like to pretend. When every single company is lowering quality, what are you going to do? Just not build the house?

When every children's book is illustrated by Midjourney, what are you going to do? Refuse to read to your children?

I don't know what my point is. Maybe I'm just expressing how unsatisfying I find the promises of new technology when we would need to live in a different world in order for those promises to come true.
posted by Kutsuwamushi at 12:18 PM on March 22 [40 favorites]


(apologies for time delayed post)

You put the hospital IN the blimp.
posted by Acari at 12:42 PM on March 22 [3 favorites]


While they were extolling the humanity of AI, they were showing clips of people using last-generation Quest 2 VR headsets. I would have booed too, that's dumb C-suite agitprop.
posted by credulous at 12:44 PM on March 22


What will happen when available inputs for AIs have significant AI-generated content?

Hard for me to believe this won’t result in significant degradation in the LLMs, at least.
posted by jamjam at 12:54 PM on March 22 [3 favorites]


Look as someone who doesn't apparently speak "Boston" all's I'm saying is: you know who else talks about how "interesting" the next few years are going to be? The C-suite at my company, when they talk about how many of us they're going to replace with LLMs as soon as they can get them off the ground. If you go around sounding like the people who are gleefully rubbing their hands together at the prospect of firing people, you're going to get some flak. This is the internet, we can't hear how you say park the car.
posted by We put our faith in Blast Hardcheese at 1:00 PM on March 22 [13 favorites]


Kutsuwamushi, I’d love a post of anything on new construction materials, standards, styles and where you think they’re real savings vs scrim.

Is ML coming for architects?
posted by clew at 1:06 PM on March 22 [1 favorite]


They can't execute us all!

Check out Sarah Connor, here.
posted by The Bellman at 1:16 PM on March 22 [1 favorite]


2024 AI Legislation continues to lack a "massive fuckery" category.
posted by lock robster at 1:20 PM on March 22 [1 favorite]


It was also, however, not as bad as you say

I am fairly familiar with the early Industrial Revolution and the Luddites specifically, having studied the time period as part of my major, though I acknowledge it’s been a while since college. More to the point, though, I didn’t say anything in particular about how bad it was (or wasn’t).
posted by nickmark at 1:22 PM on March 22 [3 favorites]


This is the internet
... and some of us aren't US Americans, and we may not even speak English as a first language.
posted by Too-Ticky at 1:27 PM on March 22 [1 favorite]


Medevac aircraft are pretty dang great, not something one can really do with a blimp.

Sidetrack, I know, but for a while I was FB friends with a local medevac pilot (I suspect he's just dropped off FB), and the bro who posted something about thanking for the second wilderness injury extraction kinda soured me. Like, okay, one is bad judgment, I get that, I've done some stupid shit in my life and limped out with a makeshift crutch, but two? Maybe having that fallback isn't doing humanity that great a service, and is inducing a bunch of stupidity.
posted by straw at 1:27 PM on March 22 [4 favorites]


Feels like a wilful misreading to compare Ryvar to C-suite executives who actually have the power to fire people from jobs just because they called the burgeoning AI movement "interesting" lol.

Ryvar, if you could do us all a favour and stop calling it interesting, I'm sure that will slow the tidal wave of obsolescences, deepfakes, and general shittiness that LLMs are going to bring crashing down on us, thanks.
posted by Cpt. The Mango at 1:32 PM on March 22 [14 favorites]


This is the internet, we can't hear how you say park the car.

I say it like someone who grew up in upstate New York and then managed to spend 17 years in Boston without picking up even a lick of accent, somehow. The attitude rubbed off, though by local definitions I’m not even a “real” New Englander.

But yeah, I’m sorry I cheesed you off, Hardcheese. And to hear about your partner’s experiences. “Interesting” as in a few painful and turbulent years, but …ultimately survivable for most of us. Just particularly shitty and not fun the way 2008 wasn’t. Maybe somewhat worse.

When I say that I think the fears are overblown: capitalism follows its own version of Avagadro’s law, as in: the total number of jobs will always expand to fill the labor pool with just enough unemployed to keep those who push back or don’t fit in very front and center visible examples of the starvation that awaits anyone else stepping out of line. No expert system or “AI” will change that core dynamic. Fixing it’s on us, and people who stepped on others to reach the top of the ladder are not going to respond in good faith to a language other than violence.

That has nothing to do with the next few years, though, just something that needs to happen before actual AGI is available for worker supervision.
posted by Ryvar at 1:35 PM on March 22 [3 favorites]


My decision to invest in the new guillotine startup, The Final Cut Co. ("The Ultimate Closure Experience") is looking more prescient by the minute.
posted by signal at 1:41 PM on March 22 [5 favorites]


Ryvar, if you could do us all a favour and stop calling it interesting, I'm sure that will slow the tidal wave of obsolescences, deepfakes, and general shittiness that LLMs are going to bring crashing down on us, thanks.

Yes, because that's what I thought would happen, and definitely why I brought it up. You know, around other topics (e.g., abortion) it's considered completely fine and even useful to point out to folks that what they might be talking about as a fun intellectual exercise is actually another person's immediate real life, and to take care in their language around it lest they come off like a jerk. But I guess not this one!

(Yes, I am now aware, excruciatingly, into my very cells, that Ryvar was not in fact looking at things as a fun intellectual exercise, just using identical words to someone who was in a context that is thin on tone.)
posted by We put our faith in Blast Hardcheese at 1:44 PM on March 22 [12 favorites]


Capital is running on fumes.
posted by symbioid at 1:47 PM on March 22 [3 favorites]


A lifetime's worth of failed neoliberal shibboleths re: job retraining for displaced workers (and the disastrous social and political consequences of those failures) have permanently raised my personal hackles on the "interesting times" POV.

I'd like to think that the booing is about something broader than the acceleration of gig economy manufactured precarity impacting people who can (or feel they must) devote resources to attending SXSW. I'd *like* to think that.
posted by structuregeek at 2:02 PM on March 22 [3 favorites]


isn't Shibboleth a spider in LOTR
posted by elkevelvet at 2:16 PM on March 22 [6 favorites]


I am fairly familiar with the early Industrial Revolution and the Luddites specifically, having studied the time period as part of my major, though I acknowledge it’s been a while since college. More to the point, though, I didn’t say anything in particular about how bad it was (or wasn’t).

Yeah, that was a misphrasing on my part. Still though, your implication seemed to be that the whole idea that there weren't a lot of people whose livelihoods were pretty good who's niche was upended by machine weaving, which is directly counter to what Brian Merchant said on that podcast.
posted by JHarris at 2:16 PM on March 22


Really? I missed this news and I can't seem to find anything
Because it was in Japan:

https://www.fukuishimbun.co.jp/articles/-/1827461
Another reason you might not have heard of it is, as I commented in response to picklenickle citing this same article last month, if you actually read the article, neither of the students it discusses actually killed themselves. (As I said there, I work as a translator, so I understand despair at what LLMs mean for my livelihood.)
posted by Strutter Cane - United Planets Stilt Patrol at 2:21 PM on March 22 [5 favorites]


And people will watch it because, for the most part, people have less control over this type of thing. My industry is construction, where enshittification has definitely hit hard in other ways - poorer materials, poorer construction, for more money. Sure, there have been advances, and ways in which things have improved, but consumers don't have as much choice as free-market politicians like to pretend. When every single company is lowering quality, what are you going to do? Just not build the house?


Big leap from video to construction. People have incredible choice in what they watch and read. This attitude about how the poor plebs will consume the AI that's shoved down their throat is borderline-insulting to the people you seem to care about.
posted by Wood at 2:32 PM on March 22


If humans were meant to fly, they would have been born with wings.

It appears, then, that AI has concluded humans are meant to flip.


Capital is running on fumes.

I guess we picked the wrong week to stop zero interest.
posted by snuffleupagus at 2:45 PM on March 22


Still though, your implication seemed to be that the whole idea that there weren't a lot of people whose livelihoods were pretty good who's niche was upended by machine weaving, which is directly counter to what Brian Merchant said on that podcast.

I haven't listened to the podcast so I don't know if Merchant does this, but I was mostly reacting to picklenickle's phrasing that seemed to me to basically be saying "everything was fantastic before these factories came along and wrecked it." It's certainly true that yarn and textiles tended to be produced in cottage-type settings prior to the late eighteenth/early nineteenth centuries. It's also probably true that a fair number of the people doing that work were able to make a pretty decent living at it by contemporary rural English standards.

But it's also true that contemporary rural English standards weren't exactly opulent. And even before industrialization, English weavers were facing stiff competition from Asian producers of things like cotton cloth, and also that English society generally was facing a lot of stressors owing at least in part to the massive population growth after about 1750. The phrasing of "nice at-home jobs that they like to do on their own terms" puts an awful sunny gloss on the economic and social upheaval that was affecting everyone (including but not limited to weavers).

I recognize that it echoes, in some ways, what the weavers themselves were saying, and suggests implicitly (as I believe some Luddites said explicitly) that these were folks who were coming from generations of weavers and it was a time-honored traditional livelihood that was being ripped away. And it probably felt like that to many of them! But there's another view, which points out that spinning and weaving were often the occupations taken up by folks who had previously farmed the common lands of their villages. With the wave of enclosures in the 18th century eliminating common land, they lost their incomes and in many cases it was a choice of becoming textile workers in their homes or moving to cities. So maybe by the 18-teens you were a weaver like your father and his father before, but there's a decent chance he only became a weaver out of necessity.

And while I don't know enough to say for sure, it wouldn't surprise me a bit if spinning and weaving turned out to be one of those "cottage industries" that - like brewing beer - was previously considered "women's work" and seen as a demeaning step down for an 18th-century man who used to farm.

Again, I'm not saying the weavers had awful lives before the mills came along. Did they have legitimate gripes about losing a way of life? Absolutely. But "nice at-home jobs that they like to do on their own terms" is stretching it in my view, and is probably something you could really only say about the wealthiest members of society at that point in history.
posted by nickmark at 3:56 PM on March 22 [7 favorites]


Phlegmco(tm), your comment reminded me of this fascinatingly weird piece from 404 about Facebook and Shrimp Jesus
posted by Suedeltica at 3:57 PM on March 22 [2 favorites]


But "nice at-home jobs that they like to do on their own terms" is stretching it in my view, and is probably something you could really only say about the wealthiest members of society at that point in history.

Consigning onesself to creating the same product over and over in order to earn a living is generally a pretty horrible lifestyle.

I've done a lot of production line stuff of various sorts ranging from diamond rings to COVID tests and it's always a chore and never is fun.

I can only imagine the Etsy crochet artists working to put out more sweaters for the dollars are also feeling despair.
posted by hippybear at 4:05 PM on March 22 [3 favorites]


LOL, nobody is going to pay for a handmade yarn sweater (I know knitting can be machine-made, but not crochet so far, more or less). If you charged for your labor/time, it'd be $300 and nobody would buy it.
posted by jenfullmoon at 4:09 PM on March 22 [2 favorites]


My Point Exactly!
posted by hippybear at 4:11 PM on March 22


This attitude about how the poor plebs will consume the AI that's shoved down their throat is borderline-insulting to the people you seem to care about.

At least the ability to make entertainment decisions implies that the person suggesting assumes that we will continue to have widespread control over and access to disposable income and utilities somehow?
posted by Selena777 at 4:20 PM on March 22 [2 favorites]


People really want to believe that LLMs will solve famine and climate change and create a utopian society.

It's an extraordinary claim, but for some reason the burden of proof is placed on those who question it rather than those making the claim.
posted by splitpeasoup at 4:22 PM on March 22 [17 favorites]


If LLMs are somehow the accumulation of all of human knowledge, as seems to be claimed, then how is anyone imagining that LLMs can solve any problem that human knowledge hasn't already solved?

Fancy auto-correct. That's how I'm defining them. I invite you to join me.
posted by hippybear at 4:30 PM on March 22 [10 favorites]


But "nice at-home jobs that they like to do on their own terms" is stretching it in my view

Yeah, listen to the podcast, and respond to that? My comment was intended to point it out. I have literally nothing else to add.
posted by JHarris at 4:36 PM on March 22


wood, that's such an obviously bad faith interpretation of my comment that i'm trying to remember whether we have a history i'm unaware of.
posted by Kutsuwamushi at 4:37 PM on March 22 [4 favorites]


So can we sum up a lot of the back and forth over the Luddites/weavers/unions as "the distribution of productivity gains between labor and capital is a policy choice"?

And that historically that policy choice has led to very little left over for labor, and since the laws are generally in service of capital owners, the motivating factor for altering that policy choice is the capital owning class's fear of lawlessness?
posted by straw at 4:56 PM on March 22 [6 favorites]


It's an extraordinary claim, but for some reason the burden of proof is placed on those who question it rather than those making the claim.
posted by splitpeasoup


crap-p evidence...
posted by snuffleupagus at 4:58 PM on March 22


AI is the current vanguard of the enshittification of our world.

A couple of days ago, a friend was on the phone with the power company. She was forced through a phone tree, made a wing turn in it and had to hang up and start over, then once through it still had to wait for 20 minutes to speak to a human being to get her flat rate plan changed, and once put in touch with that person had to get through 15 additional minutes of an attempt to convince her not to change it after all.

Phone trees do absolutely nothing to improve customer service. They exist completely and utterly to cut costs to the annoyance of customer goodwill, but they're still used because what are you going to do, go to someone else? Even if you can, they probably have a very similar system set up. We just have to put up with it, all of us, because there are no alternatives. It helps no one but the companies.

I can't help thinking, this is exactly how ChatGPT and similar systems are going to be used. We're going to have to put up with these awful systems, pretend that these horrible machines are human beings, because we'll have no alternatives. ICK.
posted by JHarris at 5:39 PM on March 22 [21 favorites]


I think Ryvar was saying that we are going to be living in interesting times, like the curse "May you live in interesting times."
posted by Jane the Brown at 5:39 PM on March 22 [5 favorites]


To me, it seems like this generation of machine learning might have some really valuable applications, like figuring out proteins, and then maybe a lot of uses that actually have negative impacts, like plausible but nonsense text generation. It seems more useful than the recent bubbles of crypto and stuff, maybe more akin to a regular internet bubble where there is some value, but also a lot of inflated bs. I certainly share the impulse to have distain towards the cheerleaders and snake oil salespersons, though the technology itself certainly seems to have applications.
posted by snofoam at 5:40 PM on March 22 [4 favorites]


I DON'T WANT A COMPUTER TO TALK TO ME.
posted by jenfullmoon at 5:42 PM on March 22 [11 favorites]


If LLMs are somehow the accumulation of all of human knowledge, as seems to be claimed, then how is anyone imagining that LLMs can solve any problem that human knowledge hasn't already solved?

Linking a comment I wrote a month ago to avoid clutter: LLMs are like a software prism.

Separate from that, though, the serious answer to your question is Eureka. The reason why I relentlessly beat that Q* ecological impact drum in AI threads is that Q* is about the most ugly, brute force approach to chain of thought “reasoning” I can imagine, and unfortunately Eureka suggests it might actually work. If it does, that’s not the “Let’s speedrun the dot com implosion and 2008 simultaneously!” variety of Interesting (nor, on preview, the interesting times variety); that’s just straight up catastrophic. No hyperbole catastrophic. Not because Skynet, but because it will be a legitimate killer app with no ceiling on the carbon footprint and I don’t know if the planet can survive that.

That’s actually my biggest fear in all of this: trillion dollar success via the most irresponsible method conceivable. Sam Altmann said in an interview last week that he expects someone will try and shoot him eventually, and my first thought was “well chief, that’s Interesting.” But that time I meant it the way Blast Hardcheese thought at the top of this thread.
posted by Ryvar at 5:52 PM on March 22 [5 favorites]


if you actually read the article, neither of the students it discusses actually killed themselves. (As I said there, I work as a translator, so I understand despair at what LLMs mean for my livelihood.)

Thanks, I will amend to say, they where suicidal. And it isn't really a unique feeling if you actually talk to people impacted by this.

Also, y'all nitpicking over how perfect/imperfect weaver's lives were. They had a fucking movement against their downgrade in QOL. Luddites didn't just wake up and go "hmm time to rebel against something!" Same with artists. There are so, so many things artists, musicians, writers, actors face when trying to make it in the industry today. Poverty wages, getting forced out of royalties, none-to-little health insurance, no retirement plan, overwork. Despite that, artists put up with this shit because they like making art.

Now, does AI solve any of that? Does it increase wages, give healthcare and retirement, get them off of food stamps, etc? No. Even if it sped things up, publishers/studios would just adjust and go "okay, 300 comic book pages in a week, please" with an equivalent lower sum of unlivable money.

Artists are not collectively getting angry and trying to sue AI companies for no reason. No matter how they try to backpeddle and spin it as an artist's "tool," that is not how it was made. Artists were not consulted on how to make their lives easier. It was made as a replacement, regardless of how shitty the product is. At that, a product that was literally made off of aforementioned hard work with no form of tracing the sources. You guys want to be all "I don't hate the technology, just the capitalism!" NO. IT'S ALSO SHITTY TECHNOLOGY TOO. IT LITERALLY TAKES SO MUCH HUMAN WORK TO MAKE IT HAPPEN. So much tagging, so much unpaid/underpaid labor needed to feed into it and then refine it, so much corporate hoop-jumping to avoid copyright infringement. It makes the engineer in me hurl. It can go in the trash bucket along with self driving cars in the "trains already exist" department. Inelegantly designed piece of trash. Go 2 Juicero hell.
posted by picklenickle at 6:19 PM on March 22 [18 favorites]


No history at all. I think I read "and people will watch it" pretty much how you said it and meant it.
posted by Wood at 9:43 PM on March 22


Do you yourself fear a future of consuming AI-produced dreck or is that something you fear on behalf of others?
posted by Wood at 9:49 PM on March 22


Suedeltica, thank you for that, and, what's worse is that actual people I know are creating crappy AI art in my feed.
Sigh........
posted by Phlegmco(tm) at 11:30 PM on March 22


On "Overheard In NYC"
Store clerk: We do not accept Apply Pay or credit cards only cash
Customer: How does that work?
Store clerk: "You hand it to me."
posted by DJZouke at 5:11 AM on March 23 [7 favorites]


I'm not sure what jobs exist that bosses don't think an LLM can do.

Their own. Let's prove them wrong.
The plot of a Twilight Zone episode in 1964.


Mel Cooley FAFO.
posted by non canadian guy at 9:15 AM on March 23


What will happen when available inputs for AIs have significant AI-generated content?

There is research on this topic. Diversity decreases over successive iterations on synthetic text.
posted by StarkRoads at 9:25 AM on March 23 [4 favorites]


Faint of Butt> Convince me that the invention of heavier-than-air flight has been a net benefit to humanity as a whole. Go ahead.

I've no twitter access here, and Musk fucked up twitter's bookmarking role, but..

Tim Garrett once cites some report that argues that empire size depends upon travel times. Air travel makes global exploitation possible!
posted by jeffburdges at 11:23 AM on March 23 [2 favorites]


Maybe having that fallback isn't doing humanity that great a service, and is inducing a bunch of stupidity.
posted by straw at 3:27 PM on March 22


My wife has been alive for the last six+ years because a helicopter got her to a hospital in time to stop a "Widowmaker" heart attack.

I invite you to ponder my reaction to hearing that maybe she should have died because you disapprove of the actions of if some random hiker.
posted by Vigilant at 12:55 PM on March 23 [6 favorites]


What will happen when available inputs for AIs have significant AI-generated content?

It’s called model collapse, it’s a downward spiral into oblivion, and anyone in or adjacent to the field has known about it for years. The people making these systems are not idiots - they are among the brightest and in my very limited experience definitely the weirdest (no “among the” qualifier) - and they’re going to pursue known-good sources of training data anywhere they can. Nobody smart enough to build anything remotely this complex is stupid enough to miss tha-

Okay, yes, their managers are, but eventually a team that isn’t shackled to idiots pushes through.

Point is: if you’re hoping for model collapse to put the genie back in the bottle, don’t bother. We learn to live with this, to use it for good, and we overhaul our political and economic systems to prioritize the exclusion of sociopaths before AGI lands. The alternative is dystopia on an unimaginable scale.
posted by Ryvar at 3:13 PM on March 23 [1 favorite]


We learn to live with this, to use it for good, and we overhaul our political and economic systems to prioritize the exclusion of sociopaths before AGI lands.

Ha ha! We are currently about to hold an election to see if the US citizenry will walk off the same cliff they did eight years ago. We've already seen the answer to the question of "surely we'll redesign society so it doesn't walk to the terrible beat of extremely rich people?" To do this requires the majority of people to care that the world is getting worse and worse. They don't, not when there are so many comforting voices telling them they don't have to. It's why we still have cigarettes, climate change, and Republicans.
posted by JHarris at 3:35 PM on March 23 [6 favorites]


Some things I've read about AI are that their basic models will be forever stuck in early 2023 or thereabouts because that's when the AI input into the system starts to become large enough that that further inputs from later dates could pollute the models to the point of unuseability.

Not sure how conversing with fancy autocorrect in 2026 will feel if its frozen in 2023, but we'll see, I guess.
posted by hippybear at 3:40 PM on March 23


hippybear: "Some things I've read about AI are that their basic models will be forever stuck in early 2023 or thereabouts because that's when the AI input into the system starts to become large enough that that further inputs from later dates could pollute the models to the point of unuseability.
"

This is called model collapse, and what it means is that as AI starts taking its output as input—eating its own shit—further generations will become more and more corrupted, and more blatantly useless. I'm a bit hopeful that this will make the current AI fad self-limiting.
posted by adamrice at 5:01 PM on March 23


I found Tim Garrett's tweet Faint of Butt: Quantitative Dynamics of Human Empires by Cesare Marchetti and Jesse H. Ausubel.

It's week in some places, likely the hormone focus, maybe trust & loyalty suffice here.

Air planes are a mechanism for exherting power over other humans, one which requires so much energy that real travel & solidarity among the lower classes becomes almost impossible, so in conjuntion with trade, and control over production capacities, our air craft make extreme exploitation levels possible.
posted by jeffburdges at 5:36 PM on March 23 [1 favorite]


Ryvar> we overhaul our political and economic systems to prioritize the exclusion of sociopaths

We've no reason to think that's possible. Also, much aweful leadership is only horrific retrospectively, so you cannot really call them sociopaths.

Amerian leadership prioritized giving Americans enormous purchasing power, which by Jevons they wielded, and placing every American behind the wheel of a car, which individuals one form of freedom unhear of in human history. "What's good for the country is good for General Motors, and vice versa.". American leadership engeneered an unprecedented level ofglobal trade, in part because this makes many millitary conflicts untennable, but also to siphon off the resources to feed western concumption.

These were not sociopaths, but their outcomes are worse than what the worst sociopaths do, maybe risking human extinction, but definitely making a devostating collapse inevitable.
posted by jeffburdges at 5:53 PM on March 23


I am willing to bet the Venn diagram of the 'learn to live with it' crowd and the 'lick the boot' crowd is a circle.
posted by jordantwodelta at 6:09 PM on March 23 [1 favorite]


It’s called model collapse...they’re going to pursue known-good sources of training data anywhere they can. Nobody smart enough to build anything remotely this complex is stupid enough to miss tha-
How can you tell good training data and bad training data apart? The marketing hype for generative AI tools is about using them everywhere, to "speed up" writing, coding, making art, etc. If everyone is putting AI-generated content into their articles, comments, GitHub source code, StackOverflow answers, etc., which is what companies like Microsoft and OpenAI apparently want us to do, then there won't be any "known-good sources" left.
posted by april of time at 6:36 PM on March 23


"We have arrived at the scary moment when our prevailing attitude to innovation has shifted from love to fear."

Am I fearful? No. So far with AI, I'm seeing only useless bulk that makes my life harder. I am not fearful of garbage, I am annoyed with garbage.


It's not garbage. It's pollution and they are deliberately adding it to everything. It's already inescapable on the internet. Soon it will be everywhere else too because its getting in your phone, computer and television. Even books are now becoming AI generated gobbledygook. Knowledge is melting into useless sludge before our very eyes. I'd say don't trust anything before 2023 but I know that they will just put fake dates on things and fake the search results if you try to look it up.

It's going to kill people. Sometimes, hopefully mostly, just accidentally by giving bad advice but also probably sometimes deliberately by someone evil who uses AI to do things like generate plausible sounding recipes that produce toxic results.
posted by srboisvert at 6:54 PM on March 23 [6 favorites]


How can you tell good training data and bad training data apart?

It doesn't take long before people get pretty good at spotting machine generated content, so to build a new model, what you do is, you turn sorting AI stuff from the real into a CAPTCHA. And then everyone solves the problem for you every time they log in somewhere. And if that doesn't work you hire a company overseas somewhere, and they set up a sweatshop...

Circle of life.
posted by surlyben at 8:51 PM on March 23 [4 favorites]


Point is: if you’re hoping for model collapse to put the genie back in the bottle, don’t bother. We learn to live with this,

Ryvar, I usually find myself strongly disagreeing with your position in these AI threads, but it is usually a bracing, point-of-view expanding disagreement, and I really appreciate you taking the time to buck what seems to be the trend here. You are not wrong that this genie isn't going back in the bottle, as bitter of a pill as that is to swallow. I don't think you are wrong to try to get us all talking about corporate AI vs the kind anyone can run on a reasonably well kitted-out personal computer, either.

Best case, maybe it proves difficult to actually make money with AI, and using it comes to be seen as kind of tacky, like 1970s polyester. Eventually, the rights issues get worked out, people get good at using it (like synthetic fibers today), and we all wonder what we ever worried about (pay no attention to the people worried about plastic in the ecosystem, nothing to see there.)
posted by surlyben at 9:23 PM on March 23 [3 favorites]


people get good at using it (like synthetic fibers today


Say


what

now

Nothing to see, yeah.
posted by clew at 9:40 PM on March 23 [5 favorites]


Thanks, surlyben, seriously. Really needed to hear that. I love Metafilter, I grew up in evangelical fundamentalism right as it woke up to its demographic collapse, and I really don’t want to watch the cycle of people facing trendlines too painful to acknowledge play out again in this place. We’ve all seen how that ends.

I fucking hate what the major players in the space have done, I still find the field fascinating, and I hope there’s a good future in store.

I typed out and deleted eleven comments before posting this one, so I need to leave the thread but before I go:

I am willing to bet the Venn diagram of the 'learn to live with it' crowd and the 'lick the boot' crowd is a circle.

I am the only person I see saying learn to live with it, and in that exact same comment I advocated removing all of our political leaders and restructuring our entire society to keep them from power. In a comment immediately prior I directly stated violence would be necessary.

This is as close as it is possible to come to openly calling for violent proletarian revolution on Metafilter without getting your comments deleted.

I’m not sure how you got from there to “lick the boot,” jordantwodelta, but… hey, let me know and I’ll try to communicate better next thread.
posted by Ryvar at 10:02 PM on March 23 [6 favorites]


I’m not sure how you got from there to “lick the boot,” jordantwodelta, but… hey

The amount of seemingly deliberate choices to not give fellow MeFites any measure of grace or goodwill and leap toward the most extreme, negative interpretation of what anyone might share here is one of the more wearisome features of participating in this community.

That it's gotten worse over the past 4-5 years means that despite MetaFilter's insistence that they haven't been polluted by Trumpism and MAGA, they have fallen victim to the same negative impulses that are magnified in our current political climate, only the MetaFilter prism means it manifests in weird, perverted ways that aren't easily spotted as being the same energy through a different lens.

But getting from calling for proletariat uprising to being accused of wanting to lick the boot is exactly a symptom of the illness we're suffering from here. And I wish we'd work more actively toward a cure and solution.
posted by hippybear at 3:22 PM on March 24 [8 favorites]



I'm always reminded of fantabulous timewaster's comment: Perhaps the Butlerian Jihad was a labor action.

posted by lalochezia at 7:25 AM on March 25 [3 favorites]


For some of us, machine learning's impact on our livelihood and careers isn't theoretical, but something we've been dealing with for quite some time, as have many of our loved ones and colleagues. When I hear someone say "we'll learn to live with it" I hear "you'll learn to live with it, because it doesn't affect me in any fundamental way."

It is extremely dismissive, and it represents buy-in to the attitude that any technology must be embraced, because we have no hope of control over the products that are foisted on us by X company. To me, this is no different than saying "if you aren't a criminal, you have nothing to be worried about," which is a fundamental tenet of "lick the boot.

You talk about "removing all of our political leaders and restructuring our entire society to keep them from power," well forgive me if your flights of fantasy fail to give me any kind of solace when faced with my actual reality. I'm living in the world as it is now, not your extrapolation of what it would have to be in order for this technology to be palatable to you.

As for the separate comment from another poster about negative impulses and MAGA/Trumpism, I'm not American, not do I live in America, so you're definitely projecting. And pardon me for not giving someone "grace and goodwill" as they advocate for the widespread adoption of a plagiarism machine that is destroying writers and artists and is therefore a direct threat to everything I've built in my life, and my ability to tell stories to a wide audience.
posted by jordantwodelta at 1:38 PM on March 25 [3 favorites]


I am also somewhat skeptical to LLMs, especially general-purpose LLMs (I think domain-specific LLMs for stuff like programming have the potential to be quite useful, and programmers I know who are much smarter and more experienced than I am seem to agree).

Generative AI in general, I think there are some interesting uses, but I think, again, general-purpose is nice for showing off what it can do, but domain-specific is where the real usefulness probably is.

I work in film postproduction, and there are a bunch of new tools coming out now that look like they're going to potentially reduce workload and increase productivity a whole bunch (and it's automating the kind of boring drudgery no one enjoys doing), and in some cases, do things we simply couldn't do before. A few examples:

* MTI DRS Nova, one of several software packages used for film restoration, just announced an AI-based frame regeneration tool in their upcoming version. We'll see how well it works, but whole missing frames or severely damaged single frames is the kind of thing that's been extremely hard to fix previously. Basically, we already have tools that regenerate missing frames or parts of frames based on motion vectors and blending the frames before and after, which works well on simpler interpolations, but AI should be able to do a much better job, and since this is one tool in the toolbox, restoration artists will be able to work on the results with all the other tools available, fixing whatever artifacts exist, matching grain, etc.

* DaVinci Resolve, a color correction and online finishing package that's widely used, has had "AI retiming" for a few years now. Basically, changing the speed of a video clip often requires generating new frames interpolated from existing ones, and while we have, again, motion vector based tools that do a good job, these tools often create characteristic artifacts in fast motion and when objects occlude. The AI based mode in Resolve is, as I understand it, trained to recognize the artifacts and replace them with a much more pleasing motion blur like area. This is essentially what you'd do manually before, but this works in 99% of cases, and is a huge time saver.

* There are a bunch of machine learning tools for doing things like rotoscoping/matting/separating elements from backgrounds both already available and upcoming, which will take a lot of the drudgery out of VFX work. Again, this is stuff that no one really enjoys doing (although rotoscoping can be kind of relaxing and zen-like), and it has potential for being even better than manual work. Indeed, at least one large VFX package now has tools to both train and infer from neural networks inside of the software.

Then again, James Cameron just worked on and approved a 4k version of Aliens that was an AI upscale from a 2k master with way too much denoise and it kind of looks like shit, so this stuff can go both ways. But my point is, there are definitely useful tools coming out of this stuff. As an extra, this is not stuff that runs on huge server farms and uses MWs of energy, it's stuff that runs at reasonable speeds on one or two fairly powerful GPUs.
posted by Joakim Ziegler at 7:10 PM on March 25 [6 favorites]


Cory Doctrow: Here's a fun AI story: a security researcher noticed that large companies' AI-authored source-code repeatedly referenced a nonexistent library (an AI "hallucination"), so he created a (defanged) malicious library with that name and uploaded it, and thousands of developers automatically downloaded and incorporated it as they compiled the code:

AI hallucinates software packages and devs download them – even if potentially poisoned with malware
posted by jeffburdges at 10:39 AM on April 2 [3 favorites]


« Older Kermitops, the newly discovered prehistoric...   |   An amazing African-American artist you may not be... Newer »


This thread has been archived and is closed to new comments