Asking Chatbots what they're for
October 11, 2023 9:53 PM   Subscribe

Making money for the businesses that developed and own them? Maybe not. Some Google Insiders also appear not to know: “The biggest challenge I’m still thinking of: what are LLMs truly useful for, in terms of helpfulness?” said Googler Cathy Pearl, a user experience lead for Bard, in August. “Like really making a difference. TBD!”
posted by airing nerdy laundry (24 comments total) 11 users marked this as a favorite
 
Yup they're very expensive to build and run and it seems like nobody in big tech has any idea how to make money off them. They're all just trying to get in position where if someone else figures it out, they will have a bigger and better LLM and can fast follow.

So, in that sense, we have already created human-like intelligence...just a big ol expensive brain that wants to sit around and have conversations, and nobody's sure what it's for other than "well, probably not [thing it's currently being used for] but we're quite sure we'll figure something out". I too am great at having confusing conversations where I make up a bunch of questionable facts, and writing code that uses APIs that turn out not to exist. Didn't have to pay your army of PhDs to make someone like that. I was right here the whole time.

Welcome to capitalism, friend, I don't know what I'm for either but it's probably not what I get paid to do at my day job. Sad that humans make an AI and we immediately just put it to work doing a job it's bad at and clearly hates, just like the rest of us.
posted by potrzebie at 10:33 PM on October 11, 2023 [50 favorites]


They’re imperfect samples of a certain biased segment of the zeitgeist, like old encyclopedias, a stack of back issues of Life or Mad, or a Simpsons boxed set. More comprehensive, more idiosyncratic. A library assembled from a hoarder’s attic and only accessible indirectly, by suggesting scenes to an improv troupe with exclusive access to the stacks.

Ask one to draw you some scene in cyberpunk style or ‘60s style, or a teenager’s bedroom, or write you a rap about something. Have one imitate the style of an artist or writer you know, or draw and describe the same scene in different styles (e.g., Hemingway, Plath, Guthrie lyrics, film noir, Max Beckmann, Norman Rockwell, as a Sopranos scene, as a Titian painting).

See what they put in the scene and what (currently thought of as exemplary or stereotypical) elements of the style they emphasize. Think about how the answer might have seemed strange 20 years ago or be strange in 20 years. When did cyberpunk get so *pink*? When did teens stop having alarm clocks and CD players? Whose notion of rap or the Renaissance is it imitating? Did those characters really talk like that, or was that just the style of an exaggerated meme?
posted by smelendez at 11:31 PM on October 11, 2023 [16 favorites]


Guys, steal this think piece title: "The enshittification of AI"
posted by Alex404 at 11:44 PM on October 11, 2023 [8 favorites]


The problem is that users pay $10 a month subscription fee for Copilot but, according to a source interviewed by the Journal, Microsoft lost an average of $20 per user during the first few months of this year.

In some ways, this has been Microsoft's business model since the early 1990s: look the other way at lost sales of Windows licenses from piracy to get people locked into a sticky platform, which is eventually expensive to leave, because everyone else is using it. In the meantime, work out the kinks and edge cases at a smaller usage scale, and then you have captured the market and can work out a more profitable revenue arrangement with your customers. In the worst case, you have a technology stack you can leverage to sell other non-Github products and services.
posted by They sucked his brains out! at 11:44 PM on October 11, 2023 [2 favorites]


eh the issue with AI is that it makes work that was never paid or rewarded in any real way easier. commenting code and creating documentation? LLMs take a lot of the starting effort out. need to create a presentation for some such product launch? emotional labor minimized, all you have to do is generate it and do some editing. need to communicate about anything really well? that first step of figuring out how to even start is just eliminated

It's in that editing aspect where the actual skill and knowledge comes in but no industry has ever paid its QA people well nor do they care to reward emotional labor in any way at all even though it's critical for organizational functioning to such an extent that they even developed a weird lil cult-esque process called Agile to implement it. even tho don't really pay or train people in resulting in it rarely working well but hey at least we do stand up every day

these corps have created a reality where exploiting people is the norm - they never paid for this kind of labor anyway so the end goal, at some point, is to simply automate all of these jobs away in such a way that some other company contracts with you instead of hiring employees or other contract employees

which... good luck to them once they realize how much actual critical thinking and actioning that analysis is involved in doing certain roles well. like crypto, some fucks are going to make a bunch of $$ randomly, a lot of people will lose out, and corps keep on making shareholders who are barely tech literate happy
posted by paimapi at 11:50 PM on October 11, 2023 [16 favorites]


Yup they're very expensive to build and run and it seems like nobody in big tech has any idea how to make money off them.

hic twitter vivit
posted by snuffleupagus at 12:10 AM on October 12, 2023 [1 favorite]


Related: Welcome to State of AI Report 2023
posted by chavenet at 2:47 AM on October 12, 2023 [2 favorites]


I'm not advocating that anyone steal John Ratzenberger's voice (as deplorable as he might be), but a chatbot which reads all of it's output as Cliff Clavin would be both hilarious and a nice satire of how problematic AI is.
posted by RonButNotStupid at 5:49 AM on October 12, 2023 [1 favorite]


This has to be one of the worst tech industry articles I have ever read.

AI or (as the case may be) AI + robotics is about replacing labor in production, or rendering feasible production cases for which labor is now too expensive relative to demand to be feasible. When mature, not only will a large chunk of payroll for a large number of current jobs belong to a small number of AI companies, so will a large chunk of the profit margin of a huge number of "jobs" that don't even exist today.

The champion AI companies will be worth tens of trillions if they succeed in even a moderate number of use cases. The non-champion tech companies will be a shadow of themselves if not out of business altogether. Not investing heavily in AI and robotics now is slow motion suicide.
posted by MattD at 6:57 AM on October 12, 2023 [4 favorites]


There are lots of companies effectively using AI to do stuff, you just don't hear about them because they're actually doing useful stuff instead sitting around trying to figure out how to make trillions off dumb chatbots.

And that's the giant money pit the article is talking about. People want specific AI solutions that solve a problem in their domain, not the general, faux-scifi, plastic-friend-whose-fun-to-be-with AI ala ChatGPT and Bard. People want the AI equivalent of an industrial controller, not a do-everything Desktop PC.
posted by RonButNotStupid at 8:39 AM on October 12, 2023 [2 favorites]


The stuff where I've seen AI work (other than providing new talking point for executives to be excited about and sound technically hip while siphoning money from a balance sheet) is rock solid boring.

- Replacing clip art / stock photography in free to read material.
- collating knowledge from a document bank and providing summaries to care agents and customers looking at a support website.

The rest of it feels like a massive version of ELIZA enrobed with a cheap Web3 VC Magic Shell.

The "writing" I've read - or prompted LLMs to "write" - is surface level at best if not wrong when trained on a generalist set.
posted by drewbage1847 at 9:12 AM on October 12, 2023


Rushing a consumer product to market and hand-waving away profitability = $XXX billion
Someone good with the economy help me, my company is dying

OpenAI didn't have to open up ChatGPT to the world, but they wanted the first-mover advantage. Github Copilot is extremely wasteful, it didn't have to be designed to autocomplete by default everywhere all the time. There is progress in improving model efficiency, they didn't wait. Stop hitting yourself, companies.

My toolset for coding includes docs, search, and now LLMs and I don't see that going away. Other applications are surely coming and will be too tempting to resist, and more profitable than chatbot and autocomplete.
posted by credulous at 9:47 AM on October 12, 2023 [1 favorite]


OpenAI’s Revenue Crossed $1.3 Billion Annualized Rate, CEO Tells Staff:
ChatGPT maker OpenAI is generating revenue at a pace of $1.3 billion a year, CEO Sam Altman told staff this week, according to several people with knowledge of the matter. Altman’s remark implies the company is generating more than $100 million per month, up 30% from this summer, when the Microsoft-backed startup generated revenue at a $1 billion-a-year pace.
posted by gwint at 10:44 AM on October 12, 2023 [1 favorite]


Yeah, but if that $1.3b top line revenue is all going to compute power they won’t necessarily have a sustainable bottom line.
posted by m@f at 11:39 AM on October 12, 2023


The champion AI companies will be worth tens of trillions if they succeed in even a moderate number of use cases. The non-champion tech companies will be a shadow of themselves if not out of business altogether. Not investing heavily in AI and robotics now is slow motion suicide.

I'm not saying you're wrong, but it is worth noting that this reasoning covers at least two legs of the fraud triangle.
posted by Not A Thing at 12:51 PM on October 12, 2023 [6 favorites]


I think the cost issue is a red herring. For one thing, startups expect to lose money. It took five years for Facebook to make a profit. More importantly, they're running on early technology. Once people start spending time on the question, you'll see them start to optimize then, both at the hardware and software levels which will probably bring costs down by an order of magnitude.
posted by CheeseDigestsAll at 12:54 PM on October 12, 2023 [1 favorite]


My headdesk of the day was learning that Google [and others] will indemnify its AI users against copyright claims.

Getty's lawyers must be salivating. All they gotta do is what they've already been doing -- catch the data theft, demo it showing up in actual use, CASH IN. $150K statutory damages per infringement!

This is either absolutely necessary to get enterprises to sign on, a monumental source of liability for generative-AI purveyors... or quite likely both.
posted by humbug at 1:12 PM on October 12, 2023 [1 favorite]


The champion AI companies will be worth tens of trillions

That would be 10x where FAANGs are now, and seems slightly unrealistic -- one good way to think about a company's valuation is present value of earnings over the next 30 years. Even with a 5% discount rate, half that value is coming from years 10-30. So not only do you need 100 dollars in profit from every person on the planet, you also need every would be competitor to look at the situation and decide a slice of 10 trillion isn't worth it.

Don't get me wrong, LLMs have a chance to make good on a lot of promises already made by tech. And unit economics are somewhat difficult to discern from the outside when the chief cost is the fixed cost of training foundation models. But to the extent that humans can comprehend large numbers this one looks to be on the wrong side of the line dividing optimism and hyperbole.

IMO, the winners won't be the company with the best model -- their margins will eventually at the mercy of the second best model. The winner is going to be the company that sells HW to train these models. There's a reason NVIDIA is trading at 113.06x the past 12 month's earnings.
posted by pwnguin at 1:54 PM on October 12, 2023 [4 favorites]


$150K statutory damages per infringement!

They will carve out intentional infringement.
posted by snuffleupagus at 2:22 PM on October 12, 2023


The champion AI companies will be worth tens of trillions if they succeed in even a moderate number of use cases.

The total US equities market is under $50 trillion. AI has the potential to make some of companies and business contributing to that them more profitable.

But the idea that "moderately" successful AI companies will basically capture all existing value is a stretch, to say the least. Anyone pitching your valuation is imagining a 100% transformation of the world economy to be AI-centric. Given we still need real world goods and services and construction I'm not even sure how to imagine that level of change, at least outside an SF novel.
posted by mark k at 3:01 PM on October 12, 2023 [5 favorites]


There's a reason NVIDIA is trading at 113.06x the past 12 month's earnings.

>be me
>realize that CUDA powers everything in ML
>don't buy any stock
posted by credulous at 4:52 PM on October 12, 2023 [6 favorites]


What are you for?

You, the reader.
posted by Flunkie at 5:45 PM on October 13, 2023


To the folks saying "AI is definitely the future, stop resisting" - how's your cryptocurrency portfolio doing?
posted by signsofrain at 7:35 AM on October 14, 2023 [2 favorites]


OpenAI in Talks for Deal That Would Value Company at $80 Billion (at it's "peak" FTX was said to be worth $32B)
posted by gwint at 12:16 PM on October 20, 2023


« Older "Amazingly talented and amazingly kind"   |   Nato vows to respond if Finland-Estonia pipeline... Newer »


This thread has been archived and is closed to new comments