Ad Astra Machina
January 7, 2023 9:44 PM   Subscribe

"I’m not threatened in my career by A.I." psychologist Jim Picano explains. Though, 'Online mental health company uses ChatGPT to help respond to users in experiment — raising ethical concerns around healthcare and AI technology." (via Mastodon)
posted by clavdivs (26 comments total) 11 users marked this as a favorite
 
That tweet thread from the start up founder is pretty revealing. Wonder if they do a lot of educating on ethical issues in research for PhDs of media arts at MIT...

This is basically plot of Zachtronics' visual novel Eliza, which is a somewhat unsubtle but still interesting take on the vapidness of the tech industry. Life imitating art...
posted by okonomichiyaki at 5:19 AM on January 8, 2023 [2 favorites]


Look Dave, I can see you're really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over.
posted by fairmettle at 6:01 AM on January 8, 2023 [21 favorites]


dAIsy, dAISy, give me your answer do...
.
posted by lalochezia at 6:10 AM on January 8, 2023 [3 favorites]


From Mastodon:
"i suppose this is also the time to note that the Media Lab was founded as the pseudo independent entity it is because Negroponte hated doing grant reporting and ethics review and a substantial part of its whole _vibe_ is dropping untested technologies on populations with little supervision and nebulous consent."

it is really not surprising to me that someone who got their PhD there in 2014 (so we would have overlapped) thinks that this is an ok way to treat people"
posted by mhoye at 6:33 AM on January 8, 2023 [16 favorites]


At least Media lab is staying consistent in its extreme disregard for decency/ethics.
posted by Philipschall at 6:38 AM on January 8, 2023 [4 favorites]


Sorry, the media lab critique I linked may be overly tangential to the article here... feel free to disregard.
posted by Philipschall at 6:44 AM on January 8, 2023


No; their pictures even look eerily similar, but the "Morris Worm" Morris was Robert T., our guy here is Robert R.
posted by mhoye at 8:03 AM on January 8, 2023 [2 favorites]


For the past few months, a running theme when it comes to this new generation of AI software is that the training data acquisition is either extremely expensive, or taken without the knowledge/consent of the data providers (to avoid the expenses). I'm not surprised that we'd see some equally reckless protocol for this experiment. Who needs the turing test if you can just dehumanize real people?
posted by picklenickle at 8:37 AM on January 8, 2023 [9 favorites]


My first thought on seeing this on the Bird Muskrat Site was :

Nothing says helping people dealing with alienation in capitalist techno-society than...
Helping them deal with therapy via a robot.
posted by symbioid at 8:37 AM on January 8, 2023 [2 favorites]


I think the link to the media lab story is relevant, and points to a larger trend in that the people and organisations who were lauded have been utterly, vacuously free of morals. The tech bubble is rapidly collapsing and the areas left to exploit - NFTs, AI, VR - are all looking pretty shaky in both their implementation, and the values behind them.
posted by The River Ivel at 10:29 AM on January 8, 2023 [1 favorite]


Psychologist Jim Picano is either whistling in the dark, hasn't paid enough attention, or is just unable to face the truth.

His job is threatened by AI. As is mine. As is yours. Everyone's job is threatened by AI.

I work in IT. 99% of what I do is following well known decision trees to resolve known issues, the only hitch is that often this is done by communicating over the phone with aggressively non-technical people who appear to shut down their brains and won't even read the text in an error message when it tells them exactly how to resolve the issue.

I could be replaced by a sufficiently advanced chatbot, and I expect I will be sooner or later.

Lawyers never talked much about their jobs being threatened by automation, but their jobs were largely taken by automation almost without anyone noticing. It's a force multiplier, not a total replacement, but today one lawyer does what once took three or five lawyers to do.

Doctors used to think they were irreplicable. Now we've found that computers can diagnose vastly better than any human doctor can, especially in some areas such as examining samples and imaging for cancer.

AI therapists seem out there, but I don't think it's really all that unusual or weird. We've come a long way since ELIZA, and even back in the 1970's people reported feeling better after spending time with that incredibly simple chatbot.

And? There's a mental health crisis worldwide. Assume an AI therapist is only 70% as effective as a human therapist. OK, isn't giving more or less anyone who wants it free 70% effective therapy better than therapy being something limited to people with money? If something is doing it's worth doing badly.
posted by sotonohito at 10:47 AM on January 8, 2023 [10 favorites]


I find the obsession with AI puzzling. It's as if we thought the point of human beings is to get some sort of abstract formulaic "work" done rather than for human beings to be able to live rich lives. I feel as if there is some alternate reality where bots are making pictures, writing essays and stories, and having conversations with one another in some kind of hallucinatory simulacrum of society.
posted by Peach at 11:03 AM on January 8, 2023 [12 favorites]


The plot of ELIZA

Also the reason the original ELIZA program was written!
posted by metahacker at 11:05 AM on January 8, 2023


Peach I'd say the object isn't really to displace people from creative tasks, but that's more a sort of side effect of developing an AI sophisticated enough to deal with the messiness of the real world and do the sort of jobs we'd like to automate away because they're dull, grinding, and miserable.

No one wants to be a call center operator, but we've got to be able to deal with all the stuff that call centers handle somehow. A sophisticated enough chatbot could deal with almost everything your average call center does (customer complaints, refunds, tracking, orders, service requests, updates on services or shipments, etc). It's almost all just a matter of parsing the question, searching a database for the answer, and giving that answer to the user.

Similarly no one wants to work in fast food. Or clean toilets. Or take out the garbage. Or any of the tens of thousands of necessary but awful jobs that must be done to keep a society going. Developing a sub-sapiant AI that could pilot mobile devices that can do those things would allow billions of people to be relieved of drudgery.

We're at the beginning of the process, so the results now look kind of haphazard and almost pointless. As AI becomes more capable, the use cases will expand to more and more useful things.

The other thing to remember about automation is that it doesn't have to be as efficient as human labor, because it keeps going forever without anything but a bit of maintenance (if that can also be automated....)

The physical stuff takes longer because, well, physical stuff takes more machinery and more complex machinery. While a chatbot or an artbot or whatever is just a matter of code. So we're seeing more of that at first too.

I think we're at a tipping point. Boston Dynamics and similar robotics research outfits have shown that we can build robots that are capable of complex physical tasks. The AI research shows that we can build software that can direct those robots through complex tasks.

And then the political questions start getting really big, because as more jobs are automated out of existence, more people will be permanently out of work. It won't be like in the past where despite jobs vanishing due to automation new jobs kept appearing. We're going to see the jobs just vanishing.

Then we'll see if we can force the billionaire class to disgorge enough of their hoarded wealth to fund a UBI and so on, or if there's a revolution.
posted by sotonohito at 11:26 AM on January 8, 2023 [2 favorites]


sotonohito I'm not so sure that eliminating physical jobs is a worthy goal. The introduction of all these "freeing" technologies has not ended up making life more complete or less stressful; enacting schemes wherein human beings can free themselves to do more enlightened work seems to align more with Vonnegut's (or Philip K. Dick's) ideas than with St. Thomas More's.

[Don't mind me, have been encountering airport bathrooms in which automation and motion sensing have produced soap dispensers that never have soap in them, so that a bottle of some perfumed slop sits sticky and unsanitary on the wet counter. My hobbyhorse is that the problems of modern society will be solved not by more developed technology and computing power, but by investing in maintenance, repairability, and sustainability. And shorter work days and weeks. Yes, I digress.]
posted by Peach at 12:21 PM on January 8, 2023 [11 favorites]


isn't giving more or less anyone who wants it free 70% effective therapy

What on Earth makes you think it's going to be free?
Or do you think that it'll be funded by selling the data harvested from these auto therapy sessions, because that's not actually free.
posted by thatwhichfalls at 12:26 PM on January 8, 2023 [13 favorites]


Nobody should be threatened by AI because it doesn't exist.
posted by GallonOfAlan at 1:35 PM on January 8, 2023 [5 favorites]


Mostly the effect of automation (using this broadly here since "AI" is not the same as previous digital automation and definitely not the same as physical automation) is that it turns humans into QCers.

Machine translators--clients dump their text into google translate and then ask professional translators to proofread it. For years, have heard translators lamenting that they don't like this.

AI radiology--a radiologist still needs to check the images to make sure nothing was missed. I'm not a healthcare worker but this seems redundant to me.

Hi-throughput liquid handlers, auto-samplers, etc in biotech--you still need to troubleshoot the instrument and QC the results. I'm fine with these because nobody in biotech is passionate about pipetting or getting carpel tunnel.

Mo-cap animation -- The animator is left to fix hundreds of little polygons and other fine adjustments.

AI 2D images--you need to fix all the little wonky fingers/teeth/hair/etc. I, personally, think this is the Juicero of creative endeavors (you open up the juicer and it's a bag of juice made by humans.... but worse because it's unpaid labor juice).
posted by picklenickle at 2:04 PM on January 8, 2023 [8 favorites]


In the services jobs that AI will take first are jobs that lots of people like: indoor, seated, and Goldilock's porridge temperature of cognitive requirements: enough that employers find it in their interests to give those who can do them a full-time living wage, but not so demanding that the jobs are out of reach for people who lack high-level cognitive aptitude, or who have that aptitude but can't deal with the stress / future time orientation it takes to mature it to capability or deploy it consistently.

Those are jobs that are linked to any number of other sectors in the economy. Credentialing people get these jobs is the raison d'etre of community colleges and commuter state schools. Small tradesman can ply their trade mostly because they have spouses who get benefits and steady paychecks doing these kind of jobs.
posted by MattD at 2:14 PM on January 8, 2023 [1 favorite]


And as for the AI therapy--if we ignore the totally messed up way they did their experiment--the use seems to be therapists overseeing and editing chatbot text. I feel like this must have only niche uses. I want to be open to new technology but what would I get out of the canned response? It's like a self-help book or self-examination quiz but in conversational form.
posted by picklenickle at 2:16 PM on January 8, 2023 [1 favorite]


A lot of the success of therapy depends on a positive relationship between the therapist and the client, and I think a good chunk of that comes down to being able to believe that your therapist is a person who genuinely cares about your well-being (even if it's caring based on a transactional relationship). The failure mode of this, I think, is not that you get therapy that's 30% worse (or whatever number you want to put on it) - it's that you have people who are in desperate need of genuine kindness and are left thinking "there is no one in the world who actually cares about me, even my therapist is just an algorithm putting together words that are supposed to make me feel better."

But is it better than nothing? In some edge cases, maybe, possibly? I keep thinking about the use of Paro in care settings, and how a lot of the patients who have had a Paro seem to get real comfort and affection from it while acknowledging it's just a machine (Shannon Strucci's video "Fake Friends" goes into some detail on this; it's a very long video, but this should be approximately the right timestamp). Better than nothing, but... it's a problem if flashy technology distracts us from the fact that people need to be cared for by other people. The difficulty of finding therapy isn't some immutable property of the world; we could improve access by investing in it, if we had the collective will to do so.
posted by Jeanne at 2:47 PM on January 8, 2023 [2 favorites]


I was feeling pretty disturbed by the manner of this experiment, but I've talked through most of my feelings with ELIZA, so now I'm better
posted by Tom Hanks Cannot Be Trusted at 4:57 PM on January 8, 2023


therapy’s effects are not restricted to either helping or not helping, and the cost is not limited to time and money. very bad therapy does actual harm to people, including but not limited to, you know, killing them.

hey, 70 percent is a terrific optimistic helpfulness number from the land of pure imagination. speaking of, drugs save lives. do you want to put your arm into a grab-bag full of psych drugs and cross your fingers that you’ll pull out one of the seven prozacs and not one of the three haldols? and don’t answer too quickly! bear in mind this wonderful bag of drugs is free
posted by queenofbithynia at 4:58 PM on January 8, 2023 [5 favorites]


I'm not so sure that eliminating physical jobs is a worthy goal. The introduction of all these "freeing" technologies has not ended up making life more complete or less stressful; enacting schemes wherein human beings can free themselves to do more enlightened work

For your first point, we are still struggling under the burden of an oppressive capitalist billionaire class and we haven't been ALLOWED to be freed from any work or have less stress. Also, we're not actually there yet, we're approaching the age of a new leisure class, but it isn't here yet.

Your second point though is one of those ones I keep running into and being baffled about. People act very confused about what we might do once we are free from work, or express concern that being free from work will somehow make us somehow lessened. And I hear the "we can't ALL be poets/artists/musicians/whatevers" a lot. Or there's claims that without us working people kept working we'd all start killing each other or something.

But the thing is, there's no mystery about what people do when they're freed from the shackles of work, we've had several thousand years of direct observation.

What people do when they don't have to work is detailed in any Jane Austen novel, or the Jeeves and Wooster novels by Wodehouse, or hell the non-stabby parts of the Tale of Genji. There's no mystery about what people will do when free from work, there's no dark future of humans hunted for sport because we were so foolish as to let the proles up for a breath of air.

All that is just propaganda from the rich who don't want us to join them.

There's going to be endless loafing about and going to various parties and being extremely involved with our social lives and those of our friends and families. All while goofing around with whatever passing interest catches our fancy.

No, everyone won't become a poet or artist or whatever, they'll be idle layabouts like the Kardashians, or the Bennets, or the Woosters, or the Windsors, or any of the others from the multitudes of the wealthy idle class that has always existed for our entire written history.

Though probably with more sex.
posted by sotonohito at 5:02 PM on January 8, 2023 [3 favorites]


I am a peer support specialist, the kind mentioned in the article. I have also worked in IT some over the years. This idea is crap straight out of the box. If you think mental health work can be done using “well known decision trees to resolve known issues” then you don’t understand the craft at all. This is going to do for therapy what “full self driving” is currently doing behind the wheel.

I swear all I ever hear about from the IT sector anymore amounts to moving fast and breaking things, and bitcoin, and running away to Mars once it’s all broke and they’ve harvested all the wealth.

Maybe slow down a little and quit breaking things for a spell why doncha?
posted by cybrcamper at 4:26 AM on January 10, 2023 [3 favorites]


No, everyone won't become a poet or artist or whatever, they'll be idle layabouts like the Kardashians, or the Bennets, or the Woosters, or the Windsors, or any of the others from the multitudes of the wealthy idle class that has always existed for our entire written history.

Though probably with more sex.


See the Culture stories by Ian M Banks. Although actually most of his stories featured characters working on edge cases in Special Circumstances, and not the mass of beings living in the Culture.
posted by rochrobbb at 4:29 AM on January 11, 2023


« Older Timothy Snyder collection   |   Bowiemas/Bowienalia 2023 Newer »


This thread has been archived and is closed to new comments