The Therapy Equivalent of Uber
August 9, 2020 9:48 AM   Subscribe

Several serious ethical crises emerging at online therapy companies like Talkspace and BetterHelp are gaining mainstream attention. Consumer beware: there are credible allegations of fake reviews on a grand scale (NYT), a CEO who openly advocates data mining of "confidential" (and undeletable) therapy transcripts (NYT), blatant HIPAA violations such as openly revealing patient emails (Forbes), and much more.

At Talkspace, Start-Up Culture Collides With Mental Health Concerns (NYT): The therapy-by-text company made burner phones available for fake reviews and doesn’t adequately respect client privacy, former employees say.

"We need data. All of our data. Mine and yours." (NYT): Talkspace CEO pens op-ed supporting mining data from patients' undeletable therapy transcripts.

Talkspace Reveals Clients' Email, Violating Clinical Confidentiality (Forbes): "Talkspace explicitly defines itself as a “Platform.” It is a business with customers and not a healthcare provider with clients or patients. And it is as a business they promise both anonymity and confidentiality. They use the language of a clinical relationship. But that is not what Talkspace is. When Talkspace promises confidentiality it is done with all the limitations in trust inherent in any company’s marketplace promises. Talkspace defines itself and is only accountable as a business, not as a healthcare provider."

BREAKDOWN: Inside the messy world of anonymous therapy app Talkspace (Verge): This lengthy expose discusses how mental health counselors are exploited via low pay, unpredictable pay, unmanageable hours, and being forced to violate professional ethics in service of business concerns. It also details patient rights violations: everything from leaky privacy, professionally irresponsible anonymity, predatory/bogus charges, patient abandonment, and overall failure to provide medical care while misleading patients.

YouTube’s BetterHelp mental health controversy, explained (Polygon): Not only do many YouTube's celebrity creators have lucrative sponsorship deals with BetterHelp, YouTube itself has ties with BetterHelp and often displays ads for BetterHelp under the videos of any creator discussing mental health. Backing away from the sponsorship does mean a loss of impressive affiliate money for creators; that’s part of the issue. Most of the videos that have BetterHelp sponsorships revolve around a creator discussing their own issues. While these are valid, viewers have complained that it feels like profiteering off mental illness at best, and causing serious harm to mentally ill people at worst. (BetterHelp’s terms of service state that the company can’t guarantee a qualified professional. “We do not control the quality of the Counselor Services and we do not determine whether any Counselor is qualified to provide any specific service as well as whether a Counselor is categorized correctly or matched correctly to you.”)

Therapy app Talkspace accused of ethically questionable practices (MobiHealthNews): "The notion that the company can read the chats – and isn't entirely clear about the circumstances in which it does – raises concerns about privacy and confidentiality and may have spurred the August HIPAA complaint by a therapist on the platform. It also threatens to undermine the trust relationship between patients and therapists. So does the notion that a therapist might be required to insert a script into a therapy session, promoting or advertising additional Talkspace services – a practice The Verge says Talkspace has engaged in."
posted by MiraK (49 comments total) 54 users marked this as a favorite
 
As a therapist this is terrifying. As a patient this is even more terrifying. In sum, terrifying.
posted by AlexiaSky at 9:51 AM on August 9, 2020 [32 favorites]


This, in particular, blows my mind: "Users can’t delete their [therapy] transcripts, for example, because they are considered medical records." I remember when I was in therapy, the psychologist who treated me put an enormous effort into writing up a medical record for me that divulged the bare minimum, in the vaguest terms possible, in order to protect my privacy, precisely because he was required by law to keep it on file for many years and possibly reveal it to courts if required to by a judge. So my record was full of session notes that read like: "Patient reports anxiety over a recent concerning event. We discussed relational and personal aspects of the event and connected it to childhood events. Psychodynamic and cognitive interventions were used." And that's it.

The idea that Talkspace stores THE ENTIRE TRANSCRIPT of every therapy session AS THE MEDICAL RECORD is ... yeah, terrifying.
posted by MiraK at 9:55 AM on August 9, 2020 [62 favorites]


Sweet Jesus Fuck.
posted by seanmpuckett at 10:19 AM on August 9, 2020 [3 favorites]


The best way to protect data is to not collect it at all, and every tech system in wide use is designed to store every piece of data that passes into the system.

The prospect of texting therapy gave me the willies the first time I saw ads for it, and for similar non-therapy chatbots: who knows where that data is going?

I shouldn't feel smug for being vindicated in every way, but I absolutely do. This sort of data collection and mining was what these sorts of chat systems were designed to do, using it for therapy is very nearly as bad an idea as electronic voting.

This isn't to excuse anything that Talkspace has done. There's ways to keep this sort of thing secure. But you have to actually want to. The default is to just collect everything.
posted by BungaDunga at 10:23 AM on August 9, 2020 [8 favorites]


I had been considering BetterHelp, but now I am not. Looks like I will be going by word of mouth with a specific provider over a custom encrypted videoconferencing solution that I control.
posted by grumpybear69 at 10:47 AM on August 9, 2020 [9 favorites]


This is wrong, and pardon the phrase, but who in their right mind thought online therapy startups, especially with webbie names like Talkspace, BetterHelp etc, would be ethical? Can we expect future companies to have more ironic winky names like BoxedWhine? Complete with 40-page Terms and Conditions, embedded targeted ads for sleep-pills, herbal anxiety supplements, pep pills, robot dogs, virtual cats, food-delivery services, spa coupons and the like? Truly stunning.
posted by Chickenring at 10:56 AM on August 9, 2020 [11 favorites]


I seriously considered these kinds of services when I was looking for a therapist after my partner died recently. I decided not to use them after reading an article about how badly they treat the therapists and deciding I wanted a relationship that wasn't controlled by a internet startup.

However, it's a real shame there are such huge ethical problem with them because I can see how they could fill a need. Finding a therapist is a horrible experience.

Between the frustrations of insurance, cost and finding someone who could accommodate me having a non-super flexible full time job I was very frustrated. A lot of therapists aren't up front about accepting insurance (not differentiating between being in network or being willing to facilitate out of network coverage), or just don't respond to messages or didn't offer any appointments outside of M-F 9-5. Trying to arrange an appointment can also be intimidating.
posted by SpaceWarp13 at 11:06 AM on August 9, 2020 [38 favorites]


Yeah, finding a therapist is really daunting, and a lot of people look for therapists when they're dealing with stuff that gets in the way of doing daunting tasks. I think a lot of the appeal of these things is that they make it easy to get started, and they make it easy to fit therapy (or "therapy") into your life. And there's probably a need for systems that make it easier to find a therapist and figure out ways to make therapy work with your other commitments, but this isn't it.
posted by ArbitraryAndCapricious at 11:32 AM on August 9, 2020 [12 favorites]


The prospect of texting therapy gave me the willies the first time I saw ads for it, and for similar non-therapy chatbots: who knows where that data is going?

Same. To be honest the possibility of saved recordings makes me shy away from telemedicine in general, which might wind up being a problem in these times.
posted by trig at 11:33 AM on August 9, 2020 [2 favorites]


I see these advertised all over the internet and they are aimed at a very real (and clearly lucractive) market of people who need some level of therapy but the cost and intimidation factor of "traditional" therapy is too much. Instant & easy access to support like this feels like a boon but the reality sounds like they're selling the psychological equivalent of "miracle diets" with a nasty side of data mining.

To push advertising during a global pandemic and pending recession is just so nakedly predatory that I don't quite have the words for it.
posted by slimepuppy at 11:38 AM on August 9, 2020 [2 favorites]


Well, this certainly explains a lot:

Oren Frank is an Israel native and is the co-founder and chief executive officer of Talkspace, an online therapy platform making therapy accessible and affordable to all. Prior to co-founding Talkspace, Frank worked for various subsidiaries of McCann Worldgroup, a global marketing service organization.
posted by armoir from antproof case at 11:43 AM on August 9, 2020 [8 favorites]


Same. To be honest the possibility of saved recordings makes me shy away from telemedicine in general, which might wind up being a problem in these times.

Telemedicine should not be recorded. Providers really don't have space for it anyway since video at a conservative 1x CD rate (1.41mbps, the lowest probably acceptable for archival) is over 600 megs an hour. JFC the sheer amount of space needed to record every medical session in the United States. 15 minutes a session, averaged out four times a year, times 350 million Americans gives... 202 petabytes of video per year. Factor in redundancy at that's at least 10% of the hard drive industy's current production if that production were channeled into 16TB drives being produced.
posted by Your Childhood Pet Rock at 11:44 AM on August 9, 2020 [3 favorites]


@YCPR, they don't need to record every session for every American in order to be successfully exploitive. A very small subset will do just nicely, and 'storage is cheap'.
posted by armoir from antproof case at 11:55 AM on August 9, 2020 [2 favorites]


I was in San Francisco for the RSA cybersecurity conference earlier this year and remember one of the Muni stations was completely wall to wall converted
Into a giant advert for Talkspace (with Olympic swimmer Michael Phelps as their spokesperson plastered everywhere). I remember thinking at the time that was like a red flag to a cybersecurity bull “hey come hack us - we’ll have the most personal sensitive data on people to use against them - not just normal PII - but people’s deepest, darkest thoughts, fears, and desires.”
posted by inflatablekiwi at 11:58 AM on August 9, 2020 [15 favorites]


Thanks, MiraK, for posting this. Important information. And, jeez, timely for me. Just last night I signed up for Curable (for chronic pain management) and as I was doing so little alarm bells were going off in my head. But did I heed them? No. Because PAIN. Now I am thinking i really need to go back an read those T&C's more closely.

Why can't we have nice things? Because 'entrepreneurs', that's why.
posted by armoir from antproof case at 12:00 PM on August 9, 2020 [9 favorites]


Telemedicine should not be recorded.

Of course it shouldn't be. And I'm fairly sure it... almost never is. But I've also had enough creepy encounters with doctors that I find the possibility really uncomfortable. And I'm not sure something like audio transcripts of a psych appointment would feel any better.

Anyway, sure, it's paranoia. But life over the past few decades has been a series of discoveries that some tech paranoia turned out to be justified, and at this point it's just one more thing I find myself really reluctant to deal with.
posted by trig at 12:01 PM on August 9, 2020


like a red flag to a cybersecurity bull “hey come hack us - we’ll have the most personal sensitive data on people to use against them - not just normal PII - but people’s deepest, darkest thoughts, fears, and desires.”

And then there's this -- Talkspace threatened to sue a security researcher over a bug report (TechCrunch) - "Talkspace does not offer a way for security researchers to submit bugs. ... Within hours of Jackson publishing his findings on his blog — which TechCrunch has seen — Talkspace sent Jackson a cease and desist letter, accusing the researcher of defaming Talkspace “by broadcasting untruths” in his blog post."
posted by MiraK at 12:08 PM on August 9, 2020 [7 favorites]


After reading the articles, it seems that while BetterHealth has some TOS issues regarding not guaranteeing licensure (and when in person you should be checking) For a platform servicing a large geographic area with different therapist requirements, licensure and boards that's there's not a simple way to 100 percent guarentee that everybody on the platform is actively licensed at all times. You can check at onboarding, but in general practitioners are required to self report licensure changes and stop practicing on their own. My employer only checks my status once every two years. And that the discussion of HIPAA seems to be more a talkspace problem than a betterhealth problem. Not that it's not possible, and not that it's not happening (I wouldn't know!), but all but one of the links are specifically about talkspace while the one about betterhelp is way less damning.

Either way, the way data is being handled at talkspace is still terrifying. And it shouldn't be happening. And it's practices will and have impacted how people seek out services. There is much to be said about ease of access to therapy, but privacy is absolutely paramount and stigma is far too real. A therapeutic relationship in which one feels that they cannot be honest, is not going to be successful.
posted by AlexiaSky at 12:11 PM on August 9, 2020 [1 favorite]


The video conferencing service my provider uses is Doxy.me, which explicitly says it doesn't record anything and doesn't give providers any tools to record anything. Sure, they might be lying, but I think it's unlikely. Their entire business seems to be providing HIPAA-compliant videoconferencing, and they're not controlling my relationship with my provider, so I'm comfortable enough with it.

Talkspace has set itself up as controlling the entire relationship, which is just nuts. They control both ends and all the data that flows between. Just a recipe for disaster.
posted by BungaDunga at 12:19 PM on August 9, 2020 [10 favorites]


a CEO who openly advocates data mining of "confidential" (and undeletable) therapy transcripts (NYT)

I find the idea of a startup therapy service by text ipso facto BAD. However the link to the opinion piece is not as described. The idea of using big data for something positive instead of just selling crap or bringing the hammer of the state down hard on some poor bastards or just in general for exclusionary and discriminatory purposes is not a bad one in my opinion. The idea of doing it with transcripts of therapy sessions is just stupid and maybe in the end this guy thinks that would be a good idea but there is no mention of it in what is linked.

It think that one of the largest problems with therapy is that it us not terribly compatible with a market based financial model. Attempts to use "tech" or disruption to make quality therapy more broadly available basically cannot.
posted by Pembquist at 12:27 PM on August 9, 2020


Pembquist, the first article linked explains the context for the op-ed - and my description was putting the information from both pieces together. See:
Talkspace’s website promises users that their conversations will be “safe and confidential,” but people may not have as much control as they might think over what happens to their data. Users can’t delete their transcripts, for example, because they are considered medical records.

Talkspace’s privacy policy states that “non-identifying and aggregate information” may be used “to better design our website” and “in research and trend analysis.” The impression left is a detached and impersonal process. But former employees and therapists told The Times that individual users’ anonymized conversations were routinely reviewed and mined for insights.

Karissa Brennan, a New York-based therapist, provided services via Talkspace from 2015 to 2017, including to Mr. Lori. She said that after she provided a client with links to therapy resources outside of Talkspace, a company representative contacted her, saying she should seek to keep her clients inside the app.

“I was like, ‘How do you know I did that?’” Ms. Brennan said. “They said it was private, but it wasn’t.”

... Talkspace also has been analyzing transcripts in order to develop bots that monitor and augment therapists’ work. During a presentation in 2019, a Talkspace engineer specializing in machine learning said the research was important because certain cues that a client is in distress that could be caught during in-person sessions might be missed when a therapist is only communicating by text. Software might better catch those cues.


Last year, Mr. Frank wrote an opinion article for The Times encouraging people to make their health data available to researchers. “We need data. All of our data. Mine and yours,” he wrote, arguing that analysis of anonymous data sets could improve treatment.

The anonymous data Talkspace collects is not used just for medical advancements; it’s used to better sell Talkspace’s product. Two former employees said the company’s data scientists shared common phrases from clients’ transcripts with the marketing team so that it could better target potential customers.
posted by MiraK at 12:36 PM on August 9, 2020 [3 favorites]


Pembquist, the first article linked explains the context for the op-ed - and my description was putting the information from both pieces together. See:

I understand that. I disagree with using the conclusion to describe the link. My own prejudice is that his op ed is just jabber drivel of the self aggrandizing sort. Sort of like TED talk peacockery. I suspect that pretty much everything is for sale and the sincere concern he expresses for anonymizing data and regulation are chortleberg meant to communicate that he is on the side of the angels. However, all that said, I think you would have a stronger set if you did not use your conclusion as a title for a link that does not mention "confidential and undeletable transcripts" at all.

The company's practices violate the spirit of what he describes in his op ed. Clearly the oiliness of the op ed is the fact that while he makes a paean to making it a fundamental principal that data mining not be for profit he clearly has no such scruples when it comes to his own business.
posted by Pembquist at 1:42 PM on August 9, 2020


Ooph, this seems like a space that could be well served by a small worker co-op just providing a platform for for therapists to connect with clients...
posted by kaibutsu at 1:43 PM on August 9, 2020 [4 favorites]


Therapist professional organizations, if they aren't already, should strongly warn therapists from signing up to provide services on these "platforms". It seems clear the legal and ethical issues are iffy at best to say nothing of the devaluing and deprofessionalization of the field. Why would I, as a therapist, want some third-party to mediate every single interaction between me and my clients? Traditional insurance panels are bad enough, but at least they aren't in the session with you (yet!) ensuring you're providing whatever minute by minute intervention they deem profitable to their bottom line. I'm not sure how much I'd trust a colleague who signs up for one of these services and in fact would probably be really wary of their judgment.
posted by flamk at 1:51 PM on August 9, 2020 [4 favorites]


Move fast and break people.
posted by acb at 2:47 PM on August 9, 2020 [23 favorites]


Ugh, flamk, it’s not just that- but a recent call to my insurance company was trying to direct me away from video visits with my therapist and the video conferencing platform uses, to a entirely new service similar to these. It was phrased in such a way that it sounded like the only option for the video visits to continue be covered. I was told my therapist COULD join and provide me services through. Or I could pick a therapist already on the platform! But my therapist’s health group already has video visits, why would they pay to use this other company? And why would I want to switch therapists because of covid?

I eventually got them to acknowledge the visits I had been getting would still be covered. But not without a lot of misleading information. Which I am sure was not the customer service reps fault, but I also wonder if I were not rather tech-and-healthcare savvy, would I have been able to decipher what was going on?
posted by [insert clever name here] at 3:03 PM on August 9, 2020 [15 favorites]


I would love to know if there are ethical, secure versions of this model. It would be especially helpful to hear from professionals who have experience with one of them.
posted by feckless at 3:40 PM on August 9, 2020 [3 favorites]


My wife is a therapist. She uses doxy.me, but Zoom and other major teleconference platforms also offer more expensive HIPPA compliant versions of their product.
posted by postel's law at 4:27 PM on August 9, 2020 [3 favorites]


Shurely a therapy company that treats its therapists as disposable cogs for the mechanical turk will treat my records ethically.
posted by benzenedream at 4:53 PM on August 9, 2020 [5 favorites]


I would love to know if there are ethical, secure versions of this model.

I was thinking the same thing. Clearly it's a space that has some interest, even people in the thread. The question is, other than the obvious "don't do data mining", what does "right" look like? And can it make money without being gross?
posted by ctmf at 5:32 PM on August 9, 2020 [2 favorites]


This is awful and sickening. Even before COVID, finding a therapist accepting patients was a torturous, deflating, uphill battle even as a person with good health insurance and a stable decent income in a major US city. There is such a huge need.
posted by desuetude at 5:45 PM on August 9, 2020 [4 favorites]


I believe Kaiser Permanente is looking hard into buying a platform, as opposed to building one ground up. I can't remember the name being tossed around, but it wasn't Talkspace or BetterHelp. Or doxy.me (jesus, what an inspiring, relevant name). But, yeah, telemedicine, particularly for therapy/psychiatry/addiction, looks to be a big priority with them.
posted by 2N2222 at 7:23 PM on August 9, 2020 [1 favorite]


The uber model for anything/everything should be thrown in a trashcan and set on fire.
posted by nikoniko at 11:43 PM on August 9, 2020 [11 favorites]


This is wrong, and pardon the phrase, but who in their right mind thought online therapy startup

I mean, the united states used to have the post office as a bank, but then we got payday loan shops in every neighborhood instead. Don t blame the customer in need, focus on changing that society and government that promotes rapaciousness
posted by eustatic at 1:21 AM on August 10, 2020 [3 favorites]


Jesus. I don't need to be another thing to be enraged about this morning.

Back in the spring, when things were just starting to become a shitshow here in the US, apps like this started popping up all over my social media feeds, explicitly targeted at health professionals. Medicine and nursing have had a massive mental health problem, which is commonly described as "burnout" but might be better characterized as a form of secondhand PTSD. Most state boards require you to disclose on your annual renewal if you are seeing a mental health provider, so the stigma against seeking help is profound. Meanwhile, the suicide rate and substance abuse rate among health professionals is about double the general population.

So these apps were aggressively marketing themselves to fit that need -- anonymous (I never saw one claim confidentiality, which was enough to make me pass), on-demand, as brief or extensive as you needed. And if you "verified' with your National Provider Identifier, you could get a clinician discount, like 20% off or a month free or whatever. Right. As I read somewhere (maybe here on Mefi), if you're not the customer, you're the product.

The other thing that enrages me is that shit like this erodes faith in actual licensed therapists, and in actual secure telemedicine platforms, as seen in this very thread. At least at my employer, neither the original in-house telehealth platform nor Zoom for Telehealth, allow recording, unlike regular/education Zoom which we also use. That wouldn't stop someone from taking screenshots or using a second device to tape audio -- but frankly that can happen just as easily in an in-person visit -- I've certainly had some people ask if we could record the visit because of their memory or whatever, and I've had some others surreptitiously record, which is just nasty (not the recording, the secrecy).

To summarize: for-profit healthcare, no matter the modality, is fundamentally unethical.
posted by basalganglia at 4:02 AM on August 10, 2020 [16 favorites]


I’m super uncomfortable with the victim-blaming streak in this thread. Yeah, those naive suckers who selfishly *checks notes* accessed mental health care when they really needed it, even if they live someplace far from any providers accepting new patients, or can’t access transportation to get to appointments. How dare they think they should be able to do that? They should’ve known they’re supposed to just keep suffering in silence, right?

MeFi always does this. “Well, I never used Uber.” “Well, I never used AirBnB.” “Well, I never liked Joss Whedon.” “Well, I never liked Harry Potter.” Well, bully for you, but maybe we could talk about the people actually doing bad things, as opposed to congratulating ourselves for not getting burned?
posted by snowmentality at 5:30 AM on August 10, 2020 [32 favorites]


I don't think there's video victim blaming so much as the "can I get an amen" type of declarative signaling that happens in many communities. I think it tends to be a way that communities build a kind of camaraderie and identity. I actually try to avoid making those kinds of declarations in my posts because I get suspicious of the kind of orthodoxy that can arise from such a dynamic, but even I sometimes fail to heed my own guidelines.

FWIW, I think I'm one of the few people to consistently defend Uber and the like, I'm not exactly sure what Joss Whedon has done, and I have no strong opinion on Harry Potter, though I enjoyed the movies with my kids.
posted by 2N2222 at 6:29 AM on August 10, 2020


It really sucks, but the truth is you flat out can't trust any middleperson between you and your doctor / therapist / dentist / restaurant / grocery store etc. They're all driven by profit and whatever industry they are leveraging for profit extraction is just incidental.
posted by grumpybear69 at 8:40 AM on August 10, 2020 [5 favorites]


Talkspace & Betterhelp are fee-for-service, meaning they don't accept healthcare insurance. It's a small market, since most Americans are covered by a health insurance plan (which, by law, must offer similar levels of mental health care to the levels of physical care they offer). I honestly don't know how many people can afford these platforms for anything more than a few months.

The good news is that your health insurance provider can direct you to a platform they use or authorize use of that have therapists covered by their insurance. This means you're just responsible for the usual out-of-pocket expenses. Most therapists taking insurance transitioned to one of these platforms due to the pandemic. It's the only way they could continue to make a living.

Because Talkspace & Betterhelp offer a wealth of contacts with the therapists per month, the amount a therapist makes is much less than they would typically make in seeing a therapist. This means the platform is more attractive to those therapists with simpler degrees (e.g., a Master's or less, vs a doctoral or MD).

Practically anything done online can be used for evil. The question isn't whether a company is doing some of that (virtually any online company pretty much is, especially stuff we use everyday like Google), but whether (a) they make you aware of exactly how they're using your data and (b) you're comfortable with that use.

Having said all that, I don't think it's a good idea whatsoever that any company keep complete chat logs of a therapy transcript. The goes beyond the pale, and in my opinion, verges on the unethical.
posted by docjohn at 8:53 AM on August 10, 2020 [1 favorite]


> Talkspace & Betterhelp are fee-for-service, meaning they don't accept healthcare insurance. It's a small market, since most Americans are covered by a health insurance plan (which, by law, must offer similar levels of mental health care to the levels of physical care they offer).

Except that health insurance plans don't actually offer similar levels of mental health care to physical care. If you're in a bad place mentally but not in immediate danger of harming yourself or others, the wait to see a new therapist is often MONTHS.
posted by desuetude at 10:19 AM on August 10, 2020 [8 favorites]


I have health insurance (from the marketplace), and BetterHelp is less expensive than my co-pay.
posted by Four-Eyed Girl at 4:18 PM on August 10, 2020 [5 favorites]


Talking to Eliza- or a GPT-3 chatbot for that matter- would be at least cheaper and more private than this.
posted by Apocryphon at 5:40 PM on August 10, 2020


I'm not surprised, just really disappointed.

Like, getting talk therapy on the NHS is something that would take...well, let's be honest, years, because mental health services aren't really funded, and I looked into online therapy situations because I also really like typing instead of talking, especially if I'm trying to work through something.

I ended up not wanting to pay that much, but the idea of a online therapy is such a great idea for a lot of people who aren't having a major crisis, but would just like someone who is trained and willing to listen.

So I'm just really disappointed that this latest batch of startups have all the problems of startups. I was hoping for something better, but I'm not really surprised.
posted by Katemonkey at 3:31 AM on August 11, 2020 [4 favorites]


If you're in a bad place mentally but not in immediate danger of harming yourself or others, the wait to see a new therapist is often MONTHS.

This is why I went with BetterHelp about three years ago, together with living in a place where therapists were thin on the ground and hard to access without a car. It was not great. I wasn't a match with the therapist I worked with, who wanted me to do affirmations and be more positive, and wasn't big on holding space or mirroring my feelings. Eventually I phased out with her because I just was not getting what I needed, but around that time I was able to access a better program in person.

Even so, I was considering trying one of these again and seeking a better fit this time because of similar circumstances. Now, well, now I can sit and spin, I guess.
posted by Countess Elena at 9:09 AM on August 11, 2020 [1 favorite]


what does "right" look like? And can it make money without being gross?
"Right" looks like putting people before profits. No, there is no way to make a profit off of people's suffering without it being gross. Health care should be a human right, not a privilege one has access to as long as it "makes money" for someone else.
posted by k8lin at 4:54 AM on August 12, 2020 [1 favorite]


Well, money has to come from somewhere, because doctors and therapists both need to be paid. US Medicare doesn't match market rates for services; US Medicaid doesn't even come close to market rates, that's why many doctors and therapists don't take it.

In short, these are specialized, time-intensive treatments administered by skilled and experienced professionals who've had many years of education & training. If a therapist can't make a decent living and be able to pay back their student loans, you'll find fewer & fewer professionals going into the field. Resulting in even higher rates, because supply & demand.

The field of psychiatry has seen the impact of this issue for over a decade. In most US states, trying to see a psychiatrist through your insurance plan is nigh impossible sooner than 3 to 6 months out. Therapy is much better, as long as you're not too picky about your therapist's training levels and what-not.

I will add one last thing -- like anything in life, you may not match best with the first therapist you try (whether online or in-person). Please, don't give up!! Therapy works best like many things in life when you find that one professional you click with. It may take 2 or 3 or more tries to find that person. But trust me.... when you do, you'll know it and you'll be able to accomplish some amazing things.
posted by docjohn at 8:32 AM on August 12, 2020 [2 favorites]


With good health insurance and organizational skills and intermittent persistence, I have been trying to find a therapist since February, when my dad died and then the world went to shit. I had to learn what an EAP is, find out who actually administered mine since my primary health insurance did not, struggle through filling out super-long questionnaires online every time I tried to hire someone, and left phone messages and sent emails that have not been returned. I got names from my doctor - none of them were taking new patients. I got names from my husband's therapist, none of which worked out, and his therapist then ghosted him sometime in April.

I have not been able to actually hire anyone. I did get one single appointment with someone in July, who then ghosted me when I tried to set up a second appointment with her.

I have been hearing ads for all these services on podcasts, and my primary health insurance pushes a telecounseling service of their own, but I have really been wanting to hire an actual person in my city that I could transition to seeing in person someday so I have avoided diving into them. I also had some reservations about not knowing how much the workers were being paid, or what their credentials actually are. It's even more disheartening to learn that they whole thing is a huge shitshow of the gig economy.
posted by See you tomorrow, saguaro at 9:58 AM on August 12, 2020 [2 favorites]


In most US states, trying to see a psychiatrist through your insurance plan is nigh impossible sooner than 3 to 6 months out. Therapy is much better, as long as you're not too picky about your therapist's training levels and what-not.

So, let's dispense with the idea that mental health care is offered at a level even remotely equivalent to physical health care, shall we? For most physical diseases or injuries, you don't wait 3-6 months for the consultation to even consider medication. Preceded by a "better" time frame (which is still often months) to even begin to get a diagnosis and treatment plan. Hell, a broken arm isn't typically imminently fatal, but they don't send you home from the ER and tell you to just buck up and call around (with your other arm) for doctors accepting new patients.
posted by desuetude at 11:29 PM on August 12, 2020 [1 favorite]


If a therapist can't make a decent living and be able to pay back their student loans, you'll find fewer & fewer professionals going into the field. Resulting in even higher rates, because supply & demand.

Yes, it is unsurprising that this problem is also tied to the problem of profit motives driving higher education. It's almost like commodifying the ways we care for one another (be it mental healthcare or teaching, which is a type of care work as defined by feminist theory) is just a bad idea all around.

The problem with these apps is that they are based on an extractive and neoliberal logic that positions the individual as an atomized unit who can self-actualize without attendance to questions of how the structure of the world allows for or constrains such actualization. Saying "please don't give up if the first therapist doesn't work out" is another example of positioning the individual as a free agent who somehow exists outside of structural constraints, or who doesn't have to attend to such constraints. The only thing stopping them is themselves and their ability to try in this construction.

These apps obfuscate what's really happening behind the scenes--your personal medical information is being bought and sold to who knows who, for knows how much money--and because many people don't actually have any good choices for accessing a therapist, they go with an app like this because it's the best they can do. This is the illusion of choice, not choice. It's not really about trying harder; it's about the structure of the world in which we currently live. These apps perpetuate and accelerate that structure. Moreover, a lot of people in the United States go to therapy to try to reconcile with the fact that their lives are mostly about extracting value from them in the form of their labor. How can you get into that at all with your therapist if the model of its delivery is also based on extracting things from you and turning you into profit for someone else?
posted by k8lin at 11:59 PM on August 12, 2020 [6 favorites]


« Older Every Pokemon is interesting and worth talking...   |   The idea that queer people are not only allowed to... Newer »


This thread has been archived and is closed to new comments