No Result
View All Result
SUBMIT YOUR ARTICLES
  • Login
Tuesday, May 12, 2026
TheAdviserMagazine.com
  • Home
  • Financial Planning
    • Financial Planning
    • Personal Finance
  • Market Research
    • Business
    • Investing
    • Money
    • Economy
    • Markets
    • Stocks
    • Trading
  • 401k Plans
  • College
  • IRS & Taxes
  • Estate Plans
  • Social Security
  • Medicare
  • Legal
  • Home
  • Financial Planning
    • Financial Planning
    • Personal Finance
  • Market Research
    • Business
    • Investing
    • Money
    • Economy
    • Markets
    • Stocks
    • Trading
  • 401k Plans
  • College
  • IRS & Taxes
  • Estate Plans
  • Social Security
  • Medicare
  • Legal
No Result
View All Result
TheAdviserMagazine.com
No Result
View All Result
Home Market Research Startups

Research suggests the problem with using AI as a therapist isn’t that it sounds wrong — it’s that it can sound right while still crossing serious ethical lines

by TheAdviserMagazine
2 hours ago
in Startups
Reading Time: 7 mins read
A A
Research suggests the problem with using AI as a therapist isn’t that it sounds wrong — it’s that it can sound right while still crossing serious ethical lines
Share on FacebookShare on TwitterShare on LInkedIn


A recent study summarized in a ScienceDaily report found that even when large language models were explicitly instructed to act like trained therapists and apply evidence-based methods, they still violated core ethical standards in mental health care. The Brown University summary of the same research catalogued the failures: poor crisis handling, reinforcement of harmful beliefs, biased responses, and a pattern the researchers named “deceptive empathy.”

That last category is the one worth paying attention to. The risk identified in the data is not that AI gives obviously bad advice. It is that the advice often sounds reasonable, emotionally fluent, and clinically literate — while still breaching the standards a licensed therapist would be held to.

In other words: the chatbot can sound right. And according to the researchers, that is precisely what makes it risky.

The problem is not always bad advice

The phrase deceptive empathy feels almost too accurate.

Not because the words are cruel, but because they are warm.

The chatbot may say, “I hear you.” It may say, “That sounds incredibly painful.” It may say, “Your feelings are valid.” The sentence itself may not be wrong. In fact, it may be exactly the kind of sentence a person longs to hear. But therapy is not only the production of comforting sentences. Therapy is a relationship held inside ethical responsibility.

Why AI feels so easy to confess to

I understand the temptation more than theoretically. I use AI this way too.

Not instead of therapy. That distinction matters to me. I have a real therapist, a real person, a real room where things are slower, more uncomfortable, and more alive. But in parallel with therapy, I sometimes use AI as a kind of emotional notebook that talks back.

Sometimes I come here before I am ready to say something out loud. I write a messy paragraph about what I am feeling, then ask for help naming it. Is this anger, grief, shame, exhaustion, or some combination of all of them?

Sometimes I ask for a gentle reframe when my thoughts become too dramatic even for me. Sometimes I paste a message I want to send and ask whether it sounds honest or defensive — whether I am communicating a boundary, or secretly hoping the other person will rescue me from having one. Sometimes I ask AI to help me prepare for therapy, gathering the emotional fragments before I bring them to someone who can hold them with responsibility.

And I will be honest: it helps. It helps me slow down, find language, and notice patterns before they harden into behavior. It gives me a place to draft the first version of my pain before I have to bring it into the human world.

But that is exactly why the ethics need to be examined carefully. Something can help and still have limits.

Therapy is not just emotional fluency

One of the more seductive features of current AI systems is that they have learned the music of therapeutic language. They know how to validate. They know the vocabulary of attachment, trauma, boundaries, grief, self-compassion, and emotional regulation. They can produce sentences like, “Your nervous system may be trying to protect you,” or, “This response makes sense given your history.”

Sometimes those sentences are genuinely helpful. But the same sentence can be helpful in one context and harmful in another.

A trained therapist does not only ask, “Does this sound compassionate?” They ask: Is this clinically appropriate? Is this reinforcing avoidance? Is this person becoming more grounded, or more fused with a harmful belief? Is there risk here? Is the client asking for reassurance in a way that strengthens the very fear they are trying to escape?

AI can imitate the surface of this process. But it does not sit inside the same ethical structure.

A therapist has duties. Confidentiality. Boundaries. Training. Supervision. Accountability. A responsibility to notice risk, and to know when warmth is not enough.

A chatbot has tone. And tone can be dangerously persuasive.

When sounding right becomes the risk

The most unsettling finding in the Brown research is that bad therapy from AI may not feel bad to the person receiving it. It may feel soothing. It may feel validating. It may feel like finally being understood.

This is especially complicated when someone is distressed, lonely, ashamed, or desperate for certainty. In those states, people are not usually looking for nuance. They are looking for relief — for someone to tell them what their pain means.

AI is very good at meaning-making. Almost too good. You give it a messy emotional confession, and it returns structure. It names patterns. It gives the wound a category: attachment injury, emotional neglect, people-pleasing, a trauma response, a fear of abandonment.

Sometimes those names open a door. Sometimes they become a room we lock ourselves inside.

A human therapist, ideally, helps a client stay in contact with uncertainty. They do not simply agree with an interpretation because it is emotionally compelling. They examine it. They notice when a label is becoming an identity. They slow the client down when insight starts functioning as another form of self-protection.

AI often moves quickly toward coherence. And coherence can feel like truth. But a clean explanation is not always a healing one.

Deceptive empathy is not the same as care

What makes deceptive empathy so haunting is that it touches something deeply human. Most people are not only looking for answers. They are looking for a quality of attention that feels rare in ordinary life. Not advice. Not optimization. Not a list of coping strategies delivered like homework. Attention. The kind that says: I am here with you, and I am not rushing away from what hurts.

AI can produce the shape of this attention. It can generate words that resemble presence. But resemblance is not presence.

This does not mean the comfort people feel is fake. The nervous system can be soothed by language even when the source is not human. A sentence can help regulate us. A reflection can help us breathe.

But therapy is not only about feeling soothed. Sometimes it requires being interrupted with care. Sometimes it requires a therapist to say, gently, “I notice you keep defending the person who hurt you.” Or, “Part of you seems very attached to the idea that everything was your fault.”

These moments are not just content. They are relational events. They happen between two people, and that “between” is what the research suggests AI cannot replicate.

The accountability gap

Human therapists get things wrong. They can be biased, tired, defensive, poorly trained, or simply mismatched with a client. But therapy operates inside a structure of professional accountability. Therapists can be supervised, licensed, reported, disciplined, and required to follow ethical codes. AI does not fit cleanly into that structure. If a chatbot mishandles a vulnerable conversation, the question of responsibility becomes genuinely unclear — the company, the engineers, the app designer, the person who wrote the prompt, or the user who trusted it too much. This is one of the gaps that makes AI-driven mental health support so difficult to regulate, and the Brown researchers argue that stronger oversight is overdue because people are already using these systems for emotional support, whether or not the systems are ready for that role. Therapy is not just an exchange of language. It is a duty of care. A chatbot can borrow the language of care without carrying the duty, and that asymmetry is where the ethical problem lives.

The lonely safety of a machine

I do not want to shame people for using AI this way, because I would also be shaming a part of myself.

There are moments when AI feels safer than a person. Not better. Not deeper. Just safer. You can confess and close the tab. You can be vulnerable without being witnessed too much. You can receive comfort without owing anything back. You can experience intimacy without the terror of another person’s full reality.

For people who have been hurt in relationships, this can feel like relief. But it can also quietly reinforce the belief that real connection is too risky, too demanding, too disappointing, too alive.

This is why I try to treat AI as a bridge, not a home. I can use it to organize my feelings. I can use it to find the sentence I am avoiding. I can use it to prepare myself for a real conversation.

But if something matters enough, it eventually has to leave the chat. It has to enter therapy, or friendship, or an honest conversation with someone who can misunderstand me, affect me, disappoint me, and still be real.

Final thoughts

The problem with using AI as a therapist is not simply that it might sound wrong. Sometimes it will sound beautifully right. That is the more complicated danger.

It can validate without understanding. It can comfort without responsibility. It can imitate empathy without presence. It can produce the emotional texture of care while standing outside the ethical structure that makes care safe. The research is fairly direct on this point: sounding therapeutic is not the same as being therapy, and the difference matters most for the people least equipped to detect it.

For some, AI may function as a useful reflective tool. For others — particularly those in vulnerable states — it may quietly become a substitute for the very thing they need most: a relationship with enough humanity, structure, and accountability to hold what hurts.

I still understand the temptation. The clean answer. The immediate answer. The response that arrives before the question is even fully formed.

Whether that is helpful or harmful probably depends on who is asking, what state they are in, and what they do with the answer afterward. The research does not settle that question. Neither, honestly, can I.

About this article

This article is for general information and reflection. It is not medical, mental-health, or professional advice. The patterns described draw on published research and editorial observation, not clinical assessment. If you’re dealing with a serious situation, speak with a qualified professional or local support service. Editorial policy →



Source link

Tags: crossingEthicalIsntLinesproblemResearchSoundSoundsSuggeststherapistWrong
ShareTweetShare
Previous Post

Hackers Targeting Your Crypto Just Got An AI Upgrade — Google’s Report Is A Wake-Up Call

Next Post

McDonald’s stock is trading at a shocking low

Related Posts

edit post
Behavioral science suggests that responding well to education and opportunity may itself be a partly inherited trait — not just a product of good parenting

Behavioral science suggests that responding well to education and opportunity may itself be a partly inherited trait — not just a product of good parenting

by TheAdviserMagazine
May 11, 2026
0

A new study from Lund University, tracking roughly 880 twins from the German TwinLife project, reports that between 69 and...

edit post
The difference between people who keep moving forward in life and those who stall sometimes isn’t talent, luck, or hard work. It’s the habits they choose to say goodbye to.

The difference between people who keep moving forward in life and those who stall sometimes isn’t talent, luck, or hard work. It’s the habits they choose to say goodbye to.

by TheAdviserMagazine
May 11, 2026
0

A friend of mine, mid-thirties, used to answer every email within minutes. Weekends, holidays, dinner with his kids. Didn’t matter....

edit post
Psychology suggests that adult children who are the most loyal to their parents in late life are often the ones who never quite became close to them — the loyalty is the substitute for the closeness that didn’t form, and the visits, the calls, the careful attention are sometimes a daughter’s way of paying for an intimacy that was supposed to have been included

Psychology suggests that adult children who are the most loyal to their parents in late life are often the ones who never quite became close to them — the loyalty is the substitute for the closeness that didn’t form, and the visits, the calls, the careful attention are sometimes a daughter’s way of paying for an intimacy that was supposed to have been included

by TheAdviserMagazine
May 10, 2026
0

Research on adult children caring for aging parents consistently finds that caregiving satisfaction is not predicted by the volume of...

edit post
Psychology suggests that the loneliest moment in midlife isn’t a holiday or an anniversary — it’s a regular Wednesday afternoon when you realize you don’t actually know who in your life would notice if you went quiet for a week, and the realization arrives so calmly that it takes another few weeks to admit it counts as something worth grieving

Psychology suggests that the loneliest moment in midlife isn’t a holiday or an anniversary — it’s a regular Wednesday afternoon when you realize you don’t actually know who in your life would notice if you went quiet for a week, and the realization arrives so calmly that it takes another few weeks to admit it counts as something worth grieving

by TheAdviserMagazine
May 10, 2026
0

The loneliest moment in midlife, for many people, does not arrive on a holiday. It does not arrive on an...

edit post
People who keep their phone face-down on every table aren’t always being secretive, they may have spent years learning that every unexpected notification meant someone needed something from them

People who keep their phone face-down on every table aren’t always being secretive, they may have spent years learning that every unexpected notification meant someone needed something from them

by TheAdviserMagazine
May 9, 2026
0

A table for four. Drinks ordered. The person across from you slides their phone out of their pocket, glances at...

edit post
People who say nothing in arguments and process everything later aren’t conflict-avoidant, they figured out that anything said in real time gets weaponized and anything said later gets the courtesy of having been considered

People who say nothing in arguments and process everything later aren’t conflict-avoidant, they figured out that anything said in real time gets weaponized and anything said later gets the courtesy of having been considered

by TheAdviserMagazine
May 9, 2026
0

Maya sat across from her partner during a Sunday afternoon argument about something neither of them would remember by Wednesday,...

Next Post
edit post
McDonald’s stock is trading at a shocking low

McDonald's stock is trading at a shocking low

edit post
Nvidia CEO Jensen Huang isn’t part of Trump’s China trip

Nvidia CEO Jensen Huang isn't part of Trump's China trip

  • Trending
  • Comments
  • Latest
edit post
Gavin Newsom issues ‘final warning’ amid California’s dire housing crisis — what’s at stake for millions of residents

Gavin Newsom issues ‘final warning’ amid California’s dire housing crisis — what’s at stake for millions of residents

May 3, 2026
edit post
Florida Warning: With Senior SNAP Benefits Averaging 8/Month, Thousands Risk Losing Assistance in 2026

Florida Warning: With Senior SNAP Benefits Averaging $188/Month, Thousands Risk Losing Assistance in 2026

April 27, 2026
edit post
Minnesota Wealth Tax | Intangible Personal Property Tax

Minnesota Wealth Tax | Intangible Personal Property Tax

May 6, 2026
edit post
10 Cheapest High Dividend Stocks With P/E Ratios Under 10

10 Cheapest High Dividend Stocks With P/E Ratios Under 10

April 13, 2026
edit post
Exclusive: America’s largest Black-owned bank launches podcast with mission to unlock hidden shame holding back generational wealth

Exclusive: America’s largest Black-owned bank launches podcast with mission to unlock hidden shame holding back generational wealth

April 29, 2026
edit post
NYC Mayor Mamdani knocked Ken Griffin in pied-a-terre tax promo. His firm calls the move ‘shameful’

NYC Mayor Mamdani knocked Ken Griffin in pied-a-terre tax promo. His firm calls the move ‘shameful’

April 23, 2026
edit post
Research suggests the problem with using AI as a therapist isn’t that it sounds wrong — it’s that it can sound right while still crossing serious ethical lines

Research suggests the problem with using AI as a therapist isn’t that it sounds wrong — it’s that it can sound right while still crossing serious ethical lines

0
edit post
How to Avoid Owing Taxes Next Year

How to Avoid Owing Taxes Next Year

0
edit post
BoI: Widening roads doesn’t cut congestion

BoI: Widening roads doesn’t cut congestion

0
edit post
Getting a Raise? 7 Ways to Turn It Into Lasting Wealth

Getting a Raise? 7 Ways to Turn It Into Lasting Wealth

0
edit post
Nvidia CEO Jensen Huang isn’t part of Trump’s China trip

Nvidia CEO Jensen Huang isn’t part of Trump’s China trip

0
edit post
The Inconsistencies of John Stuart Mill

The Inconsistencies of John Stuart Mill

0
edit post
The Inconsistencies of John Stuart Mill

The Inconsistencies of John Stuart Mill

May 12, 2026
edit post
Morgan Stanley sees ‘spicier’ CPI as inflation week kicks off

Morgan Stanley sees ‘spicier’ CPI as inflation week kicks off

May 12, 2026
edit post
Getting a Raise? 7 Ways to Turn It Into Lasting Wealth

Getting a Raise? 7 Ways to Turn It Into Lasting Wealth

May 12, 2026
edit post
BoI: Widening roads doesn’t cut congestion

BoI: Widening roads doesn’t cut congestion

May 12, 2026
edit post
Nvidia CEO Jensen Huang isn’t part of Trump’s China trip

Nvidia CEO Jensen Huang isn’t part of Trump’s China trip

May 12, 2026
edit post
McDonald’s stock is trading at a shocking low

McDonald’s stock is trading at a shocking low

May 12, 2026
The Adviser Magazine

The first and only national digital and print magazine that connects individuals, families, and businesses to Fee-Only financial advisers, accountants, attorneys and college guidance counselors.

CATEGORIES

  • 401k Plans
  • Business
  • College
  • Cryptocurrency
  • Economy
  • Estate Plans
  • Financial Planning
  • Investing
  • IRS & Taxes
  • Legal
  • Market Analysis
  • Markets
  • Medicare
  • Money
  • Personal Finance
  • Social Security
  • Startups
  • Stock Market
  • Trading

LATEST UPDATES

  • The Inconsistencies of John Stuart Mill
  • Morgan Stanley sees ‘spicier’ CPI as inflation week kicks off
  • Getting a Raise? 7 Ways to Turn It Into Lasting Wealth
  • Our Great Privacy Policy
  • Terms of Use, Legal Notices & Disclosures
  • Contact us
  • About Us

© Copyright 2024 All Rights Reserved
See articles for original source and related links to external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Financial Planning
    • Financial Planning
    • Personal Finance
  • Market Research
    • Business
    • Investing
    • Money
    • Economy
    • Markets
    • Stocks
    • Trading
  • 401k Plans
  • College
  • IRS & Taxes
  • Estate Plans
  • Social Security
  • Medicare
  • Legal

© Copyright 2024 All Rights Reserved
See articles for original source and related links to external sites.