return2ozma@lemmy.world to Mental Health@lemmy.worldEnglish · 1 month agoPeople are using ChatGPT for therapy—but is it a good idea?www.newsweek.comexternal-linkmessage-square10fedilinkarrow-up125arrow-down12cross-posted to: [email protected]
arrow-up123arrow-down1external-linkPeople are using ChatGPT for therapy—but is it a good idea?www.newsweek.comreturn2ozma@lemmy.world to Mental Health@lemmy.worldEnglish · 1 month agomessage-square10fedilinkcross-posted to: [email protected]
minus-squarepacmondo@sh.itjust.workslinkfedilinkEnglisharrow-up7·1 month agoYeah, wotsisname’s Law of Headlines. If it ends in a question mark the answer to the question is no
minus-squaremilicent_bystandr@lemm.eelinkfedilinkEnglisharrow-up5·1 month agoBecause too frequently it gives plausible-sounding but completely unfounded statements. Also it can go more darkly wrong, and all the extra checks and safeguards don’t always protect it.
minus-squareintensely_human@lemm.eelinkfedilinkEnglisharrow-up2arrow-down2·1 month agoTherapy isn’t about what the therapist says
minus-squaremilicent_bystandr@lemm.eelinkfedilinkEnglisharrow-up4·1 month agoSome of it is, as I can personally attest. And well-dressed lies can certainly do a person much harm.
No.
It is not.
Yeah, wotsisname’s Law of Headlines. If it ends in a question mark the answer to the question is no
Betteridge
Why not
Because too frequently it gives plausible-sounding but completely unfounded statements.
Also it can go more darkly wrong, and all the extra checks and safeguards don’t always protect it.
Therapy isn’t about what the therapist says
Some of it is, as I can personally attest. And well-dressed lies can certainly do a person much harm.