A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

  • dsilverz@friendica.world
    link
    fedilink
    arrow-up
    14
    ·
    12 hours ago

    @gerryflap @bytesonbike

    many men are reluctant to make that step

    Sometimes it’s not the patient to blame. I made the step, countless times since my childhood… I sought help… Result? Got several, diverging diagnostics, several medications that didn’t work, until the most recent psychiatrist and psychologist some months ago: the psychiatrist said I got “nothing” (even when I had a fresh cut on my wrist) and the second “struggled to find any complaints from me”. So I simply gave up on seeking medical care (and “care” in general, human or whatnot). I don’t use AI for therapy because, as a former programmer, I’m deeply aware of their underlying Markov chain and NN algorithms, but sometimes their probabilistic outputs lead me to insights I couldn’t get from any living Homo sapiens beings (such as the possibility that I have “Geschwind Syndrome”, a condition of which will probably stay undiagnosed).