• henfredemars@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 months ago

    Meant, in this context, refers to the conditions that humans have faced over a long period of time and may be more suited to coping with from a survival point of view. I’m an atheist, so I find it strange that you chose to read my comment as highlighting intentional design. Certainly, AI has existed for a much shorter time than the phenomenon on a human encountering the death of a loved one. Indeed, death has been quite a common theme throughout history, and the tools and support available to cope with it and relate to other human experiences far exceed those for coping with the potential issues that come with AI.

    I think one can absolutely speak of needs and adaptation for something as common a human experience as death. If you find something belittling about that opinion, I’m not sure how to address you further. I may simply have to be wrong.

    • frog 🐸@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Just gonna say that I agree with you on this. Humans have evolved over millions of years to emotionally respond to their environment. There’s certainly evidence that many of the mental health problems we see today, particularly at the scale we see, is in part due to the fact that we evolved to live in a very different way to our present lifestyles. And that’s not about living in cities rather than caves, but more to do with the amount of work we do each day, the availability and accessability of essential resources, the sense of community and connectedness with small social groups, and so on.

      We know that death has been a constant of our existence for as long as life has existed, so it logically follows that dealing with death and grief is something we’ve evolved to do. Namely, we evolved to grieve for a member of our “tribe”, and then move on. We can’t let go immediately, because we need to be able to maintain relationships across brief separations, but holding on forever to a relationship that can never be continued would make any creature unable to focus on the needs of the present and future.

      AI simulacrums of the deceased give the illusion of maintaining the relationship with the deceased. It is certainly well within the possibility that this will prolong the grieving process artificially, when the natural cycle of grieving is to eventually reach a point of acceptance. I don’t know for sure that’s what would happen… but I would want to be absolutely sure it’s not going to cause harm before unleashing this AI on the general public, particularly vulnerable people (which grieving people are.)

      Although I say that about all AI, so maybe I’m biased by the ridiculous ideology that new technologies should be tested and regulated before vulnerable people are experimented on.