• kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    1 day ago

    Could just say:

    If you accept either privacy of consciousness or phenomenal transparency then philosophical zombies must be conceivable and therefore physicalism is wrong and you can’t engineer consciousness by mimicking brain states.

    Edit:

    I guess I should’ve expected this, but I’m glad to see multiple people wanted to dive deep into this!

    I don’t have the expertise or time to truly do it justice myself, so if you want to go deep on this topic I’m going to recommend my favorite communicator on non-materialist (but also non-religious) takes on consciousness, Emerson Green:

    But I’m trying to give a tl;dl to the first few replies at least.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Why does privacy of consciousness mean one can’t engineer consciousness by mimicking states of the organ that probably has something to do with it?

      What does phenomenal transparency mean?

        • tabular@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 hours ago

          We may be able to tell with great confidence what you’re thinking or feeling but not how it feels to you. There’s a subjective, 1st person experience. Something that it’s like to be you which is different from me. I can’t tell what it’s like to be someone else, or be another animal, or if it means anything to be a rock.

        • kibiz0r@midwest.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          Privacy doesn’t mean that nobody can tell what you’re thinking. It means that you will always be more justified in believing yourself to be conscious than in believing others are conscious. There will always be an asymmetry there.

          Replaying neural activity is impressive, but it doesn’t prove the original recorded subject was conscious quite as robustly as my daily subjective experience proves my own consciousness to myself. For example, you could conceivably fabricate an entirely original neural recording of a person who never existed at all.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        I added some episodes of Walden Pod to my comment, so check those out if you wanna go deeper, but I’ll still give a tl;dl here.

        Privacy of consciousness is simply that there’s a permanent asymmetry of how well you can know your own mind vs. the minds of others, no matter how sophisticated you get with physical tools. You will always have a different level of doubt about the sentience of others, compared to your own sentience.

        Phenomenal transparency is the idea that your internal experiences (like what pain feels like) are “transparent”, where transparency means you can fully understand something’s nature through cognition alone and not needing to measure anything in the physical world to complete your understanding. For example, the concept of a triangle or that 2+2=4 are transparent. Water is opaque, because you have to inspect it with material tools to understand the nature of what you’re referring to.

        You probably immediately have some questions or objections, and that’s where I’ll encourage you to check out those episodes. There’s a good reason they’re longer than 5 sentences.

        • tabular@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          20 hours ago

          I thought that’s what was ment by privacy of consciousness and agree that’s how it is.

          However, being unable to inspect if something has a consciousness doesn’t mean we can’t create a being which does. We would be unaware if we actually succeeded, or if it even happened unintentionally with some other goal in mind.

          • kibiz0r@midwest.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            18 hours ago

            Gotcha. Yeah, I can endorse that viewpoint.

            To me, “engineer” implies confidence in the specific result of what you’re making.

            So like, you can produce an ambiguous image like The Dress by accident, but that’s not engineering it.

            The researchers who made the Socks and Crocs images did engineer them.

            • tabular@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              16 hours ago

              I see what you mean. By that definition of engineer then I would agree.

              We could perhaps engineer androids that mimic us so well that to damage them would feel to us like hurting a human. I would feel compelled to take the risk of caring for an unfeeling simulation just in case they were actually able to suffer or flourish.

    • 10001110101@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 day ago

      Lol. This comment sent me down a rabbit hole. I still don’t know if it’s logically correct from a non-physicalist POV, but I did come to the conclusion that I lean toward eliminative materialism and illusionism. Now I don’t have to think about consciousness anymore because it’s just a trick our brains play on us (consciousness always seemed poorly defined to me anyways).

      I guess when AI appears to be sufficiently human or animal-like in its cognitive abilities and emotions, I’ll start worrying about its suffering.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        16 hours ago

        If you wanna continue down the rabbit hole, I added some good stuff to my original comment. But if you’re leaning towards epiphenomenalism, might I recommend this one: https://audioboom.com/posts/8389860-71-against-epiphenomenalism

        Edit: I thought of another couple of things for this comment.

        You mentioned consciousness not being well-defined. It actually is, and the go-to definition is from 1974. Nagel’s “What Is It Like to Be a Bat?”

        It’s a pretty easy read, as are all of the essays in his book Mortal Questions, so if you have a mild interest in this stuff you might enjoy that book.

        Very Bad Wizards has at least one episode on it, too. (Link tbd)

        Speaking of Very Bad Wizards, they have an episode about sex robots (link tbd) where (IIRC) they talk about the moral problems with having a convincing human replica that can’t actually consent, and that doesn’t even require bringing consciousness into the argument.