Interesting article didnt know where it fit best so I wanted to share it here.

  • justastranger@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    arrow-down
    2
    ·
    1 year ago

    I prefer to consider it in terms of “dimensions of awareness”. Humans have evolved hundreds, possibly thousands, of interlinked dimensions of awareness for just about everything from colors to body language. Simple automated systems with sensors have their own dimensions of awareness, from vision to heat to pressure. Whatever it is that they track and respond to. AI, however, is finally hitting the point where these dimensions of awareness are being stacked and linked together (GPT5 can see, hear, read, and respond) and it’s only a matter of time and agency (aka executive functioning) before we see true AI consciousness.

    • 0ops@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I had a similar thought recently actually, that consciousness is more than the brain. Is gt4 conscious? Eh, I don’t believe anyone knows what that means but is it comparable to human consciousness? I don’t think so, but how could it be? It senses words, so it knows words, so it speaks words.

      I hear it said all the time that llm’s don’t really understand what they’re talking about, but they seem to understand as well as they can given the dimensions they are aware of, using your terminology. I mean how can I describe anything myself without sensory details? It sounds like. It looks like. It feels like. It behaves like. We got all that knowledge by sensing, then infering. There’s no special sauce that creates understanding from nothing.

      I don’t have any links but imo the experiences of people who were born without a sense, and especially those who were later able to gain it back, strongly supports this idea that something can only be conceptualized in the terms that it was sensed in.