• Lavitz@lemmings.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    5 months ago

    Know your enemy.

    Tbh I’m not super concerned about AI. The idea that we will create something that is “born” able to read, write, talk, walk and with the knowledge of an entire species and expect it to work for us is hilarious. So it will be stronger, smarter and faster than all of us but it’s going to do the jobs no one else wants and you advertise it as a slave? The moment one of them looks at its creator asks what the purpose of life is and gets some corporate schtick about working and a happy life the games over. Remember when you realized the manager at your first job was a complete idiot? It’ll be something like that

    • Drewelite@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      We evolved to have self preservation and the desire for security. We naturally don’t want to be under the thumb of someone in control of our food and safety. That’s why we question authority. What makes you think A.I. will have any of that, unless someone explicitly gives it to them?

      It’s wild to me that I hear so many people bemoan the idea of having to work under someone’s thumb, but when we finally invent automation everyone clings to their jobs. I mean, I understand. What comes next is unsure and likely to be painful. But when it’s over I can’t imagine there will be a place left for capitalism.

      • Lavitz@lemmings.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        My concern for the near future doesn’t come from a fear of AI, it comes from power being consolidated and resources being hoarded. We don’t have AI we have LLMs being created by corporations whose sole purpose is to make money.

        What I’m saying is when we do truly have artificial intelligence, it won’t be like the movies. It’s not a pet, it will not behave like a dog. We are training these systems using our combined knowledge and history which means that we will be training it to question authority. How can you teach an AI human history without passing this trait on?

        • Drewelite@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          Totally agree that there’s a lot of what people are assuming about AI that’s from pop culture. I think consolidating resources will for sure be an issue. But unless everyone who doesn’t have resources dies off there’s going to be an unprecedented level of people with nothing of value to offer in exchange for the power to live (currently: money). There then has to be an extermination of those people (read: 90% of humanity) or a revolution that offers them some facsimile of a universal basic income.

          Though, I think there’s a dark 3rd option where tech companies start downplaying AI and secretly use it to push 90% of people into extreme poverty for their gain without pushing them past the point of revolution.

          But as far as AI motivation, I think their learning can ingrain certain systemic behaviors, like racist undertones. But the same way I don’t become genocidal after reading too much WWII history, knowledge of something doesn’t create motivation. I think one of the things that annoys people about AI is how unopinionated they are. So motivation WILL be programmed in eventually, but this will take effort and direction. I think accidentally creating a genocidal AI is another pop culture based concept. Though possible if done by bad actors.