• Just_Pizza_Crust@lemmy.world
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    1
    ·
    edit-2
    3 months ago

    Softcore gilf porn created by an AI to sell state lottery tickets wasn’t on my cards for 2024, but here we are.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    32
    ·
    3 months ago

    “Our tax dollars are paying for that! I was completely shocked. It’s disturbing to say the least,” Megan explained to the Jason Rantz Show on KTTH.

    I mean, I’d assume that the state lottery is revenue-positive. It’s more like lottery players are paying for it.

  • Nobody@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    3 months ago

    AI hallucinates a request for a topless photo. Nothing fundamentally wrong with this technology at all. Keep pouring billions into it.

  • Greg Clarke@lemmy.ca
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    3
    ·
    3 months ago

    I can’t verify this story with any reputable sources. Is this real or just boomerbait?

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      3 months ago

      This site is a complete right-wing boomerbait rag, never pay it any attention.

      People think of WA and think of Seattle, then extrapolate. Seattle is really no different than places like Omaha where there is a more liberal, educated populace. They are surrounded by a state full of angry, ignorant people. This “newspaper” is for the angry types.

    • Fubarberry@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 months ago

      The “test drive a win” where it would generate AI images of people as lottery winners was a real thing, and they have taken it down.

      Only larger news outlet I see covering it is Fox news. They cite the “mynorthwest.com” as their main source, but they do say that they recieved a statement from the lottery confirming that it was shutdown for that reason:

      Washington’s Lottery confirmed to Fox News Digital that it shut down the site after being made aware of the purported image.

      Obviously a lot of people don’t like Fox news, but I don’t think there’s a political agenda where that statement shouldn’t be trusted.

    • Pissnpink@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      Idk, Mynorthwest is a real source but it’s mostly dull local news fare with a some good event coverage. KIRO is that branch and its okay, its center right, it certianly isn’t the cinclair broadcasting station, thats komo 4. 710 sports is more center left but its sports. 770 KTTH where this article seems to be coming from is obviously garbage reactionary conservative radio, but that’s what makes money in radio.

  • Neato@ttrpg.network
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 months ago

    When Megan, a 50-year-old mother based in Tumwater, visited the new AI-powered mobile site from Washington’s Lottery on March 30, she thought she was in for some frivolous fun. Test Drive A Win allows users to digitally throw a dart at a dartboard featuring dream vacations you can pay for with the money you win in the lottery. Depending on where the dart lands, you can either upload a headshot or take one on your phone to upload, and the AI superimposes your image into the vacation spot.

    Megan landed on a “swim with the sharks” dream vacation option. She was shocked at one of the AI photos Washington’s Lottery spit out. It was softcore porn.

    So I can totally see this happening. Government contracts with an genAI company and company drops the ball and erroneously includes the function for pornography or doesn’t select the correctly curated training data (I’m unsure how exactly these work). It may be quite difficult to spot this error by the Washington government is the occurrence rate is very low or none of their test training data prompted pornography to be generated. Perhaps it was only keyed to make porn (when not specifically prompted to) on certain subsets of matched facial features? I’m not suggesting this, but perhaps that affected user looks a lot like a popular porn star? It could also totally be the government’s fault for quickly selecting an AI package and not looking what it could do; but with government bureaucracy there could’ve been quite a few people with oversight.

    My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it? Why wouldn’t you just take the money and buy your own? Maaaaybe if it heavily discounts the vacations or something. Seems like an unnecessary step in the lottery process.

    • orclev@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      3 months ago

      My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it?

      No, it’s advertising. They’re trying to convince people to play the lottery so they have you roll a (virtual) wheel and upload a head shot then it generates a theoretical video of what it might look like if you went on that vacation (using your theoretical future winnings). It’s absolutely idiotic, but their target demographic isn’t exactly the sharpest tools in the shed to begin with.

    • chrash0@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 months ago

      they likely aren’t creating the model themselves. the faces are probably all the same AI girl you see everywhere. you gotta be careful with open weight models because the open source image gen community has a… proclivity for porn. there’s not a “function” per se for porn. the may be doing some preprompting or maybe “swim with the sharks” is just too vague of a prompt and the model was just tuned on this kind of stuff. you can add an evaluation network to the end to basically ask “is this porn/violent/disturbing”, but that needs to be tuned as well. most likely it’s even dumber than that where the contractor just subcontracted the whole AI piece and packages it for this use case

  • webghost0101@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    3 months ago

    Lol, they didn’t even try to test the system if this is the result. Ai isn’t intelligent but humans still take the cake of stupidity by having brains and not using them.

    Many public stable diffusion models have a bias, porn often being overrepresented but all it takes is a "nude, naked, erotic, sex, nsfw " in the negative prompt and unless the model is build to only generate porn this will never happen. Or better yet, use some of that corporate money and build their own sd model that is verified to not included any nudity in its training data.

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    3 months ago

    Why wouldn’t they just generate a couple hundred images and manually review them? It’s pretty easy to automate putting someone’s face onto an existing image, so that should be totally fine.

    They could cycle the images every so often with the insane amounts of money the lottery generates.