An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

  • postmateDumbass@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Humans will identify sterotypes in AI generated materials that match the dataset.

    Assume the dataset will grow and eventually mimic reality.

    How will the law handle discrimination based on data supported sterotypes?

    • Pipoca@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Assume the dataset will grow and eventually mimic reality.

      How would that happen, exactly?

      Stereotypes themselves and historical bias can bias data. And AI trained on biased data will just learn those biases.

      For example, in surveys, white people and black people self-report similar levels of drug use. However, for a number of reasons, poor black drug users are caught at a much higher rate than rich white drug users. If you train a model on arrest data, it’ll learn that rich white people don’t use drugs much but poor black people do tons of drugs. But that simply isn’t true.