• Amaltheamannen@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    And how do we know you didn’t crop out an instruction asking for diversity?

    Either that or a side effect of trying to have less training data bias.

    • Skull giver@popplesburger.hilciferous.nl
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Google is forcibly adding diversity to prompts like these for some weird reason. It’ll happily generate images of specific skin tones and demographics until you start asking for white people.

      I think they’re trying to prevent the “every scientist we generate is white, every criminal we generate is black” problem that comes with using popular media as training data, but this is something that actuslly happens with their new tools.

      The BBC and other news outlets have reported on this already.