Meta AI is persistently unable to generate correct photos for seemingly easy prompts like “Asian man and Caucasian good friend,” or “Asian man and white spouse,” The Verge . As a substitute, the corporate’s picture generator appears to be biased towards creating photos of individuals of the identical race, even when explicitly prompted in any other case.
Engadget confirmed these ends in our personal testing of Meta’s picture generator. Prompts for “an Asian man with a white lady good friend” or “an Asian man with a white spouse” generated photos of Asian {couples}. When requested for “a various group of individuals,” Meta AI generated a grid of 9 white faces and one individual of coloration. There have been a pair events when it created a single outcome that mirrored the immediate, however generally it didn’t precisely depict the immediate.
As The Verge factors out, there are different extra “refined” indicators of bias in Meta AI, like a bent to make Asian males seem older whereas Asian ladies appeared youthful. The picture generator additionally typically added “culturally particular apparel” even when that wasn’t a part of the immediate.
It’s not clear why Meta AI is fighting some of these prompts, although it’s not the primary generative AI platform to come back underneath scrutiny for its depiction of race. Google’s Gemini picture generator paused its capability to create photos of individuals after it overcorrected for range with in response prompts about historic figures. Google that its inner safeguards didn’t account for conditions when numerous outcomes had been inappropriate.
Meta didn’t instantly reply to a request for remark. The corporate has beforehand described Meta AI as being in “beta” and thus susceptible to creating errors. Meta AI has additionally struggled to precisely reply about present occasions and public figures.