Apple et IA générative

Apple criticized for racial bias in its generative AI images

With iOS 18.2, Apple introduced a new image generation feature in the United States through its Image Playground app, powered by Apple Intelligence. However, this technology is now being accused of reinforcing racial stereotypes in its visual creations.

Biased representations based on context

Machine learning researcher Jochem Gietema examined the AI’s behavior and found that it tends to associate certain professions or activities with specific ethnic backgrounds. For example, when a user requests an image of a basketball player, the result is almost always a Black person, while a skier is consistently depicted as white.

Even more concerning, these biases become more pronounced when using keywords related to socioeconomic status. Terms like “rich,” “influential,” or “successful career” generate images of white men in suits and ties. In contrast, words such as “poor” or “destitute” predominantly result in images of Black men dressed more modestly.

An embarrassment for Apple, despite its commitment to inclusivity

These findings, published on Jochem Gietema’s blog, pose a challenge for Apple, which has emphasized its commitment to inclusivity in the development of its artificial intelligence technologies. While other AI systems have faced similar criticism for bias, these results are particularly damaging to Apple’s image as it prepares to launch Apple Intelligence in Europe this April.

Source of the news : BFM Tech & Co