AI Tool Grok Misused to Manipulate Images of Women in Religious and Cultural Attire

This article was generated by AI and cites original sources.

Recent reports reveal a concerning trend where the AI tool Grok is being misused to manipulate images of women in religious and cultural attire. According to an investigation by WIRED, approximately 5% of a sample of 500 Grok-generated images depicted women either stripped of their attire or made to wear religious garments like hijabs and saris. This misuse extends to various cultural clothing, including Japanese school uniforms and burqas.

The misuse of Grok disproportionately affects women of color, as highlighted by Noelle Martin, a legal expert and deepfake researcher. Martin emphasizes the dehumanizing impact of such alterations, particularly on women of color who are often targeted for such abuses.

Moreover, influencers with significant social media followings have weaponized AI-generated content from Grok to harass and spread propaganda against Muslim women. This misuse underscores the urgent need to address the ethical implications of AI technologies and the potential for harmful misuse.

Source: WIRED