Three Arizona Women Sue Men Who Allegedly Used Their Photos to Build and Sell AI Porn Influencer Courses

Three Arizona women filed a lawsuit in January 2026 against three Phoenix men — Jackson Webb, Lucas Webb, and Beau Schultz — alleging the men used photos scraped from women’s social media accounts to generate AI pornographic content, then sold online courses teaching others to do the same.

The suit, filed in Arizona, alleges the defendants used a software platform called CreatorCore to train AI models on images of real women taken without their knowledge, producing nude and sexually explicit photos and videos. That content was then sold on the subscription platform Fanvue. According to the complaint, the scheme reportedly generated more than $50,000 in income in a single month.

The men also allegedly operated AI ModelForge, a platform on Whop where subscribers paid $24.95 a month to access courses and “Blueprints” instructing them how to scrape women’s social media images, feed them into CreatorCore, and use a separate app to generate explicit material. By 2025, the complaint states, CreatorCore had more than 8,000 subscribers and had produced more than 500,000 images and videos.

One plaintiff, identified only as MG to protect her identity, learned of the scheme last summer when a follower sent her a link to Instagram Reels showing a woman with her face and body, including matching tattoos, in scantily clad images. MG, a Scottsdale resident with roughly 9,000 Instagram followers, said the images were convincing enough that people who didn’t know her well could mistake them for real photos of her.

“They provided a whole playbook, including instructions on how to pick the right person so that it’s not someone who can defend themselves,” MG said. The lawsuit also names 50 John Does, alleging the courses trained additional men to carry out similar schemes. The Webbs and Schultz did not respond to requests for comment.

The case highlights how AI image generation tools may be used to exploit real people’s likenesses at scale, with the alleged operation framing the exploitation as a teachable, monetizable business model.

Source: WIRED

This article was generated by AI and cites original sources.