Women are suing men who used their Instagram feeds to create AI porn



More than a year ago, MG was leading a normal twenty-something life in Scottsdale, Arizona. He worked as a charity worker and supplemented his income by waiting on weekends. Like many women her age, she had Instagram News, where she sometimes posts News and pictures of her taking matcha and hanging out by the pool with her friends, or going to Pilates.

“I didn’t care about being famous and becoming a social media star,” says MG (who is only referred to as MG in court to protect his identity). “I just used it the way a lot of people did when they first came out, sharing their lives with the people around them.” They have over 9,000 followers – a strong following, but nowhere near a huge platform.

Last summer, he received a DM from one of his followers. Did he know, the man asked him, that pictures and videos of a woman who looked exactly like MG were circulating on Instagram? MG clicked on the link and saw several Reels of his face superimposed on a body that looked like his own. The woman in the picture was lightly dressed, with tattoos in the same places as MG.

MG was shocked. He said: “If you didn’t know me well, you’d think they were pictures of me. It was like proving that I couldn’t control myself.”

He was shocked when he realized that his nude or naked photos were published on the Internet, as he explained recently – they were also used to advertise AI ModelForge, a platform that teaches men how to model AI promoters. In several online classes and courses, the men allegedly taught subscribers how to use software CreatorCore training AI models using photos of unconscious girls, posting content on Instagram and TikTok.



Source link

اترك ردّاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *