AI

Stolen faces, stolen lives: an unsettling trend driven by AI

Most social users will encounter an influential influencer who looks a little….

Maybe their facial features are too symmetrical and their postures are too stiff. It’s very likely that you’re not looking at humans at all, but rather an AI-generated fake.

In some cases, these AI influencers are reasonable benign – just the digital form of their true peers, but have not publicly tried to deceive or manipulate.

However, this is not always the case. Disturbingly, there is a network of Instagram accounts that use artificial intelligence to create fake influencers with Down syndrome.

These bad actors steal content from real creators and then use AI to exchange computer-generated faces of Down syndrome patients. Target? Utilize a vulnerable community to get likes, stocks and ultimately cash.

But the deception is not over. Many of these accounts link to dark adult websites where AI-generated content is monetized.

Sadly, this is just the latest evolution of the “AI Pimping” trend, where unethical operators use machine learning to create fake influencers for monetary gains. This is not only Down syndrome, but also a model for prosthetic amputees, victims of burning and other forms of AI-generated pornography.

Artificial intelligence image and video models are now approaching the level of realism, making them very feasible to replace real humans. It is affecting the fashion industry – real models are facing alternatives to AI cloning.

Even household names like H&M are dabbing in these vague waters. Fast Fashion Giant Announced a campaign “Digital twins” with real models generated by AI. Back in 2023, a company called Lalaland.ai released Create an AI model Subscription fee.

While H&M insists that the model maintains control over its digital similarity, many in the industry are skeptical. After all, in an era of cost cuts and mergers, why hire talent when you can drive cheap, unlimited replication digital avatars?

The latest, most sinister twist involves the basic dignity and humanity of marginalized communities.

People with Down syndrome or any disability are not props to manipulate for profit.

In addition, the spread of AI-generated content has the potential to completely erode public trust in the media. If we can’t trust the images we see online, the foundation of digital discourse begins to erode.

So the next time you browse the feed, it seems that the influential person is too good to be true, trust your own gut.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button