A New Type of Sextortion Is Emerging — Powered by AI and Targeting Saudi International Students

Online sextortion has long been a weapon of manipulation and fear. Traditionally, scammers would lure victims into compromising situations via video chats, then record or fake explicit materials to use as blackmail. But a more insidious method is now on the rise — one that no longer requires any direct contact with the victim at all.

With the advancement of AI technologies, scam cartels — many operating overseas — have evolved their tactics. They no longer need to convince a victim to undress or engage in explicit conversations. Instead, they leverage artificial intelligence to create fake, hyper-realistic images or videos of a person, often depicting them performing sexual acts during a simulated video call.

Who Are They Targeting?

These scammers are no longer casting a wide net at random. They now strategically research their victims using publicly available data from social media, business listings, directories, and local news.

They’re targeting a very specific profile:

  • People who have vivid social media presence, but not high-profile

  • Individuals from big cities, who are less likely to have advanced knowledge digital security

  • Young adults in their late 20s or younger, often naive and with limited cybersecurity awareness

  • International students — particularly those from Saudi Arabia, whose financial support from home is precisely what the perpetrators seek to exploit

We’re seeing a disproportionate number of cases reported from the northern part of the UK, especially in Leeds, Manchester, and Sheffield. The pattern is striking — and deeply concerning.

A Real Case We Recently Handled

Just a few weeks ago, a young man contacted us in a panic. He is an international student from Saudi Arabia, currently studying in Manchester with the goal of eventually working for Saudi Aramco, the national oil company.

Note: Saudi Aramco, one of the largest and most valuable companies in the world, sponsors thousands of Saudi students annually to pursue higher education abroad, including in the United Kingdom. These students are often enrolled in STEM fields — engineering, geology, computer science — with the aim of returning home to contribute to Aramco’s operations.

This student had received a threatening email that included AI-edited images and a fabricated video made to look like he was performing explicit acts during a video call. The scammers demanded $1,500 in Bitcoin, giving him a 48-hour deadline and threatening to send the footage to his family, university staff, and close friends.

They weren’t bluffing. They had already conducted a background check and shared accurate personal information with him — including the names of distant relatives.

Terrified, he paid the $1,500. But it didn’t stop there.

Days later, they demanded an additional $5,000. When the victim refused — simply because he couldn’t afford it — the perpetrators began impersonating him on Instagram and Facebook, posting the AI-generated images and sending friend requests to his contacts.

Interestingly, a few months prior to the extortion, he had received a FaceTime call from what appeared to be a deceased relative — at least, that’s what the caller ID said. Confused and emotional, he answered. He said “hello” several times but got no response. The call lasted about a minute before he hung up. At the time, he dismissed it as a strange glitch. The student recalled, “I saw my face in the little window during the call, yes — but I had no idea I was being recorded.” He explained that the call felt strange but not threatening at the time. “It just happened — I didn’t think much of it,” he added.

Now, it’s clear: that was likely a setup. The perpetrators probably recorded the call and later used AI tools to fabricate realistic footage for the scam.

This case is just one of many — and the threat is growing.

AI media generation tools like Google Veo, which are capable of producing hyper-realistic video content from text or image prompts, are increasingly being misused by blackmailers and scam networks around the world. While many of these criminals are only just beginning to understand the full potential of these tools, it's only a matter of time before they master them.

Once that happens, we expect the number of AI-driven sextortion cases to skyrocket. The ease with which fake but believable explicit content can now be created — without ever contacting the victim — marks a terrifying shift in how these crimes are committed.

The combination of accessible AI and personal data scraped from social media is a perfect storm, and victims may soon find themselves targeted even if they’ve never shared compromising material or engaged in risky behavior online.

And that’s exactly when we step in. We were able to step in, assess the threat, block further damage, and shut down the blackmail operation.

What You Can Do

Please be aware: your biometric data (your face) can be stolen simply by answering a video call. Do not answer calls from unknown numbers, or if you must, at least don’t show your face, he-he..

If you're a student and receive suspicious messages or AI-generated images:

  • Do not respond to the blackmailer

  • Do not pay — this will only encourage further demands

  • Document everything: take screenshots, save emails, usernames, URLs

  • Contact a professional immediately

We specialize in helping victims of online sextortion and impersonation. Our team works discreetly and quickly to assess your situation, stop the damage, and protect your name.

Need Help Now?

Contact us 24/7.
The consultation is free.
The solution is real.

— Benjamin J., July 23, 2025