Virtual kidnapping: How to see through this terrifying scam

Cyber Security

Scams

Phone fraud takes a frightening twist as fraudsters can tap into AI to cause serious emotional and financial damage to the victims

Virtual kidnapping: How to see through this terrifying scam

It’s every parent’s worst nightmare. You get a call from an unknown number and on the other end of the line hear your child crying out for help. Then their ‘kidnapper’ comes on the line demanding a ransom or you will never see your son or daughter again. Unfortunately, this is not an imagined scenario from a Hollywood film.

Instead, it’s a terrifying example of the lengths that scammers can now go to in order extort money from their victims, co-opting new technology for nefarious purposes. It also shows the quality of AI voice cloning technology that is now convincing enough to trick even close family members. Fortunately, the more people know about these schemes and what to look out for, the less likely phone-based fraudsters are to make any money.

How virtual kidnapping works

There are several key stages to a typical virtual kidnapping scam. Broadly speaking they are as follows:

  • The scammers research potential victims they can call up and try to extort money from. This stage could also be optimized with the use of AI tools (more of this later).
  • The scammers identify a ‘kidnapping’ victim – most likely the child of the person they identified in stage 1. They could do this by trawling through their social media or other publicly facing information.
  • The group then creates an imagined scenario, being sure to make it as harrowing as possible for the person they’re about to call. The more scared you are, the less likely you’ll be to make rational decisions. Like any good social engineering attempt, the scammers want to rush the victim’s decision making for this reason.
  • The fraudsters might then perform some more open source research to calculate when the best time to call would be. They may scour social media or other sources to work this out. The idea is to contact you at a time when your loved one is elsewhere, ideally on holiday, like the daughter of Jennifer DeStefano.
  • The scammers then create the audio deepfakes and put in the call. Using readily available software, the scammers will create audio with the victim’s ‘voice’ and use it to try and convince you that they have kidnapped a relative. They may use other information gleaned from social media to make the scam sound more convincing, for example by mentioning details about the ‘kidnappee’ that a stranger might not know.

If you fall for the scam, you will most likely be asked to pay in non-traceable way, like cryptocurrency.

Supercharging virtual kidnapping

There are variations on this theme. Most concerning is the potential for ChatGPT and other AI tools to supercharge virtual kidnapping by making it easier for fraudsters to find the ideal victims. Advertisers and marketers have for years been using “propensity modelling” techniques to get the right messages to the right people at the right time.

Generative AI (GenAI) could help scammers to do the same, by searching for those individuals most likely to pay up if exposed to a virtual kidnapping scam. They could also search for people within a specific geographical area, with public social media profiles and of a specific socio-economic background.

A second option would be to use a SIM swapping attack on the ‘kidnappee’ to hijack their phone number ahead of the scam. This would add an unnerving legitimacy to the kidnapping phone call. Whereas DeStefano was eventually able to ascertain that her daughter was safe and well, and therefore hang up on her extortionists, this would be much harder to do if the victim’s relative is uncontactable.

What the future holds for voice cloning

Unfortunately, voice cloning technology is already worryingly convincing, as also our recent experiment proves. And it is increasingly accessible to scammers. An intelligence report from May warned of legitimate text-to-speech tools which could be abused, and a growing interest on the cybercrime underground in voice cloning-as-a-service (VCaaS). If the latter takes off it could democratize the ability to launch such attacks across the cybercrime economy, especially if used in combination with GenAI tools.

In fact, beside disinformation, deepfake technology is also being used for business email compromise (as tested by our own Jake Moore) and sextortion We are only at the start of a long journey.

How to stay safe

The good news is that a little knowledge can go a long way to diffusing the threat of deepfakes in general and virtual kidnapping in particular. There are things you can do today to minimize the chances of being selected as a victim and of falling for a scam call if one does occur.

Consider these high-level tips:

  • Don’t overshare personal information on social media. This is absolutely critical. Avoid posting details such as addresses and phone numbers. If possible, don’t even share photos or video/audio recordings of your family, and certainly not details of loved ones’ holiday plans.
  • Keep your social media profiles private in order to minimize the chances of threat actors finding you online.
  • Be on the lookout for phishing messages that could be designed to trick you into handing over sensitive personal information, or logins to social media accounts.
  • Get children and close family to download geolocation trackers such as Find My iPhone.
  • If you receive a call, keep the ‘kidnappers’ talking. At the same time try to call the alleged kidnappee from another line, or get someone close by to.
  • Stay calm, don’t share any personal info, and if possible get them to answer a question only the kidnappee would know and request to speak to them.
  • Notify the local police as soon as possible.

Virtual kidnapping is just the start. But stay up to date with the latest scams and you stand a good chance of nipping attacks in the bud before they cause serious emotional distress.

Products You May Like

Articles You May Like

Bitfinex Hacker Jailed for Five Years Over Billion Dollar Crypto Heist
Free Decryptor Released for BitLocker-Based ShrinkLocker Ransomware Victims
CISOs Turn to Indemnity Insurance as Breach Pressure Mounts
Amazon MOVEit Leaker Claims to Be Ethical Hacker
THN Recap: Top Cybersecurity Threats, Tools, and Practices (Nov 04 – Nov 10)

Leave a Reply

Your email address will not be published. Required fields are marked *