Can AI help someone stage a fake kidnapping scam against you or your family?

can ai help someone stage a fake kidnapping scam against you or your family?

Can AI help someone stage a fake kidnapping scam against you or your family?

You may feel confident in your ability to avoid becoming a victim of cyber scams. You know what to look for, and you won’t let someone fool you.

Then you receive a phone call from your son, which is unusual because he rarely calls. You hear a shout and sounds resembling a scuffle, making you take immediate notice. Suddenly, you hear a voice that you are absolutely certain is your son, screaming for help. When the alleged kidnappers come on the line and demand money to keep your son safe, you are sure that everything is real because you heard his voice.

Unfortunately, scammers are using artificial intelligence (AI) to mimic the voices of people, potentially turning these fake voices into things like kidnapping scams. This particular scam seems to be rare, but it’s happening.

CLICK TO GET KURT’S FREE CYBERGUY NEWSLETTER WITH SECURITY ALERTS, QUICK VIDEO TIPS, TECH REVIEWS AND EASY HOW-TO’S TO MAKE YOU SMARTER

Such fake emergency scams occur frequently enough that the Federal Trade Commission (FTC) provided warnings and examples for consumers. Hard numbers that indicate the frequency of these calls aren’t readily available, though, especially for calls known to make use of AI.

Such scams are certainly possible with current AI technology. Fake video and audio of politicians and other famous people are appearing with regularity. Aided by AI, these clips are frighteningly believable.

READ ON THE FOX NEWS APP

You may recall the incident in late 2023 involving a fake dental plan advertisement that featured Tom Hanks. AI technology created the video. Hanks had to make a social media post calling out the fake advertisement.

MORE: THE ‘UNSUBSCRIBE’ EMAIL SCAM IS TARGETING AMERICANS

The AI technology creates a fake by analyzing a sampling of an audio clip of the person it wants to mimic. It uses its ability to interpret incredible amounts of data to take note of multiple characteristics of the person’s voice, allowing it to make a highly realistic fake.

Once the AI is able to create the fake audio, programmers then tell it what to say, creating a personalized message designed to sell dental plans or to convince you that your loved one is in trouble with kidnappers.

Some AI programmers that use the fake audio for helpful purposes — such as for allowing people with medical problems like ALS to regain their “speech” — claim they can mimic a voice with as little as a few minutes of audio clips. However, the more audio that’s available, the more realistic the mimicked voice should sound. Twenty minutes of audio is far better than three, for example.

As AI’s capabilities continue to expand at breakneck speed, you can expect the time requirements to shrink in future years.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

MORE: HOW TO GUARD AGAINST BRUSHING SCAMS

Realistically, the vast majority of people don’t have to worry about a fake kidnapping scheme that originates from AI-generated audio. If your loved one has a lot of video and audio on social media, though, the scammers may be able to find enough source audio to create a realistic fake.

Even though AI makes this type of scam easier to perform, the setup process still remains too time-consuming for most scammers. After all, scammers in this type of scheme are relying on your rapidly expanding fear at receiving this type of call to cause you to miss obvious clues that would tell you it’s a fake.

The scammers may simply have a random child scream and sob uncontrollably, while allowing you to rapidly jump to the conclusion that it’s your child. This is far easier than using AI to try to source and generate audio … at least for now.

MORE: HOW SCAMMERS USE AI TOOLS TO FILE PERFECT-LOOKING TAX RETURNS IN YOUR NAME

Even though the scammers try to gain the upper hand with the suddenness of the fake kidnapping call and by catching you off guard, you have some steps you can take before and after you receive this type of call to prepare and protect yourself.

1. Ask your loved ones to keep you informed about trips: Fake kidnappers may try to convince you that the abduction is taking place outside your city. However, if you know that your loved one did not leave town, you can be confident that the call is probably a fake.

2. Set up a safe word or phrase: Set up a safe word that your loved ones should use if they ever are calling you because of a dangerous situation or because they are under duress. A scammer is not going to know this safe word. If you don’t hear the safe word, you know it’s probably a fake call.

3. Use privacy settings on social media: Ask your family members to limit who can see their social media posts. This would make it harder for a scammer to obtain source audio that’s usable in a fake kidnapping audio call. For more information on maintaining and protecting your online privacy, click here.

4. Try to text your loved one: Either during or immediately after the call, send a text message to your loved one without telling the caller. Ask your loved one to text you back immediately, so you can converse without tipping off the scammers. If you receive a text back, you can be confident the call is a fake. Consider creating a code word that you can use with the entire family. When you send this code word in a text, everyone knows it’s a serious situation that requires an immediate response.

5. Stay calm and think things through: Finally, although it is incredibly difficult to stay calm when you receive this kind of call, it’s important to keep thinking clearly. Do not panic. Regardless of whether it’s a real call or a scam call, panicking is never going to help. Listen for clues that make it obvious the call is a scam. Try to gather some information that can help you make a clear-headed judgment about the legitimacy of the call.

As AI continues to become more readily available and gains sophistication, scammers will be ready to take advantage of it. Perhaps by then, AI will even the playing field by coming up with ways to help us protect ourselves. Until then, taking steps to protect your family, such as by setting up a safe word, can give you some peace of mind.

Are you concerned about how scammers may take advantage of AI to create new scams?  Let us know by writing us at Cyberguy.com/Contact

For more of my tech tips & security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter

Ask Kurt a question or let us know what stories you’d like us to cover.

Answers to the most-asked CyberGuy questions:

    Copyright 2024 CyberGuy.com. All rights reserved.

    Original article source: Can AI help someone stage a fake kidnapping scam against you or your family?

    News Related

    OTHER NEWS

    Big market marred by poor upkeep

    THOSE looking for fresh produce may find themselves spoilt for choice at the biggest wet market in Klang, but visitors to the place say the condition of the facilities and ... Read more »

    Olive Grove: Phase 1 sold out, Phase 2 now open for sale

    Olive Grove is the first-ever gated-and-guarded development in Bercham, Ipoh with 24-hour security. IPOH: YTL Land and Development Bhd announced that Phase 1 of Olive Grove is fully sold while ... Read more »

    Cops arrest teen who pulled knife on elderly e-hailing driver

    Screenshots of a video showing a teenager pointing a knife at an elderly e-hailing driver. PETALING JAYA: Police have arrested a 13-year-old boy for holding an elderly e-hailing driver at ... Read more »

    Sprint Highway’s Semantan To KL Slip Road Fully Closed Until Dec 31

    Sprint Highway’s Semantan To KL Slip Road Fully Closed Until Dec 31 If you’re a regular user of the Sprint expressway, you’ll need to do some planning for your trips ... Read more »

    Genshin Impact Version 4.3 Leak Showcases Update to Domains

    Genshin Impact Version 4.3 Leak Showcases Update to Domains New leaks reveals a quality-of-life update to Domains in Genshin Impact, making it easier for players to repeat and farm resources. ... Read more »

    Urban Republic Warehouse Clearance: Get iPhone for as low as RM699 and many more

    CG Computers will host the Urban Republic (UR) Warehouse Clearance from 30 November to 3 December at the Atria Shopping Gallery in Petaling Jaya. During the event, visitor can get ... Read more »

    Malaysia has never experienced hyperinflation - Economy Ministry

    Photo for illustrative purposes only – 123RF KUALA LUMPUR – Hyperinflation has never happened in Malaysia and the government hopes it will never happen, according to the Economy Ministry. It ... Read more »
    Top List in the World