How to Protect Yourself (and Your Loved Ones) from Fraudulent AI Calls

They answer a A random phone call from a family member and they breathlessly explain how a terrible car accident occurred. They must send money immediately or they will end up in prison. You can hear the desperation in their voice as they ask for an immediate money transfer. Even though it certainly sounds like them and the call came from their number, you get the feeling that something is wrong. So you decide to hang up and call back immediately. When your family member answers your call, they say that there was no car accident and that they have no idea what you are talking about.

Congratulations, you have just successfully avoided an artificial intelligence call.

As generative AI tools become more powerful, it will become easier and cheaper for fraudsters to create fake – but convincing – audio recordings of people’s voices. These AI voice clones are trained on existing audio clips of human speech and can be customized to imitate almost anyone. The latest models can even speak in numerous languages. OpenAI, the maker of ChatGPT, recently announced a new text-to-speech model that could further improve voice cloning and make it more widely accessible.

Of course, criminals use these AI cloning tools to trick victims into thinking they’re on the phone with a loved one when they’re talking to a computer. While the threat of AI-powered fraud can be frightening, you can stay safe by keeping these expert tips in mind the next time you receive an urgent, unexpected call.

Remember that AI audio is difficult to detect

It’s not just OpenAI; Many technology startups are working on reproducing near-perfect sounding human speech, and recent progress is rapid. “If it had been a few months, we would have given you tips on what to look out for, like lulls in pregnancy or showing some sort of latency,” says Ben Colman, co-founder and CEO of Reality Defender. Like many aspects of generative AI over the past year, AI audio is now a more convincing imitation of reality. Any security strategy that relies on you audibly detecting strange quirks on your phone is outdated.

Hang up and call back

Security experts warn that scammers can easily make it appear that the call is coming from a legitimate phone number. “Scammers often spoof the number they are calling you from and make it appear as if they are calling you from that government agency or bank,” says Michael Jabbara, global head of fraud at Visa. “You have to be proactive.” Whether from your bank or a loved one, whenever you get a call asking for money or personal information, ask for a call back. Search for the number online or in your contacts and start a follow-up conversation. You can also try sending them a message through another, verified communication channel like video chat or email.

Create a secret safe word

A popular safety tip suggested by several sources was to come up with a safe word that only you and your loved ones know and that you can ask about on the phone. “You can even negotiate with your loved ones in advance a word or phrase that they could use to prove who they really are when they find themselves in a compulsive situation,” says Steve Grobman, chief technology officer at McAfee. Although it’s best to call back or verify through another means of communication, a safe word can be especially helpful for young or older relatives who may otherwise be difficult to reach.

Or just ask what they had for dinner

What if you don’t have a safe word and want to find out if a disturbing call is real? Stop for a moment and ask a personal question. “It could even be as simple as asking a question that only a loved one knows the answer to,” Grobman says. “It could be, ‘Hey, I want to make sure that’s really you.’ Can you remind me what we had for dinner last night?’” Make sure the question is specific enough that a scammer can’t answer it correctly with an educated guess.

Understand that every voice can be imitated

Deepfake audio clones aren’t just reserved for celebrities and politicians, like the calls in New Hampshire that used AI tools to sound like Joe Biden and discourage people from voting. “A misunderstanding is: ‘This can’t happen to me.’ “No one can clone my voice,” says Rahul Sood, chief product officer at Pindrop, a security company that discovered the likely origins of AI Biden audio. “What people don’t realize is that with just five to 10 seconds of your voice, on a TikTok you may have created or a YouTube video from your professional life, that content can easily be used to create your clone.” With the help of AI tools, even the outgoing voicemail message on your smartphone may be enough to recreate your voice.

Don’t give in to emotional appeals

Whether it’s a pig slaughter scam or an AI call, experienced scammers are able to build your trust in them, create a sense of urgency, and find your weak points. “Be wary of engagements where you feel a heightened sense of emotion, because the best scammers are not necessarily the most skilled technical hackers,” says Jabbara. “But they have a really good understanding of human behavior.” If you take a moment to think about a situation and don’t act spontaneously, this could be the moment you avoid fraud.

Sharing Is Caring:

Leave a Comment