The Specter of AI Scam Calls

The Specter of AI Scam Calls


The Specter of AI Scam Calls

­As with every new technology, AI is full of wondrous possibilities and a host of new opportunities for scammers. And as AI’s ability to mimic human beings gets more sophisticated, the harder it’ll be to detect the scams.

Anyone who saw the Obi-Wan Star Wars series (and probably a host of other big-budget blockbusters) knows about the more benign AI applications – in the case of Star Wars, Disney used AI to recreate Darth Vader’s voice since James Earl Jones is getting long in the tooth.

But there’s an insidious side to this anthro-mimicry.

Wired lays out an especially frightening scenario…

You receive a frantic call from a friend or family member…it’s from their actual number, so you don’t give it a second thought. On the other end, your friend or loved one says they’ve been in a horrible accident (or any number of emergency scenarios) and they need an immediate cash transfer.

Perhaps you acquiesce and send the money, or maybe the scenario raises a bunch of red flags. You hang up and call the person back, and they have no clue what you’re talking about. If you got that far, then you’ve defeated the AI scam call.

More than likely, the voice on the other end sounded a bit stilted or the mere prospect of your loved one suddenly hitting you up for cash seems fishy. That’s the good news.

The bad news is the tech is getting increasingly sophisticated, and before long, malicious actors will be able to duplicate the sound, emotion, and pitch of someone’s voice nearly perfectly.

OpenAI (of ChatGPT fame) has already announced a new text-to-speech model, and once that makes its way into the wild, it could be a huge boon for scammers.

Wired recommends some tried-and-true analog solutions – like calling them back, having an emergency safe word, or asking them random questions that only your real loved one would know.

Make sure you do something to stay ahead of the rapidly-evolving malevolent capabilities of AI.