Be wary of the unexpected call: Protecting Yourself from AI Voice Scams

Imagine one night, you’re in bed when your phone buzzes.  It’s your mom who never calls this late, so you answer. All you can hear is what sounds like your mom’s voice saying she doesn’t want to do something. You start to panic, you’re confused, you’re scared, then you hear a man’s voice telling you to do exactly what he says. Is this really happening or is this AI replicating a loved one’s voice? 

 

Artificial intelligence (AI) is great and can aid with lots of tasks like weather forecasting, searching the internet, writing e-mails and even writing this newsletter. How do we protect ourselves from AI Scams, how do we know the difference? AI voices can sound eerily convincing, making it harder to discern truth from trickery. They can speak multiple languages, make realistic breathing sounds, sound like a loved one, and they may even know personal information.   

 

Here’s how to stay vigilant

Don’t Panic. Verify! AI scammers often rely on urgency and emotional manipulation. Scammers love to create a sense of panic. If you receive a call or message from someone claiming to be in trouble, a loved one needing bail money, a government official warning of legal action, even your mother being held hostage – take a deep breath.  

 

Do not act on impulse. Instead, hang up immediately and verify the information by contacting your friend or family member directly at a number– or device– you know to be real. Remember, too, that some scammers can even make it look as though they are calling from your loved one’s phone number using technological trickery known as “spoofing,” which is why it is important to hang up and initiate your own call. Avoid using any contact details provided in the message itself, and never click on links that are sent or otherwise provided to you. 

 

Listen for the Unnatural: AI-generated voices, while becoming more sophisticated, can still sound slightly off. Pay attention to unnatural pauses, inflections, or robotic-sounding speech. 

 

 

What to do if you think you received an AI voice scam

If you’re at work, contact your IT Help Desk. If you are home and receive a scam call, report it to the Federal Trade Commission (FTC) at Reportfraud.ftc.gov. This helps them track scam tactics and develop new ways to protect consumers. 

 

In short, knowledge is power. Educate yourself and your loved ones about AI voice scams. Discuss red flags like unexpected calls, requests for immediate action, or demands for money transfers. Encourage everyone to be cautious about sharing personal information online, as this could be used to create more convincing voice forgeries. Enable two-factor authentication on all accounts that offer it— adding this extra layer of security is among the most important steps you can take to protect yourself.  

 

Learn more about AI Voice Scams (Vishing), examples, guidance and advice in the articles below. By staying alert, remaining calm during suspicious calls, and verifying information independently, you can shield yourself from the manipulative tactics of AI voice scams. 

 

Additional Resources

https://www.newyorker.com/science/annals-of-artificial-intelligence/the-terrifying-ai-scam-that-uses-your-loved-ones-voice 

https://www.npr.org/2024/01/25/1198909897/1a-draft-01-25-2024 

https://www.theguardian.com/technology/article/2024/may/10/ceo-wpp-deepfake-scam 

Learn more about how to avoid falling for scams at https://tech.rochester.edu/security/scams/