AI Voice Scams
It’s not like we didn’t have enough scams to worry about, now with fast growing AI technology, we have even more. New AI platforms has made it so simple to clone voices that can easily be used by criminals for online and phone scams.
A recent McAfee survey (Download the report here) of over 7000 people reported that 25% had already experienced some type of AI voice scam either personally or knew someone that was. Many lost money as a result.
Scammers are extracting voice data from online activities like social media. Using AI technology, they then clone the voice and create a fake voicemail or make a distressing call the victim’s contacts. The cloned voices are almost indistinguishable.
This technology is so new, many are not even aware it exists so hearing a distress message from a loved one is easy to fall prey to. The cost is significant too. More than 30% who had been a victim lost over $1000 and 7% lost between $5,000 and $15,000.
Here are a few ways to avoid falling for an AI voice scam directly:
- Don’t allow your emotions to take over. Hearing a distress call from a loved one can set you into a tailspin. Take a moment to try to verify what they are telling you or hang up and call the person directly.
- Ask the caller a specific question such as, can you confirm my son’s name or what is your dad’s birthday? This can take a scammer by surprise and they may need to generate a new AI response which will take time and without an immediate response, this causes suspicion.
- Use a family code-word. These days, every security measure helps and having a code-word used between family members and close trusted friends and make sure everyone knows it is important to ask for the code-word if they ever ask for help.
For more information Download the McAfee Report PDF Here: Link to report