Don’t fall for voice clone scans
Fraudsters are using AI (artificial intelligence) to clone voices of to carry out phone scams. Fraudsters only need a few words or sentences to pick up speech patterns of people known to their victims like family members or co-workers. Cloned voices are used to scam vulnerable family members and other victims out of money. A common technique is asking targets to quickly send money or use their credit card to help out a family member or friend in an emergency situation.
Action Steps
- PAUSE before you respond to callers, even if they sound familiar. If they sound upset and are asking for urgent help or money – this should be a red flag.
- VERIFY who they are with information only you know and ask them questions to ensure you are dealing with a family or friend, not a fraudster. Set up safe words for family and friends ahead of time, and go over with them the process that you will use if they ever need money in an urgent situation.
- REPORT to law enforcement or other authorities if you determine that a fraudster was contacting you using a voice clone scam.