AI Scam Calls Imitating Familiar Voices Are a Growing Problem – Here’s How They Work

Anyone in possession of audio recordings of your voice could use deepfake algorithms to make "you" say whatever they want.

Advertisement
By Press Trust of India | Updated: 18 July 2023 11:08 IST
Highlights
  • The technology to create an audio deepfake is becoming common
  • This capability risks increasing the prevalence of audio misinformation
  • It can be used to try to influence public opinion

Deepfakes have gained notoriety over the last few years

Photo Credit: Pixabay/ Geralt

Scam calls using AI to mimic the voices of people you might know are being used to exploit unsuspecting members of the public. These calls use what's known as generative AI, which refers to systems capable of creating text, images, or any other media such as video, based on prompts from a user.

Deepfakes have gained notoriety over the last few years with a number of high-profile incidents, such as actress Emma Watson's likeness being used in a series of suggestive adverts that appeared on Facebook and Instagram.

There was also the widely shared – and debunked – video from 2022 in which Ukrainian president Volodymyr Zelensky appeared to tell Ukrainians to “lay down arms”.

Advertisement

Now, the technology to create an audio deepfake, a realistic copy of a person's voice, is becoming increasingly common. To create a realistic copy of someone's voice you need data to train the algorithm. This means having lots of audio recordings of your intended target's voice. The more examples of the person's voice that you can feed into the algorithms, the better and more convincing the eventual copy will be.

Many of us already share details of our daily lives on the internet. This means the audio data required to create a realistic copy of a voice could be readily available on social media. But what happens once a copy is out there?

What is the worst that can happen?

A deepfake algorithm could enable anyone in possession of the data to make “you” say whatever they want. In practice, this can be as simple as writing out some text and getting the computer to say it out loud in what sounds like your voice.

Advertisement

Major challenges

This capability risks increasing the prevalence of audio misinformation and disinformation. It can be used to try to influence international or national public opinion, as seen with the “videos” of Zelensky.

But the ubiquity and availability of these technologies pose significant challenges at a local level too – particularly in the growing trend of “AI scam calls”. Many people will have received a scam or phishing call that tells us, for example, that our computer has been compromised and we must immediately log in, potentially giving the caller access to our data.

Advertisement

It is often very easy to spot that this is a hoax, especially when the caller is making requests that someone from a legitimate organisation would not. However, now imagine that the voice on the other end of the phone is not just a stranger, but sounds exactly like a friend or loved one. This injects a whole new level of complexity, and panic, for the unlucky recipient.

A recent story reported by CNN highlights an incident where a mother received a call from an unknown number. When she answered the phone, it was her daughter. The daughter had allegedly been kidnapped and was phoning her mother to pass on a ransom demand.

Advertisement

In fact, the girl was safe and sound. The scammers had made a deepfake of her voice. This is not an isolated incident, with variations of the scam including a supposed car accident, where the victim calls their family for money to help them out after a crash.

Old trick using new tech

This is not a new scam in itself, the term “virtual kidnapping scam” has been around for several years. It can take many forms but a common approach is to trick victims into paying a ransom to free a loved one they believe is being threatened.

The scammer tries to establish unquestioning compliance, in order to get the victim to pay a quick ransom before the deception is discovered. However, the dawn of powerful and available AI technologies has upped the ante significantly – and made things more personal. It is one thing to hang up on an anonymous caller, but it takes real confidence in your judgment to hang up on a call from someone sounding just like your child or partner.

There is software that can be used to identify deep fakes and will create a visual representation of the audio called a spectrogram. When you are listening to the call it might seem impossible to tell it apart from the real person, but voices can be distinguished when spectrograms are analysed side-by-side. At least one group has offered detection software for download, though such solutions may still require some technical knowledge to use.

Most people will not be able to generate spectrograms so what can you do when you are not certain what you are hearing is the real thing? As with any other form of media, you might come across: be skeptical.

If you receive a call from a loved one out of the blue and they ask you for money or make requests that seem out of character, call them back or send them a text to confirm you really are talking to them.

As the capabilities of AI expand, the lines between reality and fiction will increasingly blur. And it is not likely that we will be able to put the technology back in the box. This means that people will need to become more cautious. 


Will the Nothing Phone 2 serve as the successor to the Phone 1, or will the two co-exist? We discuss the company's recently launched handset and more on the latest episode of Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated - see our ethics statement for details.
 

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Advertisement
Popular Mobile Brands
  1. Amazon Sale 2025: Best Deals and Offers on Mirrorless Cameras
  2. Amazon Sale 2025: Best Smartphone Deals Under Rs 30,000
  3. Sony Bravia Theatre System 6 Review
  4. OTT Releases This Week: Two Much, Sundarakanda, Janaawar, and More
  5. Xiaomi 17 Pro Max vs iPhone 17 Pro Max: Price, Features and More Compared
  6. Poco F8 Ultra Might Launch With This Snapdragon Chipset, Battery
  7. Amazon Sale 2025: Here Are the Top 43-Inch Smart TV Deals on Amazon
  8. Xiaomi 17 India Launch Confirmed; Could Come With These Specifications
  9. Samsung Begins Testing Next Major One UI Update on This Galaxy Device
  10. Gemini 2.5 Flash Can Now Write Responses With Better Formatting
  1. Bird-Inspired Robot With Innovative Wing Design Achieves Self-Takeoff and Controlled Flight
  2. NASA Prepares 2025 Carruthers Mission to Explore Earth’s Hidden Hydrogen Halo
  3. Nubia Z80 Ultra Launch Timeline, Display and Camera Specifications Teased
  4. Alan Wake 2, Cocoon and Goat Simulator 3 Join PS Plus in October
  5. iQOO 15 India Launch Timeline, Design, Key Specifications Leaked
  6. Xiaomi 17 to Be Available in New 1TB Storage Variant on October 5: Price, Features
  7. Poco F8 Ultra Key Specifications Including Battery, Snapdragon 8 Elite Gen 5 Chipset Leaked
  8. Oppo Find X9 Listed on BIS Website, Could Launch in India Soon
  9. Adobe Integrates Gemini Nano Banana Image Model Into Firefly App
  10. Google Upgrades Gemini 2.5 Flash With Improved Image Understanding and Better Formatting
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.