AI Scam Calls Imitating Familiar Voices Are a Growing Problem – Here’s How They Work

Anyone in possession of audio recordings of your voice could use deepfake algorithms to make "you" say whatever they want.

Advertisement
By Press Trust of India | Updated: 18 July 2023 11:08 IST
Highlights
  • The technology to create an audio deepfake is becoming common
  • This capability risks increasing the prevalence of audio misinformation
  • It can be used to try to influence public opinion
AI Scam Calls Imitating Familiar Voices Are a Growing Problem – Here’s How They Work

Deepfakes have gained notoriety over the last few years

Photo Credit: Pixabay/ Geralt

Scam calls using AI to mimic the voices of people you might know are being used to exploit unsuspecting members of the public. These calls use what's known as generative AI, which refers to systems capable of creating text, images, or any other media such as video, based on prompts from a user.

Deepfakes have gained notoriety over the last few years with a number of high-profile incidents, such as actress Emma Watson's likeness being used in a series of suggestive adverts that appeared on Facebook and Instagram.

There was also the widely shared – and debunked – video from 2022 in which Ukrainian president Volodymyr Zelensky appeared to tell Ukrainians to “lay down arms”.

Now, the technology to create an audio deepfake, a realistic copy of a person's voice, is becoming increasingly common. To create a realistic copy of someone's voice you need data to train the algorithm. This means having lots of audio recordings of your intended target's voice. The more examples of the person's voice that you can feed into the algorithms, the better and more convincing the eventual copy will be.

Advertisement

Many of us already share details of our daily lives on the internet. This means the audio data required to create a realistic copy of a voice could be readily available on social media. But what happens once a copy is out there?

What is the worst that can happen?

A deepfake algorithm could enable anyone in possession of the data to make “you” say whatever they want. In practice, this can be as simple as writing out some text and getting the computer to say it out loud in what sounds like your voice.

Advertisement

Major challenges

This capability risks increasing the prevalence of audio misinformation and disinformation. It can be used to try to influence international or national public opinion, as seen with the “videos” of Zelensky.

But the ubiquity and availability of these technologies pose significant challenges at a local level too – particularly in the growing trend of “AI scam calls”. Many people will have received a scam or phishing call that tells us, for example, that our computer has been compromised and we must immediately log in, potentially giving the caller access to our data.

Advertisement

It is often very easy to spot that this is a hoax, especially when the caller is making requests that someone from a legitimate organisation would not. However, now imagine that the voice on the other end of the phone is not just a stranger, but sounds exactly like a friend or loved one. This injects a whole new level of complexity, and panic, for the unlucky recipient.

A recent story reported by CNN highlights an incident where a mother received a call from an unknown number. When she answered the phone, it was her daughter. The daughter had allegedly been kidnapped and was phoning her mother to pass on a ransom demand.

In fact, the girl was safe and sound. The scammers had made a deepfake of her voice. This is not an isolated incident, with variations of the scam including a supposed car accident, where the victim calls their family for money to help them out after a crash.

Old trick using new tech

This is not a new scam in itself, the term “virtual kidnapping scam” has been around for several years. It can take many forms but a common approach is to trick victims into paying a ransom to free a loved one they believe is being threatened.

The scammer tries to establish unquestioning compliance, in order to get the victim to pay a quick ransom before the deception is discovered. However, the dawn of powerful and available AI technologies has upped the ante significantly – and made things more personal. It is one thing to hang up on an anonymous caller, but it takes real confidence in your judgment to hang up on a call from someone sounding just like your child or partner.

There is software that can be used to identify deep fakes and will create a visual representation of the audio called a spectrogram. When you are listening to the call it might seem impossible to tell it apart from the real person, but voices can be distinguished when spectrograms are analysed side-by-side. At least one group has offered detection software for download, though such solutions may still require some technical knowledge to use.

Most people will not be able to generate spectrograms so what can you do when you are not certain what you are hearing is the real thing? As with any other form of media, you might come across: be skeptical.

If you receive a call from a loved one out of the blue and they ask you for money or make requests that seem out of character, call them back or send them a text to confirm you really are talking to them.

As the capabilities of AI expand, the lines between reality and fiction will increasingly blur. And it is not likely that we will be able to put the technology back in the box. This means that people will need to become more cautious. 


Will the Nothing Phone 2 serve as the successor to the Phone 1, or will the two co-exist? We discuss the company's recently launched handset and more on the latest episode of Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated - see our ethics statement for details.
 

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Advertisement
Popular Mobile Brands
  1. Nothing Headphone 1 Price, Colour Options Leaked Ahead of Launch
  2. iQOO Z10 Lite 5G With 6,000mAh Battery Launched in India: Price, Features
  3. Vivo X200 FE Launch Date, Colours, and Design Revealed Ahead of Launch
  4. Redmi Pad 2 With 11-Inch 2.5K Display, 9,000mAh Battery Launched in India
  5. Government Announces FASTag-Based Annual Pass for Highway Commutes
  6. Google Pixel 10 Series Said to Get Faster Ultrasonic Fingerprint Sensor
  7. Nothing Phone 3 to Offer Longer Software Support Than Its Predecessor
  8. Apple Back to School Offer Brings Discounts on iPad Air, Other Products
  9. Amazfit Active 2 Square Debuts With 1.75-Inch AMOLED Display
  10. Boat Wave Fortune Smartwatch With NFC Tap & Pay Feature Launched in India
  1. Oppo Reno 14 5G, Reno 14 Pro 5G India Launch Timeline Leaked
  2. Nothing Phone 3 to Offer Longer Android and Security Update Support Than Its Predecessor
  3. Boat Wave Fortune Smartwatch With NFC Tap & Pay Feature, Bluetooth Calling Launched in India
  4. Government Announces FASTag-Based Annual Pass for Highway Commutes Priced at Rs. 3,000: See Benefits
  5. Adobe Firefly App for Android and iOS Announced, Offers AI-Powered Image and Video Tools
  6. Axiom-4 Mission Carrying Shubhanshu Shukla to International Space Station Postponed to June 22
  7. Bungie Delays Marathon, Says Will Reveal New Release Date This Fall
  8. Vivo T4 Ultra Now Available for Purchase in India: See Price, Offers, Specifications
  9. Infinix Note 50s 5G+ Now Available in a New 6GB RAM and 128GB Storage Variant in India
  10. Redmi K80 Ultra Design, Colours, and Key Features Revealed; to Get MediaTek Dimensity 9400+ SoC
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.