26.8 C
New Delhi
Thursday, May 9, 2024
HomeTechDeep fakes: Beware of fake audio calls that can mimic voices of...

Deep fakes: Beware of fake audio calls that can mimic voices of your loved ones


Even as the deep fake videos of film actress Rashmika Mandanna and that of Prime Minister Narendra Modi dancing shocked the country, cybersecurity experts warn of much deeper crisis ahead.


This time hackers could use artificial intelligence to create and use deep fake voices, which are still more difficult to tell apart, cybersecurity experts at Kaspersky said.

They cited the recent example of a new Beatles song created using artificial intelligence, combining parts of an old recording. While the positive aspect of AI brought cheers to Beeatles lovers, it’s time we pay attention to the darker side of using AI that can create deepfake voices. 

voice deepfakes

What are voice deepfakes capable of? Imagine you get a phone call with a voice akin to your parents, brother or your best friend. Or, someone records a fake message using a celebrity’s voice. It can create havoc as it is very difficult for ordinary people to distinguish between the fake and original voice.

“Open AI recently demonstrated an Audio API model that can generate human speech and voice input text. So far, only this Open AI software is the closest to real human speech. In the future, such models can also become a new tool in the hands of attackers,” an experty from Kaspersky said.

The Audio API can reproduce the specified text by voice, while users can choose which of the suggested voice options the text will be pronounced with. The Open AI model, in its existing form, cannot be used to create deepfake voices, but is indicative of the rapid development of voice generation technologies. 

“In the last few months, more tools are being released to generate a human voice. Previously, users needed basic programming skills, but now it is becoming easier to work with them. In the near future, we can expect to see models that will combine both simplicity of use and quality of results,” he said.

How to protect yourself?

“For now, the best way to protect yourself is to listen carefully to what your caller says to you on the telephone. If the recording is of poor quality, has noises, and the voice sounds robotic, this is enough not to trust the information you hear,” Dmitry Anikin, Senior Data Scientist at Kaspersky, said.

“Nevertheless, you need to be aware of possible threats and be prepared for advanced deepfake fraud becoming a new reality in the near future,” he said.

Another good way to test the veracity of the caller is to ask some out-of-the-box questions such as the books one reads or colours one likes.





Source link

- Advertisment -

YOU MAY ALSO LIKE..

Our Archieves