NEW DELHI : India’s internet-using population, which surpassed 720 million in December 2022, according to Nielsen’s India Internet Report 2023, may be vulnerable to a new type of voice-based cyber scam, wherein scammers are employing artificial intelligence to replicate user voices and exploit them in cyberattacks on unsuspecting individuals, according to a report.
Cybersecurity firm McAfee revealed in a 1 May report that 47% of Indian users have either encountered or know someone who fell victim to AI voice cloning scams in January-March.
The surge in the AI voice-cloning scams corresponds with growing interest in generative AI, where algorithms process user inputs in text, image, or voice formats, and produce results based on user queries and the specific platform.
On 9 January, for instance, Microsoft introduced Vall-E, a generative AI-based voice simulator capable of replicating a user’s voice and generating responses with the users unique tonality by using just a three-second audio sample.
Several other similar tools, such as Sensory and Resemble AI, also exist. Now, scammers are leveraging these tools to dupe users, with Indians topping the list of victims globally.
McAfee data said that while up to 70% Indian users are likely to respond to a voice query from friends and family asking for financial aids by citing thefts, accidents and other emergencies, this figure is as low as 33% among users in Japan and France, 35% in Germany, and 37% in Australia.
Indian users also topped the list of users who regularly share some form of their voice on social media platforms — in the form of content in short videos, or even voice notes in messaging groups. Scammers, on this note, are leveraging this by scraping user voice data, feeding the same to AI algorithms, and generating cloned voices to implement financial scams.
Steve Grobman, chief technology officer of McAfee, said in a statement that while targeted scams are not new, “the availability and access to advanced artificial intelligence tools is, and that’s changing the game for cybercriminals.”
“Instead of just making phone calls or sending emails or text messages, a cybercriminal can now impersonate someone using AI voice-cloning technology with very little effort. This plays on your emotional connection and a sense of urgency, to increase the likelihood of you falling for the scam,” he said.
The report further added that 77% of all AI voice scams lead to some form of success for scammers. Over one-third of all victims of AI voice scams lost over $1,000 (around ₹80,000) in the first three months of this year, while 7% of victims lost up to $15,000 (around ₹1.2 million).
To be sure, security experts have warned that the advent of generative AI will give rise to new forms of security threats. On March 16, Mark Thurmond, global chief operating officer of US-based cyber security firm Tenable told Mint that generative AI will “open the door for potentially more risk, as it lowers the bar in regard to cyber criminals.” He added that AI threats such as voice-cloning in phishing attacks will expand the “attack surface”, leading to “a large number of cyber attacks that leverage AI being created.”
In cyber security parlance, the attack surface refers to the types of cyber attacks that a hacker can use to target potential victims. An expanding attack surface creates greater cyber security complications, since attacks become more difficult to track and trace, and also more sophisticated — such as in using AI to clone voices.
Sandip Panda, founder and chief executive of Delhi-based cyber security firm, Instasafe, said that generative AI is helping create “increasingly sophisticated social engineering attacks, especially targeting users in tier-II cities and beyond.”
“A much larger number of users who may not have been fluent at drafting realistic phishing and spam messages can simply use one of the many generative AI tools to create social engineering drafts, such as impersonating an employee or a company, to target new users,” he added.
Download The Mint News App to get Daily Market Updates & Live Business News.