Politics

Scams on the phone, don’t trust that voice. It is artificial intelligence

On the phone it looks like a relative, a friend, or a famous person, but it is not him. Artificial intelligence manages to faithfully reproduce the sound characteristics. And the scam danger is more concrete than ever.

Reproducing the human voice with incredible loyalty is the last frontier of telephone scams. Or rather, their evolution. Remember when the Russian opponent Alexey Navalny cheated on the Russian secret services by imitating the voice of their boss, and managed to make himself say who and how he had poisoned it? The case threw serious embarrassment to the Kremlin and more than a head roll for that humiliation. FSB officials, heir of the notorious KGB, justified themselves by stating that the voice was just like that of their boss. The dissident was therefore a good imitator, but today his talent would not have served. Yes, because appropriateing the identity and vocal stamp of someone else, with the new features of artificial intelligence is a children’s game. Massimo Moratti knows something: the former president of Inter last February paid 890 thousand euros on the account of a gang of scammers who had convinced him thanks to a phone call from a false Minister of Defense Guido Crosetto, who asked the entrepreneur that money for a phantom redemption aimed at the liberation of non -existent journalists kidnapped in the Middle East. The Guardia di Finanza then recovered the booty, But in most cases the scams are successful.

Online there are now numerous platforms available of anyone who-thanks to trivial Voice Cloning software or advanced text-to-to-cards-allow you to digitally reproduce an almost identical item to the original one, complete with inflexions, accents and anything else is sufficient to deceive the unfortunate. To clone the item, short authentic audio clips, even a few seconds: a vocal message sent via Whatsapp is sufficient or, in the case of public figures, an intervention recorded during an event are enough. These tools – available both in free and paid version – are all too easy to use and allow to generate artificial vocal phrases within a few moments and negligible prices.

An example is Speechify Voice Cloning, which uses sophisticated algorithms based on the Deep Learning to reproduce extremely realistic audio to be used to adapt the designated victims. To the machine it is sufficient to analyze a registration of just 30 seconds to obtain a synthetic and credible copy of the voice, with infinite accessory possibilities: modify its intonation, the rhythm, as well as add or eliminate pauses, simulating even trouble or emotion to customize every message and induce the listener to believe in the “synthetic” words. For example, for the Veed.io program, designed specifically to capture the voice: just insert a text and you get a Voice over which faithfully reproduces the recorded stamp. So it also makes Vidnoz to the Voice Changers, which in addition offers a vast bookcase with over 100 default voices and supports more than 140 different idioms.

Obviously, these technological solutions have not been imagined to defraud. But improper use depopulates. The times of the innocent telephone jokes to avoid school or work are far away, like those immortalized in the films of Paolo Villaggio: “Fantozzi, make the Swedish accent!”. Today the IA can perfectly imitate the voice of Tom Cruise or Giorgia Meloni and translate it into all the languages ​​of the world, at the price of a meter subscription and in a few moments. In jargon, it’s called “Voice Scam”.

In Italy, currently, the most exploited voice scam scams are the “emotional extortions” which specifically aim for the families of the victims. The most classic scam for this type of crime consists in calling or sending a pre-registrative vowel to a tight relative, simulating an urgency: “Mom have robbed me, please send me money to this account then I explain better …”. In other cases, emotional leverage is a fake job emergency against employees who deal with administration: «Hi, it is me (the boss, editor’s note), I need you to immediately make a transfer to these coordinates … it is an important customer and we cannot lose it ». Furthermore, the request for access to bank accounts for a false road accident: “I need the credit card codes to pay the tank, please hurry up on the highway …”.

Since the voice is really identical to the real one, The credibility and success of the scam depends mainly on the ability in the use of the appropriate terms, the speed of the message and the “surprise effect”: taking advantage of the pressure of the time and the suggestion of a family member who receives bad news and we want to make useful, it is easy to hit the mark. It is mainly applies to the most vulnerable and less experienced people of technology, such as the elderly. In 2023, not surprisingly, the Ultra 65nni were victims of financial scams for a total estimated in millions. The exact figure is 559.4 million euros and was calculated by Fabi, the banking union. According to their study, the online scams have yielded 114 million digital criminals in 2022, leavened at 181 in 2024 (+58 percent). Unfortunately, a trend destined to continue, given the shocking potential of the AI.

So how to protect yourself? First of all, recognizing the alarm bells: urgent requests for money, maximum secrecy (i.e. not inform other relatives) and unusual payment requests (bank transfer, gift vouchers or cryptocurrencies) are almost always a sign of fraud. It is also useful to hump and recall the relative number, in case of suspicion (an unknown number is often synonymous with scam). Finally, it is useful to use a predetermined code word to verify the validity of the call and its content.