10/22/2023

Headline, October 23 2023/ ''' THE CYBERCROOKS TAP '''


''' THE CYBERCROOKS

 TAP '''



ARTIFICIAL INTELLIGENCE COPIES YOUR VOICE - then calls your bank. Now, it is getting more and more certain that Cybercrooks have a new potent weapon to get your money.

This spring, Clive Kabatznik, an investor in Florida, called his local Bank of America representative to discuss a big money transfer he was planning to make. Then he called again.

Except the second phone call wasn't from Mr. Kabatznik. Rather, a software program had artificially generated his voice and tried to trick the banker into moving the money elsewhere.

Mr. Kabatznik and his banker were the targets of a cutting-edge fraud attempt that has grabbed the attention of cybersecurity experts : the use of artificial intelligence to generate voice deep fakes, or vocal renditions that mimic real people's voices.

The problem is still new enough that there is no comprehensive accounting of how often it happens.

But one expert whose security company, Pindrop, monitors the audio traffic for many of the largest  U.S. banks said he had seen a jump in its prevalence this year - and in the sophistication of the scammers' voice fraud attempts.

Another large voice authentication vendor, Nuance, reported the first successful deepfake attack on a financial services client late last year.

In Mr. Kabatznik's case the fraud was detectable. But the speed of technological development, the falling costs of generative artificial intelligence programme and the wide availability of recordings of people's voices on the internet have created the perfect conditions for voice-related A.I. crimes.

Customer data, including bank account details that have been stolen by hackers - and are widely available on underground markets - help criminals pull-off these attacks. They become even easier with wealthy clients, whose public appearances, including speeches, are often widely available on the internet.

Finding audio samples for everyday customers can also be as easy as conducting an online search - say, on social media apps such as TikTok and Instagram - for the name of someone whose bank account information the cyber-criminals already have.

'' There's a lot of audio content out there,'' said Vijay Balasubramaniyan, the chief executive and founder of Pindrop, which reviews automatic voice-verification systems for eight of the 11 largest lenders in the United States.

Over the past decade, Pindrop has reviewed recordings of more than five billion calls coming into call centers run by financial companies it serves. The centers handle products such as bank accounts, credit cards and other services offered by big retail banks.

All of the call centers receive calls from fraudsters, typically ranging from 1,000 to 10,000 a year. It's common for 20 calls to come in from fraudsters each week, Mr. Balasubramaniyan said.

So far, fake voices created by computer programs account for only '' a handful '' of these calls, he said -and they've begun to happen only within the past year.

Most of the fake voice attacks that Pindrop has spotted have come into credit card service call centers, where human representatives deal with customers needing help with their cards.

Mr. Balasubramaniyan played a reporter an anonymized recording of one such call that took place in March. Although the voice in this case sounds robotic, more like an e-reader than a person, the call illustrates ways that crimes could occur as A.I. makes it easier to imitate human voices.

A banker greets the customer. Then the voice, similar to an automated one, says, '' My card was declined.''

'' May I ask whom I have the pleasure of speaking with?'' the banker replies.

 '' My card was declined,'' the voice says again. The banker asks for the customer's name again.

A silence ensues, during which the faint sound of  keystrokes can be heard. According to Mr. Balasubramaiyan, the number of keystrokes correspond to the number letters in the customer's name.  The fraudster is typing words into a program that then reads them.

In this instance, the caller's synthetic speech led the employee to transfer the call to a different department and flag it as potentially fraudulent, Mr Balasubramaniiyan said.

The Honour and Serving of the Latest Global Operational Research on Cybercrooks and Technology continues. The World Students Society thanks authors Emily Flitter and Stacy Cowley.

With most respectful dedication to the Global Founder Framers of !WOW!, and then Students, Professors and Teachers of the world. See Ya all prepare for Great Global Elections on !WOW! : wssciw.blogspot.com and Twiitter X !E-WOW! - The Ecosystem 2011 :

Good Night and God Bless

SAM Daily Times - the Voice of the Voiceless

0 comments:

Post a Comment

Grace A Comment!