The next wave of cyber-crime scams will be devastating


Mark Gregory
Contributor

Australia is not prepared for the next wave of cyber-crime. Government has yet to implement regulations that would reduce the impact of what is coming. And Australian businesses and industry continue to lag international best practice.

Imagine what would happen when artificial intelligence and automated lifelike voice generators become weaponised by cyber-criminals and rogue nations. It would be devastating for many Australians, and the already too-high annual cost of cyber-crime would rise alarmingly.

In 2022, the cost of cyber-crime climbed to a record $3.1 billion according to the Australian Competition and Consumer Commission.

The Australian Cyber Security Centre reported there was a rise in the average cost to businesses in 2022: more than $39,000 for small business, $88,000 for medium-sized business and more than $62,000 for large businesses.

But in effect, as with the ongoing data breaches of our major companies and organisations, the true cost may not be known for some years.

The elderly, the disadvantaged and those that have failed to heed government cyber security warnings will be the first targets of the next wave of cyber-crime and scams.

And in some of the scenarios, they won’t be personally involved.

We already know that cyber criminals will exploit any weakness, including such brazen acts as selling people’s homes whilst they’re away – or right out from under them – because of weaknesses that found in the way that real estate can be sold in this country.

What do you think is going to happen when cyber criminals create artificial intelligence-driven bots that have been trained to use your voice?

With the information that has been gleaned about Australians from the many hundreds of data breaches, the cyber criminals can deploy cyber-crime bots onto companies that you regularly deal with as a means to commit fraud.

Too many Australian companies rely on security questions such as date of birth, address and utilise outdated security mechanisms such as six-digit codes sent to mobiles and voice recordings.

Unfortunately, many Australians talk to scammers on the phone, even when they realise that the person they’re talking to is a scammer. They fail to realise that the conversations are being recorded and the voice recordings will be used to train an artificial intelligence driven bot that has a voice engine.

Articles in the media about the devastating impact of cyber-crime have become common place and the messages provided with the articles are important, but is this nation ready for what is about to happen?

Recently, I learned that an elderly person had been contacted by a scammer claiming to be from the internet service provider and that they needed computer access to fix the problem. The elderly person provided access as the hacker was convincing and mentioned the correct internet service provider.

A family member called the elderly person’s mobile whilst the hacker was still connected through a landline. The family member was able to tell the elderly person to unplug the power to the computer and to hang up.

That family member then spent several hours checking the computer to ensure that the hacker had not left malware.

The number of phone calls from scammers is increasing and the amount of information that they have on their intended victims is growing.

The government is working with telecommunication companies to reduce the number of scam-related calls.

I wanted to learn more about the elderly person’s use of technology and asked if they used the computer for any financial transactions. While they viewed a share portfolio online, they did not carry out online share transactions, always went to a bank branch to transfer funds, and always called the same broker who acted on instructions.

At this point, you imagine that this elderly person was relatively safe from scammers targeting computers and phones. But this would be wrong.

By gaining access to the mobile phone, a scammer can download the contact list and find the broker’s number.

This is where the artificial intelligence bot that has been trained using the elderly person’s voice becomes a weapon. Who is responsible if the broker receives a call from the bot and carries out transactions and sends funds to an alternate bank account at the bot’s instruction?

Who is responsible if the bot calls the elderly person’s mobile phone provider and when asked provides date of birth and address in response to security questions and then orders phones and other devices be shipped to an alternate address as a present?

The potential for weaponising artificial intelligence bots that are voice trained is enormous in a country that is ill prepared.

Two factor authentication using codes sent to a mobile is not acceptable if the mobile has been compromised. Lax security questions are ineffectual. Voice recordings of phone conversations may not be sufficient evidence.

It is time for government to hold an inquiry into the use of artificial intelligence and the potential for new forms of cyber-crime. It is vital that this nation be prepared for what is to come.

Mark Gregory is an Associate Professor in the School of Engineering at RMIT University

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories