Skip to navigationSkip to contentSkip to footerHelp using this website - Accessibility statement
Advertisement

How cyber criminals use ChatGPT to make better scams

John Davidson
John DavidsonColumnist
Updated

Cyber criminals are already using generative artificial intelligence to craft more persuasive scams, but the greater threats the technology poses are still emerging, the country’s inaugural cyber coordinator says.

“By 2030 the global cyber threat environment is going to look very different to how it looks today,” Air Marshal Darren Goldie told The Australian Financial Review Cyber Summit.

“AI and quantum [computing] are coming like a freight train, presenting legislative and security challenges we’ve seen play out around the world.”

AI threats to cybersecurity will be coming at Australia like a freight train, says Air Marshal Darren Goldie, National Cybersecurity Coordinator Peter Rae

Should one powerful enough ever be built, a quantum computer could theoretically crack almost all the encryption that has secured financial and other data systems for decades.

But while that threat has yet to be realised, artificial intelligence is already changing the nature of cybercrime.

Advertisement

Ben Doyle, the chief information officer at defence and transport manufacturer Thales Group, said cybercriminals are already adopting AI tools such as ChatGPT “where they can”.

“When you look at spam and phishing and things like that, the writing style is a lot better because they are all using ChatGPT to write the phishing links.

“They’re not just doing it in English anymore, they’re also automatically translating it, so a lot more countries with different are actually seeing an increase in phishing [attacks]. It used to be focussed on English, now they’re doing it in Spanish, Vietnamese, because it’s easier for them to translate, giving them a greater threat capability,” he said.

Paul Jevtovic, the chief financial crime risk and group money laundering officer at the National Australia Bank, said even when the victim of a phishing attack suspected they were under attack, cyber criminals were using AI-tools to stay a step ahead of them.

“The business email compromise is so sophisticated now that even when an individual suspects that they may have been compromised, and they go to make a phone call, organised crime can mimic in under 30 seconds via tools that exist to them now.

“Even the ability to check when you’re being compromised, organised crime is already a step ahead of us,” he said.

Advertisement

But AI isn’t just a tool for criminals, said Cyber Security Minister Clare O’Neil.

“Machine learning and AI are going to create more pervasive and complex threats, but they’re also going to build new tools to help us manage them,” she said.

Rapidly sharing threat information across business and government will be a core part of Australia’s 2023-2030 Australian cybersecurity strategy, and AI-based tools that can detect threats “at wire speed” would be an integral part of that response, she said.

Read more about the AFR Cyber Summit here

John Davidson is an award-winning columnist, reviewer, and senior writer based in Sydney and in the Digital Life Laboratories, from where he writes about personal technology. Connect with John on Twitter. Email John at jdavidson@afr.com

Read More

Latest In Technology

Fetching latest articles

Most Viewed In Technology