Australian AI startup is creating fake victims to fool real scammers

A scammer places a call, confident he’s about to swindle another victim with a well-rehearsed script, perhaps posing as a bank official, a broadband technician, or a courier confirming a suspicious purchase.

On the line is someone who seems confused but cooperative, fumbling with tech terms or asking questions.

But the scammer doesn’t realise he’s been duped. The voice belongs not to a real person but to an artificial intelligence bot created by Australian cybersecurity startup Apate.ai– a synthetic “victim” designed to waste the scammer’s time and learn how the con works.

Named after the Greek goddess of deceit, Apate.ai is deploying the same technology scammers increasingly use to deceive their targets. It aims to turn AI into a defensive weapon, thereby undermining fraudsters while protecting potential victims, according to Nikkei.

Bots with personality

Apate Voice, one of the company’s key tools, generates lifelike phone personas that mimic human behaviour– complete with varying accents, age profiles, and temperaments. Some sound tech-savvy but distracted, others confused or overly chatty.

They respond in real time, engaging with scammers to keep them talking, disarm them, and collect valuable intelligence on scam operations.

A companion product, Apate Text, handles fraudulent messages, while Apate Insights compiles and analyzes data from interactions, identifying tactics, impersonated brands, and even specific scam details, such as bank accounts or phishing links.

Apate’s systems can distinguish legitimate calls from potential scams in under ten seconds. If a call is wrongly flagged, it’s quickly rerouted back to the telecommunications provider.

Small team, global impact

Based in Sydney, Apate.ai was co-founded by Professor Dali Kaafar, head of cybersecurity at Macquarie University. The idea emerged during a family outing interrupted by a scam call —a moment that sparked the question: What if AI could be used to strike back?

With just 10 employees, the startup has partnered with major institutions, including Australia’s Commonwealth Bank, and is trialling its services with a national telecom provider.

The company’s technology is already in use across Australia, the UK, and Singapore, handling tens of thousands of calls while collaborating with governments, financial institutions, and crypto exchanges.

Chief Commercial Officer Brad Joffe says the goal is to be “the perfect victim” – convincing enough to keep scammers engaged, yet smart enough to extract information.

A booming scam economy

I think the need is urgent. According to the 2024 Global Anti-Scam Alliance, scammers stole over $1 trillion worldwide in 2023 alone. Fewer than 4% of victims were able to recover their losses fully.

Much of the fraud originates from scam centres in Southeast Asia, often linked to organised crime and human trafficking. Meanwhile, scammers are adopting sophisticated AI tools to mimic voices, impersonate loved ones, and deepen deception.

In the UK, telecom provider O2 has introduced its own AI decoy —a digital “granny” named dAIsy, who responds with rambling anecdotes about her cat, Fluffy.

With threats evolving rapidly, Kaafar and his team believe AI must play an equally dynamic role in defence. “If they’re using it as a sword, we need it as a shield,” Joffe says.

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed