Australia bets on facial recognition for problem gamblers

A person gambles on a poker machine at a pub in Sydney, Australia, September 19, 2022. REUTERS/Loren Elliott

A person gambles on a poker machine at a pub in Sydney, Australia, September 19, 2022. REUTERS/Loren Elliott

What’s the context?

Clubs and pubs in the country's largest state are rolling out AI tech to spot gambling addicts, but critics flag privacy concerns

  • New South Wales uses AI cameras to spot problem gamblers
  • Critics say it brings privacy violations and data risks
  • Supporters say it can help addicts kick the habit

MELBOURNE – As guests arrive at eastern Australia's Warilla Hotel, a small camera equipped with facial recognition software scans their faces as part of a scheme to tackle problem gambling.

The tech – which uses artificial intelligence (AI) to identify addicts who have asked to be barred from betting sites – is set to be rolled out across gambling venues in the state of New South Wales next year.

Supporters say it will help curb problem gambling in a country where the addiction affects about 1% of the population and annual losses run to billions of dollars.

But the technology is "invasive, dangerous and undermines our most basic and fundamental rights", said Samantha Floreani, programme lead at Digital Rights Watch, a non-profit.

An Iranian woman living in Turkey reacts during a protest following the death of Mahsa Amini, outside the Iranian consulate in Istanbul, Turkey, September 21
Go DeeperMahsa Amini: facial recognition to hunt down hijab rebels in Iran
Security cameras near the Nagaworld casino and hotel complex in Phnom Penh, Cambodia.
Go DeeperActivists say China's new Silk Road equips autocrats with spy tech
A security camera sits on a building in New York City March 6, 2008. The New York City Police Department are using evidence from video tapes from nearby buildings to catch a suspect involved in a bombing at the Armed Forces Career Center. REUTERS/Joshua
Go DeeperAI surveillance takes U.S. prisons by storm

"We should be exceptionally wary of introducing it into more areas of our lives, and it should not be seen as a simple quick-fix solution to complex social issues," she said.

The Warilla Hotel did not respond to requests for comment. Its website states it supports "responsible" gambling.

The AI scheme's organisers, industry bodies ClubsNSW and the Australian Hotels Association NSW (AHA NSW) said "strict privacy protections" were in place.

'Best opportunity'

Facial recognition systems use AI to match live images of a person against a database of images – in this case a gallery of people who have voluntarily signed up to a 'self-exclusion' scheme for problem gamblers.

If the camera identifies someone on the state-wide database, a member of staff is alerted so they can be denied entry to casinos or escorted away from slot machines in hotels and bars.

"We think this is the best opportunity we've got in preventing people who have self-excluded from entering the venues," said John Green, director of AHA NSW.

The data collected will be secured and encrypted, and will not be accessible by any third parties, including the police and even the gambling venues, said Green.

However, digital rights groups said the tech was ineffective in stopping problem gambling and could go on to be used for wider surveillance, adding such projects underline the need for tougher privacy and data rights laws to protect citizens.

A woman plays a slot machine at club in Central Sydney November 23, 2011. REUTERS/Daniel Munoz

A woman plays a slot machine at club in Central Sydney November 23, 2011. REUTERS/Daniel Munoz

A woman plays a slot machine at club in Central Sydney November 23, 2011. REUTERS/Daniel Munoz

"People who opt into self-exclusion programs deserve meaningful support, rather than having punitive surveillance technology imposed upon them," said Floreani of Digital Rights Watch.

"And those who have not opted into these programmes ought to be able to go to the pub without having their faces scanned and their privacy undermined."

Digital rights campaigners want Australia's 1988 Privacy Act to be reformed to better address the use of facial recognition technology, and clarify when and how it can be used.

Mission creep?

Facial recognition technology is increasingly used for everything from unlocking mobile phones to checking in for flights. It has also been adopted by some police forces.

Advocates say it helps keep public order, solve crime, and even find missing people.

Critics say there is little evidence it reduces crime, and that it carries an inherent risk of bias and misidentification, especially for darker-skinned people and women.

Gambling industry bodies have said the facial recognition cameras would only be used to enforce the self-exclusion scheme.

But a draft law introduced in New South Wales' parliament last month that will formally legalise the tech in clubs and pubs includes language that would allow other uses, including identifying people banned for being too drunk.

"There's a capacity for scope creep, the capacity for this to facilitate further uses," said Jake Goldenfein, a senior lecturer at Melbourne Law School who studies technology.

He called for more regulation on facial recognition due to the sensitivity of the data captured and the heightened risks from data breaches.

"Facial templates are ... not something we can change. If we lose control over our biometric information, it becomes particularly dangerous," he said.

'Window dressing'

Advocates for reform have pushed for measures such as reduced opening hours of gambling venues and limits on the value of bets.

The use of facial recognition technology is the industry's way of delaying such reforms, and is unlikely to have a "practical effect" on problem gambling, said Tim Costello, chief advocate at the Alliance for Gambling Reform, a pressure group.

"The clubs are trying to look proactive ... it's complete window dressing to stop real reform," he said.

Green at AHA NSW said a survey of self-excluded gamblers found that more than eight in 10 felt that using facial recognition would be effective.

Call for reforms

There is growing pushback against facial recognition in Europe, the United States and elsewhere, with companies including Microsoft and Amazon ending or curbing sales of the technology to the police.

In Australia, retail giants Bunnings and Kmart halted the use of facial recognition technology to monitor customers in their stores earlier this year after the country's privacy regulator opened an investigation into whether they had broken the law.

Consumer rights group CHOICE, which referred the brands to the regulator, said the tech was "unreasonably intrusive" and "customers' silence cannot be taken as consent" to its use.

The Australian Human Rights Commission last year called for a ban on facial recognition technology until it is better regulated with "stronger, clearer and more targeted" human rights protections.

"There are questions that existing law doesn't have very good answers to," said law lecturer Goldenfein.

"There's so many ways to help problem gamblers that the idea that facial recognition technology is the solution is, frankly, preposterous."

(Reporting by Seb Starcevic; Editing by Rina Chandran and Sonia Elks.)

Context is powered by the Thomson Reuters Foundation Newsroom.

Our Standards: Thomson Reuters Trust Principles


  • Facial recognition
  • Data rights
  • Tech solutions

Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.

Latest on Context