Australia: Where your “Faceprint” is Tracked
By: Chenwei Ren
A new technology allows police to use facial recognition and GPS tracking to monitor citizens in Western Australia who contracted Covid-19. The G2G app has been used by more than 150,000 people since its roll out in September 2020.
This same technology, although presented by different companies, has started to take off in the states of New South Wales, Victoria, South Australia, and Tasmania. Australia continues to be the only democracy to use facial recognition to aid Covid-19 isolation procedures while other countries refuse the idea of using such technology.
The Australian Human Rights Commission has called for a prohibition on the technology until Australia has a specific law to regulate its use. Human rights campaigners say there is potential for the personal data obtained to be used for other purposes, citing the risk of becoming a surveillance state.
Of course, there have been cases of facial recognition technology being tracked secretly. Back in October, 7-Eleven was found to be collecting faceprints from 1.6 million Australian customers. In 2016, Australia’s Department of Home Affairs began building a national facial recognition.
“Facial recognition is on the cusp of relatively widespread deployment,” Mark Andrejevic, a professor of media studies at Monash University in Melbourne. “Australia is gearing up to use facial recognition to allow access to government services. And among law enforcement agencies, there’s definitely a desire to have access to these tools.” Most state governments have provided the central database with their citizens’ driver's licenses, and the database also stores visa and passport photos.
“And yet the technology is continuing to be deployed,” says Edward Santow, an Australian Human Rights Commissioner.
Australia has different approaches from the rest of the world. The most common is to rely on a certain amount of limited privacy protections that Santow stated failed to adequately address the issue; that’s the case in Australia. “If [the privacy protections were suitable], this project would be a really simple one.”
The benefit of faceprints applies to public places. "We have seen our technology used with great success by law enforcement to stop gun trafficking, and we are hopeful that our technology can be used to help prevent tragic gun crimes in the future," Clearview AI's Australian CEO and founder Hoan Ton-That says.
Yet, it is still unsure if the usage of facial recognition can be fully beneficial. The best face identification has an error rate of just 0.08%, according to tests done by the National Institute of Standards and Technology. However, the error rate for individuals captured “in the wild” tends to go to percentages as high as 9.3%.
“It’s incredibly useful technology. But if somebody had asked us 20 years ago when the world wide web was starting up if we wanted to live in a world where our interactions and activity were collected and tracked, the majority of us probably would have said that it sounded creepy," says Garrett O'Hara, field chief technologist at security company Mimecast. "We're now replicating the tracking of online space to include physical space as well. And we're not asking the questions about it that we should be."
One of the most problematic characteristics of this technology is the potential racial discrimination and bias.
"In the early days, the datasets being used were all taken from white males or white people in general," says O'Hara. "And clearly, it leads to problems when you've got people of color or of different ethnicities or backgrounds that don't match the training models. At the end of the day, it's just mathematics. This is the problem."
As a result, facial recognition systems are prone to mistakes when trying to identify those belonging to an ethnic minority group, women, people with disabilities and the elderly.
As technology grows to become more advanced, so do hacker’s attempts to manipulate it for their own personal gain. Fakes have emerged as an advancement of fraud techniques, particularly in relation to digital facial recognition.
"It used to take several hours to create a deep fake using animation tools – now it takes a couple of minutes," says Francesco Cavalli, the co-founder of Sensity AI in Amsterdam. "All you need is one photo to create a 3D deep fake. This means that the fraudsters can scale their operations and attacks are skyrocketing. You don't even need to be a developer or an engineer. You can do it yourself. There are tons of apps that allow you to replicate anyone's face…At some point the fraudsters work out how to fool our different models, so we need to continuously come up with new techniques," he says.
Regardless of the challenges, Santow is optimistic that Australia could become a world leader in the regulation of facial recognition.