View Online | Subscribe now
Powered byThomson Reuters Foundation logo
Context logo

Know better. Do better.

tech and society

Dataveillance

AI, privacy and surveillance in a watched world

Photo of Samuel Woodhams

Hi, it’s Sam. This week, we’ll take a look at function creep, which can transform seemingly benign devices into invasive surveillance tools and often occurs in secret – leaving us unsure of exactly what data a device is accessing and for what purpose.

Technology is not static. It changes, mutates, and develops new features and capabilities, often at breakneck speed. A device once intended for a single, straightforward purpose can be repurposed overnight.

Take, for example, CCTV cameras installed to keep an office secure. Over time, these devices may begin to monitor when you enter and leave the workplace, or even analyse your emotions while working.

This transformation is referred to as function creep, or “the expansion of a system or technology beyond its original purposes.” For some, there’s nothing creepy about it. It’s innovation, an attempt to make the most of new developments and, crucially, a new commercial opportunity that must be taken advantage of.

For privacy advocates, however, it’s an essential concept that helps warn of the potential hidden dangers of new technologies. It encourages us to consider how data extracted for one purpose may be used later, and how certain tools may lead to invasive practices, even if they are legitimate today.

When COVID-19 contact tracing apps were launched back in 2020, hundreds of articles discussed the apps’ potential impact on our privacy. A Bloomberg opinion piece by columnist Andy Mukherjee even went so far as to suggest that “the price of COVID freedom may be eternal spying.”

Because the concept of function creep is inherently forward-looking, it often relies on hypothetical arguments, and warnings of what could be done with a specific technology. Too often, this means the risks are ignored.

But function creep is a genuine threat that, without sustained scrutiny and increased transparency, may continue to transform unproblematic, even banal technology into hidden surveillance systems.

Border Patrol cameras look out across the borderlands in Nogales, Arizona, U.S., August 31, 2022

Border Patrol cameras look out across the borderlands in Nogales, Arizona, U.S., August 31, 2022. Thomson Reuters Foundation/Rebecca Noble

Smart cities, secret surveillance

In 2017, new ‘smart’ streetlights were installed across San Diego. The new technology promised to reduce the council’s energy bill, improve brightness on the streets, and measure air quality. Just two years later, it was revealed that the police had been mining the devices for video surveillance footage – a capability that few in the city were even aware of.

At first, the local police said the footage had only been accessed for ‘serious crimes.’ But before long, that also included vandalism and fly-tipping. Later, it was even used to monitor Black Lives Matter protesters.

This shift from a worthwhile city project to the creation of a mass surveillance tool is a powerful example of function creep and its potential to advance authorities’ surveillance reach in secret.

Licence plate recognition

Automatic Number Plate Recognition (ANPR) technology has been around for decades, and often receives little attention. However, its role and capabilities have transformed dramatically since its inception.

In the Netherlands, some cameras can now take photos that are of such high quality that the drivers' faces can be discerned. The images are now being used by the police during investigations, despite its questionable legality. While in Denmark, ANPR cameras now play a role in a range of controversial policing initiatives.

The UK’s Biometrics Commissioner has warned that developments in the field, including cameras that can monitor drivers, their habits, and their passengers, could significantly erode public support for the technology if adopted.

Regulatory constraints on the use of these tools lags far behind technological developments. Within this vacuum, authorities are free to use new tools in any way they deem appropriate, often without informing citizens.

An illustration photo shows - from left to right - a woman holding a national identity card, an eye on a phone screen, a man in a mask and a life vest, and a surveillance tower on a background of newspaper clippings and barbed wire. Thomson Reuters Foundation/Nura Ali

An illustration photo shows a woman holding a national identity card, an eye on a phone screen, a man in a mask and a life vest, and a surveillance tower on a background of newspaper clippings and barbed wire. Thomson Reuters Foundation/Nura Ali

Facial recognition

While more controversial than ANPR, facial recognition’s role has also rapidly shifted in recent years. Across Delhi, facial recognition was originally touted as a vital tool to help the police identify missing children. But thanks to relatively weak legal restrictions, it’s now being used for wider surveillance purposes, raising considerable concern from privacy advocates.

In Russia, facial recognition was initially used to fight crime. Since then, it’s been used to enforce COVID-19 measures, monitor protesters, and most recently catch people fleeing conscription.

With many manufacturers of facial recognition now touting ethnicity detection capabilities, it’s worth asking what’s preventing authorities from using these tools for even more nefarious purposes in the future?

Police officers walk past a surveillance camera in central Moscow, Russia January 26, 2020

Police officers walk past a surveillance camera in central Moscow, Russia January 26, 2020. REUTERS/Shamil Zhumatov

What can be done?

Earlier, I mentioned the intense debate that emerged over contact tracing apps and digital privacy. In many cases, fears of function creep have not materialised, though there are several exceptions, including in Singapore, India, and China.

Arguably, one of the reasons there hasn’t been greater misuse of this data is because of increased public scrutiny. In many countries, authorities were forced to explain how the tools worked, and implement safeguards to encourage people to use them. Increased scrutiny, in other words, forced transparency and proportionality from authorities.

To avoid the undemocratic imposition of surveillance technology in other areas, it’s clear we need to look beyond the most invasive, controversial technologies. Otherwise, the inconspicuous, banal technology that surrounds us may rapidly transform into systems capable of egregious invasions of our privacy in the future.

We're always happy to hear your suggestions about what to cover in this newsletter - drop us a line: newsletter@context.news

Recommended reading, watching and listening:

Bert-Jaap Koops, The Concept of Function Creep, Law Innovation & Technology 13(1), March 16, 2021

Function creep is frequently used but rarely examined, says Koops. The article explores the meaning of the term and provides a critical examination of the concept. It also provides important information on related terms such as mission creep, control creep, and the slippery slope argument.

No Tech for Tyrants, Surveillance Tech Perpetuates Police Abuse of Power, November 7, 2022

This report from UK-wide collective No Tech for Tyrants (NT4T) presents case studies from the UK, US, Mexico, Denmark, and India to show how police use of surveillance tools constitute an abuse of power. It includes details on function creep and information on the smart streetlights discussed above.

Ausma Bernot et al, China’s ‘Surveillance Creep’, University of Wollongong Australia, August 31, 2021

This article traces the potential for function creep in China’s deployment of surveillance technology in response to COVID-19. It sets out in concrete terms what other uses the data could be used for, and argues for transparency and accountability to be embedded within the systems.

Any views expressed in this newsletter are those of the author and not of Context or the Thomson Reuters Foundation.

This week's top picks

Special series: Surveillance tech keeps tabs on world's migrants

From digital IDs to AI-based facial recognition cameras, tracking tools threaten migrants' basic freedoms, rights experts say

'They tried to erase us': Rohingya IDs deny citizenship

Myanmar's identity card system has been used as a surveillance tool against the persecuted minority, human rights groups say

Singapore online safety bill must embed human rights

Moves to regulate online safety and harmful content must not sacrifice the right of Singaporeans to freely express themselves online

The real problem with Twitter

The problems at Twitter show a wrong turn for the wider ideal of social media, with greater control concentrated in fewer hands

Australians fear for privacy as hackers steal personal data

Millions of people have had data stolen in a spate of cyberattacks targeting firms such as telco Optus and health insurer Medibank

 
Read all of our coverage here

Discover more

Thank you for reading!

If you like this newsletter, please forward to a friend or share it on Social Media.

We value your feedback - let us know what you think.