Why my marathon-running mother would have cared about data privacy
Athletics - London Marathon - London, Britain - October 2, 2022 General view at the start of the women's elite race Action Images via Reuters/John Sibley
Most apps track you to power algorithms that affect your life. It is time we adopt a global deal to protect our data privacy
Gabriela Ramos is the Assistant Director-General for the Social and Human Sciences of UNESCO.
My mother was a marathon runner. If she ran today, she would probably use a fitness app that leverages data from millions of runners to improve her performance. An app that helps you exercise sounds innocuous. But the transaction involved – handing over your data in return for free app features like average running speed or suggested new routes in your neighbourhood – has consequences for society. A fitness tracking app in 2018, for example, inadvertently revealed the location of military bases by making public information on the runners' routes.
The data we hand over to companies can be used to manipulate us. Humans have cognitive biases, even if we sometimes don't recognize them. We group into communities of similar tastes and opinions (like marathon runners). We pay closer attention to media that corresponds with our own beliefs. These patterns can be exploited politically to polarize voters and sow division, or by companies selling adverts to get us addicted to social media.
Algorithms that govern the way we behave and interact online can deepen discrimination and inequality. This goes beyond the grim digital divide between those who do not have access to internet and those who do. Inequality grows, for example, when artificial intelligence makes it possible to automate routine tasks carried out by workers, reducing their work opportunities and salaries.
In the gig economy, where millions of workers lack social security, an algorithm can become your manager, deciding when to offer you work and when to deprive you of it. The next wave of technological transformation, led by artificial intelligence, will generate enormous value, but its fair distribution is not assured, particularly for women and other minorities.
My mother would have wanted to log, and improve, her running ability but not if the cost was undermining national security or perpetuating inequality. So what are we going to do to protect the ones we love from the dark side of digital technology?
The solution is a new ethical deal, applied worldwide. To this day, a third of all nations lack basic data privacy laws. Those countries that do are locked in a race to the bottom, opting for the weakest regulations to attract the investments of technology companies. The effectiveness of existing data privacy laws are hindered by limited digital literacy among normal people. Meanwhile, giant tech corporations operate globally take advantage of this situation for their own interests.
Establishing a new global order is urgent. It must be based on protecting human rights and human dignity, instead of devolving responsibility to the users, and must be founded on international standards and global governance. UNESCO’s international AI ethics agreement, the first of its kind, calls for a business model where AI technologies prioritise the needs of people working across many sectors.
The blueprint is there. What we need is for governments and business leaders to step up and put these recommendations into practice. This calls for new laws, regulations and institutions, as well as public officials with the knowledge required to make this work. We will all need the courage to confront the interests of those who have managed to trap us in their opaque algorithms.
I am a runner like my mother. Bringing this change about is my marathon.
Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.
- Tech regulation
- Data rights
Latest on Context