FemTech: a healthcare revolution for women or a source of exploitation?
A woman poses as she observes a period calendar tracker app on her mobile phone at her home in Madrid, Spain, May 16, 2022. REUTERS/Isabel Infantes
While the FemTech industry is revolutionizing healthcare access for historically marginalized populations, it can also be yet another avenue through which data mongers exploit and profit from women
By Lucy Purdon, senior fellow, Mozilla.
On International Women’s Day, let’s take a minute to assess the dire state of women’s healthcare: women receive worse healthcare than men, for example, when it comes to diagnosis of heart attacks and cancers.
Through an intersectional lens, the view is even worse: a 2021 study horrifyingly revealed that women of colour are four times more likely to die in childbirth than white women in the UK. In the face of this shocking neglect, it is understandable that women are increasingly taking matters into their own hands and turning to technology to seek information and manage their health. Products that help do just that are collectively known as “FemTech” – but will this technology help revolutionise women’s healthcare, or is it just another way to exploit their fears?
The FemTech industry is set to explode this decade, predicted to be worth $103 billion by 2030. FemTech entrepreneurs are predominantly women, so more focus on women’s health and increased female representation among investors, entrepreneurs, and researchers is a win-win, right? In 2022, McKinsey reported on the “Dawn of the FemTech Revolution”. But we need to be cautious. This revolution is a Trojan Horse – under the guise of healthcare access, it will open up new avenues of data collection to be monetised – as Google and Facebook did – and more data doesn’t always equal more health.
Online data collection is gratuitous and out of control. The internet is built on advertising revenues, which means that many apps and websites are collecting information and passing it on to third parties. Data brokers then use this information for the purpose of advertising, decision making, and profiling – often without the user’s knowledge or informed consent.
The current cycle of data collection, profiling, and targeting has a unique impact on women. When it comes to personal information relating to reproductive rights – such as menstruation, pregnancy, birth, miscarriage, abortion and menopause – intimate details are being fed into algorithms for unknown reasons and producing unknown results. Trustworthy AI starts with the data that is collected, and women are being betrayed at every turn.
It is a practice so ubiquitous, yet the resulting targeted advertising can be so wrong and distressing. Tommy’s, the UK charity advocating for safer pregnancy and research into baby loss, even published advice on how to stop pregnancy ads following you around the internet after suffering a miscarriage.
Fortunately, the online advertising industry is undergoing welcome legislative reforms in the EU through the Digital Services Act. The UK government is currently reviewing the online advertising regulatory framework and regulators are watching the industry and data brokers closely.
A gender perspective needs to be one of the cornerstones of these reforms. Concerns about how data in the wild might impact safety, jobs, education, children and future, reflect a power imbalance that traditionally controls, restricts, and ignores female autonomy. FemTech could be truly revolutionary for women’s healthcare by challenging the business model of a “traditional” online tech company and refusing to be a source for data brokers that routinely exploit women.
Yet, FemTech did not get off to a great start in the privacy race by embedding itself in this advertising ecosystem as a go-to source of data for data brokers. Mozilla’s recent *Privacy Not Included research demonstrated worrying trends of third party data sharing among pregnancy, menstruation, and fertility apps. The UK charity Privacy International found women were being nudged to use pregnancy apps in healthcare settings that collected a lot of data but didn’t function properly, and the Norwegian Consumer Council found 10 of the most popular menstruation apps were sharing personal information with 135 companies involved in advertising or behavioural profiling.
Companies caught sharing data with an unclear legal basis may change their policies following public pressure or pay the fine if caught by a data protection regulator, but what then? The illegally collected data doesn’t magically disappear – it is still out there, being sold and resold by data brokers and making its way into profiles used to target advertising.
There is little consideration of this under our current data protections. Losing track of personal data is a frightening prospect for many women. For example, the 2022 Goldacre review into using NHS data for health research highlighted how pseudonymisation (removing direct identifiers such as name, date of birth and address) is a weak privacy protection for women’s health records. Knowing their approximate age, location, and time at which they had children is often enough to reveal their identity, and then illegally see everything else in their record. The review highlights the numerous examples of this in the healthcare setting – punished by low-level penalties.
This fear is magnified in commercial settings. The overturning of Roe v. Wade in the United States has made women fearful of using menstruation apps or even Google in case the data tracked and collected may be used to prosecute them for seeking an abortion.
There is work to do to regain trust and put women at the centre of technological developments rather than exploiting them as data points and ignoring their lived reality. Today, and every day, let's prioritize women's rights, autonomy, health, and privacy by ensuring that FemTech doesn't become another source of female exploitation. We demand better.
Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.
Tags
- Tech and inequality
- Tech regulation
Go Deeper
Related
Latest on Context
- 1
- 2
- 3
- 4
- 5
- 6
Most Read
- 1
- 2
- 3
- 4
- 5