Roe v Wade and the need for fair, meaningful protections for data privacy
A protester holds a placard as people gather for a rally in support of abortion rights following the U.S. Supreme Court overturning Roe v. Wade, in Paris, France, July 2, 2022. REUTERS/Benoit Tessier
Everyone deserves to have control over the information they leave on their phones and computers. It’s time our laws and practices changed to give all users meaningful and lasting data privacy.
Corynne McSherry is legal director at Electronic Frontier Foundation.
Many things have changed since the last time abortion was criminalized in many U.S. states. One of the biggest is the emergence of widespread digital surveillance systems.
Today, technologies that help people to seek and share information, healthcare, funding, travel and so on, also make it easier for companies to collect data about our activities. Meanwhile, our digital devices have become tracking mechanisms, storing information about our movements, searches, and communications.
Taken together, these tracking mechanisms create a digital footprint that can serve as evidence. People seeking, offering, or facilitating abortion access must now assume that any data they provide online or offline could be sought by law enforcement or private parties spurred by strict abortion laws like Texas’ S.B. 8.
Everyone deserves to have strong controls over collection and use of information they leave as they go about their normal activities on their phones and computers. The Supreme Court’s decision to overturn Roe v Wade underscores the crucial importance of fair and meaningful protections for data privacy.
Individuals, service providers, and governments all have vital roles to play.
Individuals can help protect themselves and their communities by keeping reproductive health information compartmentalized from other aspects of their digital lives, making it harder to trace. By using a separate browser with hardened privacy settings like Brave, Firefox, or DuckDuckGo on mobile, and turning off its ability to remember browsing history and site data or cookies, and by using a separate email address made with security in mind, like Tutanota or Protonmail.
For messaging, use an end-to-end encrypted app like Signal, and turn on disappearing messages. Turn off ad identifiers to curtail individual apps' abilities to track your behavior, and limit sharing of that information. Review and consider shutting off permissions that apps have on your phone, especially location services or “Find My” features. If there’s a heightened risk of arrest, reboot your phone so that only you can only unlock it with a passcode, not a face ID or fingerprint. Consider turning off cellular data - this will prevent apps from sending your real-time location data to their servers.
That said, individuals shouldn’t have to be security experts to access healthcare. Companies that collect user data can expect subpoenas and warrants seeking that data. They can also expect pressure to aggressively police the use of their services to provide information that may be classified in many states as facilitating a crime.
In this context, those companies can and should do far more, immediately, to protect their users. That starts with understanding that “if you build it, they will come” - so don’t build it, don’t keep it, dismantle what you can, and keep it secure. Allow anonymous access, stop behavioral tracking, don’t retain data you don’t need, enable end-to-end message encryption by default, stop location tracking unless the user asks for it, and ensure that users get notice when their data is being sought.
Social media platforms should also re-examine their speech and content moderation policies, and consider carefully how they might be abused. Any “community standards” policy or algorithm that permits depublication, downranking, or outright censorship of information and/or user accounts based on reproductive health issues or potential links to “criminal” activities should be re-thought.
Meanwhile, state and federal policymakers must pass meaningful privacy legislation.
In Congress, EFF supports the My Body, My Data Act, which would restrict businesses and non-governmental organizations from collecting, using, retaining, or disclosing reproductive health information that isn't essential to providing the service someone asks them for. The bill would protect people who use fertility or period-tracking apps, or are seeking information about reproductive health services.
In the longer term, Congress must pass a broader privacy protection law that goes further. We all need data privacy, whether we are seeking reproductive healthcare, trying to learn more about gay rights, or looking for a new job.
The world has changed. It’s time our laws and practices changed with it to give all users meaningful and lasting data privacy.
Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.
Tags
- Gender equity
- Digital IDs
- Tech and inequality
- Tech regulation
- Data rights
Related
Latest on Context
- 1
- 2
- 3
- 4
- 5
- 6