ContextKnow better. Do better.

User, beware: Privacy-washing online is real

People look at data on their mobiles as background with internet wire cables on switch hub is projected in this picture illustration taken
opinion

People look at data on their mobiles as background with internet wire cables on switch hub is projected in this picture illustration taken May 30, 2018. REUTERS/Kacper Pempel

Many companies are adapting “green washing” tactics from decades ago, affixing a veneer of privacy to their products but still scooping up, sharing, and selling our intimate data.

By Jen Caltrider, Mozilla Foundation

Online privacy is one of the biggest issues that galvanizes consumers. In a world where everything from fitness trackers to therapy apps to even toilets are connected, consumers need to worry about what exactly these apps and gadgets do with all that personal information they collect. 

“I have nothing to hide,” was once a popular rebuttal, but that’s no longer the case. There are too many examples of our online privacy and offline lives intersecting in terrible ways. Like the gay priest who was outed by Grindr data, or the mother and daughter facing criminal charges over an abortion procedure in the U.S. state of Nebraska, with Facebook messages as key evidence. 

Online privacy is also a topic for lawmakers, with U.S. bills such as the American Data Privacy and Protection Act (ADPPA) seeking to better protect consumers.

Surveillance

Xinjiang to London: Chinese surveillance tech in the UK

Visitors experience facial recognition technology at Face++ booth during the China Public Security Expo in Shenzhen, China October 30, 2017. Picture taken October 30, 2017. REUTERS/Bobby Yip
Surveillance

Face for sale: Leaks and lawsuits blight Russia facial recognition

The coronavirus disease (COVID-19) contact tracing smartphone app of Britain's National Health Service (NHS) is displayed on an iPhone in this illustration photograph taken in Keele, Britain, September 24, 2020. REUTERS/Carl Recine/Illustration
Surveillance

Pandemic surveillance: is tracing tech here to stay?

Yet, U.S. consumers are still awash in a sea of untrustworthy apps and gadgets. Worse, many of the companies behind them are adapting the “green washing” tactics from decades ago, affixing a veneer of privacy to their products but still scooping up, sharing, and selling our intimate data. Call it “privacy washing.” 

At Mozilla, I build *Privacy Not Included: a buyer’s guide that consumers can use to learn the privacy features and flaws of everything from dating apps to smart doorbells. I spend my days poring over privacy policies, testing products, and emailing companies with privacy questions. And what I see doesn’t match up with what consumers deserve. 

Mental health apps are a prime example. They bill themselves as safe and private: “Your privacy is extremely important to us,” reads the first line of one app’s privacy policy. After all, you might expect software that deals with suicide, domestic violence, and PTSD to have unparalleled privacy features. Not so.

In our recent research, 28 out of 32 mental health and prayer apps received Mozilla’s *Privacy Not Included warning label. Too many of these apps share highly personal data with third parties. They target vulnerable users with invasive ads, and they feature hard-to-read privacy policies. The result? The most vulnerable among us are put at even higher risk.

The same is true for other types of tech: When Mozilla researched dating apps, we found that apps geared toward vulnerable populations, like the LGBTQ+ community, had some of the worst privacy practices. Once again, the most vulnerable become even more so. 

Oftentimes, some of the worst privacy offenders are the biggest names, like Facebook and Amazon. While they may use encryption, require strong passwords, and more, that’s not the same as privacy — they still collect your data, and share it with advertisers.

But privacy isn’t a lost cause. There are good guys. And consumers’ voices are being heard. 

In each *Privacy Not Included edition, we feature a “Best Of” category. There are companies out there who genuinely care about users’ privacy, and are pushing the needle in the right direction. Like Euki, a reproductive health app for women that’s built by a nonprofit. And Wysa, a mental health app and chat bot that doesn’t monetize user data.

Meanwhile, consumers' voices are growing louder. After the repeal of Roe vs Wade, consumers objected to unfettered access of law enforcement to their data. In response, Apple made sure the fertility tracking data collected by its Apple Watch 8 is only stored locally on a user’s device. So Apple collects no reproductive health data to give to law enforcement.

For another example, look at the case of the disappearing Facebook buttons: Aware of consumer ire with Facebook’s poor privacy practices, other companies — like Nike and Best Buy — no longer allow customers to sign in using Facebook credentials. 

All this is welcome — and sorely needed. We’re learning that everything is now connected: toilets, bathtubs, bikes, scooters, washers and dryers, and even dehumidifiers. So user, beware: make sure your next purchase has real privacy, and not just a veneer.


Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.


Tags

Consumer protection
Tech and inequality
Social media
Data rights
Corporate responsibility
Cyberspace


Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.


Today On Context

Exclusive commentary & unique insights. Always Free. Directly to your inbox.