Kill Your Algorithm: Listen to the new podcast featuring tales from a more fearsome FTC

Kill Your Algorithm is a two-part Digiday podcast special exploring the implications of a more aggressive Federal Trade Commission. Sometimes referred to as weak and toothless in past years, the FTC is sharpening its fangs under the tough new leadership of Chairwoman Lina Khan, who has already guided policy changes that could have a big impact on how the agency addresses privacy and antitrust abuses of data-hungry tech. But party-line votes among FTC commissioners signal heightened internal partisanship at the agency, known historically for rising above the political fray. And some worry getting too aggressive or political could backfire.

Episode One: Shocking Data Stories

When the FTC alleged that period tracking app maker Flo Health shared people’s private health information with Facebook and Google without permission, its settlement with the company required some changes in how it gathers and uses people’s data. But some believed it was just another example of a feeble approach to enforcing the agency’s authority. The settlement soon led to a controversial enforcement policy update that could affect countless health and fitness app makers. And that was just one indication that the FTC is getting tougher on tech firms. It’s already forced two companies to destroy their algorithms.

Transcript

Kill Your Algorithm credits:
Kate Kaye, reporter, scriptwriter and host
Sara Patterson, producer
Priya Rao, script editor
D. Rives Curtright, original music

PAM DIXON
For some people — for some women — this was a violation not just of privacy, but of spiritual beliefs, and religious beliefs. This was a huge problem for them and brought them great shame.

KATE KAYE
Pam Dixon is the executive director of World Privacy Forum, an organization that provides research and guidance related to all sorts of privacy issues.

When people found out that a period tracking app called Flo may have shared intimate data about their bodies without their permission, a lot of calls came into her group’s privacy hotline.

DIXON
When people of an app learn that their data is going to one of these large tech companies that they were not aware of when they signed up, it makes them very nervous and I think that’s fair. They’ll call our office line which is a voice line and takes a lot of messages.

KAYE
So, in case you don’t use one of these period trackers, they’ve become pretty common. Like many of the other period tracking apps, people use Flo to monitor their periods to see if they’re late, to know whether it’s prime time to try to get pregnant, to plan when the best dates for a beach vacation might be, or if they’re a little on the older side, to measure how their menstrual cycles change as menopause comes into the picture.

To make the app’s predictions work, people submit all sorts of really personal information about their bodies — when they were sexually intimate, whether they had sex related problems and even when they experienced premenstrual symptoms like bloating or acne or depression.

It was alleged that between 2017 and 2019, Flo Health, the maker of the Flo Period and Ovulation Tracker app, shared that type of personal health data with companies including Google and Facebook. 

And that data sharing may have affected a lot of people. Tens of millions around the world use the Flo app. 

Maria Jose is one of those many Flo app users. She lives in Medellin, Columbia. When we spoke in September she was 14 years old — about to turn 15. Because of her age, we’re only using Maria Jose’s first name.

She told me that the boys at school bullied her and other girls about their periods.  

MARIA JOSE
It’s not a good topic to talk about. You can get bothered a lot, like bullying. They’ll say, “Oh, you have that? That’s gross.”

When I started, like, my period I talked to my friends, and they recommended me the Flo app. I just started using it. I really don’t read the policy apps — the privacy. I just, like, started it. And, yeah, it has been very amazing, that app.

I like that it tells me when I’m about to start so I don’t get like any, in the school or anything.

KAYE
Yes, so you don’t have spots end up places you don’t want them to. I had that happen when I was about your age. I remember. 

The company was sharing information that for instance, people like you, when they use the app and you say, “Hey, my period started,” that information could have been shared with Facebook and Google and other companies. And there’s a chance that it could have been used for, say, targeting advertising or for Facebook to use for its product development and research — we don’t really know.  What do you think about that? 

MARIA JOSE
I’m not going to stop using the app because it’s very useful, but it worries me a little bit that, yeah, it can be linked very easily.

KAYE
Maria Jose explained to me that she didn’t like the idea of the Flo app linking data about her period or premenstrual symptoms to data that other companies — such as Facebook or Google — have. 

She was right to be concerned. When people enter data into an app like Flo, it usually doesn’t stay just in one place. It travels and often it’s combined and connected to other data. 

And when a period tracker app like Flo shares information with Facebook or other companies, it can be linked up with other information about someone — and used to paint a more vivid portrait of who they are and what’s going on in their lives. 

Facebook, for example, could have taken a piece of information like someone gained some PMS weight and it could have aimed an ad at them promoting a weight loss product. Or it could have even categorized her as someone who is at risk for fertility problems associated with weight gain and bloating.

Here’s Pam Dixon again.

DIXON
A lot of times where the problems come in is when there’s unknown secondary uses of information you entrusted to, you know a technology company or a retailer or to anyone, and I think that that’s where Flo has gotten in trouble here.

KAYE
And the thing is, information about periods, or fertility, or whether someone is trying to conceive a child — these aren’t just data points. They are personal, sensitive issues. 

People like Maria Jose are bullied. Women and girls in some parts of India are forced to stay in menstrual huts — exiled just for getting their periods. And information about when someone is on their period takes on a whole new level of risk for trans men or non-binary people.

DIXON
There is significant concern, and not just from people in the United States, there are people from other countries who are very concerned about this, and the anxiety is actually in some cases is stronger in other countries — and there’s more anger. 

In some cultures, periods are, they’re not controversial but they are very personal. In the U.S., I think we’re more open about these things, and we view it as, OK, well this is part of health, and you know, we talk about it, but it’s not that way everywhere. And in places where it isn’t that way, to have this kind of breach is a really big problem.

I think being told that well, “it’s just a number,” the problem is once there is a breach of trust like this it’s really hard to get it back and because we don’t have enough transparency into what actually happened, I think there’s an ongoing lack of trust. 

KAYE
So, you’re probably wondering — aren’t there laws against what Flo Health did? Can’t the government do something when a company shares sensitive personal health data without permission?

Well, yeah. There are laws against deceptive business practices like these. And there’s a government agency that is set up to protect people from the unfair data sharing that Flo Health allegedly enabled. 

In fact, that agency — The Federal Trade Commission — or the FTC for short — is exactly what we’re here to talk about. My name is Kate Kaye. I’m a reporter covering data and privacy issues for Digiday, and a lot of my reporting deals with the FTC and how it is changing to get a better grip on a largely-untamed tech industry.

This is part one of Kill Your Algorithm, a two-part podcast about how the FTC is getting tougher.

About how it’s trying to lasso data-hungry tech. 

About what a more aggressive FTC might mean for tech companies and the people who use their apps and websites.

About how partisanship and politics is influencing the FTC’s future.

And about how its past could get in the way. 

The FTC investigated Flo Health and eventually lodged a complaint against the company that was made public in January 2021. 

They found that — even though the company promised users it wouldn’t share intimate details about them — it did. The FTC said that Flo disclosed information revealing things like when users of the app had their periods or if they had become pregnant. 

A 2019 Wall Street journal expose that got the FTC interested in investigating Flo walked readers through the process.  How software inside the Flo app records information — say about when a user is ovulating — and passes it to Facebook, which can then use it to target ads, maybe for fertility services.

KAYE
So, in the end the FTC did what it often does in these sorts of situations. It settled with Flo Health. 

Following the investigation, four of the FTC’s five commissioners voted in favor of finalizing a legal settlement with the company. It demanded that Flo Health make some changes to its app and its data practices to ensure it would never share people’s intimate health data without their permission again. 

It required the company to ask people in a clear and prominent way — like right up front when they download the app — if they’re OK with Flo sharing their health data. That meant Flo Health couldn’t continue to bury information about data sharing in a privacy policy that most users never read.

The settlement also said the company had to tell people using its app that their data had been disseminated to companies like Facebook without their knowledge or permission. 

Finally, the FTC ordered Flo Health to tell the other companies it shared its users’ data with, like Facebook and Google, that they’d have to destroy that data. 

Flo declined to be interviewed for this podcast, but the company sent a statement claiming that at no time did Flo Health ever sell user data or share it for advertising purposes. The company said it cooperated fully with the FTC’s inquiry, and stressed that the settlement was not an admission of any wrongdoing. 

But there’s a lot of stuff the FTC didn’t do to penalize Flo Health.

It didn’t slap any fines on the company. And it didn’t get money for people who were violated when Flo Health — without permission — shared details about when they got cramps or felt bloated or were ovulating or got depressed. 

Some people believed the settlement was more of a light slap on the wrist than any kind of tough penalty. They worried that the FTC didn’t enforce a specific health privacy rule. One that would have forced the company to notify its app users in the future if their personal health data was shared or leaked. Even two of the FTC’s own five commissioners wanted the agency to go further by applying that rule: it’s called the Health Breach Notification Rule. 

The Health Breach Notification Rule not only requires companies to notify people affected by a breach of health-related data, violating it could pack a punch — companies can be fined more than $43,000 for each violation per day. But in the decade since it’s had the authority to apply the rule, the FTC has never once done that. It wasn’t even applied against Flo.

FTC commissioner Rohit Chopra voted ‘yes’ on the settlement against Flo Health, with some caveats. He argued that the FTC should have charged the company with a violation of that rule. Enforcing it against Flo could have been a sign to other health app makers that the FTC is getting tougher on health data and app data privacy.

Chopra spoke about it during a September FTC meeting.

ROHIT CHOPRA 
Flo was improperly sharing extremely sensitive data with Facebook, Google and others, but rather than sending a clear message, that the text of the health breach notification rule covers this activity, we demonstrated again that we would be unwilling to enforce this law as written.

KAYE
So, it turns out that during that meeting — just a few months after the Flo settlement — the FTC decided it would put more emphasis on that rule in the future when it comes to data sharing by health apps. 

Not everyone agreed. Two FTC commissioners voted against the idea of enforcing the rule against health app makers. They said that data sharing without permission isn’t the same thing as a breach of data security.

Even though the health breach notification rule seems kinda wonky and in-the-weeds, here’s why it’s important:

The FTC has a set of tools it can use to protect people when they’re privacy is violated, and this rule is one of those tools. So, it’s just the sort of thing people like commissioner Chopra and his fellow FTC commissioner, Rebecca Slaughter, want to see the FTC actually use in order to take full advantage of the rules and powers they have right now.

I spoke in July with commissioner Slaughter.

REBECCA SLAUGHTER
We don’t always need new rules, we have a lot of rules that we don’t always enforce or don’t enforce as broadly or frequently as we could and so making sure we are really examining our entire toolbox and applying everything that is applicable even before we get to adding new tools is something that I have thought was important for several years and is particularly important as we tackle novel types of problems.

KAYE
She means new types of problems. And in some ways, she means new and novel problems brought on by data-gobbling tech. The Flo case — it’s just one example of why the FTC has garnered a reputation as being too weak. 

Let’s talk Facebook.

The FTC has gone after Facebook more than once, but many believe it just hasn’t cracked down hard enough on the company. Back in 2012 the agency settled with Facebook, resolving charges that the company lied to people by repeatedly allowing their data to be shared and made public even though it told them their information would be kept private.

The FTC ordered Facebook not to do it again and said it would monitor the company closely to make sure it didn’t misrepresent the privacy controls or safeguards it has in place.  

But then Cambridge Analytica happened. 

Sound montage from news reports:

It’s an online information war where often unseen hands harvest your personal data tapping into your hopes and fears for the greatest political yield.

In 2014, you may have taken a quiz online and if you did you probably shared your personal data and your friends personal data with a company that worked for President Trump’s 2016 campaign.

I found out that the information that was passed on to Cambridge Analytica was my public profile, my birthday, my current city and my page likes. 

Kogan combined the quiz results with your Facebook data to develop a psychometric model, a sort of personality profile. 

Zuck is finally speaking out about Facebook’s Cambridge Analytica scandal.

So, this was a major breach of trust and I’m really sorry that this happened.

KAYE
There was no shortage of media reports and investigations into Cambridge Analytica and how the company’s psychological ad targeting influenced voters in the 2016 election.

The FTC had authority to do something about it. They said, “Wait a minute, Facebook — by letting that data gathering happen on your platform, you violated our 2012 agreement.”

So, in 2019 the FTC charged Facebook with deceiving its users about how private their personal information really is, and it fined Facebook what the FTC called a “record-breaking” penalty: $5 billion. 

But not everyone was happy about it. Some said the settlement was another lame move by the FTC. Along with lots of FTC observers, both commissioners Chopra and Slaughter pushed back hard on what they saw as a feeble settlement with Facebook — one that did little to deter the company from engaging in the same old data tactics in the future.

Here’s commissioner Chopra speaking to CNBC.

CHOPRA
This settlement is full of giveaways and gifts for Facebook.

There’s a lot for their investors to celebrate. At the end of the day, this settlement does nothing to fix the fundamental incentives of their broken behavioral advertising model. It leads to surveillance, manipulation and all sorts of problems for our democracy and our economy.

KAYE
Commissioner Chopra echoed what lots of critics said: that fining one of the world’s biggest digital ad sellers — a company that took in more than $70 billion in revenue that year — a five billion dollar penalty was meaningless. 

Slaughter, in her dissent, said she was skeptical that the terms of the settlement — without placing more limits on how Facebook collects, uses and shares people’s data — would have any meaningful disciplining effect on how the company treats data and privacy going forward. 

Slaughter told me she expects in future cases against companies that the FTC will move toward getting tougher remedies. In other words, restrictions and penalties that remedy the problems and violations they charge companies with.

SLAUGHTER
I anticipate pushing for remedies that really get at the heart of the problem and the incentives that companies face that lead them into the illegal conduct. Another thing we talk about a lot as a novel remedy is the deletion of not only data but algorithms that are built out of illegally collected data. 

So, another important case we had this year was called Everalbum which involved a company misrepresenting how it was using facial photo data, facial recognition data about people, and in our order we required not only for them to delete the data that they collected but also to delete the algorithm that they built from that data. That’s really important because in models that use data to build analytical tools like algorithms the underlying data doesn’t actually become important at the end of the day, it’s the tool that they built from it.

KAYE
Yep. The FTC has begun to force companies to kill their algorithms. And it could be just the beginning. The agency might not only demand that companies delete data they gathered through deceptive practices, but it will force them to destroy the algorithms they built with that data.

That means they’d have to get rid of the complex code and data flowing through their automated systems. This really scares tech companies because in many cases, the reason they’re collecting all this data in the first place is to build and feed algorithms that make automated decisions and learn as they ingest more and more information. 

We experience algorithms in our lives every day. When Amazon recommends products, that’s an algorithm making those recommendations. When Spotify or Netflix serves up another song or film that they think you’ll like, an algorithm did it. Even when we drive these days. That automatic driver assist feature that helps your car stay in a lane on the highway? You guessed it: an algorithm. 

And the reason people give apps like Flo personal health information like when their period starts and whether they had cramps, it’s so the app and the algorithm it uses can make more accurate predictions and improve over time. 

Here’s Rebecca Slaughter.

SLAUGHTER
Nobody talks about this but that was something we required of Cambridge Analytica too. In our order against Cambridge Analytica we required them to delete not only the data but the algorithms that they built from the data, which was what made their tool valuable and useful.

That was an important part of the outcome for me in that case. I think it will continue to be important as we look at why are companies collecting data that they shouldn’t be collecting, how can we address those incentives, not just the surface level practice that’s problematic.

KAYE
Cambridge Analytica effectively shut down after that. 

While the FTC won’t reveal specifics about how it monitors companies for compliance with its settlements, the approach was a sign of what a more-aggressive FTC could have in store — especially for companies whose businesses rely on data and algorithms. 

Alysa Hutnik heads up the privacy and information security practice at Kelley Drye and Warren. They’re a law firm that represents tech companies. She and her clients are always on the lookout for changes at the FTC that could affect their businesses.

ALYSA HUTNIK
You don’t want to end up with a decision by the FTC that you violated the law because that starts with often a settlement discussion, and the settlement is all about changing your business practices. Where, if the FTC thinks that you’ve done something wrong then one of the remedies that they are very much looking at now is, “Can we delete some of your models and your algorithmic decision making.” Well, what does that do? I mean, if your model has to get erased, are you starting from scratch on some pretty substantive things? And that clearly affects the value of the business and really what you can do going forward.

KAYE
In the Flo case, the company did not have to destroy its algorithm. Even though Flo Health got caught sharing data with companies without permission, they did, as far as the FTC is concerned, have the OK from people to use the data collected from them to help it track their periods.

And Flo plans to continue improving its algorithm. When the company collected $50 million in venture capital funding in September, it said it would use the money to make its app even more customized and provide users with advanced insights into their menstrual cycles and symptom patterns to help them manage and improve their health.

Flo Health is still actively marketing its app trying to get more users. It started running ads on Facebook in September promoting an update to its app. The company is even sending swag to influencers.

JAY PALUMBO
Hello, all. Can we discuss this box that I just received from Flo? Look at this, phenomenally on my period [laughs].

KAYE
In July, Flo sent a goodie box to Jay Palumbo, a stand-up comic and women’s health advocate who writes for Forbes and other publications. She told me she never did any work for Flo or promoted the company, but she tweeted out a video showing off the gifts she got from them.

So, even though Flo Health was charged with unfair and deceptive data sharing, the company doesn’t seem to have missed a beat. They even have a podcast.

FLO PODCAST SOUND
This is your body your story, a podcast by Flo.

KAYE
But it’s not just privacy issues people criticize the FTC for being too weak on. They also say the agency is ineffectual when it comes to its other main area of oversight, antitrust and competition, or ensuring market fairness. 

Put it this way: it’s not difficult to find articles or, like, interviews with pundits calling the FTC a do-nothing agency, one that has failed to protect people when it comes to everything from pharma price gouging to insufficient penalties for tech companies.

NEWS SOUNDBYTE
The FTC previously had been a fairly toothless agency in going up against some of these big tech companies.

KAYE
But that seems to be changing. 

And there’s one person in particular who is pushing for that change: Lina Khan.

Sound montage from news reports:

This was a sort of ‘oh wow’ moment for me when I heard the name Lina Khan this morning. Tell me more about why Lina Khan is such a big deal and why tech companies might be a little nervous about this news.

This was a controversial move led by the new FTC chair Lina Khan during her first public meeting and it could signal more aggressive action, especially against big tech in the future.

[Ohio Rep. Jim Jordan] The federal trade commission run by Biden democrats who want to fix systemic racism, people who want your business to fail, Soros-backed individuals.

Facebook is seeking the recusal of FTC chair Lina Khan.

KAYE
In part two of Kill Your Algorithm, we’ll learn more about why this former progressive law school professor ruffled big tech feathers even before she was named as chair of the FTC. We’ll talk about some of the FTC’s latest moves and how they could rein in excessive data collection that propels tech power. And we’ll analyze why the FTC’s move into more partisan political territory could backfire.

That’s it for this first episode of our two-part series. Special thanks to our producer Sara Patterson, and to Portland, Oregon multi-instrumentalist and songwriter D. Rives Curtright for supplying our killer music. You can find him on most streaming platforms.

https://digiday.com/?p=429317

More in Media

News publishers may be flocking to Bluesky, but many aren’t leaving X

The Guardian and NPR have left X, but don’t expect a wave of publishers to follow suit. Execs said the platform is still useful for some traffic and engaging with fandoms – despite its toxicity.

Media Briefing: Publishers’ Q4 programmatic ad businesses are in limbo

This week’s Media Briefing looks at how publishers in the U.S. and Europe have seen programmatic ad sales on the open market slow in the fourth quarter while they’ve picked up in the private marketplace.

How the European and U.S. publishing landscapes compare and contrast

Publishing executives compared and contrasted the European and U.S. media landscapes and the challenges facing publishers in both regions.