‘Community is at the heart of everything we do’: A Q&A with Twitch vp of trust and safety Angela Hession

VP Angela Hession of Twitch

Twitch is the most active gaming community platform on the web, and it shows no signs of slowing its momentum in 2022.

Gamers watched over 24 billion hours of content on the livestreaming service last year, according to a recent report by StreamElements and Rainmaker.gg — a 45% increase between 2020 and 2021. During the same period, users viewed 5.3 billion hours of content on Facebook Gaming, with YouTube Gaming taking third place in the race for the attention of the livestreaming audience.

With such growth, the Amazon-owned streaming platform has experienced some growing pains. Last year, many Twitch creators were forced to endure orchestrated “hate raids,” waves of anonymous users inundating streamers’ broadcasts with mean-spirited messages. These raids primarily targeted smaller or mid-size streamers — particularly those in marginalized communities, such as female streamers and creators of color. “You have to constantly tell yourself that this is happening to everybody — it’s not just you,” said prominent Twitch creator ARUUU. “You’ve got to keep going and continue streaming, because the moment you start to show any weakness, it gets even more aggressive. They’re going to keep doing it until you stop streaming, which has happened with a bunch of Twitch streamers.”

Last month, Twitch’s global vp of trust and safety Angela Hession penned an open letter acknowledging the presence of hate raids on the platform and listing the tools and policy changes that Twitch has created to protect its community from targeted harassment. The letter outlined updates to the platform’s harassment policies and highlighted the creation of the streaming industry’s first off-service conduct policy.

Twitch is constantly updating its safety and privacy tools, per Hession, and the changes listed in the open letter are just the tip of the platform’s safety iceberg. Digiday caught up with Hession to learn more about Twitch’s current approach to user safety.

This interview has been lightly edited and condensed for clarity.

In your open letter, you wrote that Twitch’s updated hateful conduct and harassment policy takes a clearer and tougher stance on harmful behaviors. What exactly does this mean?

This was a policy that took a lot of time, a lot of feedback. Hateful conduct and harassment have always been prohibited on Twitch, and we’re constantly looking at our policies to see how we make them clearer. What we did was that we broke out hateful conduct, and harassment and sexual harassment, and we provided specific examples for people to understand what we meant by that.

I think giving examples really helps our marginalized creators and people that are impacted by hateful conduct or harassment to understand how to report it. So the fact that we broke all of those out, and we gave specific examples, has really helped our report validity rate. For hateful conduct alone, our report validity rate went up by 4x. And for harassment — which I think is more difficult, because it’s very contextual — we saw user report validity go up 5-6x. So not only does it help our community, because they understand what we mean by it, but it also helps when people actually report the right things, and then we do more enforcement on that.

What tools has Twitch developed as a result of the hate raids?

When we work on tools, they take months to develop, because when you think about building tools, you do it at a global scale, right? So when we build our technology, we’re always thinking about what this means from a global perspective. 

One of the tools that we released in the last year was the phone verification chat. This is a tool that allows our creators to have anyone that wants to chat be phone verified before they do, which empowers them to create the experience they want. Then we had suspicious user detection; we’ve received lots of great community feedback from people wanting this tool. It took months to develop and integrate because it uses machine learning to determine whether or not someone could be a ban evader [a user that creates a new account or uses a VPN service to get around a channel-level ban]. Both allow creators and moderators to determine whether or not they want that person in their channel. So it goes back to that customization, giving creators and moderators tools to help them think through the safety that they want on their channel, in their community.

How are creators or viewers involved in the testing and development of these tools?

Community is at the heart of everything we do at Twitch; it’s what makes Twitch, Twitch. If you zoom out a bit, we have many signals and many engagement feedback loops here at Twitch. The first one I know is the Creator Camp. That’s basically where we’re meeting our community where they’re at, where we share tools and information. Then we have Twitch UserVoice, where our community can go and give feedback and/or suggestions and we interact directly with them. We also have the Safety Advisory Council — that’s industry experts and creators that have regular sessions with our products. And we also have the Twitch Ambassadors; these are role models in our community, and we’ll be able to have feedback sessions with them as well.

What kind of hiring activity has Twitch made as it steps up its enforcement of anti-harassment rules?

We’ve partnered with a third-party law firm that helps us with investigations, and we also quadrupled our law enforcement response team. [Twitch declined to give specific team figures]. We are continually collaborating with other services and sharing information with them. 

One of the most interesting elements of the open letter was the creation of an off-service conduct policy allowing Twitch to monitor infractions that occur off-platform. Are there any legal considerations behind punishing people for things that happened away from Twitch?

It’s important that we’re very thoughtful, and that’s why we have that third-party law firm that helps us with investigations — to make sure that we’re using proper evidence and thinking through it. For us, it very much is about how we reduce harm for our community, wherever they’re at. So that third-party law firm has helped us think about investigations and evidence — how are we being thoughtful?

What are the accessibility considerations as Twitch develops these tools? Could creating more potential barriers to entry reduce people’s enthusiasm about participating in these communities?

We enable our creators to basically create the safety experience they want. So they don’t have to do phone certification chat; they can use the tools that they want to create what safety means to them. So I hear you about the friction — but at the end of the day, you don’t have to do it. They have that tool if they feel like they need it, but it’s not required.

https://digiday.com/?p=439543

More in Marketing

Meta’s Threads expected to have ads this year

The move would make Threads Meta’s latest bit of ad real estate venue just over a year after its launch.

Mobile esports reaches new heights in 2024 with a boost from Saudi Arabian investment

Mobile esports activity has been picking up gradually since 2021, but 2024 could be one of the most lucrative years yet for the esports teams and players participating in popular mobile games such as “PUBG Mobile” and “Mobile Legends: Bang Bang” (MLBB).

Q1 ad rundown: there’s cautious optimism amid impending changes

The outlook for the rest of the year is a tale of two realities.