Decoder with Nilay Patel artwork

Recode Decode: Twitter's Vijaya Gadde and Kayvon Beykpour (Live at Code 2019)

Decoder with Nilay Patel · with Vijaya Gadde and Kayvon Beykpour · June 14, 2019 · 49 min

Summary

This episode, recorded live at Code 2019, features Twitter executives Vijaya Gadde and Kayvon Beykpour discussing the complexities of content moderation on the platform. They delve into the challenges of balancing free speech with user safety, elaborating on Twitter's policies concerning hate speech, misinformation, and the de-platforming of problematic accounts. While focused on social media, the insights on policy enforcement, product development influenced by user safety, and managing public perception can be applied to any ecommerce platform dealing with user-generated content or community features.

Key takeaways

Topics covered

content moderation policiesplatform governance and safetytrust and safety teamsmisinformation and disinformation strategiesde-platforming impactbalancing free speech and platform responsibility

Episode description

Vijaya Gadde, who leads the legal and trust and safety teams at Twitter, and Periscope co-founder Kayvon Beykpour, who's now the company's head of product, talk with Recode's Kara Swisher and Peter Kafka at the 2019 Code Conference in Scottsdale, Ariz. In this episode: Twitter’s meeting with President Trump; CEO Jack Dorsey's level of contact with the policy team; cleaning the Twitter “cesspool”; could it operate without letting everyone speak?; its new policies around elections and anti-vaxxers; how its responses to abuse compare to Facebook’s and Google’s; does de-platforming people like Alex Jones reduce their influence?; does Twitter radicalize people?; how Twitter is trying to get rid of white supremacists; and false equivalency in content moderation. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Related episodes

Frequently asked about this episode

What's takeaway #1 from this episode?
Twitter's approach to content moderation involves a delicate balance between enabling free speech and ensuring user safety, with policies constantly evolving to address new threats like misinformation and hate speech.
What's takeaway #2 from this episode?
The Trust and Safety team plays a critical role in developing and enforcing community guidelines, which directly impacts product development and platform features.
What's takeaway #3 from this episode?
De-platforming controversial figures like Alex Jones requires careful consideration of its effectiveness in reducing influence versus simply displacing the problem.
What's takeaway #4 from this episode?
The involvement of top leadership, like Jack Dorsey, in policy decisions highlights the strategic importance of content moderation for platform integrity and public perception.
What's takeaway #5 from this episode?
Twitter's strategies for combating misinformation and hate speech involve continuously evolving policies, robust enforcement mechanisms, and product interventions to reduce the visibility of harmful content.

Listen