This episode critically examines how social media’s design, driven by emotional engagement, inadvertently fuels online extremism and manipulates public discourse. Andrew Marantz discusses the dangers of techno-utopianism in Silicon Valley and why banning individuals isn’t a comprehensive solution to these complex issues. Ecommerce operators should consider the ethical implications of platform design and content moderation, particularly in how it shapes user behavior and brand perception.
Key takeaways
Social media platforms, by prioritizing emotional engagement, can unintentionally create environments ripe for the spread of extremist ideologies and misinformation, impacting public trust and brand safety.
The "big swinging brains" culture in Silicon Valley, characterized by an uncritical belief in technology as a universal good, often overlooks the societal and ethical repercussions of platform development.
Content moderation strategies, such as banning users, are often insufficient standalone solutions for combating online extremism and require more fundamental changes in platform design and governance.
Understanding the psychological mechanisms that make users susceptible to online manipulation is crucial for developing responsible platform practices and fostering healthier online communities.
Ecommerce brands need to be aware of the broader online discourse and potential for platform manipulation, as it can directly impact brand reputation, customer engagement, and responsible advertising practices.
Andrew Marantz, a staff writer at the New Yorker, talks with Recode's Kara Swisher about his new book, Antisocial: Online Extremists, Techno‑Utopians, and the Hijacking of the American Conversation. He discusses the danger of designing social media platforms around emotional engagement, how people like Mike Cernovich and Milo Yiannopoulos exploited people's belief in a broad political "consensus," and technology's role in advancing hate and extremism online. Marantz also explains what he calls the culture of "big swinging brains" in Silicon Valley, and why banning people from Twitter — including President Trump — isn't a comprehensive solution.
Featuring:
Andrew Marantz (@andrewmarantz), staff writer at the New Yorker and author of Antisocial.
Hosts:
Kara Swisher (@karaswisher), Recode co-founder and editor-at-large
More to explore:
Subscribe for free to Pivot, Kara’s podcast with NYU Professor Scott Galloway that offer sharp, unfiltered insights into the biggest stories in tech, business, and politics.
About Recode by Vox:
Recode by Vox helps you understand how tech is changing the world — and changing us.
Follow Us:
Newsletter: Recode Daily
Twitter: @Recode and @voxdotcom
Learn more about your ad choices. Visit podcastchoices.com/adchoices
What does this episode say about founder & leadership?
Social media platforms, by prioritizing emotional engagement, can unintentionally create environments ripe for the spread of extremist ideologies and misinformation, impacting public trust and brand safety.
What does this episode say about brand & content?
The "big swinging brains" culture in Silicon Valley, characterized by an uncritical belief in technology as a universal good, often overlooks the societal and ethical repercussions of platform development.
What does this episode say about ai & automation?
Content moderation strategies, such as banning users, are often insufficient standalone solutions for combating online extremism and require more fundamental changes in platform design and governance.
What does this episode say about founder & leadership?
Understanding the psychological mechanisms that make users susceptible to online manipulation is crucial for developing responsible platform practices and fostering healthier online communities.
What does this episode say about founder & leadership?
Ecommerce brands need to be aware of the broader online discourse and potential for platform manipulation, as it can directly impact brand reputation, customer engagement, and responsible advertising practices.