Decoder with Nilay Patel artwork

Recode Decode: Tristan Harris

Decoder with Nilay Patel · with Tristan Harris · May 6, 2019 · 64 min

Summary

This episode uncovers how tech platforms, seeking user engagement, are inadvertently "downgrading" human capacity and societal well-being. It critiques current "digital well-being" features as insufficient and advocates for systemic changes, including potential reforms to Section 230, to hold tech giants accountable for their profound societal impact. Ecommerce operators should consider the ethical implications of their platform choices and how engagement metrics might be subtly influencing user experience beyond conversion.

Key takeaways

Themes

ai & automationfounder & leadership

Topics covered

human downgradingdesign ethicsdigital well-being initiativestechlashsection 230 reformplatform accountability

Episode description

Tristan Harris, the co-founder of the Center for Humane Technology, talks with Recode's Kara Swisher about the latest problem he and his peers are trying to solve: "Human downgrading." In this episode: Harris’ background as a design ethicist Google; how his previous movement, Time Well Spent, was co-opted by the tech industry; how are Apple and Google's "digital well-being" features doing?; the utopian promise of tech; why Harris shifted to focus on “downgrading”; what the Center for Humane Technology does; has Harris gotten through to tech's leaders?; why Facebook and YouTube are worse than Shell and Exxon; digital platforms as cities; what we need to do now; techlash; the (sort of) good news; are CEOs just hoping this goes away on its own?; and why removing Section 230 of the Communications Decency Act is "critical." Learn more about your ad choices. Visit podcastchoices.com/adchoices

Related episodes

Frequently asked about this episode

What does this episode say about ai & automation?
"Human downgrading" describes the subtle erosion of human capabilities and societal well-being due to engagement-driven tech design, moving beyond the concept of "time well spent".
What does this episode say about founder & leadership?
Current "digital well-being" features from major tech companies are largely ineffective in addressing systemic issues, offering superficial solutions to deep-seated problems in platform design.
What does this episode say about ai & automation?
The societal impact of major tech platforms like Facebook and YouTube can be compared to that of large industrial polluters, necessitating similar levels of scrutiny and accountability.
What does this episode say about ai & automation?
Viewing digital platforms as "cities" helps frame the need for structured governance, ethical infrastructure, and a focus on public good within these virtual environments.
What does this episode say about ai & automation?
Reforming or removing Section 230 of the Communications Decency Act is presented as a critical step to increase platform accountability and drive more ethical design practices.

Listen