This episode features Charlton McIlwain, author of 'Black Software,' discussing the historical and ongoing relationship between technology, racial justice, and the Black community. It explores how technology has been used for activism and community building, as well as its role in policing, surveillance, and perpetuating algorithmic bias. The conversation offers crucial insights into how ecommerce operators can understand the broader societal impacts of technology and consider ethical implications in their own digital strategies.
Key takeaways
Early online communities like AfroNet were foundational for Black tech engagement and activism.
Social media platforms have been instrumental in amplifying movements like Black Lives Matter, highlighting the power of online communities for social change.
Technology in policing, including facial recognition and predictive policing, disproportionately affects Black communities due to algorithmic biases.
Addressing racial injustice in tech requires understanding historical context, combating algorithmic bias, and advocating for equitable digital access.
Tech companies have a significant responsibility to moderate content, combat misinformation, and create safe online spaces, which directly impacts brand perception and customer trust.
In this episode of Decoder, Nilay sits down with Charlton McIlwain, a professor of media, culture, and communications at NYU and the author of Black Software, to talk about Black Lives Matter, Twitter, Online Communities, and Policing.
Learn more about your ad choices. Visit podcastchoices.com/adchoices
What does this episode say about founder & leadership?
Early online communities like AfroNet were foundational for Black tech engagement and activism.
What does this episode say about brand & content?
Social media platforms have been instrumental in amplifying movements like Black Lives Matter, highlighting the power of online communities for social change.
What does this episode say about ai & automation?
Technology in policing, including facial recognition and predictive policing, disproportionately affects Black communities due to algorithmic biases.
What does this episode say about founder & leadership?
Addressing racial injustice in tech requires understanding historical context, combating algorithmic bias, and advocating for equitable digital access.
What does this episode say about founder & leadership?
Tech companies have a significant responsibility to moderate content, combat misinformation, and create safe online spaces, which directly impacts brand perception and customer trust.