121 avsnitt • Längd: 45 min • Månadsvis
In our podcast, Your Undivided Attention, co-hosts Tristan Harris, Aza Raskin and Daniel Barcay explore the unprecedented power of emerging technologies: how they fit into our lives, and how they fit into a humane future.
Join us every other Thursday as we confront challenges and explore solutions with a wide range of thought leaders and change-makers — like Audrey Tang on digital democracy, neurotechnology with Nita Farahany, getting beyond dystopia with Yuval Noah Harari, and Esther Perel on Artificial Intimacy: the other AI.
Your Undivided Attention is produced by Executive Editor Sasha Fegan and Senior Producer Julia Scott. Our Researcher/Producer is Joshua Lash. We are a top tech podcast worldwide with more than 20 million downloads and a member of the TED Audio Collective.
The podcast Your Undivided Attention is created by Tristan Harris and Aza Raskin, The Center for Humane Technology. The podcast and the artwork on this page are embedded on this page using the public podcast feed (RSS).
Historian Yuval Noah Harari says that we are at a critical turning point. One in which AI’s ability to generate cultural artifacts threatens humanity’s role as the shapers of history. History will still go on, but will it be the story of people or, as he calls them, ‘alien AI agents’?
In this conversation with Aza Raskin, Harari discusses the historical struggles that emerge from new technology, humanity’s AI mistakes so far, and the immediate steps lawmakers can take right now to steer us towards a non-dystopian future.
This episode was recorded live at the Commonwealth Club World Affairs of California.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA
NEXUS: A Brief History of Information Networks from the Stone Age to AI by Yuval Noah Harari
Further reading on the Stanford Marshmallow Experiment Further reading on AlphaGo’s “move 37”
RECOMMENDED YUA EPISODES
This Moment in AI: How We Got Here and Where We’re Going
The Tech We Need for 21st Century Democracy with Divya Siddarth
Synthetic Humanity: AI & What’s At Stake
Two Million Years in Two Hours: A Conversation with Yuval Noah Harari
It’s a confusing moment in AI. Depending on who you ask, we’re either on the fast track to AI that’s smarter than most humans, or the technology is about to hit a wall. Gary Marcus is in the latter camp. He’s a cognitive psychologist and computer scientist who built his own successful AI start-up. But he’s also been called AI’s loudest critic.
On Your Undivided Attention this week, Gary sits down with CHT Executive Director Daniel Barcay to defend his skepticism of generative AI and to discuss what we need to do as a society to get the rollout of this technology right… which is the focus of his new book, Taming Silicon Valley: How We Can Ensure That AI Works for Us.
The bottom line: No matter how quickly AI progresses, Gary argues that our society is woefully unprepared for the risks that will come from the AI we already have.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA
Link to Gary’s book: Taming Silicon Valley: How We Can Ensure That AI Works for Us
Further reading on the deepfake of the CEO of India's National Stock Exchange
Further reading on the deepfake of of an explosion near the Pentagon.
The study Gary cited on AI and false memories.
Footage from Gary and Sam Altman’s Senate testimony.
RECOMMENDED YUA EPISODES
Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn
Taylor Swift is Not Alone: The Deepfake Nightmare Sweeping the Internet
No One is Immune to AI Harms with Dr. Joy Buolamwini
Correction: Gary mistakenly listed the reliability of GPS systems as 98%. The federal government’s standard for GPS reliability is 95%.
AI is moving fast. And as companies race to rollout newer, more capable models–with little regard for safety–the downstream risks of those models become harder and harder to counter. On this week’s episode of Your Undivided Attention, CHT’s policy director Casey Mock comes on the show to discuss a new legal framework to incentivize better AI, one that holds AI companies liable for the harms of their products.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA
The CHT Framework for Incentivizing Responsible AI Development
Further Reading on Air Canada’s Chatbot Fiasco
Further Reading on the Elon Musk Deep Fake Scams
The Full Text of SB1047, California’s AI Regulation Bill
RECOMMENDED YUA EPISODES
Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn
Can We Govern AI? with Marietje Schaake
A First Step Toward AI Regulation with Tom Wheeler
Correction: Casey incorrectly stated the year that the US banned child labor as 1937. It was banned in 1938.
[This episode originally aired on August 17, 2023] For all the talk about AI, we rarely hear about how it will change our relationships. As we swipe to find love and consult chatbot therapists, acclaimed psychotherapist and relationship expert Esther Perel warns that there’s another harmful “AI” on the rise — Artificial Intimacy — and how it is depriving us of real connection. Tristan and Esther discuss how depending on algorithms can fuel alienation, and then imagine how we might design technology to strengthen our social bonds.
RECOMMENDED MEDIA
Mating in Captivity by Esther Perel
Esther's debut work on the intricacies behind modern relationships, and the dichotomy of domesticity and sexual desire
The State of Affairs by Esther Perel
Esther takes a look at modern relationships through the lens of infidelity
Where Should We Begin? with Esther Perel
Listen in as real couples in search of help bare the raw and profound details of their stories
Esther’s podcast that focuses on the hard conversations we're afraid to have at work
A young man strikes up an unconventional relationship with a doll he finds on the internet
In a near future, a lonely writer develops an unlikely relationship with an operating system designed to meet his every need
RECOMMENDED YUA EPISODES
Big Food, Big Tech and Big AI with Michael Moss
The Three Rules of Humane Tech
Digital Democracy is Within Reach with Audrey Tang
CORRECTION: Esther refers to the 2007 film Lars and the Real Doll. The title of the film is Lars and the Real Girl.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Today, the tech industry is the second-biggest lobbying power in Washington, DC, but that wasn’t true as recently as ten years ago. How did we get to this moment? And where could we be going next? On this episode of Your Undivided Attention, Tristan and Daniel sit down with historian Margaret O’Mara and journalist Brody Mullins to discuss how Silicon Valley has changed the nature of American lobbying.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA
The Wolves of K Street: The Secret History of How Big Money Took Over Big Government - Brody’s book on the history of lobbying.
The Code: Silicon Valley and the Remaking of America - Margaret’s book on the historical relationship between Silicon Valley and Capitol Hill
More information on the Google antitrust ruling
More information on the SOPA/PIPA internet blackout
Detailed breakdown of Internet lobbying from Open Secrets
RECOMMENDED YUA EPISODES
U.S. Senators Grilled Social Media CEOs. Will Anything Change?
Can We Govern AI? with Marietje Schaake
The Race to Cooperation with David Sloan Wilson
CORRECTION: Brody Mullins refers to AT&T as having a “hundred million dollar” lobbying budget in 2006 and 2007. While we couldn’t verify the size of their budget for lobbying, their actual lobbying spend was much less than this: $27.4m in 2006 and $16.5m in 2007, according to OpenSecrets.
The views expressed by guests appearing on Center for Humane Technology’s podcast, Your Undivided Attention, are their own, and do not necessarily reflect the views of CHT. CHT does not support or oppose any candidate or party for election to public office
It’s been a year and half since Tristan and Aza laid out their vision and concerns for the future of artificial intelligence in The AI Dilemma. In this Spotlight episode, the guys discuss what’s happened since then–as funding, research, and public interest in AI has exploded–and where we could be headed next. Plus, some major updates on social media reform, including the passage of the Kids Online Safety and Privacy Act in the Senate.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA
The AI Dilemma: Tristan and Aza’s talk on the catastrophic risks posed by AI.
Info Sheet on KOSPA: More information on KOSPA from FairPlay.
Situational Awareness by Leopold Aschenbrenner: A widely cited blog from a former OpenAI employee, predicting the rapid arrival of AGI.
AI for Good: More information on the AI for Good summit that was held earlier this year in Geneva.
Using AlphaFold in the Fight Against Plastic Pollution: More information on Google’s use of AlphaFold to create an enzyme to break down plastics.
Swiss Call For Trust and Transparency in AI: More information on the initiatives mentioned by Katharina Frey.
RECOMMENDED YUA EPISODES
War is a Laboratory for AI with Paul Scharre
Jonathan Haidt On How to Solve the Teen Mental Health Crisis
Can We Govern AI? with Marietje Schaake
The Three Rules of Humane Tech
Clarification: Swiss diplomat Nina Frey’s full name is Katharina Frey.
The views expressed by guests appearing on Center for Humane Technology’s podcast, Your Undivided Attention, are their own, and do not necessarily reflect the views of CHT. CHT does not support or oppose any candidate or party for election to public office
AI has been a powerful accelerant for biological research, rapidly opening up new frontiers in medicine and public health. But that progress can also make it easier for bad actors to manufacture new biological threats. In this episode, Tristan and Daniel sit down with biologist Kevin Esvelt to discuss why AI has been such a boon for biologists and how we can safeguard society against the threats that AIxBio poses.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA
Sculpting Evolution: Information on Esvelt’s lab at MIT.
SecureDNA: Esvelt’s free platform to provide safeguards for DNA synthesis.
The Framework for Nucleic Acid Synthesis Screening: The Biden admin’s suggested guidelines for DNA synthesis regulation.
Senate Hearing on Regulating AI Technology: C-SPAN footage of Dario Amodei’s testimony to Congress.
The AlphaFold Protein Structure Database
RECOMMENDED YUA EPISODES
U.S. Senators Grilled Social Media CEOs. Will Anything Change?
Big Food, Big Tech and Big AI with Michael Moss
Clarification: President Biden’s executive order only applies to labs that receive funding from the federal government, not state governments.
Will AI ever start to think by itself? If it did, how would we know, and what would it mean?
In this episode, Dr. Anil Seth and Aza discuss the science, ethics, and incentives of artificial consciousness. Seth is Professor of Cognitive and Computational Neuroscience at the University of Sussex and the author of Being You: A New Science of Consciousness.
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA
A free, plain text version of the Shelley’s classic of gothic literature.
A video from OpenAI demonstrating GPT4o’s remarkable ability to mimic human sentience.
You Can Have the Blue Pill or the Red Pill, and We’re Out of Blue Pills
The NYT op-ed from last year by Tristan, Aza, and Yuval Noah Harari outlining the AI dilemma.
Thomas Nagel’s essay on the nature of consciousness.
Are You Living in a Computer Simulation?
Philosopher Nick Bostrom’s essay on the simulation hypothesis.
Anthropic’s Golden Gate Claude
A blog post about Anthropic’s recent discovery of millions of distinct concepts within their LLM, a major development in the field of AI interpretability.
RECOMMENDED YUA EPISODES
Esther Perel on Artificial Intimacy
Climate change, political instability, hunger. These are just some of the forces behind an unprecedented refugee crisis that’s expected to include over a billion people by 2050. In response to this growing crisis, wealthy governments like the US and the EU are employing novel AI and surveillance technologies to slow the influx of migrants at their borders. But will this rollout stop at the border?
In this episode, Tristan and Aza sit down with Petra Molnar to discuss how borders have become a proving ground for the sharpest edges of technology, and especially AI. Petra is an immigration lawyer and co-creator of the Migration and Technology Monitor. Her new book is “The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence.”
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
RECOMMENDED MEDIA
The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence
Petra’s newly published book on the rollout of high risk tech at the border.
A report co-authored by Petra about Canada’s use of AI technology in their immigration process.
A report authored by Petra about the use of experimental technology in EU border enforcement.
Startup Pitched Tasing Migrants from Drones, Video Reveals
An article from The Intercept, containing the demo for Brinc’s taser drone pilot program.
The UNHCR
Information about the global refugee crisis from the UN.
RECOMMENDED YUA EPISODES
War is a Laboratory for AI with Paul Scharre
No One is Immune to AI Harms with Dr. Joy Buolamwini
Can We Govern AI? With Marietje Schaake
CLARIFICATION:
The iBorderCtrl project referenced in this episode was a pilot project that was discontinued in 2019
This week, a group of current and former employees from OpenAI and Google DeepMind penned an open letter accusing the industry’s leading companies of prioritizing profits over safety. This comes after a spate of high profile departures from OpenAI, including co-founder Ilya Sutskever and senior researcher Jan Leike, as well as reports that OpenAI has gone to great lengths to silence would-be whistleblowers.
The writers of the open letter argue that researchers have a “right to warn” the public about AI risks and laid out a series of principles that would protect that right. In this episode, we sit down with one of those writers: William Saunders, who left his job as a research engineer at OpenAI in February. William is now breaking the silence on what he saw at OpenAI that compelled him to leave the company and to put his name to this letter.
RECOMMENDED MEDIA
My Perspective On "A Right to Warn about Advanced Artificial Intelligence": A follow-up from William about the letter
Leaked OpenAI documents reveal aggressive tactics toward former employees: An investigation by Vox into OpenAI’s policy of non-disparagement.
RECOMMENDED YUA EPISODES
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Right now, militaries around the globe are investing heavily in the use of AI weapons and drones. From Ukraine to Gaza, weapons systems with increasing levels of autonomy are being used to kill people and destroy infrastructure and the development of fully autonomous weapons shows little signs of slowing down. What does this mean for the future of warfare? What safeguards can we put up around these systems? And is this runaway trend toward autonomous warfare inevitable or will nations come together and choose a different path? In this episode, Tristan and Daniel sit down with Paul Scharre to try to answer some of these questions. Paul is a former Army Ranger, the author of two books on autonomous weapons and he helped the Department of Defense write a lot of its policy on the use of AI in weaponry.
RECOMMENDED MEDIA
Four Battlegrounds: Power in the Age of Artificial Intelligence: Paul’s book on the future of AI in war, which came out in 2023.
Army of None: Autonomous Weapons and the Future of War: Paul’s 2018 book documenting and predicting the rise of autonomous and semi-autonomous weapons as part of modern warfare.
The Perilous Coming Age of AI Warfare: How to Limit the Threat of Autonomous Warfare: Paul’s article in Foreign Affairs based on his recent trip to the battlefield in Ukraine.
The night the world almost almost ended: A BBC documentary about Stanislav Petrov’s decision not to start nuclear war.
AlphaDogfight Trials Final Event: The full simulated dogfight between an AI and human pilot. The AI pilot swept, 5-0.
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza: An investigation into the use of AI targeting systems by the IDF.
RECOMMENDED YUA EPISODES
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Tech companies say that AI will lead to massive economic productivity gains. But as we know from the first digital revolution, that’s not what happened. Can we do better this time around?
RECOMMENDED MEDIA
Power and Progress by Daron Acemoglu and Simon Johnson Professor Acemoglu co-authored a bold reinterpretation of economics and history that will fundamentally change how you see the world
Can we Have Pro-Worker AI? Professor Acemoglu co-authored this paper about redirecting AI development onto the human-complementary path
Rethinking Capitalism: In Conversation with Daron Acemoglu The Wheeler Institute for Business and Development hosted Professor Acemoglu to examine how technology affects the distribution and growth of resources while being shaped by economic and social incentives
RECOMMENDED YUA EPISODES
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Suicides. Self harm. Depression and anxiety. The toll of a social media-addicted, phone-based childhood has never been more stark. It can be easy for teens, parents and schools to feel like they’re trapped by it all. But in this conversation with Tristan Harris, author and social psychologist Jonathan Haidt makes the case that the conditions that led to today’s teenage mental health crisis can be turned around – with specific, achievable actions we all can take starting today.
This episode was recorded live at the San Francisco Commonwealth Club.
Correction: Tristan mentions that 40 Attorneys General have filed a lawsuit against Meta for allegedly fostering addiction among children and teens through their products. However, the actual number is 42 Attorneys General who are taking legal action against Meta.
Clarification: Jonathan refers to the Wait Until 8th pledge. By signing the pledge, a parent promises not to give their child a smartphone until at least the end of 8th grade. The pledge becomes active once at least ten other families from their child’s grade pledge the same.
Beneath the race to train and release more powerful AI models lies another race: a race by companies and nation-states to secure the hardware to make sure they win AI supremacy.
Correction: The latest available Nvidia chip is the Hopper H100 GPU, which has 80 billion transistors. Since the first commercially available chip had four transistors, the Hopper actually has 20 billion times that number. Nvidia recently announced the Blackwell, which boasts 208 billion transistors - but it won’t ship until later this year.
RECOMMENDED MEDIA
Chip War: The Fight For the World’s Most Critical Technology by Chris Miller
To make sense of the current state of politics, economics, and technology, we must first understand the vital role played by chips
Gordon Moore Biography & Facts
Gordon Moore, the Intel co-founder behind Moore's Law, passed away in March of 2023
AI’s most popular chipmaker Nvidia is trying to use AI to design chips faster
Nvidia's GPUs are in high demand - and the company is using AI to accelerate chip production
RECOMMENDED YUA EPISODES
Future-proofing Democracy In the Age of AI with Audrey Tang
How Will AI Affect the 2024 Elections? with Renee DiResta and Carl Miller
The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao
Protecting Our Freedom of Thought with Nita Farahany
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
What does a functioning democracy look like in the age of artificial intelligence? Could AI even be used to help a democracy flourish? Just in time for election season, Taiwan’s Minister of Digital Affairs Audrey Tang returns to the podcast to discuss healthy information ecosystems, resilience to cyberattacks, how to “prebunk” deepfakes, and more.
RECOMMENDED MEDIA
This academic paper addresses tough questions for Americans: Who governs? Who really rules?
Recursive Public is an experiment in identifying areas of consensus and disagreement among the international AI community, policymakers, and the general public on key questions of governance
A Strong Democracy is a Digital Democracy
Audrey Tang’s 2019 op-ed for The New York Times
The Frontiers of Digital Democracy
Nathan Gardels interviews Audrey Tang in Noema
RECOMMENDED YUA EPISODES
Digital Democracy is Within Reach with Audrey Tang
The Tech We Need for 21st Century Democracy with Divya Siddarth
How Will AI Affect the 2024 Elections? with Renee DiResta and Carl Miller
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Was it political progress, or just political theater? The recent Senate hearing with social media CEOs led to astonishing moments — including Mark Zuckerberg’s public apology to families who lost children following social media abuse. Our panel of experts, including Facebook whistleblower Frances Haugen, untangles the explosive hearing, and offers a look ahead, as well. How will this hearing impact protocol within these social media companies? How will it impact legislation? In short: will anything change?
Clarification: Julie says that shortly after the hearing, Meta’s stock price had the biggest increase of any company in the stock market’s history. It was the biggest one-day gain by any company in Wall Street history.
Correction: Frances says it takes Snap three or four minutes to take down exploitative content. In Snap's most recent transparency report, they list six minutes as the median turnaround time to remove exploitative content.
RECOMMENDED MEDIA
Founded by Julie Scelfo, Get Media Savvy is a non-profit initiative working to establish a healthy media environment for kids and families
The Power of One by Frances Haugen
The inside story of France’s quest to bring transparency and accountability to Big Tech
RECOMMENDED YUA EPISODES
Real Social Media Solutions, Now with Frances Haugen
A Conversation with Facebook Whistleblower Frances Haugen
Social Media Victims Lawyer Up with Laura Marquez-Garrett
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Over the past year, a tsunami of apps that digitally strip the clothes off real people has hit the market. Now anyone can create fake non-consensual sexual images in just a few clicks. With cases proliferating in high schools, guest presenter Laurie Segall talks to legal scholar Mary Anne Franks about the AI-enabled rise in deep fake porn and what we can do about it.
Correction: Laurie refers to the app 'Clothes Off.' It’s actually named Clothoff. There are many clothes remover apps in this category.
RECOMMENDED MEDIA
Revenge Porn: The Cyberwar Against Women
In a five-part digital series, Laurie Segall uncovers a disturbing internet trend: the rise of revenge porn
In this provocative book, Mary Anne Franks examines the thin line between constitutional fidelity and constitutional fundamentalism
Fake Explicit Taylor Swift Images Swamp Social Media
Calls to protect women and crack down on the platforms and technology that spread such images have been reignited
RECOMMENDED YUA EPISODES
No One is Immune to AI Harms
Esther Perel on Artificial Intimacy
Social Media Victims Lawyer Up
The AI Dilemma
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
We usually talk about tech in terms of economics or policy, but the casual language tech leaders often use to describe AI — summoning an inanimate force with the powers of code — sounds more... magical. So, what can myth and magic teach us about the AI race? Josh Schrei, mythologist and host of The Emerald podcast, says that foundational cultural tales like "The Sorcerer's Apprentice" or Prometheus teach us the importance of initiation, responsibility, human knowledge, and care. He argues these stories and myths can guide ethical tech development by reminding us what it is to be human.
Correction: Josh says the first telling of "The Sorcerer’s Apprentice" myth dates back to ancient Egypt, but it actually dates back to ancient Greece.
RECOMMENDED MEDIA
The Emerald explores the human experience through a vibrant lens of myth, story, and imagination
Embodied Ethics in The Age of AI
A five-part course with The Emerald podcast’s Josh Schrei and School of Wise Innovation’s Andrew Dunn
Nature Nurture: Children Can Become Stewards of Our Delicate Planet
A U.S. Department of the Interior study found that the average American kid can identify hundreds of corporate logos but not plants and animals
AI is revolutionizing the world - here's how democracies can come out on top. This upcoming book was authored by an architect of President Biden's AI executive order
RECOMMENDED YUA EPISODES
How Will AI Affect the 2024 Elections?
The Three Rules of Humane Tech
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
2024 will be the biggest election year in world history. Forty countries will hold national elections, with over two billion voters heading to the polls. In this episode of Your Undivided Attention, two experts give us a situation report on how AI will increase the risks to our elections and our democracies.
Correction: Tristan says two billion people from 70 countries will be undergoing democratic elections in 2024. The number expands to 70 when non-national elections are factored in.
RECOMMENDED MEDIA
White House AI Executive Order Takes On Complexity of Content Integrity Issues
Renee DiResta’s piece in Tech Policy Press about content integrity within President Biden’s AI executive order
The Stanford Internet Observatory
A cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies, with a focus on social media
Britain’s leading cross-party think tank
Invisible Rulers: The People Who Turn Lies into Reality by Renee DiResta
Pre-order Renee’s upcoming book that’s landing on shelves June 11, 2024
RECOMMENDED YUA EPISODES
The Spin Doctors Are In with Renee DiResta
From Russia with Likes Part 1 with Renee DiResta
From Russia with Likes Part 2 with Renee DiResta
Esther Perel on Artificial Intimacy
A Conversation with Facebook Whistleblower Frances Haugen
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
You asked, we answered. This has been a big year in the world of tech, with the rapid proliferation of artificial intelligence, acceleration of neurotechnology, and continued ethical missteps of social media. Looking back on 2023, there are still so many questions on our minds, and we know you have a lot of questions too. So we created this episode to respond to listener questions and to reflect on what lies ahead.
Correction: Tristan mentions that 41 Attorneys General have filed a lawsuit against Meta for allegedly fostering addiction among children and teens through their products. However, the actual number is 42 Attorneys General who are taking legal action against Meta.
Correction: Tristan refers to Casey Mock as the Center for Humane Technology’s Chief Policy and Public Affairs Manager. His title is Chief Policy and Public Affairs Officer.
RECOMMENDED MEDIA
Marietje Schaake curates this briefing on artificial intelligence and technology policy from around the world
President Biden’s executive order on the safe, secure, and trustworthy development and use of AI
Meta sued by 42 AGs for addictive features targeting kids
A bipartisan group of 42 attorneys general is suing Meta, alleging features on Facebook and Instagram are addictive and are aimed at kids and teens
RECOMMENDED YUA EPISODES
The Three Rules of Humane Tech
Two Million Years in Two Hours: A Conversation with Yuval Noah Harari
Inside the First AI Insight Forum in Washington
Digital Democracy is Within Reach with Audrey Tang
The Tech We Need for 21st Century Democracy with Divya Siddarth
Mind the (Perception) Gap with Dan Vallone
Can We Govern AI? with Marietje Schaake
Ask Us Anything: You Asked, We Answered
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
As AI development races forward, a fierce debate has emerged over open source AI models. So what does it mean to open-source AI? Are we opening Pandora’s box of catastrophic risks? Or is open-sourcing AI the only way we can democratize its benefits and dilute the power of big tech?
Correction: When discussing the large language model Bloom, Elizabeth said it functions in 26 different languages. Bloom is actually able to generate text in 46 natural languages and 13 programming languages - and more are in the works.
RECOMMENDED MEDIA
Open-Sourcing Highly Capable Foundation Models
This report, co-authored by Elizabeth Seger, attempts to clarify open-source terminology and to offer a thorough analysis of risks and benefits from open-sourcing AI
BadLlama: cheaply removing safety fine-tuning from Llama 2-Chat 13B
This paper, co-authored by Jeffrey Ladish, demonstrates that it’s possible to effectively undo the safety fine-tuning from Llama 2-Chat 13B with less than $200 while retaining its general capabilities
Centre for the Governance of AI
Supports governments, technology companies, and other key institutions by producing relevant research and guidance around how to respond to the challenges posed by AI
AI: Futures and Responsibility (AI:FAR)
Aims to shape the long-term impacts of AI in ways that are safe and beneficial for humanity
Studies the offensive capabilities of AI systems today to better understand the risk of losing control to AI systems forever
RECOMMENDED YUA EPISODES
A First Step Toward AI Regulation with Tom Wheeler
No One is Immune to AI Harms with Dr. Joy Buolamwini
Mustafa Suleyman Says We Need to Contain AI. How Do We Do It?
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
On Monday, Oct. 30, President Biden released a sweeping executive order that addresses many risks of artificial intelligence. Tom Wheeler, former chairman of the Federal Communications Commission, shares his insights on the order with Tristan and Aza and discusses what’s next in the push toward AI regulation.
Clarification: When quoting Thomas Jefferson, Aza incorrectly says “regime” instead of “regimen.” The correct quote is: “I am not an advocate for frequent changes in laws and constitutions, but laws and institutions must go hand in hand with the progress of the human mind. And as that becomes more developed, more enlightened, as new discoveries are made, new truths discovered, and manners and opinions change, with the change of circumstances, institutions must advance also to keep pace with the times. We might as well require a man to wear still the coat which fitted him when a boy as civilized society to remain ever under the regime of their barbarous ancestors.”
RECOMMENDED MEDIA
President Biden’s Executive Order on the safe, secure, and trustworthy development and use of AI
The summit brings together international governments, leading AI companies, civil society groups, and experts in research to consider the risks of AI and discuss how they can be mitigated through internationally coordinated action
An open letter calling for an international AI treaty
Techlash: Who Makes the Rules in the Digital Gilded Age?
Praised by Kirkus Reviews as “a rock-solid plan for controlling the tech giants,” readers will be energized by Tom Wheeler’s vision of digital governance
RECOMMENDED YUA EPISODES
Inside the First AI Insight Forum in Washington
Digital Democracy is Within Reach with Audrey Tang
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
In this interview, Dr. Joy Buolamwini argues that algorithmic bias in AI systems poses risks to marginalized people. She challenges the assumptions of tech leaders who advocate for AI “alignment” and explains why some tech companies are hypocritical when it comes to addressing bias.
Dr. Joy Buolamwini is the founder of the Algorithmic Justice League and the author of “Unmasking AI: My Mission to Protect What Is Human in a World of Machines.”
Correction: Aza says that Sam Altman, the CEO of OpenAI, predicts superintelligence in four years. Altman predicts superintelligence in ten years.
RECOMMENDED MEDIA
Unmasking AI by Joy Buolamwini
“The conscience of the AI revolution” explains how we’ve arrived at an era of AI harms and oppression, and what we can do to avoid its pitfalls
Shalini Kantayya’s film explores the fallout of Dr. Joy’s discovery that facial recognition does not see dark-skinned faces accurately, and her journey to push for the first-ever legislation in the U.S. to govern against bias in the algorithms that impact us all
How I’m fighting bias in algorithms
Dr. Joy’s 2016 TED Talk about her mission to fight bias in machine learning, a phenomenon she calls the "coded gaze."
RECOMMENDED YUA EPISODES
Mustafa Suleyman Says We Need to Contain AI. How Do We Do It?
Protecting Our Freedom of Thought with Nita Farahany
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
This is going to be the most productive decade in the history of our species, says Mustafa Suleyman, author of “The Coming Wave,” CEO of Inflection AI, and founder of Google’s DeepMind. But in order to truly reap the benefits of AI, we need to learn how to contain it. Paradoxically, part of that will mean collectively saying no to certain forms of progress. As an industry leader reckoning with a future that’s about to be ‘turbocharged’ Mustafa says we can all play a role in shaping the technology in hands-on ways and by advocating for appropriate governance.
RECOMMENDED MEDIA
The Coming Wave: Technology, Power, and the 21st Century’s Greatest Dilemma
This new book from Mustafa Suleyman is a must-read guide to the technological revolution just starting, and the transformed world it will create
Partnership on AI is bringing together diverse voices from across the AI community to create resources for advancing positive outcomes for people and society
Policy Reforms Toolkit from the Center for Humane Technology
Digital lawlessness has been normalized in the name of innovation. It’s possible to craft policy that protects the conditions we need to thrive
RECOMMENDED YUA EPISODES
Can We Govern AI? with Marietje Schaake
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Last week, Senator Chuck Schumer brought together Congress and many of the biggest names in AI for the first closed-door AI Insight Forum in Washington, D.C. Tristan and Aza were invited speakers at the event, along with Elon Musk, Satya Nadella, Sam Altman, and other leaders. In this update on Your Undivided Attention, Tristan and Aza recount how they felt the meeting went, what they communicated in their statements, and what it felt like to critique Meta’s LLM in front of Mark Zuckerberg.
Correction: In this episode, Tristan says GPT-3 couldn’t find vulnerabilities in code. GPT-3 could find security vulnerabilities, but GPT-4 is exponentially better at it.
RECOMMENDED MEDIA
In Show of Force, Silicon Valley Titans Pledge ‘Getting This Right’ With A.I.
Elon Musk, Sam Altman, Mark Zuckerberg, Sundar Pichai and others discussed artificial intelligence with lawmakers, as tech companies strive to influence potential regulations
Majority Leader Schumer Opening Remarks For The Senate’s Inaugural AI Insight Forum
Senate Majority Leader Chuck Schumer (D-NY) opened the Senate’s inaugural AI Insight Forum
The Wisdom Gap
As seen in Tristan’s talk on this subject in 2022, the scope and speed of our world’s issues are accelerating and growing more complex. And yet, our ability to comprehend those challenges and respond accordingly is not matching pace
RECOMMENDED YUA EPISODES
Spotlight On AI: What Would It Take For This to Go Well?
The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao
Spotlight: Elon, Twitter and the Gladiator Arena
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Where do the top Silicon Valley AI researchers really think AI is headed? Do they have a plan if things go wrong? In this episode, Tristan Harris and Aza Raskin reflect on the last several months of highlighting AI risk, and share their insider takes on a high-level workshop run by CHT in Silicon Valley.
NOTE: Tristan refers to journalist Maria Ressa and mentions that she received 80 hate messages per hour at one point. She actually received more than 90 messages an hour.
RECOMMENDED MEDIA
Musk, Zuckerberg, Gates: The titans of tech will talk AI at private Capitol summit
This week will feature a series of public hearings on artificial intelligence. But all eyes will be on the closed-door gathering convened by Senate Majority Leader Chuck Schumer
Takeaways from the roundtable with President Biden on artificial intelligence
Tristan Harris talks about his recent meeting with President Biden to discuss regulating artificial intelligence
Biden, Harris meet with CEOs about AI risks
Vice President Kamala Harris met with the heads of Google, Microsoft, Anthropic, and OpenAI as the Biden administration rolled out initiatives meant to ensure that AI improves lives without putting people’s rights and safety at risk
RECOMMENDED YUA EPISODES
The AI ‘Race’: China vs the US with Jeffrey Ding and Karen Hao
The Dictator’s Playbook with Maria Ressa
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
In the debate over slowing down AI, we often hear the same argument against regulation. “What about China? We can’t let China get ahead.” To dig into the nuances of this argument, Tristan and Aza speak with academic researcher Jeffrey Ding and journalist Karen Hao, who take us through what’s really happening in Chinese AI development. They address China’s advantages and limitations, what risks are overblown, and what, in this multi-national competition, is at stake as we imagine the best possible future for everyone.
CORRECTION: Jeffrey Ding says the export controls on advanced chips that were established in October 2022 only apply to military end-users. The controls also impose a license requirement on the export of those advanced chips to any China-based end-user.
RECOMMENDED MEDIA
Recent Trends in China’s Large Language Model Landscape by Jeffrey Ding and Jenny W. Xiao
This study covers a sample of 26 large-scale pre-trained AI models developed in China
This paper argues for placing a greater weight on a state’s capacity to diffuse, or widely adopt, innovations
U.S. moves to cut research ties with China over security concerns threaten American progress in critical areas
Military technology has grown so complex that it’s hard to imitate
RECOMMENDED YUA EPISODES
The Three Rules of Humane Tech
A Fresh Take on Tech in China with Rui Ma and Duncan Clark
Digital Democracy is Within Reach with Audrey Tang
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
For all the talk about AI, we rarely hear about how it will change our relationships. As we swipe to find love and consult chatbot therapists, acclaimed psychotherapist and relationship expert Esther Perel warns that there’s another harmful “AI” on the rise — Artificial Intimacy — and how it is depriving us of real connection. Tristan and Esther discuss how depending on algorithms can fuel alienation, and then imagine how we might design technology to strengthen our social bonds.
RECOMMENDED MEDIA
Mating in Captivity by Esther Perel
Esther's debut work on the intricacies behind modern relationships, and the dichotomy of domesticity and sexual desire
The State of Affairs by Esther Perel
Esther takes a look at modern relationships through the lens of infidelity
Where Should We Begin? with Esther Perel
Listen in as real couples in search of help bare the raw and profound details of their stories
Esther’s podcast that focuses on the hard conversations we're afraid to have at work
A young man strikes up an unconventional relationship with a doll he finds on the internet
In a near future, a lonely writer develops an unlikely relationship with an operating system designed to meet his every need
RECOMMENDED YUA EPISODES
Big Food, Big Tech and Big AI with Michael Moss
The Three Rules of Humane Tech
Digital Democracy is Within Reach with Audrey Tang
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
We are on the cusp of an explosion of cheap, consumer-ready neurotechnology - from earbuds that gather our behavioral data, to sensors that can read our dreams. And it’s all going to be supercharged by AI. This technology is moving from niche to mainstream - and it has the same potential to become exponential.
Legal scholar Nita Farahany talks us through the current state of neurotechnology and its deep links to AI. She says that we urgently need to protect the last frontier of privacy: our internal thoughts. And she argues that without a new legal framework around “cognitive liberty,” we won’t be able to insulate our brains from corporate and government intrusion.
RECOMMENDED MEDIA
The Battle for Your Brain offers a path forward to navigate the complex dilemmas that will fundamentally impact our freedom to understand, shape, and define ourselves
Computer Program Reveals What Neurons in the Visual Cortex Prefer to Look At
A study of macaque monkeys at Harvard generated valuable clues based on an artificial intelligence system that can reliably determine what neurons in the brain’s visual cortex prefer to see
Understanding Media: The Extensions of Man by Marshall McLuhan
An influential work by a fixture in media discourse
RECOMMENDED YUA EPISODES
The Three Rules of Humane Tech
Talking With Animals… Using AI
How to Free Our Minds with Cult Deprogramming Expert Dr. Steven Hassan
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Social media was humanity’s ‘first contact’ moment with AI. If we’re going to create laws that are strong enough to prevent AI from destroying our societies, we could benefit from taking a look at the major lawsuits against social media platforms that are playing out in our courts right now.
In our last episode, we took a close look at Big Food and its dangerous “race to the bottom” that parallels AI. We continue that theme this week with an episode about litigating social media and the consequences of the race to engagement in order to inform how we can approach AI harms.
Our guest, attorney Laura Marquez-Garrett, left her predominantly defense-oriented practice to join the Social Media Victims Law Center in February 2022. Laura is literally on the front lines of the battle to hold social media firms accountable for the harms they have created in young people’s lives for the past decade.
Listener warning: there are distressing and potentially triggering details within the episode.
Correction: Tristan refers to the Social Media Victims Law Center as a nonprofit legal center. They are a for-profit law firm.
RECOMMENDED MEDIA
1) If you're a parent whose child has been impacted by social media, Attorneys General in Colorado, New Hampshire, and Tennessee are asking to hear your story. Your testimonies can help ensure that social media platforms are designed safely for kids. For more information, please visit the respective state links.
Tennessee
2) Social Media Victims Law Center
A non-profit legal center that was founded in 2021 in response to the testimony of Facebook whistleblower Frances Haugen
3) Resources for Parents & Educators
Overwhelmed by our broken social media environment and wondering where to start? Check out our Youth Toolkit plus three actions you can take today
4) The Social Dilemma
Learn how the system works. Watch and share The Social Dilemma with people you care about
RECOMMENDED YUA EPISODES
Transcending the Internet Hate Game with Dylan Marron
A Conversation with Facebook Whistleblower Frances Haugen
Behind the Curtain on The Social Dilemma with Jeff Orlowski-Yang and Larissa Rhodes
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
In the next two episodes of Your Undivided Attention, we take a close look at two respective industries: big food and social media, which represent dangerous “races to the bottom” and have big parallels with AI.
And we are asking: what can our past mistakes and missed opportunities teach us about how we should approach AI harms?
In this first episode, Tristan talks to Pulitzer Prize-winning journalist and author Michael Moss. His book Salt, Sugar, Fat: How the Food Giants Hooked Us rocked the fast food industry when it came out in 2014.
Tristan and Michael discuss how we can leverage the lessons learned from Big Food’s coordination failures, and whether it’s the responsibility of the consumer, the government, or the companies to regulate.
RECOMMENDED MEDIA
Salt Sugar Fat: How the Food Giants Hooked Us
Michael’s New York Times bestseller. You’ll never look at a nutrition label the same way again
Hooked: Food, Free Will, and How the Food Giants Exploit Our Addictions
Michael’s Pulitzer Prize-winning exposé of how the processed food industry exploits our evolutionary instincts, the emotions we associate with food, and legal loopholes in their pursuit of profit over public health
Center for Humane Technology’s recently updated Take Control Toolkit
RECOMMENDED YUA EPISODES
How Might a long-term stock market transform tech? (ZigZag episode)
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
What happens when creators consider what lifelong human development looks like in terms of the tools we make? And what philosophies from Sesame Street can inform how to steward the power of AI and social media to influence minds in thoughtful, humane directions?
When the first episode of Sesame Street aired on PBS in 1969, it was unlike anything that had been on television before - a collaboration between educators, child psychologists, comedy writers and puppeteers - all working together to do something that had never been done before: create educational content for children on television.
Fast-forward to the present: could we switch gears to reprogram today’s digital tools to humanely educate the next generation?
That’s the question Tristan Harris and Aza Raskin explore with Dr. Rosemarie Truglio, the Senior Vice President of Curriculum and Content for the Sesame Workshop, the non-profit behind Sesame Street.
RECOMMENDED MEDIA
Street Gang: How We Got to Sesame Street
This documentary offers a rare window into the early days of Sesame Street, revealing the creators, artists, writers and educators who together established one of the most influential and enduring children’s programs in television history
Sesame Street: Ready for School!: A Parent's Guide to Playful Learning for Children Ages 2 to 5 by Dr. Rosemarie Truglio
Rosemarie shares all the research-based, curriculum-directed school readiness skills that have made Sesame Street the preeminent children's TV program
This volume serves as a marker of the significant role that Sesame Street plays in the education and socialization of young children
The Democratic Surround by Fred Turner
In this prequel to his celebrated book From Counterculture to Cyberculture, Turner rewrites the history of postwar America, showing how in the 1940s and 1950s American liberalism offered a far more radical social vision than we now remember
Amusing Ourselves to Death by Neil Postman
Neil Postman’s groundbreaking book about the damaging effects of television on our politics and public discourse has been hailed as a twenty-first-century book published in the twentieth century
Sesame Workshop Identity Matters Study
Explore parents’ and educators’ perceptions of children’s social identity development
Effects of Sesame Street: A meta-analysis of children's learning in 15 countries
Commissioned by Sesame Workshop, the study was led by University of Wisconsin researchers Marie-Louise Mares and Zhongdang Pan
U.S. Parents & Teachers See an Unkind World for Their Children, New Sesame Survey Shows
According to the survey titled, “K is for Kind: A National Survey On Kindness and Kids,” parents and teachers in the United States worry that their children are living in an unkind world
RECOMMENDED YUA EPISODES
Are the Kids Alright? With Jonathan Haidt
The Three Rules of Humane Tech
When Media Was for You and Me with Fred Turner
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
You’re likely familiar with the modern zombie trope: a zombie bites someone you care about and they’re transformed into a creature who wants your brain. Zombies are the perfect metaphor to explain something Tristan and Aza have been thinking about lately that they call zombie values.
In this Spotlight episode of Your Undivided Attention, we talk through some examples of how zombie values limit our thinking around tech harms. Our hope is that by the end of this episode, you'll be able to recognize the zombie values that walk amongst us, and think through how to upgrade these values to meet the realities of our modern world.
RECOMMENDED MEDIA
Is the First Amendment Obsolete?
This essay explores free expression challenges
This blog post from the Center for Humane Technology describes the gap between the rising interconnected complexity of our problems and our ability to make sense of them
RECOMMENDED YUA EPISODES
A Problem Well-Stated is Half Solved with Daniel Schmachtenberger
How To Free Our Minds with Cult Deprogramming Expert Steve Hassan
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
There’s really no one better than veteran tech journalist Kara Swisher at challenging people to articulate their thinking. Tristan Harrris recently sat down with her for a wide ranging interview on AI risk. She even pressed Tristan on whether he is a doomsday prepper. It was so great, we wanted to share it with you here.
The interview was originally on Kara’s podcast ON with Kara Swisher. If you like it and want to hear more of Kara’s interviews with folks like Sam Altman, Reid Hoffman and others, you can find more episodes of ON with Kara Swisher here: https://link.chtbl.com/_XTWwg3k
RECOMMENDED YUA EPISODES
The Three Rules of Humane Tech
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Democracy in action has looked the same for generations. Constituents might go to a library or school every one or two years and cast their vote for people who don't actually represent everything that they care about. Our technology is rapidly increasing in sophistication, yet our forms of democracy have largely remained unchanged. What would an upgrade look like - not just for democracy, but for all the different places that democratic decision-making happens?
On this episode of Your Undivided Attention, we’re joined by political economist and social technologist Divya Siddarth, one of the world's leading experts in collective intelligence. Together we explore how new kinds of governance can be supported through better technology, and how collective decision-making is key to unlocking everything from more effective elections to better ways of responding to global problems like climate change.
Correction:
Tristan mentions Elon Musk’s attempt to manufacture ventilators early on in the COVID-19 pandemic. Musk ended up buying over 1,200 ventilators that were delivered to California.
RECOMMENDED MEDIA
Against Democracy by Jason Brennan
A provocative challenge to one of our most cherished institutions
Technology platforms have created a race for human attention that’s unleashed invisible harms to society. Here are some of the costs that aren't showing up on their balance sheets
This blog post from the Center for Humane Technology describes the gap between the rising interconnected complexity of our problems and our ability to make sense of them
DemocracyNext is working to design and establish new institutions for government and transform the governance of organizations that influence public life
An incubator for new governance models for transformative technology
Transform community engagement through consensus
Kazm’s Living Room Conversations
Living Room Conversations works to heal society by connecting people across divides through guided conversations proven to build understanding and transform communities
A model for citizen participation in Ostbelgien, which was brought to life by the parliament of the German-speaking community
Asamblea Ciudadana Para El Clima
Spain’s national citizens’ assembly on climate change
The UK’s national citizens’ assembly on climate change
Citizens’ Convention for the Climate
France’s national citizens’ assembly on climate change
Polis is a real-time system for gathering, analyzing and understanding what large groups of people think in their own words, enabled by advanced statistics and machine learning
RECOMMENDED YUA EPISODES
Digital Democracy is Within Reach with Audrey Tang
They Don’t Represent Us with Larry Lessig
A Renegade Solution to Extractive Economics with Kate Raworth
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
A few episodes back, we presented Tristan Harris and Aza Raskin’s talk The AI Dilemma. People inside the companies that are building generative artificial intelligence came to us with their concerns about the rapid pace of deployment and the problems that are emerging as a result. We felt called to lay out the catastrophic risks that AI poses to society and sound the alarm on the need to upgrade our institutions for a post-AI world.
The talk resonated - over 1.6 million people have viewed it on YouTube as of this episode’s release date. The positive reception gives us hope that leaders will be willing to come to the table for a difficult but necessary conversation about AI.
However, now that so many people have watched or listened to the talk, we’ve found that there are some AI myths getting in the way of making progress. On this episode of Your Undivided Attention, we debunk five of those misconceptions.
RECOMMENDED MEDIA
In this New York Times piece, Yuval Harari, Tristan Harris, and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents.
A deep dive into the game theory and exponential growth underlying our modern economic system, and how recent advancements in AI are poised to turn up the pressure on that system, and its wider environment, in ways we have never seen before
RECOMMENDED YUA EPISODES
The Three Rules of Humane Tech
Can We Govern AI? with Marietje Schaake
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Despite our serious concerns about the pace of deployment of generative artificial intelligence, we are not anti-AI. There are uses that can help us better understand ourselves and the world around us. Your Undivided Attention co-host Aza Raskin is also co-founder of Earth Species Project, a nonprofit dedicated to using AI to decode non-human communication. ESP is developing this technology both to shift the way that we relate to the rest of nature, and to accelerate conservation research.
Significant recent breakthroughs in machine learning have opened ways to encode both human languages and map out patterns of animal communication. The research, while slow and incredibly complex, is very exciting. Picture being able to tell a whale to dive to avoid ship strikes, or to forge cooperation in conservation areas.
These advances come with their own complex ethical issues. But understanding non-human languages could transform our relationship with the rest of nature and promote a duty of care for the natural world.
In a time of such deep division, it’s comforting to know that hidden underlying languages may potentially unite us. When we study the patterns of the universe, we’ll see that humanity isn’t at the center of it.
Corrections:
Aza refers to the founding of Earth Species Project (ESP) in 2017. The organization was established in 2018.
When offering examples of self-awareness in animals, Aza mentions lemurs that get high on centipedes. They actually get high on millipedes.
RECOMMENDED MEDIA
Using AI to Listen to All of Earth’s Species
An interactive panel discussion hosted at the World Economic Forum in San Francisco on October 25, 2022. Featuring ESP President and Cofounder Aza Raskin; Dr. Karen Bakker, Professor at UBC and Harvard Radcliffe Institute Fellow; and Dr. Ari Friedlaender, Professor at UC Santa Cruz
What A Chatty Monkey May Tell Us About Learning to Talk
The gelada monkey makes a gurgling sound that scientists say is close to human speech
Lemurs May Be Making Medicine Out of Millipedes
Red-fronted lemurs appear to use plants and other animals to treat their afflictions
Two biologists set out on an undertaking as colossal as their subjects – deciphering the complex communication of whales
Earth Species Project is Hiring a Director of Research
ESP is looking for a thought leader in artificial intelligence with a track record of managing a team of researchers
RECOMMENDED YUA EPISODES
The Three Rules of Humane Tech
Synthetic Humanity: AI & What’s At Stake
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
When it comes to AI, what kind of regulations might we need to address this rapidly developing new class of technologies? What makes regulating AI and runaway tech in general different from regulating airplanes, pharmaceuticals, or food? And how can we ensure that issues like national security don't become a justification for sacrificing civil rights?
Answers to these questions are playing out in real time. If we wait for more AI harms to emerge before proper regulations are put in place, it may be too late.
Our guest Marietje Schaake was at the forefront of crafting tech regulations for the EU. In spite of AI’s complexity, she argues there is a path forward for the U.S. and other governing bodies to rein in companies that continue to release these products into the world without oversight.
Correction: Marietje said antitrust laws in the US were a century ahead of those in the EU. Competition law in the EU was enacted as part of the Treaty of Rome in 1957, almost 70 years after the US.
RECOMMENDED MEDIA
Tristan Harris and Aza Raskin’s presentation on existing AI capabilities and the catastrophic risks they pose to a functional society. Also available in the podcast format (linked below)
This blog post from the Center for Humane Technology describes the gap between the rising interconnected complexity of our problems and our ability to make sense of them
The EU’s Digital Services Act (DSA) & Digital Markets Act (DMA)
The two pieces of legislation aim to create safer and more open digital spaces for individuals and businesses alike
RECOMMENDED YUA EPISODES
Digital Democracy is Within Reach with Audrey Tang
The Three Rules of Humane Tech
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
In our previous episode, we shared a presentation Tristan and Aza recently delivered to a group of influential technologists about the race happening in AI. In that talk, they introduced the Three Rules of Humane Technology. In this Spotlight episode, we’re taking a moment to explore these three rules more deeply in order to clarify what it means to be a responsible technologist in the age of AI.
Correction: Aza mentions infinite scroll being in the pockets of 5 billion people, implying that there are 5 billion smartphone users worldwide. The number of smartphone users worldwide is actually 6.8 billion now.
RECOMMENDED MEDIA
We Think in 3D. Social Media Should, Too
Tristan Harris writes about a simple visual experiment that demonstrates the power of one’s point of view
Let’s Think About Slowing Down AI
Katja Grace’s piece about how to avert doom by not building the doom machine
If We Don’t Master AI, It Will Master Us
Yuval Harari, Tristan Harris and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents in this New York Times opinion piece
RECOMMENDED YUA EPISODES
Synthetic humanity: AI & What’s At Stake
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
You may have heard about the arrival of GPT-4, OpenAI’s latest large language model (LLM) release. GPT-4 surpasses its predecessor in terms of reliability, creativity, and ability to process intricate instructions. It can handle more nuanced prompts compared to previous releases, and is multimodal, meaning it was trained on both images and text. We don’t yet understand its capabilities - yet it has already been deployed to the public.
At Center for Humane Technology, we want to close the gap between what the world hears publicly about AI from splashy CEO presentations and what the people who are closest to the risks and harms inside AI labs are telling us. We translated their concerns into a cohesive story and presented the resulting slides to heads of institutions and major media organizations in New York, Washington DC, and San Francisco. The talk you're about to hear is the culmination of that work, which is ongoing.
AI may help us achieve major advances like curing cancer or addressing climate change. But the point we're making is: if our dystopia is bad enough, it won't matter how good the utopia we want to create. We only get one shot, and we need to move at the speed of getting it right.
RECOMMENDED MEDIA
AI ‘race to recklessness’ could have dire consequences, tech experts warn in new interview
Tristan Harris and Aza Raskin sit down with Lester Holt to discuss the dangers of developing AI without regulation
This made-for-television movie explored the effects of a devastating nuclear holocaust on small-town residents of Kansas
The Day After discussion panel
Moderated by journalist Ted Koppel, a panel of present and former US officials, scientists and writers discussed nuclear weapons policies live on television after the film aired
Zia Cora - Submarines
“Submarines” is a collaboration between musician Zia Cora (Alice Liu) and Aza Raskin. The music video was created by Aza in less than 48 hours using AI technology and published in early 2022
RECOMMENDED YUA EPISODES
Synthetic humanity: AI & What’s At Stake
A Conversation with Facebook Whistleblower Frances Haugen
Two Million Years in Two Hours: A Conversation with Yuval Noah Harari
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
A few months ago on Your Undivided Attention, we released a Spotlight episode on TikTok's national security risks. Since then, we've learned more about the dangers of the China-owned company: We've seen evidence of TikTok spying on US journalists, and proof of hidden state media accounts to influence the US elections. We’ve seen Congress ban TikTok on most government issued devices, and more than half of US states have done the same, along with dozens of US universities who are banning TikTok access from university wifi networks. More people in Western governments and media are saying that they used to believe that TikTok was an overblown threat. As we've seen more evidence of national security risks play out, there’s even talk of banning TikTok itself in certain countries. But is that the best solution? If we opt for a ban, how do we, as open societies, fight accusations of authoritarianism?
On this episode of Your Undivided Attention, we're going to do a deep dive into these questions with Marc Faddoul. He's the co-director of Tracking Exposed, a nonprofit investigating the influence of social media algorithms in our lives. His work has shown how TikTok tweaks its algorithm to maximize partisan engagement in specific national elections, and how it bans international news in countries like Russia that are fighting propaganda battles inside their own borders. In other words, we don't all get the same TikTok because there are different geopolitical interests that might guide which TikTok you see. That is a kind of soft power that TikTok operates on a global scale, and it doesn’t get talked about often enough.
We hope this episode leaves you with a lot to think about in terms of what the risks of TikTok are, how it's operating geopolitically, and what we can do about it.
RECOMMENDED MEDIA
Tracking Exposed Special Report: TikTok Content Restriction in Russia
How has the Russian invasion of Ukraine affected the content that TikTok users see in Russia? [Part 1 of Tracking Exposed series]
Tracking Exposed Special Report: Content Restrictions on TikTok in Russia Following the Ukrainian War
How are TikTok’s policy decisions affecting pro-war and anti-war content in Russia? [Part 2 of Tracking Exposed series]
Tracking Exposed Special Report: French Elections 2022
The visibility of French candidates on TikTok and YouTube search engines
The Democratic Surround by Fred Turner
A dazzling cultural history that demonstrates how American intellectuals, artists, and designers from the 1930s-1960s imagined new kinds of collective events that were intended to promote a powerful experience of American democracy in action
RECLOMMENDED YUA EPISODES
When Media Was for You and Me with Fred Turner
A Fresh Take on Tech in China with Rui Ma and Duncan Clark
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
It may seem like the rise of artificial intelligence, and increasingly powerful large language models you may have heard of, is moving really fast… and it IS.
But what’s coming next is when we enter synthetic relationships with AI that could come to feel just as real and important as our human relationships... And perhaps even more so.
In this episode of Your Undivided Attention, Tristan and Aza reach beyond the moment to talk about this powerful new AI, and the new paradigm of humanity and computation we’re about to enter.
This is a structural revolution that affects way more than text, art, or even Google search. There are huge benefits to humanity, and we’ll discuss some of those. But we also see that as companies race to develop the best synthetic relationships, we are setting ourselves up for a new generation of harms made exponentially worse by AI’s power to predict, mimic and persuade.
It’s obvious we need ways to steward these tools ethically. So Tristan and Aza also share their ideas for creating a framework for AIs that will help humans become MORE humane, not less.
RECOMMENDED MEDIA
Cybernetics: or, Control and Communication in the Animal and the Machine by Norbert Wiener
A classic and influential work that laid the theoretical foundations for information theory
New Chatbots Could Change the World. Can You Trust Them?
The New York Times addresses misinformation and how Siri, Google Search, online marketing and your child’s homework will never be the same
This paper proposes and explores the possibility that language models can be studied as effective proxies for specific human sub-populations in social science research
Earth Species Project, co-founded by Aza Raskin, is a non-profit dedicated to using artificial intelligence to decode non-human communication
A science-fiction romantic drama film written, directed, and co-produced by Spike Jonze
What A Chatty Monkey May Tell Us About Learning To Talk
NPR explores the fascinating world of gelada monkeys and the way they communicate
RECOMMENDED YUA EPISODES
How Political Language is Engineered with Drew Westen & Frank Luntz
It’s easy to tell ourselves we’re living in the world we want – one where Darwinian evolution drives competing technology platforms and capitalism pushes nations to maximize GDP regardless of externalities like carbon emissions. It can feel like evolution and competition are all there is.
If that’s a complete description of what’s driving the world and our collective destiny, that can feel pretty hopeless. But what if that’s not the whole story of evolution?
This is where evolutionary theorist, author, and professor David Sloan Wilson comes in. He has documented where an enlightened game, one of cooperation, rather than competition, is possible. His work shows that humans can and have chosen values like cooperation, altruism and group success – versus individual competition and selfishness – at key moments in our evolution, proving that evolution isn’t just genetic. It’s cultural, and it’s a choice.
In a world where our trajectory isn’t tracking in the direction we want, it's time to slow down and ask: is a different kind of conscious evolution possible?
On Your Undivided Attention, we’re going to update the Darwinian principles of evolution using a critical scientific lens that can help upgrade our ability to cooperate – ranging from the small community-level, all the way to entire technology companies that can cooperate in ways that allow everyone to succeed.
RECOMMENDED MEDIA
This View of Life: Completing the Darwinian Revolution by David Sloan Wilson
Atlas Hugged: The Autobiography of John Galt III by David Sloan Wilson
Governing the Commons: The Evolution of Institutions for Collective Action by Elinor Ostrom
WTF? What’s the Future and Why It’s Up to Us by Tim O’Reilly
Hard Drive: Bill Gates and the Making of the Microsoft Empire by James Wallace & Jim Erickson
RECOMMENDED YUA EPISODES
An Alternative to Silicon Valley Unicorns with Mara Zepeda & Kate “Sassy” Sassoon
A Problem Well-Stated is Half-Solved with Daniel Schmachtenberger
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Welcome to our first-ever Ask Us Anything episode. Recently we put out a call for questions… and, wow, did you come through! We got more than 100 responses from listeners to this podcast from all over the world. It was really fun going through them all, and really difficult to choose which ones to answer here. But we heard you, and we’ll carry your amazing suggestions and ideas forward with us in 2023.
When we created Your Undivided Attention, the goal was to explore the incredible power technology has over our lives, and how we can use it to catalyze a humane future. Three years and a global pandemic later, we’re more committed than ever to helping meet the moment with crucial conversations about humane technology - even as the tech landscape constantly evolves and world events bring more urgency to the need for technology that unites us, invests in democratic values, and enhances our well-being.
We’ve learned from our guests alongside all of you. Sixty-one episodes later, the podcast has over 16 million unique downloads! That’s a lot of people who care about the promise of humane technology and are working to construct a more humane version of technology in their lives, their family’s lives, and within their communities and society at large. We’re a movement!
Thank you to everyone who submitted questions and comments for us. We loved doing this, and we’re looking forward to doing it again!
Correction:
When discussing DeepMind’s recent paper, Aza said the premise was four people entering their views and opinions, with AI finding the commonality between all of those viewpoints. It was actually three people entering their views and opinions.
RECOMMENDED MEDIA
CHT’s Recommended Reading List:
Foundations of Humane Technology
Our free, self-paced online course for professionals shaping tomorrow’s technology
The Age of Surveillance Capitalism by Shoshana Zuboff
Foundational reading on the attention economy
Algorithms of Oppression by Safiya Umoja Noble
Seminal work on how algorithms in search engines replicate and reinforce bias online and offline
Amusing Ourselves to Death by Neil Postman
Written in 1985, Postman’s work shockingly predicts our current media environment and its effects
Attention Merchants by Tim Wu
A history of how advertisers capture our attention
Doughnut Economics by Kate Raworth
A compass for how to upgrade our economic models to be more regenerative and distributive
Thinking in Systems by Donella Meadows
This excellent primer shows us how to develop systems thinking skills
What Money Can’t Buy: The Moral Limits of Markets by Michael Sandel
Sandel explores how we can prevent market values from reaching into spheres of life where they don’t belong
Essay: Disbelieving Atrocities by Arthur Koestler
Originally published January 9, 1944 in The New York Times
Humane Technology reading list
Comprehensive for those who want to geek out
ORGANIZATIONS TO EXPLORE
Integrity Institute
Integrity Institute advances the theory and practice of protecting the social internet, powered by their community of integrity professionals
All Tech Is Human job board
All Tech Is Human curates roles focused on reducing the harms of technology, diversifying the tech pipeline, and ensuring that technology is aligned with the public interest
Denizen brings together leaders across disciplines to accelerate systemic change
New_Public is place for thinkers, builders, designers and technologists to meet and share inspiration
Psychology of Technology Institute
PTI is non-profit network of behavioral scientists, technology designers, and decision-makers that protects and improves psychological health for society by advancing our understanding and effective use of transformative technologies
RxC is a social movement for next-generation political economies
The School for Social Design
The School for Social Design offers three courses on articulating what’s meaningful for different people and how to design for it at smaller and larger scales
TechCongress is a technology policy fellowship on Capitol Hill
RECOMMENDED YUA EPISODES
An Alternative to Silicon Valley Unicorns
https://www.humanetech.com/podcast/54-an-alternative-to-silicon-valley-unicorns
A Problem Well-Stated is Half-Solved
https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved
Digital Democracy is Within Reach
https://www.humanetech.com/podcast/23-digital-democracy-is-within-reach
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
When you look at the world, it can feel like we're in a precarious moment. If you’ve listened to past episodes, you know we call this the meta-crisis — an era of overlapping and interconnected crises like climate change, polarization, and the rise of decentralized technologies like synthetic biology. It can feel like we’re on a path to destroy ourselves.
That's why we’re talking to Rick Doblin, the founder and executive director of the Multidisciplinary Association for Psychedelic Studies, or MAPS. They’re a nonprofit focused on educating and researching the benefits of using psychedelic therapy to address PTSD and promote humane ways of relating worldwide.
Doblin’s vision is for nothing less than a transformation of society through psychedelic-assisted therapy – not for the drugs themselves, but for their ability to help us react to one another with compassion, appreciate differences, and accept criticism.
Given the perma-crisis we face, it’s provocative to think about a tool that, when prescribed and used safely, could help us overcome rivalrous dynamics out in the world and on social media. If we rescue our hijacked brains, we can heal from the constant trauma inflation we get online, and shrink the perception gap that splits us into tribes.
Both MAPS and Center for Humane Technology want to understand what helps minds heal and be free. We invite you to keep an open mind about a different kind of humane technology as you listen to this episode.
Correction:
Doblin attributes a quote to Stan Grof about psychedelics helping your ego be “transparent to the transcendent.” In his book Pathways to Bliss, Joseph Campbell wrote, "When a deity serves as a model for you, your life becomes transparent to the transcendent as long as you realize the inspiring power of that deity. This means living not in the name of worldly success and achievement, but rather in the name of the transcendent, letting the energy manifest through you.” Grof was likely paraphrasing Campbell’s work and applying it to psychedelics.
Additional credits:
The episode contains an original musical composition by Jeff Sudakin. Used with permission.
RECOMMENDED MEDIA
Multidisciplinary Association for Psychedelic Studies (MAPS)
The non-profit founded by Rick Doblin in 1986 focused on developing medical, legal, and cultural contexts for people to benefit from the careful uses of psychedelics and marijuana. MAPS has some open clinical trials; see details on their website.
In this fascinating dive into the science of psychedelics, Doblin explains how drugs like LSD, psilocybin and MDMA affect your brain - and shows how, when paired with psychotherapy, they could change the way we treat PTSD, depression, substance abuse and more.
How to Change Your Mind by Michael Pollan
Pollan writes of his own consciousness-expanding experiments with psychedelic drugs, and makes the case for why shaking up the brain's old habits could be therapeutic for people facing addiction, depression, or death.
How to Change Your Mind on Netflix
The docuseries version of Pollan’s book
Breath by James Nestor
This popular science book provides a historical, scientific and personal account of breathing, with special focus on the differences between mouth breathing and nasal breathing.
A free app for sleep, anxiety, and stress
RECOMMENDED YUA EPISODES
You Will Never Breathe the Same Again with James Nestor
https://www.humanetech.com/podcast/38-you-will-never-breathe-the-same-again
Two Million Years in Two Hours: A Conversation with Yuval Noah Harari
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
When it comes to social media risk, there is reason to hope for consensus. Center for Humane Technology co-founder Tristan Harris recently helped launch a new initiative called the Council for Responsible Social Media (CRSM) in Washington, D.C. It’s a coalition between religious leaders, public health experts, national security leaders, and former political representatives from both sides - people who just care about making our democracy work.
During this event, Tristan sat down with Facebook whistleblower Frances Haugen, a friend of Center for Humane Technology, to discuss the harm caused to our mental health and global democracy when platforms lack accountability and transparency. The CRSM is bipartisan, and its kickoff serves to boost the solutions Frances and Tristan identify going into 2023.
RECOMMENDED MEDIA
Council for Responsible Social Media (CRSM)
A project of Issue One, CRSM is a cross-partisan group of leaders addressing the negative mental, civic, and public health impacts of social media in America.
Twitter Whistleblower Testifies on Security Issues
Peiter “Mudge” Zatko, a former Twitter security executive, testified on privacy and security issues relating to the social media company before the Senate Judiciary Committee.
Beyond the Screen is a coalition of technologists, designers, and thinkers fighting against online harms, led by the Facebook whistle-blower Frances Haugen.
Our campaign to pressure Facebook to make one immediate change — join us!
RECOMMENDED YUA EPISODES
A Conversation with Facebook Whistleblower Frances Haugen
https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen
A Facebook Whistleblower: Sophie Zhang
https://www.humanetech.com/podcast/episode-37-a-facebook-whistleblower
Mr. Harris Zooms to Washington
https://www.humanetech.com/podcast/episode-35-mr-harris-zooms-to-washington
With Great Power Comes… No Responsibility?
https://www.humanetech.com/podcast/3-with-great-power-comes-no-responsibility
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
The weekly American news show 60 Minutes invited Center for Humane Technology co-founder Tristan Harris back recently to discuss political polarization and the anger and incivility that gets elevated on social media as a matter of corporate profit. We're releasing a special episode of Your Undivided Attention this week to dig further into some of the important nuances of the complexity of this problem.
CHT’s work was actually introduced to the world by Anderson Cooper on 60 Minutes back in 2017, and we’re honored to have been invited back. In this new interview, we cover the business model of competing for engagement at all costs - the real root of the problem that we’re thrilled to be able to discuss on a far-reaching platform.
We also busted the myth that if you’re not on social media, you don’t need to be concerned. Even if you're not on social media, you likely live in a country that will vote based on other people’s collective choices and behaviors. We know that the media we engage with shapes the people who consume it.
CORRECTION:
RECOMMENDED MEDIA
60 Minutes: “Social Media and Political Polarization in America”
https://humanetech.com/60minutes
Amusing Ourselves to Death by Neil Postman
https://www.penguinrandomhouse.com/books/297276/amusing-ourselves-to-death-by-neil-postman/
Neil Postman’s groundbreaking book about the damaging effects of television on our politics and public discourse has been hailed as a twenty-first-century book published in the twentieth century.
60 Minutes: “Brain Hacking”
https://www.youtube.com/watch?v=awAMTQZmvPE
RECOMMENDED YUA EPISODES
Elon, Twitter, and the Gladiator Arena
https://www.humanetech.com/podcast/elon-twitter-and-the-gladiator-arena
Addressing the TikTok Threat
https://www.humanetech.com/podcast/bonus-addressing-the-tiktok-threat
What is Civil War In The Digital Age? With Barbara F Walter
https://www.humanetech.com/podcast/50-what-is-civil-war-in-the-digital-age
Since it’s looking more and more like Elon Musk, CEO of Tesla and SpaceX, will probably soon have ownership of Twitter, we wanted to do a special episode about what this could mean for Twitter users and our global digital democracy as a whole.
Twitter is a very complicated place. It is routinely blocked by governments who fear its power to organize citizen protests around the world. It’s also where outrage, fear and violence get amplified by design, warping users’ views of each other and our common, connected humanity.
We’re at a fork in the road, and we know enough about humane design principles to do this better. So we thought we would do a little thought experiment: What if we applied everything we know about humane technology to Twitter, starting tomorrow? What would happen?
This is the second part in a two-part conversation about Twitter that we’ve had on Your Undivided Attention about Elon Musk’s bid for Twitter and what it could mean in the context of the need to go in a more humane direction.
RECOMMENDED MEDIA
On Liberty by John Stuart Mill
Published in 1859, this philosophical essay applies Mill's ethical system of utilitarianism to society and state
Elon Musk Only Has “Yes” Men by Jonathan L. Fischer
Reporting from Slate on the subject
Foundations of Humane Technology
The Center for Humane Technology's free online course for professionals shaping tomorrow's technology
RECOMMENDED YUA EPISODES
A Bigger Picture on Elon and Twitter
https://www.humanetech.com/podcast/bigger-picture-elon-twitter
Transcending the Internet Hate Game with Dylan Marron
https://www.humanetech.com/podcast/52-transcending-the-internet-hate-game
Fighting With Mirages of Each Other with Adam Mastroianni
https://www.humanetech.com/podcast/56-fighting-with-mirages-of-each-other
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
We often talk about the need to protect American democracy. But perhaps those of us in the United States don't currently live in a democracy.
As research shows, there's pretty much no correlation between the percentage of the population that supports a policy and its likelihood of being enacted. The strongest determinant of whether a policy gets enacted is how much money is behind it.
So, how might we not just protect, but better yet revive our democracy? How might we revive the relationship between the will of the people and the actions of our government?
This week on Your Undivided Attention, we're doing something special. As we near the election, and representation is on our minds, we're airing a talk by Harvard Law professor and Creative Commons co-founder Larry Lessig. It's a 2019 talk he gave at the Politics and Prose bookstore in Washington, DC about his book, They Don't Represent Us.
The book title has two meanings: first, they — as in our elected representatives — don't represent us. And second, we — as in the people — don't represent ourselves. And this is where social media comes in: we don't represent ourselves because the more we use social media, the more we see extreme versions of the other side, and the more extreme, outraged, and polarized we ourselves become.
Last note: Lessig's talk is highly visual. We edited it lightly for clarity, and jump in periodically to narrate things you can’t see. But if you prefer to watch his talk, you can find the link below in Recommended Media.
RECOMMENDED MEDIA
Video: They Don't Represent Us
The 2019 talk Larry Lessig gave at Politics and Prose in Washington, DC about his book of the same name
Larry Lessig’s 2019 book that elaborates the ways in which democratic representation is in peril, and proposes a number of solutions to revive our democracy -- from ranked-choice voting to non-partisan open primaries
Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens
Princeton's Martin Gilens and Benjamin I. Page study measuring the correlation between the preferences of different groups and the decisions of our government
RECOMMENDED YUA EPISODES
Digital Democracy is Within Reach with Audrey Tang
https://www.humanetech.com/podcast/23-digital-democracy-is-within-reach
How Political Language Is Engineered with Drew Westen and Frank Luntz
https://www.humanetech.com/podcast/53-how-political-language-is-engineered
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
The next frontier of the internet is the metaverse. That's why Mark Zuckerberg changed the name of his company from Facebook to Meta, and just sold $10 billion in corporate bonds to raise money for metaverse-related projects.
How might we learn from our experience with social media, and anticipate the harms of the metaverse before they arise? What would it look like to design a humane metaverse — that respects our attention, improves our well-being, and strengthens our democracy?
This week on Your Undivided Attention, we talk with two pioneers who are thinking critically about the development of the metaverse. Professor Jeremy Bailenson is the Founding director of Stanford’s Virtual Human Interaction Lab, where he studies how virtual experiences lead to changes in perceptions of self and others. Dr. Courtney Cogburn is an Associate Professor at Columbia's School of Social Work, where she examines associations between racism and stress-related disease. Jeremy and Courtney collaborated on 1000 Cut Journey, a virtual reality experience about systemic racism.
CORRECTIONS:
RECOMMENDED MEDIA:
Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Do
https://www.amazon.com/Experience-Demand-Virtual-Reality-Works/dp/0393253694
Jeremy Bailenson's 2018 book exploring how virtual reality can be harnessed to improve our everyday lives
Experiencing Racism in VR
https://www.ted.com/talks/courtney_cogburn_experiencing_racism_in_vr_courtney_d_cogburn_phd_tedxrva
Courtney Cogburn's 2017 TEDx talk about how using virtual reality to help people experience the complexities of racism
Do Artifacts Have Politics?
https://faculty.cc.gatech.edu/~beki/cs4001/Winner.pdf
Technology philosopher Langdon Winner’s seminal 1980 article, in which he writes, "by far the greatest latitude of choice exists the very first time a particular instrument, system, or technique is introduced."
RECOMMENDED YUA EPISODES:
Do You Want To Become A Vampire? with LA Paul
https://www.humanetech.com/podcast/39-do-you-want-to-become-a-vampire
Pardon the Interruptions with Gloria Mark
https://www.humanetech.com/podcast/7-pardon-the-interruptions
Bonus - What Is Humane Technology?
https://www.humanetech.com/podcast/bonus-what-is-humane-technology
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Have you ever lost a friend to misperception? Have you lost a friend or a family member to the idea that your views got so different, that it was time to end the relationship — perhaps by unfriending each other on Facebook?
As it turns out, we often think our ideological differences are far greater than they actually are. Which means: we’re losing relationships and getting mired in polarization based on warped visions of each other.
This week on Your Undivided Attention, we're talking with Adam Mastroianni, a postdoctoral research scholar at Columbia Business School who studies how we perceive and misperceive our social worlds. Together with Adam, we're going to explore how accurate — and inaccurate — our views of each other are. As you listen to our conversation, keep in mind that relationship you might have lost to misperception, and that you might be able to revive as a result of what you hear.
CORRECTIONS: In the episode, Adam says in 1978, 85% of people said they'd vote for a Black president, but the actual percentage is 80.4%. Tristan says that Republicans estimate that more than a third of Democrats are LGBTQ, but the actual percentage is 32%. Finally, Tristan refers to Anil Seth's notion of cognitive impenetrability, but that term was actually coined by the Canadian cognitive scientist and philosopher Zenon W. Pylyshyn.
RECOMMENDED MEDIA
Widespread Misperceptions of Long-term Attitude Change
https://www.pnas.org/doi/abs/10.1073/pnas.2107260119
Adam Mastroianni's research paper showing how stereotypes of the past lead people to misperceive attitude change, and how these misperceptions can lend legitimacy to policies that people may not actually prefer
Experimental History
https://experimentalhistory.substack.com/
Adam's blog, where he shares original data and thinks through ideas
Americans experience a false social reality by underestimating popular climate policy support by nearly half
https://www.nature.com/articles/s41467-022-32412-y
Academic study showing that Americans are living in what researchers called a “false social reality” with respect to misperceptions about climate views
RECOMMENDED YUA EPISODES
Mind the (Perception) Gap with Dan Vallone
https://www.humanetech.com/podcast/33-mind-the-perception-gap
The Courage to Connect. Guests: Ciaran O’Connor and John Wood, Jr.
https://www.humanetech.com/podcast/30-the-courage-to-connect
Transcending the Internet Hate Game with Dylan Marron
https://www.humanetech.com/podcast/52-transcending-the-internet-hate-game
Imagine it's the Cold War. Imagine that the Soviet Union puts itself in a position to influence the television programming of the entire Western world — more than a billion viewers.
While this might sound like science fiction, it’s representative of the world we're living in, with TikTok being influenced by the Chinese Communist Party.
TikTok, the flagship app of the Chinese company Bytedance, recently surpassed Google and Facebook as the most popular site on the internet in 2021, and is expected to reach more than 1.8 billion users by the end of 2022. The Chinese government doesn't control TikTok, but has influence over it. What are the implications of this influence, given that China is the main geopolitical rival of the United States?
This week on Your Undivided Attention, we bring you a bonus episode about TikTok. Co-hosts Tristan Harris and Aza Raskin explore the nature of the TikTok threat, and how we might address it.
RECOMMENDED MEDIA
Pew Research Center's "Teens, Social Media and Technology 2022"
https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/
Pew's recent study on how TikTok has established itself as one of the top online platforms for U.S. teens
Axios' "Washington turns up the heat on TikTok"
Article on recent Congressional responses to the threat of TikTok
Felix Krause on TikTok's keystroke tracking
https://twitter.com/KrauseFx/status/1560372509639311366
A revelation that TikTok has code to observe keypad input and all taps
RECOMMENDED YUA EPISODES
A Fresh Take on Tech in China with Rui Ma and Duncan Clark
https://www.humanetech.com/podcast/44-a-fresh-take-on-tech-in-china
A Conversation with Facebook Whistleblower Frances Haugen
https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen
From Russia with Likes (Part 1). Guest: Renée DiResta
https://www.humanetech.com/podcast/5-from-russia-with-likes-part-1
From Russia with Likes (Part 2). Guest: Renée DiResta
https://www.humanetech.com/podcast/6-from-russia-with-likes-part-2
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
At Center for Humane Technology, we often talk about multipolar traps — which arise when individuals have an incentive to act in ways that are beneficial to them in the short term, but detrimental to the group in the long term. Think of social media companies that compete for our attention, so that when TikTok introduces an even-more addictive feature, Facebook and Twitter have to mimic it in order to keep up, sending us all on a race to the bottom of our brainstems.
Intervening at the level of multipolar traps has extraordinary leverage. One such intervention is the Long Term Stock Exchange — a U.S. national securities exchange serving companies and investors who share a long-term vision. Instead of asking public companies to pollute less or be less addictive while holding them accountable to short-term shareholder value, the Long-Term Stock Exchange creates a new playing field, which incentivizes the creation of long-term stakeholder value.
This week on Your Undivided Attention, we’re airing an episode of a podcast called ZigZag — a fellow member of the TED Audio Collective. In an exploration of how technology companies might transcend multipolar traps, we're sharing with you ZigZag’s conversation with Long Term Stock Exchange founder Eric Ries.
CORRECTION: In the episode, we say that TikTok has outcompeted Facebook, Instagram, and YouTube. In fact, TikTok has outcompeted Facebook, but not yet YouTube or Instagram — TikTok has 1 billion monthly users, while YouTube has 2.6 billion and Instagram has 2 billion. However, we can say that TikTok is on a path toward outcompeting YouTube and Instagram.
RECOMMENDED YUA EPISODES
An Alternative to Silicon Valley Unicorns with Mara Zepeda & Kate “Sassy” Sassoon: https://www.humanetech.com/podcast/54-an-alternative-to-silicon-valley-unicorns
A Problem Well-Stated Is Half-Solved with Daniel Schmachtenberger: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved
Here’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-know
When you hear the word cyber-attack, what comes to mind? Someone hacking into your email, or stealing your Facebook password?
As it turns out, our most critical infrastructure can be hacked. Our banks, water treatment facilities, and nuclear power plants can be deactivated and even controlled simply by finding bugs in the software used to operate them. Suddenly, cyber-attack takes on a different meaning.
This week on Your Undivided Attention, we're talking with cyber-security expert Nicole Perlroth. Nicole spent a decade as the lead cyber-security reporter at The New York Times, and is now a member of the Department of Homeland Security’s Cybersecurity Advisory Committee. She recently published “This Is How They Tell Me The World Ends” — an in-depth exploration of the global cyber arms race.
CORRECTIONS: In the episode, Nicole says that "the United States could have only afforded 2 to 3 more days of Colonial Pipeline being down before it ground the country — our economy — to a halt." The correct number is actually 3 to 5 days. She also refers to a 2015 study researching why some countries have significantly fewer successful cyber-attacks relative to cyber-attack attempts. That study was actually published in 2016.
RECOMMENDED MEDIA
This Is How They Tell Me The World Ends
Nicole Perlroth’s 2021 book investigating the global cyber-weapons arms race
Reporter Page at the New York Times
Nicole’s articles while the lead cyber-security reporter at the New York Times
The Global Cyber-Vulnerability Report (in brief)
Brief of a 2015 study by the Center for Digital International Government, Virginia Tech, and the University of Maryland that researched why some countries have significantly fewer successful cyber-attacks relative to cyber-attack attempts
RECOMMENDED YUA EPISODES
The Dark Side Of Decentralization with Audrey Kurth Cronin: https://www.humanetech.com/podcast/49-the-dark-side-of-decentralization
Is World War III Already Here? Guest: Lieutenant General H.R. McMaster: https://www.humanetech.com/podcast/45-is-world-war-iii-already-here
A Problem Well-Stated Is Half-Solved with Daniel Schmachtenberger:
https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Why isn't Twitter doing more to get bots off their platform? Why isn’t Uber taking better care of its drivers?
What if...they can't?
Venture-capital backed companies like Twitter and Uber are held accountable to maximizing returns to investors. If and when they become public companies, they become accountable to maximizing returns to shareholders. They’ve promised Wall Street outsized returns — which means Twitter can't lose bots if it would significantly lower their user count and in turn lower advertising revenue, and Uber can’t treat their drivers like employees if it competes with profits.
But what's the alternative? What might it look like to design an ownership and governance model that incentivizes a technology company to serve all of its stakeholders over the long term – and primarily, the stakeholders who create value?
This week on Your Undivided Attention, we're talking with two experts on creating the conditions for humane business, and in turn, for humane technology: Mara Zepeda and Kate “Sassy” Sassoon of Zebras Unite Co-Op. Zebras Unite is a member-owned co-operative that’s creating the capital, culture, and community to power a more just and inclusive economy. The Zebras Unite Coop serves a community of over 6,000 members, in about 30 chapters, over 6 continents. Mara is their Managing Director, and Kate is their Director of Cooperative Membership.
Two corrections:
RECOMMENDED MEDIA
Zebras Fix What Unicorns Break
A seminal 2017 article by Zebras Unite co-founders, which kicked off the movement and distinguished between zebras and unicorns — per the table below.
Zebras Unite’s 2019 thought experiment of exiting Meetup to community
Zebras Unite Crowdcast Channel
Where you can find upcoming online events, as well as recordings of previous events.
RECOMMENDED YUA EPISODES
A Renegade Solution to Extractive Economics with Kate Raworth: https://www.humanetech.com/podcast/29-a-renegade-solution-to-extractive-economics
Bonus — A Bigger Picture on Elon & Twitter: https://www.humanetech.com/podcast/bigger-picture-elon-twitter
Here’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-know
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
This week on Your Undivided Attention, we’re doing something different: we’re airing an episode of another podcast that’s also part of the TED Audio Collective.
Backing up for a moment: we recently aired an episode with Dylan Marron — creator and host of the podcast, Conversations With People Who Hate Me. On his show, Dylan calls up the people behind negative comments on the internet, and asks them: why did you write that?
In our conversation with Dylan, we played a clip from episode 2 of Conversations With People Who Hate Me. In that episode, Dylan talks with a high school student named Josh, who’d sent him homophobic messages online. This week, we're airing that full episode — the full conversation between Dylan Marron and Josh.
If you didn’t hear our episode with Dylan, do give it a listen. Then, enjoy this second episode of Conversations With People Who Hate Me.
RECOMMENDED YUA EPISODES
Transcending the Internet Hate Game with Dylan Marron: https://www.humanetech.com/podcast/52-transcending-the-internet-hate-game
A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen
The Cure for Hate. Guest: Tony McAleer: https://www.humanetech.com/podcast/11-the-cure-for-hate
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Democracy depends on our ability to choose our political views. But the language we use to talk about political issues is deliberately designed to be divisive, and can produce up to a 15-point difference in what we think about those issues. As a result, are we choosing our views, or is our language choosing them for us?
This week,Your Undivided Attention welcomes two Jedi Masters of political communication. Drew Westen is a political psychologist and messaging consultant based at Emory university, who has advised the Democratic Party. Frank Luntz is a political and communications consultant, pollster, and pundit, who has advised the Republican Party. In the past, our guests have used their messaging expertise in ways that increased partisanship. For example, Luntz advocated for the use of the term “death tax” instead of “estate tax,” and “climate change” instead of “global warming.” Still, Luntz and Westen are uniquely positioned to help us decode the divisive power of language — and explore how we might design language that unifies.
CORRECTIONS: in the episode, Tristan refers to a panel Drew Westen and Frank Luntz were on at the New York Public Library. He says the panel was “about 10 years ago,” but it was actually 15 years ago in 2007. Also, Westen refers to a news anchor who moderated a debate between George H. W. Bush and Michael Dukakis in 1988. Drew mistakenly names the anchor as Bernard Kalb, when it was actually Bernard Shaw.
RECOMMENDED MEDIA
The Political Brain: The Role of Emotion in Deciding the Fate of the Nation
Drew Westen's 2008 book about role of emotion in determining the political life of the nation, which influenced campaigns and elections around the world
Words That Work: It's Not What You Say, It's What People Hear
Frank Luntz's 2008 book, which offers a behind-the-scenes look at how the tactical use of words and phrases affects what we buy, who we vote for, and even what we believe in
New York Public Library's Panel on Political Language
A 2007 panel between multiple 'Jedi Masters' of political communication along the political spectrum, including Frank Luntz, Drew Westen, and George Lakoff
RECOMMENDED YUA EPISODES
The Invisible Influence of Language with Lera Boroditsky: https://www.humanetech.com/podcast/48-the-invisible-influence-of-language
How To Free Our Minds with Cult Deprogramming Expert Dr. Steven Hassan: https://www.humanetech.com/podcast/51-how-to-free-our-minds
Mind the (Perception) Gap with Dan Vallone: https://www.humanetech.com/podcast/33-mind-the-perception-gap
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
The game that social media sets us up to play is a game that rewards outrage. It's a game that we win by being better than other players at dunking on each other, straw-manning each other, and assuming the worst in each other. The game itself must be transformed.
And, we can also decide to step out of the game, and do something different.
On this week’s episode of Your Undivided Attention, we welcome Dylan Marron — who has been called by Jason Sudeikis "a modern Mr. Rogers for the digital age." Dylan is the creator and host of the podcast Conversations With People Who Hate Me.
On the show, he calls up the people behind negative comments on the internet, and asks them a simple question: why did you write that? He just published a book by the same name, where he elaborates 12 lessons learned from talking with internet strangers. Together with Dylan, we explore how transforming the game and transforming ourselves can go hand-in-hand.
RECOMMENDED MEDIA
Conversations With People Who Hate Me (podcast)
Dylan Marron’s podcast where he calls up the people behind negative comments on the internet, and talks to them. In this episode, we heard a clip of Episode 2: Hurt People Hurt People.
Conversations With People Who Hate Me (book)
Dylan’s book where he elaborates 12 lessons learned from talking with internet strangers.
Feature documentary chronicling the work and legacy of Fred Rogers.
RECOMMENDED YUA EPISODES
A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen
The Cure for Hate. Guest: Tony McAleer: https://www.humanetech.com/podcast/11-the-cure-for-hate
The Fake News of Your Own Mind with Jack Kornfield and Trudy Goodman:
https://www.humanetech.com/podcast/19-the-fake-news-of-your-own-mind
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
How would you know if you were in a cult?
If not a cult, then at least under undue influence?
The truth is: we're all under some form of undue influence. The question is: to what degree and to what extent we’re aware of this influence — which is exacerbated by social media. In an era of likes, followers, and echo chambers, how can we become aware of undue influence and gain sovereignty over our minds?
Our guest this week is Dr. Steven Hassan, an expert on undue influence, brainwashing, and unethical hypnosis. He’s the founder of the Freedom of Mind Resource Center — a coaching, consulting, and training organization dedicated to helping people freely consider how they want to live their lives. Dr. Hassan was himself a member of a cult: the Unification Church (also known as the Moonies), which was developed in Korea in the 1950's. Since leaving the Moonies, Dr. Hassan has helped thousands of individuals and families recover from undue influence.
RECOMMENDED MEDIA
Freedom of Mind website: The website for Dr. Hassan’s Freedom of Mind Resource Center, which includes resources such as his Influence Continuum, BITE model of authoritarian control, and Strategic Interactive Approach for alleviating people of undue influence
The Influence Continuum with Dr. Steven Hassan: Dr. Hassan’s podcast exploring how mind-control works, and how to protect yourself from its grips
Reckonings: A podcast that told the stories of people who’ve transcended extremism, expanded their worldviews, and made other kinds of transformative change. Start with episode 17 featuring a former paid climate skeptic, or episode 18 featuring the former protégé of Fox News chairman Roger Ailes
RECOMMENDED YUA EPISODES
Can Your Reality Turn on a Word? Guest: Anthony Jacquin: https://www.humanetech.com/podcast/34-can-your-reality-turn-on-a-word
The World According to Q. Guest: Travis View: https://www.humanetech.com/podcast/21-the-world-according-to-q
The Cure for Hate. Guest: Tony McAleer: https://www.humanetech.com/podcast/11-the-cure-for-hate
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
If Elon Musk owns Twitter, what are the risks and what are the opportunities? In order for Twitter to support democracy — and Musk’s goal of becoming a multi-planetary civilization — we need a radical redesign that goes beyond free speech.
Note: this conversation was recorded on April 21, 2022. That was 3 days prior to the official purchase announcement, which revealed that Elon Musk will buy Twitter for $44 billion.
Clarification: In the episode, we talk about the creation of The Daily Show, featuring Jon Stewart. To be clear, The Daily Show was created by writer and producer Madeleine Smithberg and comedian and media personality Lizz Winstead — for comedian and host Craig Kilborn. Jon Stewart took over in 1999, which is when he had the conversation with executives that we reference in the episode, where he didn't want to see the viewership numbers.
RECOMMENDED MEDIA
Examining algorithmic amplification of political content on Twitter
Polarization of Twitter (Knight Foundation)
Pew Research on the political extremes drowning out centrist voices on Twitter
Chronological feed vs algorithm (Computational Journalism Lab)
RECOMMENDED YUA EPISODES
A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen
Here’s Our Plan And We Don’t Know: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-know
A Problem Well-Stated Is Half-Solved: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Civil war might be the most likely escalation pathway towards disaster for our country. On the flip side, learning how to avoid civil conflict — and more ambitiously, repair our civic fabric — might have the greatest leverage for addressing the challenges we face.
Our guest Barbara F. Walter is one of the world's leading experts on civil wars, political violence, and terrorism. She’s the author of How Civil Wars Start: And How To Stop Them, which provides insight into the drivers of civil war, how social media fuels conflict, and how we might repair our broken democracies.
Together, we explore what makes for a healthy liberal democracy, why democracies worldwide are in decline, and the role of resentment and hope. Join us in an exploration of the generator functions for civil war in the digital age, and how we might prevent them.
RECOMMENDED MEDIA
How Civil Wars Start: And How To Stop Them
Barbara F. Walter’s latest book and the subject of our conversation, identifying the conditions that give rise to modern civil war in order to address them
Political Violence At A Glance
An award-winning online magazine about the causes and consequences of violence and protest, co-authored by Barbara and other experts
Publications, analysis, and other resources from the organizations that measures for democracies and anocracies on a 21-point scale
RECOMMENDED YUA EPISODES
A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen
The Courage to Connect. Guests: Ciaran O’Connor and John Wood, Jr.: https://www.humanetech.com/podcast/30-the-courage-to-connect
Mind the (Perception) Gap with Dan Vallone: https://www.humanetech.com/podcast/33-mind-the-perception-gap
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
“The fundamental problem of humanity is that we have paleolithic emotions, medieval institutions, and God-like technology.” — E. O. Wilson.
More than ever, we need the wisdom to match the power of our God-like technology. Yet, technology is both eroding our ability to make sense of the world, and increasing the complexity of the issues we face. The gap between our sense-making ability and issue complexity is what we call the “wisdom gap."
How do we develop the wisdom we need to responsibly steward our God-like technology?
This week on Your Undivided Attention, we're introducing one way Center for Humane Technology is attempting to close the wisdom gap —through our new online course, Foundations of Humane Technology. In this bonus episode, Tristan Harris describes the wisdom gap we're attempting to close, and our Co-Founder and Executive Director Randima Fernando talks about the course itself.
Sign up for the free course: https://www.humanetech.com/course
RECOMMENDED YUA EPISODES
A Problem Well-Stated Is Half-Solved with Daniel Schmachtenberger: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved
A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen
Here’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-know
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
[This episode originally aired on July 23rd, 2020.] Imagine a world where every country has a digital minister and technologically-enabled legislative bodies. Votes are completely transparent and audio and video of all conversations between lawmakers and lobbyists are available to the public immediately. Conspiracy theories are acted upon within two hours and replaced by humorous videos that clarify the truth. Imagine that expressing outrage about your local political environment turned into a participatory process where you were invited to solve that problem and even entered into a face to face group workshop.
Does that sound impossible? It’s ambitious and optimistic, but that's everything that our guest this episode, Audrey Tang, digital minister of Taiwan, has been working on in her own country for many years. Audrey’s path into public service began in 2014 with her participation in the Sunflower Movement, a student-led protest in Taiwan’s parliamentary building, and she’s been building on that experience ever since, leading her country into a future of truly participatory digital democracy.
Is decentralization inherently a good thing?
These days, there's a lot of talk about decentralization. Decentralized social media platforms can allow us to own our own data. Decentralized cryptocurrencies can enable bank-free financial transactions. Decentralized 3D printing can allow us to fabricate anything we want.
But if the world lives on Bitcoin, we may not be able to sanction nation states like Russia when they invade sovereign nations. If 3D printing is decentralized, anyone can print their own weapons at home. Decentralization takes on new meaning when we're talking about decentralizing the capacity for catastrophic destruction.
This week on Your Undivided Attention, we explore the history of decentralized weaponry, how social media is effectively a new decentralized weapon, and how to wisely navigate these threats. Guiding us through this exploration is Audrey Kurth Cronin — one of the world’s leading experts in security and terrorism. Audrey is a distinguished Professor of International Security at American University, and the author of several books — most recently: Power to the People: How Open Technological Innovation is Arming Tomorrow’s Terrorists.
Clarification: in the episode, Tristan refers to a video of Daniel Schmachtenberger's as "The Psychological Pitfalls of Working on Existential Risk." The correct name of the video is "Psychological Pitfalls of Engaging With X-Risks & Civilization Redesign."
RECOMMENDED MEDIA
Power to the People: How Open Technological Innovation is Arming Tomorrow's Terrorists
Audrey Kurth Cronin's latest book, which analyzes emerging technologies and devises a new framework for analyzing 21st century military innovation
Psychological Pitfalls of Engaging With X-Risks & Civilization Redesign
Daniel Schmachtenberger's talk discussing the psychological pitfalls of working on existential risks and civilization redesign
Policy Reforms Toolkit
The Center for Humane Technology's toolkit for developing policies to protect the conditions that democracy needs to thrive: a comprehensively educated public, a citizenry that can check the power of market forces and bind predatory behavior
RECOMMENDED YUA EPISODES
22 – Digital Democracy is Within Reach with Audrey Tang: https://www.humanetech.com/podcast/23-digital-democracy-is-within-reach
28 – Two Million Years in Two Hours: A Conversation with Yuval Noah Harari: https://www.humanetech.com/podcast/28-two-million-years-in-two-hours-a-conversation-with-yuval-noah-harari
45 – Is World War III Already Here? Guest: Lieutenant General H.R. McMaster: https://www.humanetech.com/podcast/45-is-world-war-iii-already-here
One of the oldest technologies we have is language. How do the words we use influence the way we think?
The media can talk about immigrants scurrying across the border, versus immigrants crossing the border. Or we might hear about technology platforms censoring us, versus moderating content.
If those word choices shift public opinion on immigration or technology by 25%, or even 2%, then we’ve been influenced in ways we can't even see. Which means that becoming aware of how words shape the way we think can help inoculate us from their undue influence. And further, consciously choosing or even designing the words we use can help us think in more complex ways – and address our most complex challenges.
This week on Your Undivided Attention, we're grateful to have Lera Boroditsky, a cognitive scientist who studies how language shapes thought. Lera is an Associate Professor of Cognitive Science at UC San Diego, and the editor-in-chief of Frontiers in Cultural Psychology.
Clarification: in the episode, Aza refers to Elizabeth Loftus' research on eyewitness testimony. He describes an experiment in which a car hit a stop sign, but the experiment actually used an example of two cars hitting each other.
RECOMMENDED MEDIA
How language shapes the way we think
Lera Boroditsky's 2018 TED talk about how the 7,000 languages spoken around the world shape the way we think
Measuring Effects of Metaphor in a Dynamic Opinion Landscape
Boroditsky and Paul H. Thibodeau's 2015 study about how the metaphors we use to talk about crime influence our opinions on how to address crime
Subtle linguistic cues influence perceived blame and financial liability
Boroditsky and Caitlin M. Fausey's 2010 study about how the language used to describe the 2004 Super Bowl "wardrobe malfunction" influence our views on culpability
Why are politicians getting 'schooled' and 'destroyed'?
BBC article featuring the research of former Your Undivided Attention guest Guillaume Chaslot, which shows the verbs YouTube is most likely to include in titles of recommended videos — such as "obliterates" and "destroys"
RECOMMENDED YUA EPISODES
Mind the (Perception) Gap: https://www.humanetech.com/podcast/33-mind-the-perception-gap
Can Your Reality Turn on a Word?: https://www.humanetech.com/podcast/34-can-your-reality-turn-on-a-word
Down the Rabbit Hole by Design: https://www.humanetech.com/podcast/4-down-the-rabbit-hole-by-design
The meta-crisis is so vast: climate change, exponential technology, addiction, polarization, and more. How do we grasp it, let alone take steps to address it?
One of the thinking tools we have at our disposal is science fiction. To the extent that we co-evolve with our stories, science fiction can prepare us for the impending future — and empower us to shape it.
This week on Your Undivided Attention, we're thrilled to have one of the greatest living science-fiction writers — Kim Stanley Robinson. His most recent novel is The Ministry for the Future, a sweeping epic that reaches into the very near future, and imagines what it would take to unite humanity and avoid a mass extinction. Whether or not you've read the book, this episode has insights for you. And if this episode makes you want to read the book, our conversation won't spoil it for you.
Clarification: in the episode, Robinson refers to philosopher Antonio Gramsci's "pessimism of the intellect, optimism of the will." This phrase was originally said by novelist and playwright Romain Rolland. Gramsci made the phrase the motto of his newspaper, because he appreciated its integration of radical intellectualism with revolutionary activism.
RECOMMENDED MEDIA
The Ministry For The Future
Robinson's latest novel and the subject of our conversation — which reaches into the near future, and imagines what it would take to unite humanity and avoid a mass extinction
A Deeper Dive Into the Meta Crisis
CHT's blog post about the meta-crisis, which includes the fall of sense-making and the rise of decentralized technology-enabled power
The project based on E. O. Wilson's proposal to conserve half the land and sea — in order to safeguard the bulk of biodiversity, including ourselves
Global tech worker community mobilizing the technology industry to face the climate crisis
RECOMMENDED YUA EPISODES
18 – The Stubborn Optimist’s Guide to Saving the Planet: https://www.humanetech.com/podcast/18-the-stubborn-optimists-guide-to-saving-the-planet
Bonus – The Stubborn Optimist’s Guide Revisited: https://www.humanetech.com/podcast/bonus-the-stubborn-optimists-guide-revisited
29 – A Renegade Solution to Extractive Economics: https://www.humanetech.com/podcast/29-a-renegade-solution-to-extractive-economics
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Renowned quantum physicist Richard Feynman once wrote, "It is our capacity to doubt that will determine the future of civilization."
In that spirit, this episode is a little different – because we're talking openly about our doubts, with you, our listeners. It's also different because it’s hosted by our Executive Producer Stephanie Lepp, with Tristan Harris and Aza Raskin in the hot seats.
How have we evolved our understanding of our social media predicament? How has that evolution inspired us to question the work we do at Center for Humane Technology? Join us as we say those three magic words — I don't know — and yet pursue our mission to the best of our ability.
RECOMMENDED MEDIA
Leverage Points: Places to Intervene in a System
Systems theorist Donella Meadows' seminal article, articulating a framework for thinking about how to change complex systems.
Winning Humanity’s Existential Game
The Future Thinkers podcast with Daniel Schmactenberger, where he explores how to mitigate natural and human-caused existential risks and design post-capitalist systems
Ledger of Harms of Social Media
The Center for Humane Technology's research on elaborating the many externalities of our technology platforms' race for human attention
Foundations of Humane Technology Course
CHT's forthcoming course on how to build technology that protects our well-being, minimizes unforeseen consequences, and builds our collective capacity to address humanity's urgent challenges
RECOMMENDED YUA EPISODES
36 - A Problem Well-Stated Is Half-Solved: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved
42 - A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen
43 - Behind the Curtain on The Social Dilemma: https://www.humanetech.com/podcast/43-behind-the-curtain-on-the-social-dilemma
Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Would you say that the US is in war-time or peace-time? How do you know? The truth is, the nature of warfare has changed so fundamentally, that we're currently in a war we don't even recognize. It's the war that Russia, China, and other hostile foreign actors are fighting against us — weaponizing social media to undermine our faith in each other, our government, and democracy itself. World War III is here, it's in cyberspace, and the US is unprepared — and largely unaware. This week on Your Undivided Attention, we're fortunate to be speaking with Lieutenant General H. R. McMaster. General McMaster was the United States National Security Advisor from 2017 to 2018. He has examined the most critical foreign policy and national security challenges that face the United States, and is devoted to preserving America's standing and security. |
Who do you think the Chinese government considers its biggest rival? The United States, right? Actually, the Chinese government considers its biggest rival to be its own technology companies.
It's China's tech companies who threaten its capacity to build a competitive China. That's why the Chinese government is cracking down on social media — for example, by limiting the number of hours youth can play video games, and banning cell phone use in schools. China's restrictions on social media use may be autocratic, but may also protect users more than what we see coming from the US government.
It’s a complicated picture.
This week on Your Undivided Attention, we're having a surprising conversation about technology in China. Here to give us a fresh take are two guests: investor, analyst, and co-host of the Tech Buzz China podcast Rui Ma, and China internet expert and author of Alibaba: The House That Jack Ma Built, Duncan Clark.
How do you make a film that impacts more than 100 million people in 190 countries in 30 languages?
This week on Your Undivided Attention, we're going behind the curtain on The Social Dilemma — the Netflix documentary about the dark consequences of the social media business model, which featured the Center for Humane Technology. On the heels of the film's 1-year anniversary and winning of 2 Emmy Awards, we're talking with Exposure Labs' Director Jeff Orlowski-Yang and Producer Larissa Rhodes. What moved Jeff and Larissa to shift their focus from climate change to social media? How did the film transform countless lives, including ours and possibly yours? What might we do differently if we were producing the film today?
Join us as we explore the reverberations of The Social Dilemma — which we're still feeling the effects of over one year later.
We are now in social media's Big Tobacco moment. And that’s largely thanks to the courage of one woman: Frances Haugen.
Frances is a specialist in algorithmic product management. She worked at Google, Pinterest, and Yelp before joining Facebook — first as a Product Manager on Civic Misinformation, and then on the Counter-Espionage team. But what she saw at Facebook was that the company consistently and knowingly prioritized profits over public safety. So Frances made the courageous decision to blow the whistle — which resulted in the biggest disclosure in the history of Facebook, and in the history of social media.
In this special interview, co-hosts Tristan and Aza go behind the headlines with Frances herself. We go deeper into the problems she exposed, discuss potential solutions, and explore her motivations — along with why she fundamentally believes change is possible. We also announce an exciting campaign being launched by the Center for Humane Technology — to use this window of opportunity to make Facebook safer.
In seven years of working on the problems of runaway technology, we’ve never experienced a week like this! In this bonus episode of Your Undivided Attention, we recap this whirlwind of a week — from Facebook whistleblower France Haugen going public on 60 Minutes on Sunday, to the massive outage of Facebook, Instagram, and WhatsApp on Monday, to Haugen’s riveting Congressional testimony on Tuesday. We also make some exciting announcements — including our planned episode with Haugen up next, the Yale social media reform panel we’re participating in on Thursday, and a campaign we’re launching to pressure Facebook to make one immediate change.
This week it truly feels like we’re making history — and you’re a part of it.
What helps you make meaning in challenging times? As you confront COVID, the climate crisis, and all of the challenges we discuss on this show, what helps you avoid nihilism or fundamentalism, and instead access healing, inspiration, and connection?
Today on Your Undivided Attention, we're joined by anthropologist and writer Jamie Wheal. Wheal is the author of Recapture the Rapture: Rethinking God, Sex and Death In a World That's Lost Its Mind. In the book, he makes the case that in order to address the meta-crisis — the interconnected challenges we face, which we talked about in Episode 36 with Daniel Schmachtenberger, we must address the meaning crisis — the need to stay inspired, mended, and bonded in challenging times. Jamie argues that it doesn't matter whether we're staying inspired, mended, and bonded through institutionalized religion or other means as long as meaning-making is inclusively available to everyone.
What we hope you'll walk away with is a humane way to think about how to address the challenges we face, from COVID to climate — by enabling us to make meaning in challenging times.
On September 13th, the Wall Street Journal released The Facebook Files, an ongoing investigation of the extent to which Facebook's problems are meticulously known inside the company — all the way up to Mark Zuckerberg. Pollster Frank Luntz invited Tristan Harris along with friend and mentor Daniel Schmachtenberger to discuss the implications in a live webinar.
In this bonus episode of Your Undivided Attention, Tristan and Daniel amplify the scope of the public conversation about The Facebook Files beyond the platform, and into its business model, our regulatory structure, and human nature itself.
What is the goal of our digital information environment? Is it simply to inform us, or also to empower us to act?
The Solutions Journalism Network (SJN) understands that simply reporting on social problems rarely leads to change. What they’ve discovered is that rigorously reporting on responses to social problems is more likely to give activists and concerned citizens the hope and information they need to take effective action. For this reason, SJN trains journalists to report on “solutions angles.” More broadly, the organization seeks to rebalance the news, so that people are exposed to stories that help them understand the challenges we face as well as potential ways to respond.
In this episode, Tina Rosenberg, co-founder of SJN, and Hélène Biandudi Hofer, former manager of SJN’s Complicating the Narratives initiative, walk us through the origin of solutions journalism, how to practice it, and what impact it has had. Tristan Harris and Aza Raskin reflect on how humane technology, much like solutions journalism, should also be designed to create an empowering relationship with reality — enabling us to shift from learned helplessness to what we might call learned hopefulness.
How do we decide whether to undergo a transformative experience when we don’t know how that experience will change us? This is the central question explored by Yale philosopher and cognitive scientist L.A. Paul.
Paul uses the prospect of becoming a vampire to illustrate the conundrum: let's say Dracula offers you the chance to become a vampire. You might be confident you'll love it, but you also know you'll become a different person with different preferences. Whose preferences do you prioritize: yours now, or yours after becoming a vampire? Similarly, whose preferences do we prioritize when deciding how to engage with technology and social media: ours now, or ours after becoming users — to the point of potentially becoming attention-seeking vampires?
In this episode with L.A. Paul, we're raising the stakes of the social media conversation — from technology that steers our time and attention, to technology that fundamentally transforms who we are and what we want. Tune in as Paul, Tristan Harris, and Aza Raskin explore the complexity of transformative experiences, and how to approach their ethical design.
When author and journalist James Nestor began researching a piece on free diving, he was stunned. He found that free divers could hold their breath for up to 8 minutes at a time, and dive to depths of 350 feet on a single breath. As he dug into the history of breath, he discovered that our industrialized lives have led to improper and mindless breathing, with cascading consequences from sleep apnea to reduced mobility. He also discovered an entire world of extraordinary feats achieved through proper and mindful breathing — including healing scoliosis, rejuvenating organs, halting snoring, and even enabling greater sovereignty in our use of technology. What is the transformative potential of breath? And what is the relationship between proper breathing and humane technology?
In September of 2020, on her last day at Facebook, data scientist Sophie Zhang posted a 7,900-word memo to the company's internal site. In it, she described the anguish and guilt she had experienced over the last two and a half years. She'd spent much of that time almost single-handedly trying to rein in fake activity on the platform by nefarious world leaders in small countries. Sometimes she received help and attention from higher-ups; sometimes she got silence and inaction. “I joined Facebook from the start intending to change it from the inside,” she said, but “I was still very naive at the time.”
We don’t have a lot of information about how things operate inside the major tech platforms, and most former employees aren’t free to speak about their experience. It’s easy to fill that void with inferences about what might be motivating a company — greed, apathy, disorganization or ignorance, for example — but the truth is usually far messier and more nuanced. Sophie turned down a $64,000 severance package to avoid signing a non-disparagement agreement. In this episode of Your Undivided Attention, she explains to Tristan Harris and Aza Raskin how she ended up here, and offers ideas about what could be done at these companies to prevent similar kinds of harm in the future.
We’ve explored many different problems on Your Undivided Attention — addiction, disinformation, polarization, climate change, and more. But what if many of these problems are actually symptoms of the same meta-problem, or meta-crisis? And what if a key leverage point for intervening in this meta-crisis is improving our collective capacity to problem-solve?
Our guest Daniel Schmachtenberger guides us through his vision for a new form of global coordination to help us address our global existential challenges. Daniel is a founding member of the Consilience Project, aimed at facilitating new forms of collective intelligence and governance to strengthen open societies. He's also a friend and mentor of Tristan Harris.
This insight-packed episode introduces key frames we look forward to using in future episodes. For this reason, we highly encourage you to listen to this unedited version along with the edited version.
We also invite you to join Daniel and Tristan at our Podcast Club! It will be on Friday, July 9th from 2-3:30pm PDT / 5-6:30pm EDT. Check here for details.
We’ve explored many different problems on Your Undivided Attention — addiction, disinformation, polarization, climate change, and more. But what if many of these problems are actually symptoms of the same meta-problem, or meta-crisis? And what if a key leverage point for intervening in this meta-crisis is improving our collective capacity to problem-solve?
Our guest Daniel Schmachtenberger guides us through his vision for a new form of global coordination to help us address our global existential challenges. Daniel is a founding member of the Consilience Project, aimed at facilitating new forms of collective intelligence and governance to strengthen open societies. He's also a friend and mentor of Tristan Harris.
This insight-packed episode introduces key frames we look forward to using in future episodes. For this reason, we highly encourage you to listen to this edited version along with the unedited version.
We also invite you to join Daniel and Tristan at our Podcast Club! It will be on Friday, July 9th from 2-3:30pm PDT / 5-6:30pm EDT. Check here for details.
Back in January 2020, Tristan Harris went to Washington, D.C. to testify before the U.S. Congress on the harms of social media. A few weeks ago, he returned — virtually — for another hearing, Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds. He testified alongside Dr. Joan Donovan, Research Director at the Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy and the heads of policy from Facebook, YouTube and Twitter. The senators’ animated questioning demonstrated a deeper understanding of how these companies’ fundamental business models and design properties fuel hate and misinformation, and many of the lawmakers expressed a desire and willingness to take regulatory action. But, there’s still room for a more focused conversation. “It’s not about whether they filter out bad content,” says Tristan, “but really whether the entire business model of capturing human performance is a good way to organize society.” In this episode, a follow-up to last year’s “Mr. Harris Goes to Washington,” Tristan and Aza Raskin debrief about what was different this time, and what work lies ahead to pave the way for effective policy.
Can hypnosis be a tool to help us see how our minds are being shaped and manipulated more than we realize? Guest Anthony Jacquin is a hypnotist and hypnotherapist of over 20 years, author of Reality is Plastic, and he co-runs the Jacquin Hypnosis Academy. He uses his practice to help his clients change their behavior and improve their lives. In this episode, he breaks down the misconceptions of hypnosis and reveals that despite the influence of hypnotizing forces like social media, we all still have the ability to get in touch with our subconscious selves. “What can I say with certainty is true about me — what is good, true and real about me?” Anthony asks. “Much of what we’ve invested in is actually transient. It will change. What is unchanging?” Anthony draws connections between hypnosis and technology and the impacts of both on our subconscious minds but identifies a key difference — technology is exploiting us. But maybe a little more insight into one more dimension of how our minds work underneath the hood can help us build better, more humane and conscious technology.
[This episode originally aired May 21, 2020] Internationally-recognized global leader on climate change Christiana Figueres argues that the battle against global threats like climate change begins in our own heads. She became the United Nations’ top climate official, after she had watched the 2009 Copenhagen climate summit collapse “in blood, in screams, in tears.” In the wake of that debacle, Christiana began performing an act of emotional Aikido on herself, her team, and eventually delegates from 196 nations. She called it “stubborn optimism.” It requires a clear and alluring vision of a future that can supplant the dystopian and discouraging vision of what will happen if the world fails to act. It was stubborn optimism, she says, that convinced those nations to sign the first global climate framework, the Paris Agreement. In this episode, we explore how a similar shift in Silicon Valley’s vision could lead 3 billion people to take action for the planet.
What do you think the other side thinks? Guest Dan Vallone is the Director of More in Common U.S.A., an organization that’s been asking Democrats and Republicans that critical question. Their work has uncovered countless “perception gaps” in our understanding of each other. For example, Democrats think that about 30 percent of Republicans support "reasonable gun control," but in reality, it’s about 70 percent. Both Republicans and Democrats think that about 50 percent of the other side would feel that physical violence is justified in some situations, but the actual number for each is only about five percent. “Both sides are convinced that the majority of their political opponents are extremists,” says Dan. “And yet, that's just not true.” Social media encourages the most extreme views to speak the loudest and rise to the top—and it’s hard to start a conversation and work together when we’re all arguing with mirages. But Dan’s insights and the work of More in Common provide a hopeful guide to unraveling the distortions we’ve come to accept and correcting our foggy vision.
The film Coded Bias follows MIT Media Lab researcher Joy Buolamwini through her investigation of algorithmic discrimination, after she accidentally discovers that facial recognition technologies do not detect darker-skinned faces. Joy is joined on screen by experts in the field, researchers, activists, and involuntary victims of algorithmic injustice. Coded Bias was released on Netflix April 5, 2021, premiered at the Sundance Film Festival last year, and has been called “‘An Inconvenient Truth’ for Big Tech algorithms” by Fast Company magazine. We talk to director Shalini Kantayya about the impetus for the film and how to tackle the threats these challenges pose to civil rights while working towards more humane technology for all.
How many technologists have traveled to Niger, or the Balkans, or Rwanda, to learn the lessons of peacebuilding? Technology and social media are creating patterns and pathways of conflict that few people anticipated or even imagined just a decade ago. And we need to act quickly to contain the effects, but we don't have to reinvent the wheel. There are people, such as this episode’s guest, Shamil Idriss, CEO of the organization Search for Common Ground, who have been training for years to understand human beings and learn how to help them connect and begin healing processes. These experts can share their insights and help us figure out how to apply them to our new digital habitats. “Peace moves at the speed of trust, and trust can’t be fast-tracked,” says Shamil. Real change is possible, but as he explains, it takes patience, care, and creativity to get there.
Disinformation researchers have been fighting two battles over the last decade: one to combat and contain harmful information, and one to convince the world that these manipulations have an offline impact that requires complex, nuanced solutions. Camille François, Chief Information Officer at the cybersecurity company Graphika and an affiliate of the Harvard Berkman Klein Center for Internet & Society, believes that our common understanding of the problem has recently reached a new level. In this interview, she catalogues the key changes she observed between studying Russian interference in the 2016 U.S. election and helping convene and operate the Election Integrity Partnership watchdog group before, during and after the 2020 election. “I'm optimistic, because I think that things that have taken quite a long time to land are finally landing, and because I think that we do have a diverse set of expertise at the table,” she says. Camille and Tristan Harris dissect the challenges and talk about the path forward to a healthy information ecosystem.
It’s no revelation that Americans aren’t getting along. But it’s easier to diagnose the problem than come up with solutions. The organization Braver Angels runs workshops that convince Republicans and Democrats to meet, but not necessarily in the middle. “Conflict can actually be a pathway to intimacy and connection rather than division, if you have the right structure for bringing people together,” says Ciaran O’Connor, the organization’s Chief Marketing Officer. We’re delighted to have Ciaran and the Braver Angels National Ambassador John Wood, Jr. on the show to describe their methods, largely based on marriage counseling techniques, and talk about where to go next. “How do you scale that up and apply that to the digital space, given that that is the key battlefield?” asks John. Technology companies play a role here, and the wisdom of the people doing the work on the ground is a valuable guide.
When Kate Raworth began studying economics, she was disappointed that the mainstream version of the discipline didn’t fully address many of the world issues that she wanted to tackle, such as human rights and environmental destruction. She left the field, but was inspired to jump back in after the financial crisis of 2008, when she saw an opportunity to introduce fresh perspectives. She sat down and drew a chart in the shape of a doughnut, which provided a way to think about our economic system while accounting for the impact to the world around us, as well as for humans’ baseline needs. Kate’s framing can teach us a lot about how to transform the economic model of the technology industry, helping us move from a system that values addicted, narcissistic, polarized humans to one that values healthy, loving and collaborative relationships. Her book, “Doughnut Economics: Seven Ways to Think Like a 21st Century Economist,” gives us a guide for transitioning from a 20th-century paradigm to an evolved 21st-century one that will address our existential-scale problems.
Yuval Noah Harari is one of the rare historians who can give us a two-million-year perspective on today’s headlines. In this wide-ranging conversation, Yuval explains how technology and democracy have evolved together over the course of human history, from paleolithic tribes to city states to kingdoms to nation states. So where do we go from here? “In almost all the conversations I have,” Yuval says, “we get stuck in dystopia and we never explore the no less problematic questions of what happens when we avoid dystopia.” We push beyond dystopia and consider the nearly unimaginable alternatives in this special episode of Your Undivided Attention.
You’ve heard us talk before on this podcast about the pitfalls of trying to moderate a “global public square.” Our guest today, Eli Pariser, co-director of Civic Signals, co-founder of Avaaz, and author of "The Filter Bubble," has been thinking for years about how to create more functional online spaces and is bringing people together to solve that problem. He believes the answer lies in creating spaces and groups intentionally, with the same kinds of skilled support and infrastructure that we would enlist in the physical world. It’s not enough to expect the big revenue-oriented tech companies to transform their tools into something less harmful; Eli is encouraging us to proactively gather in our own spaces, optimized for togetherness and cooperation.
We are in the midst of a teen mental health crisis. Since 2011, the rate of U.S. hospitalizations for preteen girls who have self-harmed is up 189 percent, and with older teen girls, it’s up 62 percent. Tragically, the numbers on suicides are similar — 151 percent higher for preteen girls, and 70 percent higher for older teen girls. NYU social psychologist Jonathan Haidt has spent the last few years trying to figure out why, working with fellow psychologist Jean Twenge, and he believes social media is to blame. Jonathan and Jean found that the mental health data show a stark contrast between Generation Z and Millennials, unlike any demographic divide researchers have seen since World War II, and the division tracks with a sharp rise in social media use. As Jonathan explains in this interview, disentangling correlation and causation is a persistent research challenge, and the debate on this topic is still in full swing. But as TikTok, Instagram, Snapchat and the next big thing fine-tune the manipulative and addictive features that pull teens in, we cannot afford to ignore this problem while we sit back and wait for conclusive results. When it comes to children, our standards need to be higher, and our burden of proof lower.
Today’s extremists don’t need highly produced videos like ISIS. They don’t need deep pockets like Russia. With the right message, a fringe organization can reach the majority of a nation’s Facebook users for the price of a used car. Our guest, Zahed Amanullah, knows this firsthand. He’s a counter-terrorism expert at the Institute for Strategic Dialogue, and when his organization received $10,000 in ad credits from Facebook for an anti-extremism campaign, they were able to reach about two-thirds of Kenya’s Facebook users. It was a surprising win for Zahed, but it means nefarious groups all over the African continent have exactly the same broadcasting power. Last year, Facebook took down 66 accounts, 83 pages, 11 groups and 12 Instagram accounts related to Russian campaigns in African countries, and Russian networks spent more than $77,000 on Facebook ads in Africa. Today on the show, Zahed will explain how the very tools that extremists use to broadcast messages of hate can also be used to stop them in their tracks, and he’ll tell us what tech and government must do to systematically counter the problem. “If we don’t get in front of this,” he says, “this phenomenon is going to amplify beyond our reach.“
A new documentary called The Social Dilemma comes out on Netflix today, September 9, 2020. We hope that this film, full of interviews with tech insiders, will be a catalyst and tool for exposing how technology has been distorting our perception of the world, and will help us reach the shared ground we need to solve big problems together.
This summer, Facebook unveiled “2Africa,” a subsea cable project that will encircle nearly the entire continent of Africa — much to the surprise of Julie Owono. As Executive Director of Internet Without Borders, she’s seen how quickly projects like this can become enmeshed in local politics, as private companies dig through territorial waters, negotiate with local officials and gradually assume responsibility over vital pieces of national infrastructure. “It’s critical, now, that communities have a seat at the table,” Julie says. We ask her about the risks of tech companies leading us into an age of “digital colonialism,” and what she hopes to achieve as a newly appointed member of Facebook’s Oversight Board.
In 1940, a group of 60 American intellectuals formed the Committee for National Morale. “They’ve largely been forgotten,” says Fred Turner, a professor of communications at Stanford University, but their work had a profound impact on public opinion. They produced groundbreaking films and art exhibitions. They urged viewers to stop, reflect and think for themselves, and in so doing, they developed a set of design principles that reimagined how media could make us feel more calm, reflective, empathetic; in short, more democratic.
Imagine a world where every country has a digital minister and technologically-enabled legislative bodies. Votes are completely transparent and audio and video of all conversations between lawmakers and lobbyists are available to the public immediately. Conspiracy theories are acted upon within two hours and replaced by humorous videos that clarify the truth. Imagine that expressing outrage about your local political environment turned into a participatory process where you were invited to solve that problem and even entered into a face to face group workshop. Does that sound impossible? It’s ambitious and optimistic, but that's everything that our guest this episode, Audrey Tang, digital minister of Taiwan, has been working on in her own country for many years. Audrey’s path into public service began in 2014 with her participation in the Sunflower Movement, a student-led protest in Taiwan’s parliamentary building, and she’s been building on that experience ever since, leading her country into a future of truly participatory digital democracy.
#StopHateforProfit is an important first step, but we need to go much further.
What would inspire someone to singlehandedly initiate an armed standoff on the Hoover Dam, or lead the police on a 100-mile-an-hour car chase while calling for help from an anonymous internet source, or travel hundreds of miles alone to shoot up a pizza parlor? The people who did these things were all connected to the decentralized cult-like internet conspiracy theory group called QAnon. Our guest this episode, Travis View, is a researcher, writer and podcast host who has spent the last few years trying to understand the people who’ve become wrapped up in QAnon and the concerning consequences as Q followers increasingly leave their screens and take extreme actions in the real world. As many as six candidates who support QAnon are running for Congress and will be on the ballot for the 2020 elections, threatening to upend long-held Republican establishment seats. This just happened to a five-term Republican congressman in Colorado. Travis warns that QAnon is an extremism problem, not a disinformation or political problem, and dismissing QAnon as a fringe threat underestimates how quickly their views can leapfrog into mainstream debates on the left and the right.
The sound of bullies on social media can be deafening, but what about their victims? “They're just sitting there being pummeled and pummeled and pummeled,” says Fadi Quran. As the campaign director of Avaaz, a platform for 62 million activists worldwide, Fadi and his team go to great lengths to figure out exactly how social media is being weaponized against vulnerable communities, including those who have no voice online at all. “They can't report it. They’re not online.” Fadi says. “They can't even have a conversation about it.” But by bringing these voices of survivors to Silicon Valley, Fadi says, tech companies can not just hear the lethal consequences of algorithmic abuse, they can start hacking away at a system that Fadi argues was “designed for bullies.”
[This episode originally aired on November 5, 2019] Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.
When you’re gripped by anxiety, fear, grief or dread, how do you escape? It can happen in the span of a few breaths, according to meditation experts Jack Kornfield and Trudy Goodman. They have helped thousands of people find their way out of a mental loop, by moving deeper into it. It's a journey inward that reveals an important lesson for the architects of the attention economy: you cannot begin to build humane technology for billions of users, until you pay careful attention to the course of your own wayward thoughts.
How can we feel empowered to take on global threats? The battle begins in our heads, argues Christiana Figueres. She became the United Nation’s top climate official, after she had watched the 2009 Copenhagen climate summit collapse “in blood, in screams, in tears.” In the wake of that debacle, she began performing an act of emotional Aikido on herself, her team and eventually delegates from 196 nations. She called it “stubborn optimism." It requires a clear and alluring vision of a future that can supplant the dystopian and discouraging vision of what will happen if the world fails to act. It was stubborn optimism, she says, that convinced those nations to sign the first global climate framework, the Paris Agreement. We explore how a similar shift in Silicon Valley's vision could lead 3 billion people to take action.
How does disinformation spread in the age of COVID-19? It takes an expert like Renée DiResta to trace conspiracy theories back to their source. She’s already exposed how Russian state actors manipulated the 2016 election, but that was just a prelude to what she’s seeing online today: a convergence of state actors and lone individuals, anti-vaxxers and NRA supporters, scam artists and preachers and the occasional fan of cuddly pandas. What ties all of these disparate actors together is an information ecosystem that’s breaking down before our eyes. We explore what’s going wrong and what we must do to fix it in this interview with Renée DiResta, Research Manager at the Stanford Internet Observatory.
An information system that relies on advertising was not born with the Internet. But social media platforms have taken it to an entirely new level, becoming a major force in how we make sense of ourselves and the world around us. Columbia law professor Tim Wu, author of The Attention Merchants and The Curse of Bigness, takes us through the birth of the eyeball-centric news model and ensuing boom of yellow journalism, to the backlash that rallied journalists and citizens around creating industry ethics and standards. Throughout the 20th century, radio, television, and even posters elicited excitement, hope, fear, skepticism and greed, and people worked together to create a patchwork of regulation and behavior that attempted to point those tools in the direction of good. The Internet has brought us to just such a crossroads again, but this time with global consequences that are truly life-and-death.
We agree more than we think we do, but tech platforms distort our perceptions by amplifying the loudest, angriest and most dismissive voices online. In reality, they’re just a noisy faction. This Earth Day we ask Anthony Leiserowitz, Director of the Yale Program on Climate Change Communication, how he shifts public opinion on climate change. We’ll see how tech platforms could amplify voices of solidarity within our own communities. More importantly, we’ll see how they could empower 2 billion people to act in the face of global threats.
How can tech companies help flatten the curve? First and foremost, they must address the lethal misinformation and disinformation circulating on their platforms. The problem goes much deeper than fake news, according to Claire Wardle, co-founder and executive director of First Draft. She studies the gray zones of information warfare, where bad actors mix facts with falsehoods, news with gossip, and sincerity with satire. “Most of this stuff isn't fake and most of this stuff isn't news,” Claire argues. If these subtler forms of misinformation go unaddressed, tech companies may not only fail to flatten the curve — they could raise it higher.
What difference does a few hours of Congressional testimony make? Tristan takes us behind the scenes of his January 8th testimony to the Energy and Commerce Committee on disinformation in the digital age. With just minutes to answer each lawmaker’s questions, he speaks with Committee members about how the urgency and complexity of humane technology issues is an immense challenge. Tristan returned hopeful, and though it sometimes feels like Groundhog Day, each trip to DC reveals evolving conversations, advancing legislation, deeper understanding and stronger coalitions.
We are in the middle of a global trust crisis. Neighbors are strangers and local news sources are becoming scarcer; institutions that used to symbolize prestige, honor and a sense of societal security are ridiculed for being antiquated and out of touch. To replace the void, we turn to sharing economy companies and social media, which come up short, or worse. Our guest on this episode, academic and business advisor Rachel Botsman, guides us through how we got here, and how to recover. Botsman is the Trust Fellow at Oxford University, and the author of two books, including “Who Can You Trust?” The intangibility of trust makes it difficult to pin down, she explains, and she speaks directly to technology leaders about fostering communities and creating products the public is willing to put faith in. “The efficiency of technology is the enemy of trust,” she says.
“You can binge watch an ideology in a weekend,” says Tony McAleer. He should know. A former white supremacist, McAleer was introduced to neo-Nazi ideology through the U.K. punk scene in the 1980s. But after his daughter was born, he embarked on a decades-long journey from hate to compassion. Today’s technology, he says, make violent ideologies infinitely more accessible and appealing to those who long for acceptance. Social media isolates us and can incubate hate in a highly diffuse structure, making it nearly impossible to stop race-based violence without fanning the flames or driving it further underground. McAleer discusses solutions to this dilemma and the positive actions we can take together.
Brittany Kaiser, a former Cambridge Analytica insider, witnessed a two day presentation at the company that shocked her and her co-workers. It laid out a new method of campaigning, in which candidates greet voters with a thousand faces and speak in a thousand tongues, automatically generating messages that are increasingly aiming toward an audience of one. She explains how these methods of persuasion have shaped elections worldwide, enabling candidates to sway voters in strange and startling ways.
Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.
What causes addiction? Johann Hari, author of Chasing the Scream, travelled some 30,000 miles in search of an answer. He met with researchers and lawmakers, drug dealers and drug makers, those who were struggling with substance abuse and those who had recovered from it, and he came to the conclusion that our whole narrative about addiction is broken. "The opposite of addiction is not sobriety," he argues. "The opposite of addiction is connection." But first, we have to figure out what it really means to connect.
Every 40 seconds, our attention breaks. It takes an act of extreme self-awareness to even notice. That’s why Gloria Mark, a professor in the Department of Informatics at University of California, Irvine, started measuring the attention spans of office workers with scientific precision. What she has discovered is not simply an explosion of disruptive communications, but a pandemic of stress that has followed workers from their offices to their homes. She shares the latest findings from the “science of interruptions,” and how we can stop forfeiting our attention to the next notification, and the next one, ad nauseam.
In the second part of our interview with Renée DiResta, disinformation expert, Mozilla fellow, and co-author of the Senate Intelligence Committee’s Russia investigation, she explains how social media platforms use your sense of identity and personal relationships to keep you glued to their sites longer, and how those design choices have political consequences. The online tools and tactics of foreign agents can be very precise and deliberate, but they don’t have to be -- Renée has seen how deception and uncertainty are powerful agents of distrust and easy to create. Do we really need the ease of global amplification of information-sharing that social media enables, anyway? We don’t want spam in our email inbox so why do we tolerate it in our social media feed? What would happen if we had to copy and paste and click twice, or three times? Tristan and Aza also brainstorm ways to prevent and control disinformation in the lead-up to elections, and particularly the 2020 U.S. elections.
Today’s online propaganda has evolved in unforeseeable and seemingly absurd ways; by laughing at or spreading a Kermit the Frog meme, you may be unwittingly advancing the Russian agenda. These campaigns affect our elections integrity, public health, and relationships. In this episode, the first of two parts, disinformation expert Renee DiResta talks with Tristan and Aza about how these tactics work, how social media platforms’ algorithms and business models allow foreign agents to game the system, and what these messages reveal to us about ourselves. Renee gained unique insight into this issue when in 2017 Congress asked her to lead a team of investigators analyzing a data set of texts, images and videos from Facebook, Twitter and Google thought to have been created by Russia’s Internet Research Agency. She shares what she learned, and in part two of their conversation, Renee, Tristan and Aza will discuss what steps can be taken to prevent this kind of manipulation in the future.
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.
Aza sits down with Yael Eisenstat, a former CIA officer and a former advisor at the White House. When Yael noticed that Americans were having a harder and harder time finding common ground, she shifted her work from counter-extremism abroad to advising technology companies in the U.S. She believed as danger at home increased, her public sector experience could help fill a gap in Silicon Valley’s talent pool and chip away at the ways tech was contributing to polarization and election hacking. But when she joined Facebook in June 2018, things didn’t go as planned. Yael shares the lessons she learned and her perspective on government’s role in regulating tech, and Aza and Tristan raise questions about our relationships with these companies and the balance of power.
In part two of our interview with cultural anthropologist Natasha Dow Schüll, author of Addiction by Design, we learn what gamblers are really after a lot of the time — it’s not money. And it’s the same thing we’re looking for when we mindlessly open up Facebook or Twitter. How can we design products so that we’re not taking advantage of these universal urges and vulnerabilities but using them to help us? Tristan, Aza and Natasha explore ways we could shift our thinking about making and using technology.
Natasha Dow Schüll, author of Addiction by Design, has spent years studying how slot machines hold gamblers spellbound, in an endless loop of play. She never imagined the addictive designs which she had first witnessed in Las Vegas would go bounding into Silicon Valley and reappear on virtually every smartphone screen worldwide. In the first segment of this two-part interview, Natasha Dow Schüll offers a prescient warning to users and designers alike: How far can the attention economy go toward stealing another moment of your time? Farther than you might imagine.
Technology has shredded our attention. We can do better.
En liten tjänst av I'm With Friends. Finns även på engelska.