Moral repair web banner.png

 

 
 

 

 
 

Moral Repair:
A Black Exploration of Tech

How can we all thrive as we navigate technology, automation, and AI in the Information Age? What have technologists, philosophers, care practitioners, and theologians learned about the innovations and worldviews shaping a new century of unprecedented tech breakthroughs and social change? 

On Moral Repair: A Black Exploration of Tech, hosts Annanda Barclay and Keisha McKenzie talk with tech and spiritual leaders. Their conversations inspire curiosity about tech while showcasing practical wisdom from the African continent and diaspora to nurture wellbeing for all. 

Moral Repair expands mainstream tech narratives, celebrates profound insight from Black philosophy and culture, and promotes technology when it serves the common good. Listeners leave each episode with new ways to think about tech’s impacts and apply practical wisdom in their own lives. 

In Season 1, episode themes range from recommendation algorithms and a Black ethical standard for evaluating tech to interactive holograms and hip hop as cultural memory tools. Other episodes explore moral repair, ideologies and philosophies shaping Silicon Valley, AI ethics, inclusive design, and tech well-being.

Guests include Aral Balkan (Small Tech Foundation), Dr. Scott Hendrickson (data scientist), the Rev. Dr. Otis Moss III (pastor, filmmaker, storyteller), Stewart Noyce (technologist and marketer), Zuogwi Reeves (minister and scholar), the Rev. Dr. Sakena Young-Scaggs (Stanford University’s Office of Religious & Spiritual Life,  Judith Shulevitz (culture critic), and Dr. Damien Williams (professor and researcher on science, technology, and society). 

 

Season 2

Trailer

Season 2

On Moral Repair: A Black Exploration of Tech, hosts Annanda Barclay and Keisha McKenzie talk with tech and wisdom leaders. Their conversations inspire curiosity about tech while showcasing practical wisdom from the African continent and diaspora to nurture wellbeing for all.

Moral Repair expands mainstream tech narratives, celebrates profound insight from Black philosophy and culture, and promotes technology when it serves the common good. Listeners leave each episode with new ways to think about tech’s impacts and apply practical wisdom in their own lives.

  • Annanda: An ancient form of trauma has a new name. It’s called moral injury. And it defines a deep spiritual and existential pain that arises when something violates our core beliefs. In our increasingly connected world, we’re seeing lots of moral injury.

     Keisha: Just take our AI age, which often feels like it's doing more harm than good.

    OM3: What scares me about technology is the profit motive. profit plus human frailty, historically, has meant tragedy for someone.

    Damien: the very people who are supposed to be regulated by these policy changes, are the people who are writing the laws, or the very least “advising” on them.

    Adorable: technology it's solving the problems of other technologists and industry, it's not really solving the problems of everyday people.

    [SFX: A crescendo in the music]

    Keisha: AI is overwhelming.

     Annanda: So here's the question. Can we ever truly mend the damage that AI is causing us? That is, what could moral repair in our modern technological era look like?

    Keisha: This isn't just about patching systems. It's about caring for people. Technology should serve humanity… not the other way around.

    Annanda: We’ve got an antidote to moral injury.

    [SFX: A shift in tone]

    Jane: Africana philosophy, it's so rich with a broadened conception of technology, it's all about cultivating human well being, cultivating sustainability…

    Rev. Dr. SYS: Is this life-giving or death-dealing? Because we need more life-giving things.

    [SFX: Our theme song or a play off it leading into it]

    Keisha: Welcome to a new season of the two-time AMBIE-nominated podcast Moral Repair: A Black Exploration of Tech. A series about the innovations that make our world… disrupt our societies… and how we can repair the damage.

    Annanda:  I'm Annanda Barclay, an end of life planner, chaplain, and moral injury researcher. I think a lot about the impact moral injury can have on living a meaningful and dignified life, all the way to the end.

    Keisha:  And I'm Keisha McKenzie, a technical communicator and narrative strategist with a knack for asking tough questions and making experts accessible.

    Annanda: This season, we'll be your guides through the maze of AI. And its use in political strategies, conflicts…

    Keisha: Government regulation and play. All while we try to answer the big question — who's actually responsible for the moral repair we so badly need in AI?

    Annanda: And how can Africana wisdom guide us towards healing and accountability?

    Keisha:  Let's take a hard look at the AI that powers and distorts our world.

    Annanda: And let's start the process of repair together. Join us for Moral Repair: A Black Exploration of Tech launching April 24th.

 

Government Regulation: Afrofuturism and Equity in Tech

Season 2 | Episode 1

What do we need to know about recent regulatory guidelines on AI trust and safety? What does one recent federal regulator think still needs attention? How could critical Black digital perspectives reshape the conversation? Annanda and Keisha talk Afrofuturism and equity with Dr. Alondra Nelson, deputy director for science and society at the White House Office of Science and Technology Policy from 2021-2023.

Talk to us online: at Instagram (@moralrepairpodcast), on X (@moralrepair), and on LinkedIn

The Social Text Afrofuturism issue. About the Black Panther’s clinics.

“No Justice, No Health”

Nelson + Lander explain the AI Bill of Rights (WIRED) 

How many medical tech advances came from HIV-AIDS research.

Audio Block
Double-click here to upload or link to a .mp3. Learn more
  • Lead Guest: Alondra Nelson, PhD

    EPISODE BIG QUESTION: What does tech policy look like behind the curtain? And how can Afrofuturist and Black cultural principles make that ecosystem work for those who’ve been left behind?

    DESCRIPTION: What do we need to know about recent regulatory guidelines on AI trust and safety? What does one recent federal regulator think still needs attention? How could critical Black digital perspectives reshape the conversation? Annanda and Keisha talk Afrofuturism and equity with Dr. Alondra Nelson, deputy director for science and society at the White House Office of Science and Technology Policy from 2021-2023.

    [00:00] INTRO

    SOUND: curious background music

    Annanda: Welcome everybody to our first episode of our second season of Moral Repair: A Black Exploration of Tech!

    Ya girls are back at it again! And we won’t stop.

    So this season, we’re focusing specifically on AI and technology. We’ll be talking about government, cobalt mining, policing, and believe it or not, the impact big tech has on American farmlands.

    Keisha: I’m excited to get into it. We’re talking about how AI shows up in different parts of our lives, how we can use it in positive directions and, where it’s harmful, mitigate that damage.

    Annanda: There's a lot that Africana wisdom, Black wisdom, can say about technology: how do we take what AI gives, consider it, and sprinkle some Africana wisdom seasonings on it? That’s the business we’re up to this second season.

    Keisha: In these conversations with tech experts, we’ve talked around the role government plays with tech. On today’s show, we’re focusing on the policy environment around new technologies. We ask the big question:

    What does tech policy look like behind the curtain? And how can Afrofuturist and Black cultural principles make that ecosystem work for those who’ve been left behind?

    Annanda: ‘Cause so many people are left behind.

    Keisha: And we can do things differently.

    SOUND: title music starts

    Annanda: I’m Annanda Barclay…

    Keisha: I’m Keisha McKenzie…

    Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech… and share wisdom from Africana culture for caring for what tech has broken.

    SEGMENT A: Regulating Emerging Tech (Problems in the policy ecosystem)

    Keisha: Annanda, did I ever tell you I worked for Congress once?

    Annanda: Nope, I’d remember that. *laughs*

    Keisha: Yeah, I interned with committee lawyers in the House of Representatives halfway through grad school. I wanted to learn about government as a technology for managing power and decision making.

    SOUND: title music fades out

    While I was there, learned a *lot* about the gap between the myths and the realities of government.

    That was the first administration of the Obama era.

    It was a chaotic time.

    Annanda: But—but a historic time nonetheless.

    You know, bless you, Keisha. I don't know if I could've had the stamina!

    I kind of look at that time as a beginning of a—as a major shift. I mean, I can go back to Bush and Gore, but I went on YouTube and looked at past political debates from when I was a kid, and then from when I was a teenager and then now, and you can see like the disintegration of rhetoric and actually of knowledge of government and how it functions.

    It was actually quite frightening to me to see how quickly bipartisanship and actually knowledge of governance has unraveled. And I think about [Gen] Zs, for whom, this has always been their political reality. They have only known a dysfunctional government.

    Keisha: Yeah, it’s a maze…

    SOUND: Curious sound

    So we brought in someone who could guide us through it.

    Keisha: Dr. Nelson, thank you for being here.

    Alondra: A pleasure to be with you both.

    Keisha: Dr. Alondra Nelson was the deputy director for science and society at the Biden-Harris White House Office of Science and Technology Policy (OSTP) from 2021-2023.

    Alondra: I have been very forthright in writing about problems of American society and the ways that African American communities in particular have both been innovators and also, struggled very hard, in American society to sort of compel it to live up to its professed ideals—

    Keisha: Dr. Nelson told us about how she ended up as the first Black woman to hold her position. It started two presidential terms earlier, with relationships she built while writing a book about the Obama White House.

    Alondra: I never thought I would serve in government.

    I had been—in—since 2016 working on a book about the White House Office of Science and Technology Policy under the Obama administration. I was very interested in the kind of creative, innovative work that office was doing, under a Black presidency in President Obama's personal interest in science and technology as a policy maker and also as a kind of self identified nerd.

    I think when the history books are kind of written about that presidency, That science and technology policy will be a really central part of it. I think it's one of the lesser acknowledged significant things of that administration.

    One of President Obama's last foreign trips before he leaves office, he goes to Hiroshima, to the Hiroshima Memorial Park…

    Audio clip: Obama at Hiroshima (C-SPAN) - (The actual line Alondra referenced: (7:05-7:27”) “Technological progress without equivalent progress in human institutions can doom us. The Scientific Revolution that led to the splitting of an atom requires a moral revolution as well.”

    Keisha: This idea — of parallel technological and ethical revolutions — it’s what was behind Obama forming the President’s Council on Bioethics.

    Alondra: You see in the administration this sense that science and technology policy really shouldn't and can't be unmoored from thinking about the historical context and social implications of science and technology.

    Annanda: One thing I appreciate about Obama, that man, he could tap into his emotions as a politician to the suffering of people. And I wish I saw more of that in general... I wish that was a standard.

    Keisha: Yes, a little emotional awareness and capacity.

    Annanda: Yeah, I want a lot, to be honest. *laughs* Not a little. I want—

    Keisha: Fair!

    Annanda: Not that one is run by their emotions, but to be able to tap into their own humanity and the humanity of others, especially those who are different from them.

    Keisha: I listened to the speech and I'm struck by that too. I'm wrestling with whether it did end at the emotional performance of that moment because he went over there for the first time in 70 years. He didn't actually apologize to the people of Japan for the U. S. dropping the atomic bomb, so he was present as the president, and it was extraordinary, and he acknowledged the moral breakdown, and he hugged the survivors. But I'm wondering if there was actual moral repair there.

    Is there actual moral repair if you're not apologizing or taking responsibility for it? What do you think?

    Annanda: I think that's up for the survivors, those who are morally injured. I think there is moral injury for sure in United States foreign policy. And I wonder what does repair look like for politicians who are in seats of power at times of war, making decisions as big as dropping an atom bomb.

    Keisha: Right.

    Annanda: The phrase moral injury and moral repair actually does come from the military in the 80s. Because they're trying to figure out what these troops are going through. It's like PTSD, but it's different. They could cure PTSD, but there's this other thing that's like showing up as a comorbidity, and that is moral injury. And the impacts of it are profound.

    Keisha: Mmmm. Mm hmm.

    So there was a history-making moment with Obama at Hiroshima. And then back home, as Dr. Nelson joined the Biden-Harris administration, they were about to face another.

    Alondra: We were at the high water mark of a once in a generation pandemic.

    I mean people were just dying. In Brazil, in the African subcontinent, in Harlem, where I live, and the sort of racial demographics of that, the racial inequality of that was just raw and real.

    SOUND: conversation heading toward a break

    Keisha: Phew: I can't even believe it's been four years since the pandemic started.

    Annanda: You can't believe it? Oh, I can. *laughs* Why? Say more.

    Keisha: It feels almost like world before, world after. We've collectively gone through multiple layers of moral injury, including abandonment from the people whose roles in the political system is to protect and care for the public.

    Alondra: What situation would the United States have to be in for an African American woman who studies health inequality and racial inequality to be invited to be a deputy at the White House Office of Science and Technology Policy?

    Keisha: You’ve heard of the “glass ceiling,” the career limits you can see through but not break through. Being promoted into crisis is called the “glass cliff.” And it’s a set up.

    Annanda: Being a chaplain in the hospital around that time, that was definitely a cliff. Life is forever changed when you're dealing with six to eight deaths a day on your on-call shift, right? Just death after death after death…

    But I actually felt safer in the hospital than any other place. How do you long term manage supporting people in that much grief, bereavement, and death?

    Keisha: What I love about what you said was not just to care for others, but also to care for yourself in the midst of caring.

    People who are in cliff situations, they don't get that part of the resource. They might get enough to care for others or do the job function, but they usually will not get the stuff that keeps them nourished and able to navigate the change.

    Alondra: The economic inequality, the health inequality, the pandemic, were what we were going into office to help mitigate the best we could. I don’t know if it’ll ever happen again, where governments leadership were seeking people who had thought about issues of inequity and innovation, inequality in science and technology. I happened to be… one of those people.

    SOUND: break cue

    Keisha: Glass cliff or not, Dr. Nelson went into government to help change the climate in tech. After the break, we’ll hear all about that.

    BREAK

    [9:55] SEGMENT B: How equity improves the process

    Annanda: We’re back with Professor Alondra Nelson, former deputy at the White House Office of Science and Technology Policy. We've been thinking about how she got into government and how a moral revolution could help shape tech. But how does influencing government policies really work?

    SOUND: break cue ends

    Keisha: Going back to what you said about the pandemic or the early part of the pandemic being a once in a century event or experience, when it comes to emerging technologies like machine learning or AI or the mRNA vaccines, who has the greatest influence on what regulations around those things actually look like?

    Alondra: The MRNA vaccine is set within the U. S. Food and Drug Administration. But certain things were accelerated. I mean, we did clinical trials very quickly.

    We were running them parallel as opposed to waiting for one to finish and another to start so in that case, it was the regulatory authorities that were in place, the usual channels, accelerated. We had something called Operation Warp Speed, and the whole point of that was to do things much quicker than usual.

    I think when you talk about things like automated systems and AI gets a little more complicated in part because— and I’ve also worked on human genetics and genomics as an emerging field as well— and when you get the collision or collaboration (pick your C-word!) between market-driven innovation and sometimes really exciting products and possibilities with a PR hype cycle, regulation becomes harder. You know, in the public sphere, people say “Oh government’s so slow and they can’t regulate” and all of that. But also that hype cycles just make it sound like you can’t regulate it.

    Like, this new, magical, direct to consumer genomics is coming at you, and we've never seen anything like it in the world. Cut out “genomics” and put in “generative AI,” and it's like, how could you even think you could regulate this? This is like, magic, you know, and it's cool, and there'll be food for everybody, and we'll cure cancer, and everyone's gonna have a house, and the robots will be here, and it'll be awesome!

    So, there is this combination of hype cycle and material interests of venture capitalists and companies that make regulation hard, and the hard part is not necessarily that we can't keep up. It's how do you regulate a fever dream? How do you regulate a hype cycle?

    That said, it can be regulated!

    Annanda: Y’know, when Alondra says, how do you regulate a fever dream? And then she talks about the mRNA vaccine. I remember my beloved gay uncle, Michael, being really freaked out during the pandemic because it reminded him of HIV AIDS.

    Keisha: Right.

    Annanda: And for those who don't have older gay men as friends, they have survived the AIDS epidemic. we wouldn't have the mRNA vaccine for COVID as fast as we did, had it not been for the years and years and years of HIV AIDS activists who demanded research and a search for a cure in spite of government and in spite of the church and other religious institutions.

    Keisha: the AIDS virus and the mRNA vaccine for COVID? How was that connected?

    Annanda: Research on HIV/AIDS advanced our understanding of how viruses can manipulate the immune system. The way for our immune systems to attack these unwanted viruses— it's targeting specific viral proteins. So viruses have proteins as a part of their makeup, and mRNA vaccines work by producing a strong immune response to a virus by attacking a specific protein within that virus.

    Right?

    In the case of COVID the spike protein is a key protein that our immune system must attack for the COVID 19 vaccination to be effective. They took research founded in HIV/AIDS, which showed that to kill this particular virus, you have to target a very specific viral protein. And… it’s from there that the COVID 19 vaccine was able to jump off.

    Without gay men, trans folk, queers, lesbians, there would arguably not be a COVID vaccine at the speed the world received it.

    Keisha: Right.

    Annanda: mRNA vaccines were bought with gay, queer, and black death if we consider the pandemic ongoing in Africa and among African American populations.

    And the truth of the matter is, it was and still is.

    Keisha: Yes. What we heard from that generation of organizers was not taking the given world for granted. Not treating it as like, It came down from the mountain, like the tablets, and nobody can touch it.

    And that's what breaks the fever dream. To say the world as it is, was made, can be remade. We can do something about it. And we're going to do something about it. That's a critical trait we're gonna need to navigate this latest round of “It's magic. It came down from the mountain. It's scary. We can't touch it.” But I think that's the part that Alondra's talking about, that it can be regulated, but we have to take responsibility to do it.

    SOUND: shift to prefigure interview audio

    When we talk about regulation, it's really important to be able to understand, as you were saying of politicians, how does the system actually work and therefore what is my influence in it? What's my moral responsibility in it?

    Like us, Alondra also had to figure out her influence within the system she was embedded in.

    Alondra: The President of the United States is the Commander in Chief, the highest in the land

    It's a very powerful office, but with regards to legislation, it's a very weak office. The executive branch doesn't make legislation, and it sounds silly to say it, because you're working in the White House,

    But you know, people will say, “Why can’t President Biden or Vice President Harris just, like, make the companies stop X, do X, do Y?” And you just can't, it's a free society, so you can't tell companies to do anything. and it is the job of Congress to do that. So the question becomes, in the domain of the powers, soft or otherwise, that the executive branch has, what can you do?

    You can convey a vision, you can make a case, and, assert a theory for how things might be done. Everyone could just ignore you. It has no power or anything. And so the Blueprint for an AI Bill of Rights was an attempt to both model what might be done, and also to advance a conversation.

    Keisha: Fortunately it was a conversation that had already started. Not just in the US but even in UN agencies and global trade organizations. Leaders were starting to think about “safety in artificial intelligence” and setting common standards for technologies that members of the public would use. And that led to the publication of the White House’s AI Bill of Rights in October ‘22, which Dr. Nelson and her team framed:

    Alondra: Part of what we tried to do with the AI Bill of Rights was to say, if systems are supposed to be safe and effective, what processes does one use to guarantee that that's the case?

    So it could be organizations that are in charge of protecting consumers can intervene and ensure that AI tools and products, automated systems that are released for consumer use, have sort of met that standard consumer bar.

    But you could only imagine doing that if you didn't think that AI was like magic, right? Is ChatGPT a special, nuanced, fascinating consumer product, but a consumer product nevertheless?

    Or is it in some other category that we can't even imagine regulating?

    Whether or not you're talking about existing algorithmic systems or future ones, they should be safe and effective. They shouldn't violate people's rights. They shouldn't discriminate against people. people should have some sort of alternative or fallback in using the systems. And they should have some expectation of privacy.

    All fairly common sense things, but you can really only get there if you anchor in the kind of fundamental rights that people have, regardless of what technology you're talking about, even if you’re talking about quantum computing, it should not be a vector for discrimination in your life.

    It might affect discrimination in different ways but the outcome for people's lived experience shouldn't be that they have unsafe systems or you know, lose their privacy, or they're discriminated against by a new or emerging technology.

    Keisha: I'm curious about what you were hearing from the public during the year long listening process.

    Alondra: We talked a lot to the people. When my appointment was announced I was able to give a short speech, I thank my mom and my daddy and my ancestors and my family, but also said, What are we going to do, effectively, about algorithmic bias?

    because that was an issue that people were talking about the film that Joy Bualamwini's involved in, Coded Gaze, had come out. Timnit Gebru, had been fired unceremoniously from Google, for raising issues around generative AI. And so there was already a public conversation happening, and there was a responsibility on the part of folks in government to respond to that.

    SOUND: background music

    Keisha: Trust in institutions like the government is the lowest it’s been for nearly 70 years. But here Professor Nelson’s describing government actually listening to members of the public.

    They’re listening to conversations happening in media, journals, books and industry chats. And they’re also actively soliciting input in some interesting ways, thinking about equity and access as they do it. This reminds me of the call-and-response communication pattern used across the Black diaspora. It’s a way of interacting between speaker and audience that assumes both are an essential part of the experience and both have something valuable to add.

    Alondra: The FDA has public listening sessions, anyone can come and speak their peace for two minutes, We did several of those at different times of day, so people could come after work, before work after school…

    We had high school students very worried about algorithmic bias and discrimination as we’re putting metal detectors and surveillance cameras and these sorts of things in their schools.

    and then we had office hours. So we met with pastors and rabbis, researchers, sometimes industry people. But we try to talk to as many people as possible and to reach them beyond the typical process in Washington, D.C. which is often respond to a request for comment. I imagine in a couple of years, this will be something that we use privacy preserving, generative AI to filter through this kind of data.

    Yeah, people do read them and they also become part of the historical record.

    Keisha: Not many congressional reps or senators have a science background. And some people say it's probably unlikely we could get lawmakers to know enough about new technology to regulate it quickly in an informed way. What’s your perspective?

    Alondra: When the Office of Management and Budget was making initial decisions, about whether government should adopt personal computers, or buy the Microsoft Office Suite We didn’t think you have to have a PhD in computer science to make a policy decision around this.

    I don’t want to suggest that artificial intelligence is not a powerful technology. This is a transformative technology for pretty much every facet of life.

    At the same time, If we can have more legislators in government, who understand technology and have that background, that is only to the good in the same way as having legislators who have been doctors or nurses or medical professionals, working on healthcare policy or public health policy is only to the good.

    But if we think about the AI Bill of Rights, algorithmic systems are supposed to be benefiting and enhancing people's lives, benefiting and enhancing the experience of workers, and they're not doing that, a policymaker at the Department of Labor doesn't need expertise in AI to make that distinction.

    Keisha: So the underlying ethic, the agreed core values, become the touchstone for everyone in the system, regardless of role. Values and ethics are how equity and other emerging norms can reshape organizations, industries, and entire countries.

    SOUND: background sound ends

    Keisha: You've talked a little bit about who's missing from the rooms when tech like AI is developed. Why does it matter if most of the people designing new tech come from the same cluster of schools or share the same social class?

    Alondra: Oh, gosh. Because we keep getting it wrong. We keep getting it wrong.

    I spent a lot of my career teaching introduction to medical sociology, some history of public health, sociology of health and illness.

    And even if we just go back a hundred years and think about some of the early clinical trials or early approvals for pharmaceuticals that were only tested in men, right? and it was assumed that they were just supposed to work on everyone, you know?

    So, in some cases, they were only tested on white men.

    Ruha Benjamin has examples of this from the design space, where you know, we've designed a chair, we've designed a seatbelt for this car, and it should fit everybody, and it just doesn't.

    There's some instances, particularly when drugs were thought to be dangerous, where they might have been tested on veterans or on African Americans, other marginalized communities.

    Keisha: Now, this isn’t a “might have been”. This is a “was.” Go look up — Project Operation Whitecoat!

    Alondra: we have quite a lot of research that shows us how science and technologies have been drivers of inequality. The Tuskegee Syphilis Study was stood up by the U. S. Public Health Service, a body of the U. S. federal government, and it left, you know, around 300 Black men untreated for syphilis after we had penicillin and knew how to treat it. the legacy of that study has been to build and fester mistrust and distrust of science, of technology, in government, in African American communities, and frankly Black and Brown communities outside of the United States as well.

    There’s so much rich research on the impact and legacy of that study, and how it shapes people's help-seeking behaviors. And often, becomes a barrier to people feeling like they have enough trust to get the health care that they need and deserve, or to participate in clinical trials and these sorts of things.

    You can't just act like that didn't happen and then say, everybody go get a vaccine. If we're thinking about the pandemic, equitable science and technology policy means a government that's willing to say we have gotten this really wrong, not a boosterism that's like everyone should get a vaccine because the American government is great, go.

    Being forthright about those mistakes about those traumas are really important.

    And some of these mistakes are so basic, but you don't have anyone in the room who's saying, like, I don't know, that seatbelt doesn't fit me. I'm taller, shorter, bigger, wider. those are just fundamental basic questions.

    And then when you have algorithmic systems that are based on, often historical data, or people making choices about data that then get baked in, almost, hardwired as a metaphor, into algorithmic systems, those choices become these path dependencies there's one way that they're going and it sort of narrows the possibility for what the outputs can be. What the, you know, uh, what the answers can be, what the quote unquote predictions can be, if you're talking about a predictive system.

    And so it doesn't matter in the sense of get a rainbow coalition in every, Google seminar room and things are going to be better, but it does matter in the design phase of algorithms and the design phase of lots of science and its applications, very basic errors, um, very wrong assumptions, that could certainly benefit from having a broader perspective of people in the room.

    And then of course, everyone won't share this, but there's just an equity mandate, that if a company is making tools and resources that are literally going to be used—if you're talking about the big tech companies, Apple, Amazon, Microsoft—by potentially literally everyone in the world, that there should be some obligation for some kind of democratic participation and shaping those outcomes, even if those folks are not stakeholders and don't owe the company, what is an equitable responsibility in relationship to the public and to the consumer?

    Keisha: Equity is about defining key populations and getting the data that describe their situation. It’s also about telling the truth about what happened to them, who did it, what gaps resulted, and what it now takes to make up the difference.

    Annanda… if we understand equity that way, does that make “moral repair” a kind of equity practice?

    Annanda: I think it's a both-and. In order to have moral repair, we first have to name the injury, right? We can't heal what we don't name. there's collective moral repair, and there's individual. And so, when I think about equity, I think we're trying to repair something collectively.

    Keisha: Right. Right.

    Annanda: And what we know about moral repair is in order for that repair to occur, acknowledgement and hearing of the transgressions have to take place.

    SOUND: investigative / curious cue

    So people actually do need to be nonjudgmentally heard in their pain, seen and witnessed in their pain which is something that we as a culture and as a country struggle with.

    Keisha: Mmm, yes.

    Annanda: Dr. Nelson broke down equity in a recent speech at the Federation of American Scientists awards:

    CLIP FROM ALONDRA: Audio from Alondra Nelson’s speech at the Federation of American Scientists Awards (2023) (2:10-2:33)

    “Equity isn't just who you serve. It's who is at the table when the policies are made. It's that the policies work and proactively uplift all. It's also who has access to the research, the data, the tools, and the resources to pursue the answers we work to ensure greater access to and support within STEM fields for more people and to more immediately open federally funded research to All American taxpayers regardless of their ability to pay.” —Alondra Nelson

    SOUND: interview cue

    Alondra: We also saw researchers, scientific researchers, talking about the danger of trying to go to their field site during a pandemic. People would be like, “Who is this Black woman with a backpack, out here in the woods where we're, in a lockdown, what are you doing out here?”

    And we started to hear stories about researchers who didn't feel that they had mobility. That would shape an office [OSTP] that does science and technology policy, but also has responsibility for the research ecosystem.

    What was the responsibility of government to help people think about how else they can do field research or building a conversation in the scientific community about the fact that there's not equitable access to field research, right? So you're being asked to publish papers or get tenure but you don't actually have full access to the resources you need to work at the height of your powers as a researcher. A few years ago, I ran an organization called the Social Science Research Council, and we had a grant for people doing international field work.

    And we had a very unequal fellowship experience, right, so we had people without caregiving responsibilities who could take the fellowship and use all the money for their field sites and all of that. And then we had a population of fellows who had to find child care, had to find elder care and were trying to make that fellowship money stretch farther.

    And then once they got to the field, often faced vulnerability in the field by virtue of their gender or sexual identity. And so those are the things that are part of a conversation about equitable science and technology policy.

    Annanda: Repair isn't just “I'm sorry”, it's what structures then need to be put in place to rebuild trust and to build a new sense of hope. Repairing is mending the harm, and creating a new moral covenant or equity agreement that this harm will not be repeated.

    Otherwise, if the harm is repeated, you're continuing the pattern of distrust and injury, making repair further difficult, and there is a place of no return.

    It's been argued by scholars that chattel slavery was so great that there is a place of no return with that.

    the same with Native Americans, there's no way to repair the indigenous genocide—

    Keisha: Because it's ongoing.

    Annanda: You're not going to be able to repair what was done in the past, but how do you repair how those descendants are treated in the future, right?

    Keisha: Yeah, so changing these sorts of patterns of disparate access and not being taken into account requires a systemic response, a different way of investing resources, a different way of taking in information from a population and a different way of designing for solutions to serve them.

    And it was at the Social Science Research Council that Dr. Nelson helped to put together a major fellowship and reshaped that model for funding participants that took into account their research, sure, but also their life contexts, the life contexts in which they were doing that research. Were they parents or elder caregivers? Were they experiencing strain? Where? And how can the resource then serve those conditions?

    Annanda: I look at the United States and your medical care is not guaranteed.

    Keisha: Nope, it's tied to your job. So if you lose your job, what health insurance?

    Annanda: Your income is not guaranteed to be a livable wage. And so you have so many things in this country that are basic safety, security, and I actually think dignity needs that are not guaranteed.

    Keisha: Small scale experiments like, universal basic income and or a fellowship that actually treats you like a human being. Those are the kinds of experiments I want to see scaled up socially so that you don't have to be the special person who gets the fellowship and you don't have to be in the right city to have the experiment of universal basic income, but you get to have that not as an exception, but as a norm.

    That's what I'm looking for.

    Annanda: That would be amazing. The amount of flourishing that would take place because basic needs are met. The amount of crime that would go down, the amount of stress that would decrease, the amount of health and wellness, of innovation and creativity, the amount of kindness in the world. From your lips to God’s ears.

    SOUND: interview cue

    Alondra: In my time at the White House Office of Science and Technology Policy, we did a kind of vision document on STEM equity and excellence. How do you support the full person?

    In postdocs and research lab experiences supporting the full person meant a range of things like how do we need to think about the design of laboratories so that anyone across a spectrum of ability can imagine being a scientist and imagine doing bench research or other kinds of research. So how do you think about universal design?

    Are there ways that science and technology research funding agencies and the U.S. federal government can do more to support people who are caregivers, to make sure that they stay in the pipeline because in the bigger context we don’t have enough workers, right?

    We need more people working in these fields and sometimes that conversation is an immigration conversation, right? we don't have, enough, scientists coming from abroad, and technologists and engineers, but we are also not doing all that we can to support people here in the United States, whether or not they immigrated, last year or 100 years ago or were brought here during the slave trade, to do the work that they want to do in science and technology.

    The ecosystem approach is about supporting the full person and also about trying to strategically and intentionally link up all of the programs. So there's the, y’know, after school program, and there's the community college program, and the worker training program, and everybody's competing for funding and doing their own thing.

    SOUND: break cue starts quietly

    How do we put all of these programs in an ecosystem so they're like handing the baton to each other, not just competing with each other.

    So they're passing the resources and sharing the resources in a way that's accessible. more than the sum of their parts.

    Keisha: From ecosystem equity to Afrofuturism and AI: more, after the break.

    BREAK

    [34:04] SEGMENT C: Afrofuturist and other Black Perspectives on the Problem

    SOUND: Break music

    Keisha: Welcome back.

    We’ve explored how equity approaches can repair the entire tech policy ecosystem from regulation standards and the research environment to opportunities for young people left out of the field. It’s an orientation that, for Dr. Nelson, is rooted in Afrofuturist perspectives.

    SOUND: Break music ends

    Afrofuturism is a movement that blends art, philosophy, technology, and activism to address black life and liberation.

    It assumes Black people are thriving in the future—not invisible or missing like in The Jetsons, and not subjugated like in most conventional sci-fi.

    Annanda: It's the ultimate form of freedom. I think it's a collective prayer, speaking into being, of Africana Black people all the time of we're here, we will be here, we will thrive here, and us being here isn't an over and against anybody else, right? It's not the supremacist notion of dominance, but it is dignity. It is respect. and care.

    Every Afrofuturist thing, Black people are living good, and the societies in which they are a part are thriving. but it's not limited, but those societies, those worlds are thriving.

    Keisha: Annanda, remember this?

    SOUND: Fade up bars from Baaba Mal’s Wakanda theme a few minutes into the start of Black Panther. E.g. 0:01:12-1:28

    Annanda: Oh my gosh, my little diaspora heart!

    Keisha: I know! I know. When Black Panther came out in New York, it felt like an earthquake hit. Afrofuturism everywhere. The clothes. The memes. People doing Wakanda Forever gestures all over the place lol…

    SOUND: Black Panther theme fades out

    Annanda: Oh yeah. I was out here in these Bay Area streets I went to go see it with Black Stanford And, yeah, everybody dressed up in whatever Africana gear that they had it was beautiful,

    Keisha: Yes.

    Annanda: what got me with Black Panther I was like certain things or ways of being that were just so unapologetic.

    Keisha: I saw it in Harlem, the blackest experience I've had seeing a movie. those examples of music culture, and blackness remind me of the power of Afrofuturism to cast a different vision for people, where we belong, we're in the future we get to survive—

    And it's full color. it's technology, it's innovation, style…

    Annanda: And all those little STEM babies and art babies… could see themselves into the future as well. It was an Africana space: folks from the continent, from the diaspora. It was beautiful.

    SOUND: Drumming in the background

    Keisha: In 1998, Professor Alondra Nelson — who we’ve been hearing from in this episode — started an international Afrofuturism listserv for scholars, artists, and others. She went on to publish on the movement through the early 2000s, using journal articles and books to break down the Afrofuturist vision of “making the impossible possible” and imagining “race after the internet.”

    So it’s not surprising that Professor Nelson told us about the tremendous impact Afrofuturism has on her.

    Keisha: How does it shape your work?

    Alondra: Oh my gosh, uh, every day in every way. Absolutely it does. I started writing about Afrofuturism in the mid 90s and I think as people take it up and use it and think with it and are empowered by it now as a concept and a movement is a lot more about fashion and music and food, you know, it's this beautiful, aesthetic thing. But for me, it's always been also about science and technology.

    So my early work on Afrofuturism tried to have a place for all of those things. It was a community that included Nalo Hopkinson, the Caribbean science fiction writer, and Fatima Tuggar, the Nigerian visual artist, people doing social theory, as well as people who were technologists, and the like.

    Keisha: But an Afrofuturist view is quite different than how many of us… including Dr. Nelson — learned Black history and the past—a mix of tragedy and exceptional Great Men.

    Alondra: The stories about technology and gender and race that were part of my upbringing were either the Great African-American First who invented the traffic light

    Or you know George Washington Carver, “He’s such a genius, he did a hundred things with peanuts, like oh my god.” I’m just a regular, regular person; how do I get involved?

    And then you’ve got these stories of science and technology used to expropriate, exploit, traumatize, repress. Afrofuturism, for me, has offered not an apology for any of that repression, um, and the trauma but a way to say that's not the only story, and that even over and against that history, that continues in some instances, that you've had black People just creating beauty and innovating and doing just extraordinary things. I still marvel at the fact that in 1968, black teenagers, 16 year old and 17 year olds in Oakland were like, we're going to do genetic testing and screening.

    I mena, people didn't know what genetic testing was. We didn't have universal newborn screening for genetic disease like we do now. It's like very commonplace now. talk about imagining another world. we can't get access to these resources and people like us are dying because of genetic disease. So let's do some tests. Like what?

    Keisha: Sidebar—

    SOUND: historical sidebar note

    Back in the 1960s and 70s, the Black Panther Party started free health clinics that expanded access to disease screenings including for sickle cell trait, which disproportionately impacts Black people. They started this in Oakland, and the clinics spread across the country, eventually influencing the US government to start screening programs it should have offered years before. And the early Panthers were young: many in their late teens and early 20s.

    Similarly, African American senior citizens were some of the earliest adopters of direct to consumer genetics. That's not how we tell that story. The history of chattel slavery has meant that Black people very much live in a world in which they're constantly seeking more information about their ancestors, about their past, about their families.

    But what that has meant has been, like, tremendous innovation and a kind of courage around new and emerging technology. Afrofuturism has really been the bridge that has allowed me to tell that story. Like, who are the, outré, you know, interesting, brave, ingenious people who are trying to think about science and technology and its implications in new ways?

    Keisha: Coming back to the consumer side of gene tech, is there something that you'd want regulators to take a closer look at—

    Alondra: What the national security and existential risk concerns in part around generative AI have done is made people more aware of the potential of bioweapons and using these new kinds of technologies to accelerate, the ability of people without a lot of technological or scientific expertise to, potentially, use these tools in malign ways.

    And already people can do CRISPR gene editing in their garages.

    Keisha: “People can do CRISPR gene editing in their garages”—what?? *laughs*

    Annanda: I'm telling you, these Bay Area streets, give some nerds some tools. *laughs* That's all they need.

    Alondra: There'd been these dividing lines, like forensic genetics and medical genetics; those are important and legit. And then there's recreational genetics; and no one cares. Part of the through line of The Social Life of DNA is that there might be different use cases for the genetics, but it's the same genetic information.

    The genome of a person carries all of this information at once, and we need to be careful.

    I've been kind of trying to track the thinking about genomics and AI together. I mean, part of the project of the Human Genome Project has always been um, a big data supercomputing project, and so AI has been involved in some way in genomics research for a long time. It’ll be interesting to see what that means for genetics and genomics research.

    President Obama announces the Precision Medicine Initiative, which is this endeavor in the United States to get a million people's DNA in a database for medical research, plans to oversample for underrepresented groups, and to do a lot of work, admirably, around consent and repeat contact with people who are subjects in the study, giving them sort of feedback and the like.

    But they announce it in the East Room and President Obama, says something like, “Well, you know, if it's my genome, I like to think that that's mine,” you know, him and his avuncular kind of delivery. And, I think the next day there was a correction in the newspaper: if you go to the hospital and give tissue, it's not yours, actually, it belongs to the hospital. It does not belong to you.

    Keisha: Even the President of the United States with all his staff and information resources—even he didn’t know that he wouldn’t own his own DNA if he went to hospital! So how much are regular folks at a disadvantage?

    And what about the technologies that are emerging around us right now?

    Alondra: I think, if we can not mess it up, that generative AI is really cool.

    SOUND SHIFT

    I mean these tools in these systems are potentially fun, fascinating, saving time to the extent that people are anticipating that generative AI or some of these other tools might do work we don't want to do, I'm interested and excited about that.

    I'm both interested, captivated, and daunted and worried about outer space innovation, about outer space policy. We're like throwing debris up there. There's no laws. Anybody can pretty much shoot up a satellite as long as they have the capacity to do it. You can explore up there. I think the story of technology and science, if we're wise, is the things that excite us always fill us with some trepidation as well, and our job, the job of policy and of imagination, and of good governance, is to try to mitigate the trepidation.

    Keisha: Thank you again for your time this afternoon. It's been a delight.

    Alondra: Thank you for your work. I've so just been honored to be hailed by you and, um, really appreciated it, enjoyed the conversation.

    SOUND SHIFT: silence for closing reflection.

    Keisha: Talking to somebody like Dr. Nelson is helpful because it gives you perspective with all the things that go wrong, are there things that maybe are going right?

    Annanda: My dad has this saying. He's like, always keep your wits about you. When he's anxious, it means one thing. When he's calm, it means another. But, you know, I think what's come up for me is, is that idea of what does it mean to right size your wits… as it relates to AI?

    I think it would be foolish to not have any caution and to have full abandon to the technology. I think there is a level of trust that is required because I think what could come from this can be so beautiful, y’know.

    Keisha: I did get my matrilineal ancestry tested recently. The LDS church has a lot of civic records and so I've been poking around those. But for a long time, I was completely resistant for some of the same reasons you said. Um, who are these companies? What will they do with that information? What's the relationship to law enforcement?

    And then on the back end, do they actually have the spread of data to interpret it for me? Like, for me and my people?

    I'm really so grateful to be around at a time where, one, that thing is possible. It's possible to do that sort of genealogical study. and make it generally accessible to ordinary people who don't know all about DNA and long time and migrations out of Central Africa and things like that. And to present it in a way that helps me feel like, Oh, I now have a sense of my link to lineages 500 to 2000 years ago, which before that test, I did not.

    Annanda: There are biomedical startups that are trying to make unique medicines that can be specified just to your DNA. Imagine how helpful that can be with a myriad of diseases. I want the Wakanda version of that.

    Keisha: And not the James Bond version.

    Annanda: Oh hell no! No.

    There are enough incredibly wealthy people that have shown you’re going to be alright if you do right by others. And we don't highlight that enough.

    SOUND: theme twinkles

    All too common is we think in order to be wealthy, that wealth must come at the cost of dignified living and thriving of other people. And that's just not true.

    Yes, that means there will also be a cap on how wealthy one could be because how could anybody have a right-sized relationship with the world if 1%-9% of people are hoarding so much? Like the math don’t math, right?

    Keisha: Right.

    Annanda: And our relationship with the world could be sustainable if folks who are wealthy honor that they’re part of a greater interdependent whole, and that when that whole thrives, they do too.

    Keisha: Yes.

    I'm struck by the fact that it is a choice, to go back to that question of magic and responsibility. It is a choice to choose the magic of life-giving ethics and to take in the perspectives of those who are different from you, to fill out your perspective on the world around you.

    This whole panic about emerging technologies requires that any of us, any individual of us, remember that we're not here by ourselves and that we depend on the perspectives of those who are different from us to have a grounded reality and to help us design a future that's worth living in.

    I end the episode like with a sense it's possible. If you take the Afrofuturist route, it's possible.

    SOUND: Closing theme starts

    CALL TO ACTION

    Annanda: We’re building community with listeners this season. So reach out and talk to us on Instagram—our handle is @moralrepairpodcast. Also catch us on X… formerly Twitter.

    We’d love to hear from you!

    Keisha: Follow the show on all major audio platforms—Apple, Spotify, Audible, RSS—wherever you like to listen to podcasts.

    And please help others find Moral Repair by sharing new episodes and leaving us a review.

    CREDITS

    Annanda: I’m Annanda Barclay.

    Keisha: And I’m Keisha McKenzie.

    Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Rye Dorsey, Courtney Fleurantin, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.

    Keisha: Original music is by Jim Cooper and Infomercial USA. And original cover art is by Randy Price. Our two-time Ambie-nominated podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.

    [PRX SIGNATURE]

    SHOW NOTES

    Talk to us online: at Instagram (@moralrepairpodcast), on X (@moralrepair), and on LinkedIn:

    The Social Text Afrofuturism issue: https://www.dukeupress.edu/afrofuturism-1

    About the Black Panther’s clinics: https://www.blackpast.org/african-american-history/institutions-african-american-history/black-panther-partys-free-medical-clinics-1969-1975/“

    No Justice, No Health”: https://link.springer.com/article/10.1007/s12111-019-09450-w

    Nelson + Lander explain the AI Bill of Rights (WIRED) https://www.wired.com/story/opinion-bill-of-rights-artificial-intelligence/

    How many medical tech advances came from HIV-AIDS research: https://www.princeton.edu/~ota/disk2/1990/9026/902612.PDF

    Interaction Question

    What piece of emerging technology are you excited about right now?

 

Season 1

Algorithms: Follow the Purple Light

Season 1 | Episode 1
What do we do about recommendation algorithms? What ethical standards could we use to reshape technology? Hosts Annanda and Keisha talk to Stewart Noyce, a technologist who helped develop the internet, and Rev. Dr. Sakena Young-Scaggs, an Afrofuturist scholar and philosopher, to understand how we can all navigate recommendation algorithms in a life-giving way.

  • INTRO

    Keisha: One time I hopped onto TikTok and I think I was watching these gym or body movement videos, and then, like, before I looked up, it was an hour later and there was some live talking about the president being a reptilian imposter, and I'm just like, [laughs] how did we get from one to the other? I don’t even know.

    It actually made me wonder about, you know, not only how much time do these devices take from us, but also how much data is needed to create them in the first place?

    [title music starts, laughter]

    Annanda: Quite curious that the president is a reptile [laughs], and this is what, this is what we're fed. This is what we're fed.

    Annanda: I'm Annanda Barclay, death doula, moral injury researcher, and chaplain who’s concerned with a life well-lived and cherishes the relational stories people share.

    Keisha: And I'm Keisha McKenzie, technical communicator, strategist, and advocate.

    This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech advances and share wisdom from Black culture for healing what tech has broken.

    Today on Moral Repair—what are we gonna do about algorithms?

    [title music ends]

    [upbeat hip hop instrumental music starts]

    Annanda: What algorithms recommend shapes our worldview. It impacts how we think and feel. It informs our politics, how we shop, how we perceive one another.

    Keisha: It's surprisingly like exposure, to share what's going on in your algorithm because you know that it's doing recommendations based on what you told it at some point or what it's assuming about you. When we share the Netflix password at home, like I, I have a whole separate profile for guests because first of all, you don't wanna see what they're seeing. You don't want them to see what you're seeing and you don't want what they're seeing to mess up what you're seeing.

    Annanda: Recommendation algorithms are so incredibly intimate and personal, even though it's a shared experience within the company, the platform, it's oddly both public and private.

    It's as though an algorithm is a pair of glasses that colors our view of the world. And that algorithm, it was created by a stranger for corporate interests to keep us watching and to sell us things. So yeah, what are we gonna do about algorithms?

    [upbeat instrumental music ends]

    SEGMENT A - STEWART

    [mysterious background instrumental]

    Stewart: I watched Instagram over the last six or nine months go from serving up my family and friends to serving up whatever it thought I needed to watch. I just watched a guy jumping off while he was skiing and juggling. It's the most amazing thing.

    [Background yelling: “YES, the flip juggle!] [laughs]

    I did not have any interest in that at all when I woke up this morning.

    Keisha: Stewart Noyce helped create the internet—no joke.

    Stewart:

    [mysterious background instrumental]

    [In] 1992, our friend group had a guy who was the first webmaster at Sun. He had to explain to us what he was doing with HTML. And they were building the first browsers; they had some of the first websites and all of a sudden what we thought was just gonna be something that helped companies do a better job of making money now became something that everyone could interact with and everyone could immediately, uh, kind of create their own worldview.

    Stewart: So, when they had the Lillehammer Olympics, Sun went over there and started broadcasting the news using their web browser and so you could get access to the news faster and in a more open and creative way.

    CLIP AUDIO from IBM Olympics ad: 00:09-00:35

    Before they could start at the 1994 Winter Games… we started. Four years ago, IBM was here helping plan and design every event…. The Lillehammer Olympic Organizing committee chose IBM not just to integrate thousands of computers and systems but to actually help manage the Winter Games.

    Stewart: I think that was the moment I was like, we—I can’t believe, I think we've unleashed something.

    Annanda: Something that upended who was in control.

    Stewart: Being in the middle of the first push for the internet meant that we were taking people away from very centralized systems.

    Individuals, then, were there empowered to create and to interact in new and interesting ways. So Twitter ended up being a massive change model for the world because it gave people a place where they could throw out their voice.

    Now let's step back though and say what's happened in response to all of that energy, right? Was that the people who wanted to continue to maintain their control over the social fabric, to make sure there's order in society, to make sure that we don't get too far over our skis in terms of chaos in the public forum, they wanna pull it back. And so surveillance came out as really a response to all of us being able to share our information.

    Annanda: Stewart’s a technologist with a lot of experience in what powers the internet, including recommendation algorithms—

    Keisha: which are based on that notion of surveillance.

    You can think of them as recipes for serving up content based on what the app or site that we're using thinks we wanna see—

    Annanda: —or what we've already chosen to tell it. So we feed it something, and then it keeps on suggesting the next thing, and the next thing.

    Stewart: The first step you would do in the recommendation algorithm is you can just take the last thing that somebody's watched and keep feeding that. It's like, did I watch a comedian for 60 seconds? Okay, well I'm gonna get that comedian for the next two months. That's not a capable algorithm. It's a trivial algorithm. It just gives me the last thing I saw. It gets smarter when it can take in more inputs and it builds from a base of experiences that I have and maybe brings in other factors that give them clues as to my interests. And those can be coming in from external sources. If you have multiple feeds of information that can come from different browser history that I've had, then you might be able to get a little bit more information about who I am and what I'm interested in: including more features, right, in the algorithm itself. So as opposed to having one or two things that you clue off of, this might have 10, 15, or 20.

    Annanda: One of the issues with recommendation algorithms is privacy. Corporation Big Brother is able to see what you watched and when. Think dating apps. Over time, they learn your preferences, you're able to chat privately, but it's always at least a three-way conversation: you, your potential cutie or cuties—and the corporation that designed the algorithm has forever access to our private conversations. Corporations with recommendation algorithms know us intimately. We pay to play and we pay with our privacy.

    The internet was groundbreaking for freedom of speech. But also a new threat to national security. Pandora’s Box unleashed.

    Stewart: I was involved in the introduction of elliptic curve cryptography, which allowed people to have secure and private communications from your handheld computer into the back end of the internet, and to other people. All of that activity would be private and would accelerate communication between individuals. Well, you know, you can't have that, so we had to have backdoors put into all of the cryptography…

    Keisha: “Backdoor.” Quick definition here.

    [Background: swipe sound, double bass-heavy mysterious music, keystrokes, wicked giggle]

    Let’s say you want to use a social media app on your phone. You’ll come into the system through the front door: you type in your username and password and you’re in. But the system owner also makes a “backdoor” to that system. They and whoever else they let in can use that backdoor to access the system and its data, including yours, even if they don’t have your direct permission. Those terms and conditions we don’t read but swipe through? We’re giving implied permission there, to God knows who. OK, back to Stewart.

    [Background sounds end]

    Stewart: And if anybody did not wanna play ball, then they had to leave the United States because this is where we were maintaining that control.

    Keisha: The government required those back doors? Or that was industry consensus that there would be back doors?

    Stewart: I wasn't in the room. All I know is that their back doors have been inserted into all these algorithms.

    Stewart: And if you look at the systems that are in place like Signal or WhatsApp, or Telegram, these peer-to-peer messaging applications that are secure are secure to the extent that most people can't break in and hear your communication, but whoever's running that system can. So whatever sovereign entity or large corporate entity is running that model, they can hear everything that you know you have and they can use it, uh, for their own purposes.

    Keisha: Since 9/11, particularly, the US government has been empowered to suck up tons and tons of data, whether through the NSA and the intelligence community or through the information that we volunteer through our interaction with corporations.

    Stewart: Live in the light, then, because there's no, there's no privacy anymore, literally no privacy. And if you want privacy, then you should probably go and have a conversation with somebody at the beach and leave your phone in the car.

    Keisha: You were saying capable recommendations require several different kinds of information sources, so not just the stuff that we might create as a Instagram post, but also information from the mobile device that we're using to log into Instagram and maybe our network of connections on Instagram and maybe the metadata for how often we use Instagram and all that information then gets compiled into this more capable algorithm. Is that right?

    Stewart: Yes that’s correct.

    Keisha: So like, If, as you're saying, what we're facing now is corporate control, decentralized control, and very little restraint cuz there's a profit motive—

    How are you imagining a path forward? What's the next step if corporations are not being restrained by a government and also government isn't being restrained by government, what's the opportunity here?

    Stewart: You mean our opportunity for survival?

    Keisha: Yes! [all laugh]

    Stewart: I would say that, yeah—

    Annanda: Yeah.

    Stewart: I thought I'd throw that out there. I would say that I'm very optimistic about the next generation and I think of them as generation AI or generation artificial intelligence.

    They're growing up in a very new kind of world, where these massive global systems that have been collecting information about them and all their friends and everybody else, now get turned against them and they just look at 'em and say, oh no, if we're gonna live here for the next like 60 years, we're not gonna put up with that.

    And so they're already changing.

    Keisha: Several sites built on recommendation algorithms have lost credibility and users over the past year. As we start to worry about privacy and bots, some of us will share less with each other. As people lose confidence in the communities that the internet has given us, trying out smaller platforms and being more wary of algorithms makes a lot of sense.

    But it might be too early to know how big a trend this is gonna be.

    Annanda: I wonder how this moment in tech is affecting those of us who want to feel known and safe with each other, not watched by corporations who make money out of what they think they know about us, and not pushed apart by algorithms we don’t control that tuck us into separate corners.

    The distributed web lets us find things and people we’d never have discovered on our own. And yet the recommendation algorithm can make it really hard to come back together.

    Disconnection and endless advertising is not at all what the internet promised us.

    Keisha: I came of age on the internet around 1998. I've seen people have really deep and meaningful human experiences, relationships, connection, self-discovery, all of that because of this distributed web. But then at the same time, in the last two decades, people have been crying out for a lack of local, in-person third spaces.

    So work has zoomed into massive, um, influence on your life. You have less time to do things and to build connections that are meaningful. And so I do see people getting off of X platform, whether that's TikTok or Facebook or Twitter or Instagram, and going to these smaller servers on Mastodon and Discord.

    It's like going from a walled garden that one guy in Silicon Valley controls—[Stewart laughs]—to a walled garden that somebody down the street controls, but at least you know the guy down the street and you can kind of work with them on the rules of the walled garden that you're now in. You're in somebody's patio instead of in somebody else's mansion.

    And so there's something around this more localized version of the internet that's pretty interesting and if we play it right, if it doesn't turn back into those smaller spaces, our spaces that corporations can weasel their way back in to advertise to us through content or to influence us in terms of our political thinking or our social thinking, then I won't worry.

    But if it's more that people isolate, I think that's a troubling trajectory.

    Stewart: If you want to make more contacts and more relationships that are real and deep, then as an individual, you should be using the internet and using social media and public spaces to present your talking points, goals and interests, in such a way that those people who should connect with you find you and start having a relationship with you. But don't expect that “I represented myself” to result in immediately strong interconnected relationships that develop value. You're going to have to work at those.

    Keisha: Our best shot, given the technology is here, is to try to use it to drive towards knowledge and wisdom. And part of that means that we get to use influence, whether it's influence and virality through the bots, or it's actually building on our human connections and persuading our neighbors to share the thing.

    What's the difference between that sort of spreading the message and the kind of spread-the-message that unnerves you about this technology? If everybody's having the same dream, that's because somebody spread a particular dream. But we want to spread a particular dream, like a particular dream about flourishing and a particular dream about life and community.

    Stewart: Every group that gets together and creates something new brings in the best of each individual to bear on the solution. We have a vision around something that is a flourishing human society that brings value to each of us. Each of us is worthy, each of us has value and should be able to contribute. And because we do that, the life becomes so much easier for all of us because we just have more instead of the scarcity.

    ​​Keisha: Where I hear people worrying the most about algorithmic technologies is where they are used to break the group apart, to break apart the symphony, where they're used to atomize us and to increase suspicion. And even the question of whether we're able to tell what is true because there's so much content and it's increasingly hard to distinguish between true and false, that feels fundamentally dangerous because if we can't tell what is true, then how can we be in the same reality with each other?

    There has to be some sort of countermeasure to the profit motive that encourages that sort of true-false distinction to be washed away.

    Stewart: Those people who practice division and deception then that's what you'll get. And those of us who live in this world need to consider that acceleration, in every part of our life and how fragile it makes our society, because we have to make decisions faster when we should be spending more time to think, is this true or is this false? How do we slow it down?

    [bass-heavy episode music starts]

    And that is why I believe that we will have more communities where we are better known and those communities where we are better known are more likely to push back on us when we do something foolish or say something that divides and deceives and that we will become better people because of the small community we're in.

    God wants what is best for us. That means each of us is valued and each of us should be valued. So for there to be one meme that is the controlling meme for everyone, I don't agree with that. I agree that each of us do have value, but we are working within a framework of something greater than ourselves.

    [midroll Break -- ads and promos go here]

    Keisha: This is Moral Repair: A Black Exploration of Tech. We’ve been talking about algorithms and their impact on society and us. And now we’re thinking about ways to heal what those algorithms have broken.

    Annanda: Technological innovation in the West has had a complicated impact on humanity over the last five centuries.

    [background: driving, curious music starts]

    Portuguese, Dutch, British, Spanish, and French enslavers used innovations in shipbuilding technology as their literal vehicle during the trans-Atlantic trade. Plantations also used state-of-the-art tech to process goods like cotton, rice, and sugar cane as well as the labor of enslaved people. That was the tech that’s made our world.

    And tech’s still shaping our world. Algorithms are used in judicial systems to support how judges sentence people charged with a crime.

    Many states also use recommendation algorithms to determine who gets access to food assistance and other social services.

    Keisha: According to research from Professor Virginia Eubanks, automation of the Supplemental Nutrition Assistance Program has overwhelmingly resulted in fewer people receiving life-saving social services.

    The poverty rate increase is so dramatic, that she says these recommendation algorithms help to build “digital poorhouses”.

    In the colonial past, as today, Black people disproportionately experienced poverty and incarceration. And today, Black people are also significantly underrepresented as engineers or decision makers in the tech industry shaping this world.

    Annanda: A 2019 Harvard Business Review paper revealed that Black entrepreneurs receive just 1% of all venture capital funding. An amount that hasn’t really budged in the last several years.

    I went online just a month ago to see how many C-suite executives were Black in the top five tech companies and of the top five tech companies as of this recording, I saw maybe one or two. And out of those, they were all doing DEI. For those of you who don’t know, DEI stands for diversity, equity, and inclusion. Meaning that the work that they do doesn’t directly contribute to technological innovation. The work that they do is diversity, equity, and inclusion work within the organizational structure.

    Keisha: That’s wild.

    [background music ends]

    Annanda: So with that, I'd like to introduce my mentor, dear friend, and scholar, Reverend Dr. Sakena Young-Scaggs.

    SEGMENT B - REV DR. SYS

    Rev. Dr. SYS: "Reverend Dr. Sakena Young-Scaggs" is a lot for most to say and definitely a lot for most to say on the go. So it just sometimes gets shortened to "Hey Rev SYS [Sis]," and that's fine.

    Keisha: Rev Dr SYS explores tech’s problems and possibilities through the lens of her field, Afrofuturism. She’s a pastor, campus dean at Stanford University, and philosopher who thinks a lot about how tech shapes us all, who’s often left out of the design room, what that costs us, and how we can still find joy.

    That’s where moral repair comes in. Something breaks down, not only in a kind of technology like algorithms, but also in our sense of what is right and how we interact with each other. Arizona State professor Margaret Urban Walker describes moral repair as the work of creating or stabilizing mutually agreed ways for people to be in relationship.

    And that means we have to look at all of the features of the social world and the technological world that we’ve inherited.

    Rev Dr SYS: One of the challenges in science fiction and futurist thought, historically, has been the erasure of race, and talking about the future without giving that consideration.

    That has changed over the last three decades. The beauty of that, was a term, Afrofuturism. It crosses genre.

    You take that philosophy of Afrofuturism and you imagine a future, you imagine a world in which we can shape and mold, but also reimagine a world that has caused harm and trauma systemically to Black and brown folk.

    Keisha: So these patterns of surveillance and erasure go way back. But so do some of the tools that monitored communities have used for resilience and repair.

    Annanda: Among Black enslaved communities, hush harbors were places our enslaved ancestors would go to “get out of the gaze” of their enslavers and an overseeing population—aka White people who were not directly their enslavers—to find privacy.

    They would participate in religious and spiritual practices, ceremonies of releasing stress, trauma, pain. They'd also celebrate in joy. They would laugh, they would cry, dance, and sing.

    They would exchange multiple forms of wisdom to keep on keeping on in conditions designed to reinforce their chattel condition. They found ways to be resilient in the moral injury of slavery.

    Keisha: Lack of privacy and self-determination isn't new in the human experience or in recent US history. But neither is imagination.

    [background: trumpets and whirly lounge jazz music, evoking the whistles of the original Star Trek series]

    Rev. Dr. SYS: Every course that I open, that I teach—science fiction, race and religion—I have everyone take out their cell phone, can't get away from it. And I say, you're holding in your hand the imagination of Gene Roddenberry. I say, in 1963 he imagined this technology called, a tricorder and it was a communicator. And it was able to do wonderful imagined things of not only communicating but medicine and identifying, uh, body metrics and, maybe even doing healing. And as Star Trek, of course, uh, took on “the next generation,” the tricorder did lots of things.

    Keisha: I really appreciate seeing the cell phone as the, almost the precedent of the tricorder. I watched Star Trek and my favorite captain for the majority of it was Picard. He would go into his office and then tell the computer to make him an Earl Grey, hot [AUDIO CLIP: Earl Grey Hot] or when anybody would say to a newcomer onto the ship, "Whatever you've got on your planet, the computer can make it for you."

    There's something about the knowledge that is embedded in that computer system in this series that mirrors all of the data that has been collected and infused into Google's machine and into the new forms of AI. I think people are wondering now about what it does to us as humans to have acres and acres of human knowledge and experience at our fingertips, ready to serve us exactly what it thinks we want.

    Rev. Dr. SYS: Very practical example: 3D printing is the manifestation of that science fiction, science happening in real time.

    My husband is a chess coach. He was teaching at a STEM school in Phoenix, and the students came to him and said, you know, we'd love to play some chess.

    And before they had a chess team, parents came and said, oh, we'll bring you some chess sets. But the school had a 3D printer. And they began to print out chess sets for these kids, in urban Phoenix, downtown, to, to play chess regularly. And so manifest on the tables were these 3D chess sets that was just printed up, um, like Earl Grey Hot.

    Some say that the Stanford air has imagination and innovation just in it, uh, because things just kind of get constantly innovated and imagined, and then manifested and come to life. That's part of that imagination.

    What do you do with it? People sometimes, yes, are afraid, and uncomfortable with imagined science that comes to life. But it's not the science or the imagined information. It's how it's applied and how it's used.

    Those come back to what I talk about in ethics and theology. It's how it's used and how it's applied with the human element, uh, not just the science in and of itself.

    We had this discussion, uh, two decades ago in the late ‘90s. When they broke the human genome, everybody got uncomfortable… because there was no controls and we had to put in ethical imperatives.

    And I think the same can be said about AI. It's not the science.

    It's how the human element interfaces with the science.

    Annanda: My research is centered in the moral distresses and injuries directly informed by the implementation of technological innovation. So I’m curious, from a Black Africana perspective, what moral wisdom, resilience, and self-determination looks like as it relates to recommendation algorithms.

    If you had a magic wand, what do you think would be an Black approach?

    Rev. Dr. SYS: Is this life-giving or death-dealing?

    Mercy Oduyoye, a womanist theologian, I always use her quote, how do we assess, how do we go about the world when we assess things? That has always been my litmus test, whether something is life giving or death dealing, and if there was a purple light that could assess whether something was life-giving, that would be what I would imagine.

    Because we need more life-giving things, we have a lot of death-dealing things that are imagined every day. And so what good—does this enhance the human experience? How does it shift, the barometer of hunger and healing and wellbeing and not status or rank or you know, how much we are gonna make in terms of the cost and capital of something that we're gonna just rack up and this is gonna just make us rich.

    No.

    What good does it do for humanity overall? Is it life giving or is it death dealing?

    Rev. Dr. SYS: does it disconnect us? If it's a technology that disconnects us from our humanity, then I find it problematic.

    We know that we have experienced trauma and harm at the different hands of society, whether it be the state, whether it be the everyday citizen that just harbors intergenerational racism or hate.

    We have, as a people, I'm gonna talk from the Africana perspective, from the I perspective, experienced that, so we have a healthy suspicion of anyone keeping track of us, monitoring our bio data. Exactly.

    You have this cultural boundary, that is valid.

    Annanda: What role do you think black Africana spiritualities and religious traditions have at this particular time of the fourth industrial revolution? You know, what wisdom and gift, um, can it provide at this time?

    Rev. Dr. SYS: Who do we take our problems to, our heartaches, but also who are speaking from our pulpits that we flock to on Sunday and then live it out Monday through Saturday.

    Africana spirituality includes the Black church, Christianity, but it also includes Black Muslims. It includes Black Buddhists, includes African traditional spiritualities. So the voices of our spiritual leaders, the Black church, all of those spiritual traditions, um, has always been part of community: sages and wisdom.

    You know, we can talk nicely about shamans and witch doctors as well. They've all been part of the spiritual landscape. The Yoruba Baba, the Iya, they've always been the ones who we go to, to seek wisdom—the Nana in the Akan tradition.

    So there is no way around in many ways, spiritual practice and spiritual leaders having a voice. Now, you know, whether someone is a humanist, they're still in some ways practicing spirituality. Because they're thinking about their human existence and having meaning.

    So all of those voices play a role in shaping decisions and shaping the way in which people have decided to interface or not.

    For me, leading with heart and head means making decisions that's wise intellectually, that takes the human capacity and the harm that it could do, or the help that it could do, it meaning technology.

    Keisha: Mm-hmm.

    Rev. Dr. SYS: And the, the joy that it could bring because we need more joy.

     Dr. Prathia Hall coined the term “Sankofa ethic,” and it means that you live out both past, present, and future in taking those things into consideration.

    Keisha: From all the moral streams that you've referenced today, choosing paths that are life-giving for technology, thinking about the past and remaining anchored in it while stewarding the future, is there a particular frame that you wanna offer people today who are navigating almost infinite choices through the technologies they hold in their hands?

    Rev. Dr. SYS: Because there is so much to choose from, the marker has to be simple on how we decide. There is infinite—as we know now with AI, infinite combinations and possibilities of technology moving forward, and we have to wrestle and reckon with that.

    [title music starts]

    But whether something is life giving or death dealing, that's the bottom line, I believe, as an assessment. And there's the purple light: make a choice.

    OUTRO closing music

    Keisha: One of the things I loved most about Rev. Dr. SYS's conversation with us was the part where she talked about how Star Trek and music and other culture media moments help her bring her students into the conversation about imagination, science fiction, and how science happening today is the imagination made real.

    Annanda: I appreciate her introducing herself and showing the importance of what does it mean for her to be, uh, you know, a Black African American woman who's concerned with how we be— with the phenomena of things— and that how we be matters cuz it's how we show up in the world.

    Keisha: As I'm looking at the way the algorithm is pushing me through different kinds of content, I’ve been thinking a lot about how I felt about the internet back in the ‘90s and early 2000s: hopeful about relationships, spending time in life-giving communities. I’d really love to have that back in my life again. But maybe it means I give less time to the algorithmic internet. We're more than what is manifest. We're more than what is obvious. There's something that's | intangible and beautiful about humanity and sometimes technology helps us like deepen into that and sometimes it distracts us from it. And that goes | back to what Rev. | Dr. SYS was saying, that there's a way that technology can enhance our humanity | or / distract from it.

    Annanda: Stewart reminded me this technology is not going anywhere so knowing how to use it to your benefit in this capitalist system is important. With Rev. Dr SYS, the takeaway I have is if the outcome and the impact is misaligned with the good intentions for this technology that there’s accountability and responsibility for the real-world ways in which it impacts people’s lives.

    Stay strong out here folks! [laughs]

    Keisha: Okay! Yeah.

    Annanda: Y’know, make it work for you. In life giving ways.

    Keisha: That's right.

    CREDITS

    [bass-heavy episode music starts]

    Annanda: I’m Annanda Barclay

    Keisha: And I’m Keisha McKenzie.

    Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.

    Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.

    [episode music fades]

    [PRX SONIC ID x 2]

    SHOW NOTES

    Learn more about Stewart’s work in marketing and consulting at StewartNoyce.com

    See IBM promoting their work at the 1994 Winter Olympics in this vintage ad.

    How do algorithms drive social inequality? Virginia Eubanks explains in Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor (St. Martin's Press)

    What’s Afrofuturism all about? Read Afrofuturism: The World of Black Sci-Fi and Fantasy Culture by Ytasha L. Womack (Lawrence Hill Books)

    Learn More About Black Entrepreneurs receiving 1% of all Venture Capital: Sources of Capital for Black Entrepreneurs, Harvard Business Review, 2019 by Steven S. Rogers, Stanly Onuoha, and Kayin Barclay

    Explore more On “live giving and death dealing” from African Feminist Theologian Mercy Oduyoye: Beads & Strands: Reflections of an African Woman on Christianity in Africa (Theology in Africa), Orbis Press 2013

 

Holograms

Season 1 | Episode 2

Have you ever considered the moral dilemma of a hologram narrating someone's life story for them? Dive into the latest episode of Moral Repair as we interview Otis Moss III about the interactive AI hologram of his father Civil Rights Leader Otis Moss Jr. showcased at the Maltz Museum in Ohio. Discover Zuogwi Earl Reeves how the black wisdom of hip-hop plays a pivotal role in the moral repair of narrating our tales authentically. Tune into the episode and embark on this thought-provoking journey with us.

  • MORAL REPAIR - EPISODE 2: Holograms (& How We Remember)

    Big Question: Can your hologram replace you?

    INTRODUCTION

    Keisha: Hey Annanda.

    Annanda: Hey Keisha.

    Keisha: Listen to this clip I found on YouTube. It’s from the Maltz Museum of Jewish Heritage near Cleveland, Ohio.

    Audio from Maltz Museum’s promo video (00:19-00:59): (Maltz narrator) “Using state-of-the-art technology, Rev. Moss was filmed over the course of several days, answering thousands of questions about his life growing up in Georgia, being orphaned at the age of 16, and continuing on to earn a Master’s of Divinity from Morehouse College, to become one of the foremost thought leaders on civil rights.”

(Maltz staff to hologram) “What’s your story?” 

(Moss hologram) “I am Otis Moss, Jr, an eternal… fighter… for justice… human rights… civil rights, and the right to be… a human being.”

    Keisha: “The right to be a human being.”

    But guess what. This isn’t a human being talking.

    Annanda: It’s not actually Otis Moss, Jr.

    Keisha: Nope. It’s a hologram trained with AI to sound and interact like him.

    And he’s not alone.

    Earlier this year, therapist Esther Perel learned that someone she didn’t know had made an AI bot of her.

    They’d trained the bot on her voice and her stories, counseling sessions, and speeches, and then used AI to make it responsive… all without her knowledge or consent.

    It sounded like her. It answered questions like her. But it wasn’t her. Here she is at South by Southwest.

    Audio from Esther Perel’s SXSW talk, The Other Ai: Artificial Intimacy (01:09-01:18): As I’m getting older, I’ve been thinking a lot about mortality. But I didn’t imagine immortality to appear in the form of a machine…

    Audio from Esther Perel’s SXSW talk, The Other Ai: Artificial Intimacy (01:31-01:54): Part of me was flattered. Part of me felt plagiarized. Another part of me was deeply concerned about the clinical and ethical implications. Mostly, however, I was fascinated. This “AI Esther”—is she better than me? [SXSW audience laughs]

    << Music >>

    Annanda: I’m Annanda Barclay.

    Keisha: And I’m Keisha McKenzie.

    Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of technology… and share wisdom from Black culture for healing what that tech has broken.

    Today on Moral Repair — can your hologram replace you?

    << Music >>

    Let’s return to Rev. Otis Moss Jr — the civil rights leader — and the exhibit in Cleveland that turned him into a hologram. First off, he and his family authorized everything.

    OM3: The Maltz Museum approached my father.

    Keisha: This is Rev. Otis Moss Jr’s son — Rev. Dr. Otis Moss the Third.

    He’s a multi-hyphenate—author, filmmaker, mentor to young ministers, community leader, and senior pastor at Trinity United Church of Christ in Chicago. I know him as OM3, or Pastor Moss. And he knows exactly how his father’s hologram exhibit came to be.

    SEGMENT 1 - Rev. Dr. Otis Moss III

    OM3: The Maltz Museum is based in Cleveland. It is | the primary museum and repository in reference to Jewish history and also specifically the Holocaust. And they were partnering with StoryFile.

    Keisha: StoryFile is a software company that makes AI-based conversational videos.

    OM3: StoryFile had already done several of these holograms with very unique and interesting people. Some were astronauts, some were scientists, some were writers.

    And they had been doing the work, of recording, uh, the experiences of Holocaust survivors so that another generation would be able to engage with them.

    But they've expanded beyond that of looking at issues of the tragic and triumph among the human spirit among people across the board from Indigenous to people of African descent, uh, to people who are Latinx and so on.

    And my father, you know, is quite an individual | in the city of Cleveland as an icon around civil rights. And they asked him, could we record you?

    He had no idea it was a hologram. He's just thinking they're gonna bring in a camera, and record him. But then they explained that this is going to be an AI-generated hologram that will learn over time to engage people based upon the questions that they ask.

    When StoryFile presented the other people that they had recorded and the purpose, that it was an educational purpose, that it was an ancestral purpose, he was immediately intrigued and reached out, which is, it’s so funny — My father has this wonderful, deep resonant voice. And he calls, he says, “Otis.”

    I was like, “Yeah, Pop. What's going on?” “Um, I've been asked… to be recorded… as a hologram. What do you think?” 

*laughs* So we had this conversation about holograms, which was fantastic.

    So it was a three-day recording event, which was, uh, fascinating in and of itself. In terms of what he had to wear, how he had to sit, and all of that in order to get the right posture to make sure that his voice was recorded appropriately.

    Keisha: You've picked up a couple of threads that I'd love us to touch on, one being the function of multi-generational storytelling.

    You said, for the museum, it was really important for them to help capture the stories of Holocaust survivors. And then with regard to your father, somebody who was part of the civil rights movement.

    You're a storyteller yourself too.

    OM3: Story is one of the most important things in terms of the human endeavor. Story is how we live, it's how we function. If you live out the right story, you literally can change your life and change the world. If you live out someone else's story, you have a ceiling on what you can do. And that ceiling is based on someone else determining what your possibilities are.

    I come from a tradition and a family rich with the telling of story and the importance of story. Both of my parents are Southern. And there are no storytellers, in America, quite like Southern storytellers.

    Keisha: Mm.

    OM3: There’s something special about it. Before there was TV, you, you sat on the front porch, you heard a story. When families get together, you listen to a story. The myths, the "big lies," as Zora Neale Hurston likes to say, and the deep truths that are within these stories that sometimes have a lot of hyperbole. Every faith tradition is a story tradition. And by operating out of the appropriate story, it can be transformative or it can be reductive and destructive. What I appreciate that StoryFile did is they allowed my father to articulate his story, to just go for it, which was very similar to the manner in which he teaches and preaches: There's going to be a story, there's gonna be a reference to, one, African-American history or African history; two, there will be a story that connects his direct experience in the American experiment.

    And StoryFile gave the space to do that and allowed him to, to, you know, to talk and talk and talk and talk and talk, versus trying to truly edit his story so that it would be palatable. Because his story is a story of great triumph and great pain and you can't separate the two.

    He mixes those two things together with | the story of what it was like growing up in rural Georgia in the 1940s and the 1930s. Having four siblings, a father who made a decision not to get remarried after his wife passed away as a result of medical apartheid.

    That story connected with his eventual acceptance to Morehouse College and connecting with Benjamin Elijah Mays and the StoryFile people allowed him to share the reality of what it meant to be a person of African descent who was a visible, invisible human being in the South.

    Keisha (Narration): Dating back to the 1700s, European and American courts, circuses, science fairs, medical schools, and museums began collecting and exhibiting Indigenous and African people and their remains. Several universities and public museums still store Indigenous and Black remains today, usually without the consent of the families or the nations they were taken from.

    Keisha: You're aligned with the project on the level of story and cultural transmission. Did you have any concerns about the technology itself? Black culture has had a mixed legacy of being engaged with museums.

    OM3: The difference in this | is that he had personal relationships with people who were involved with the project. So the individual who gave the initial money, uh, for the project he had worked with before back in the day, probably protests and everything else. So they had relationships.

    Pastor Moss says there are a few things that made this project really sing: how the staff was educated to work with his father and the civil rights generation… their sensitivity to the content of their lives…. and what it might evoke for people interacting with those histories and experiences in such a vivid way.

    Otis Moss Jr and his wife, Edwina, visited the Maltz Museum when the hologram of Rev. Moss first opened. On their way through, they walked through an exhibit of nine photographers who had chronicled the civil rights movement.

    OM3: They had these amazing photos many people have seen. My mom stops and says, oh, this picture of Coretta—you know, I'm over on the right side of the photographer when this was taken. This was Coretta Scott King. You see her mourning because she had gotten news that Dr. King had been assassinated. So it's the first time ever heard this, that she was in the room in that moment. And she's just naming all the people who were in the room.

    So we're just absolutely blown away. And it brought back memories for all of them.

    But the one that was really moving was when they stopped at the picture of the open casket of Dr. King. My father paused and my mother paused. And they stated that “We were never given the opportunity to mourn our friend.”

    Annanda: Mm.

    OM3: And they stood there for like 15 minutes. Because they had to go back to work. There was a short mourning period, but they just had to get back to work and let other people mourn.

    Keisha: What if holograms had existed when Dr. King was alive? Could they have helped people and communities remember their loved ones and mourn more fully in that moment, not just sixty years later?

    OM3: Imagine if the community and the family of Medgar Evers had that.

    Keisha: Medgar Evers was a voting rights and desegregation activist from Mississippi. He was assassinated by a White supremacist in 1963. That whole decade, people like Evers were murdered… people who devoted their lives to practicing freedom and fighting segregation.

    OM3: That family did not have the time to mourn in an appropriate way because they were essentially in combat. They were, they were at war. It’s like we do the rites, but we gotta get back to work and let other people mourn. One of my biggest questions for Pastor Moss is whether the holographic representation of his dad felt authentic.

    OM3: It—it does. They use multiple cameras to film. So you’re getting a video essentially of my father. But then the AI system is drawing on your questions.

    So the AI system is trying to figure out, based upon the file that we already have, where should we go in terms of an answer. You know, "Did you have a dog when you were a kid?" "Oh yeah, we did."

    But if it's something that the system does not have information, there's a particular answer that it gives for that. You know sometimes people ask just like you know, an out-the-bo question, you know, do you put ketchup on a hot dog? Something like that. And, and the system says, "Excuse me?"

    Annanda: Yes. Oh my gosh. Yes.

    OM3: It was really great when, when some of the little kids were asking these, you know, kind of out-the-box kid questions.

    Keisha: Mm-hmm.

    OM3: And some of them were, “Excuse me?” And some of them the system would learn “Excuse me?” with a smile and a little laughter: “I'm not familiar with that.”

    So it felt authentic because it was authentically his voice. I was familiar with the moments that were recorded. It was now the system was doing the curation versus a reconstruction of my father physically.

    Annanda: How is it that you hope people engage with your father’s hologram?

    OM3: I hope that they will have a new picture of people of African descent. And I mention that because the gatekeepers of media, and even on social media, these stories are not held. They're not elevated.

    These are stories that are deeply rooted in the human spirit and rooted in the spirit of God moving in magnificent and sacred, serendipitous ways, and I hope that another generation would raise a question:

    How could a people denied everything in this country create space and institutions that rewrite the history of the country? How do they do that?

    Keisha: A few possible answers… Resilience. Telling stories to stay connected. Leaning into community when folks cheat or steal. And being savvy about questions of dignity… and ownership.

    Keisha: How do you understand ownership when it comes to stories? There is a way in which stories are cultural property, or collective property.

    OM3: Mm-hmm.

    Keisha: But we also are in a society that has intellectual property law—You can privatize that ownership. Did you have conversations with the museum about who owns—

    OM3: Yes.

    Keisha: —the stories and your father's image and so forth? Talk to us about that?

    OM3: Yes. | I'm glad you raised that. Yes. That was very clear that we, the Moss family, Otis Moss Jr., my father, it is his story. It is his image, and they are only leasing per se, that image. And they do not have ownership.

    We, as the family at any time could say, take it down. We don't want you to use us anymore. This is his intellectual property. That's very, that's very important. And Black culture has been appropriated so often, um, purchased and then saying, we own it.

    Keisha: Yep.

    OM3: When in many ways we, we should lease, but not let anybody own. So you want to use my beats? Sure. Lease them. You wanna use my estate? Sure. You're more than welcome to, but the, the ownership comes back to me. And we've seen that in music. We've seen that in film and art and others in books and things of that nature over and over and over again.

    But there's some things that you can't, you know, even if you attempt to say that you own it, there's no way that | you can, it's like Emeril Lagasse trying to say he owns gumbo. It don't work. *laughs*

    Keisha: We came across statements from Chance the Rapper and Sheila E talking about Prince. Prince had very, very rugged ideas about whether holograms should be used of his work. Even Justin Timberlake.

    OM3: Oh Lord.

    Annanda: God bless.

    Keisha: —introducing holograms at the Super Bowl or pretending that he would.

    What’s your perspective on the use of post-mortem holograms?

    OM3: In terms of educational purposes, I'm fine with it.

    Keisha: Mm-hmm.

    OM3: Uh, to create spaces where people can interact I think there's an appropriate place for museum to do this work.

    In these other spaces? These are not for educational purposes |. These are for purely profit purposes and go back to what company owns.

    So, in other words, we're doing Henrietta Lacks with holograms.

    Keisha: Mmmmm.

    OM3: If the company was really about “We wanna save lives,” then the family also would've retained the rights to her own doggone DNA. And that’s what’s happening with the MJs and the Tupacs and all these other things: somebody’s figured out “I can make a buck.”

    They're not trying to educate on Tupac's trajectory and his connection to the Panthers and his experience in the School of the Arts in Baltimore. They're not trying to tell us about his fascination with the seventies Black poets. They're not interested.

    Educational purposes put things in public domain, not for-profit.

    So the spirituals are in public domain. Mm-hmm. And they should be ta

    But I don’t want a hologram… of a bunch of Black folk…. from 1842 and go to Disney—

    Keisha: Singing “Deep River.”

    OM3: —Singing “Deep River,” doggonit. I don’t want a hologram of Paul Robeson. You need to read about him.

    Keisha: Oh Lord.

    OM3: The only place it needs to be is if I'm going to the Museum of African-American Culture, to the Museum of—in Nashville—on African-American Music. Uh, yeah. That's fine.

    But I do not want to go “Tickets available. $35. Sponsored by Coke.”

    Keisha: Come on.

    OM3: I don’t. I don't want the hologram like Henrietta Lacks to become like the NAACP programs have many times become. Where they're sponsored by the same people who are selling certain things to our neighborhood that is destructive.

    Keisha: That's real.

    Keisha: Sidebar. Just this summer, Thermo Fisher Scientific — one of the tech companies that profited from using Henrietta Lacks’ body without consent — finally settled with her family—over 70 years later. That’s how long it’s taken ethics to catch up with medical tech.

    Now, hologram tech isn’t good enough to be confused with real people. Yet. But it’s reasonable to expect it’ll improve, and fast. As it does, how could community relationships and knowledge protect us from being misled, plagiarized, or preyed on for-profit like Lacks was?

    Keisha: You put a lot of content out there yourself, sermons, lecturers, talks, interviews, films. | Because of | the density of that content available on the internet, we have deep fakes, deep fakes being very, very credible imitations of the real.

    How concerned are you on a scale of 1-10 about deep fakes drawing on the content you've already put out there and applying it to purposes you don't agree with?

    OM3: At this point, six to a seven. | Within the Black church tradition, deep fakes have been around a whole for a long time. Preachers have been preaching other people's stuff | and telling a story. "I once heard" —it's a deep fake. So we have just given an algorithm to a cultural practice among human beings.

    And that's why Zora Neale Hurston talks about, how when you are of a culture that knows story, you can make the distinction between the mythos | and that which is the truth, and the bigger T “Truth” that’s embedded within it. So you then become an automatic editor.

    OM3: And you know the wink that's happening. Soon as someone says, “I heard it said,” it's like “He got that from somebody.”

    Annanda: It’s the “If you know, you know” culture. To that wink and the nod, if you are in community, Black community is efficient. It has an efficiency to it— that is designed to deal with the road bumps, the blockades, that capitalism specifically targeting Black bodies deals out. I really appreciate you naming that because that is how we operate and move through the world.

    And hearing Keisha’s questions and your responses on how we do that with holograms, how we incorporate this technology into a Black sensibility and way of being is really fundamentally important and educational, for ourselves, for our audience.

    OM3: What scares me about technology is the profit motive. More than anything else.

    Plus, the inherent human frailty. So profit plus human frailty, historically, has meant tragedy for someone.

    And where are the spaces that allow conversations of public good and not for quarterly profit? And we’re in this age—I struggle with that because AI in itself is held in the hands | of a few companies. That terrifies me more than the AI because | if it's a profit motive, it's not for public good.

    Keisha: That's right.

    OM3: They’re raising the question instead of, Should we do this? Can we do this? And those are two fundamentally different questions. And come from two different moral places.

    Keisha: That's right. That's right. One is a pragmatic question and the other is an ethical question.

    OM3: Mm-hmm.

    Keisha: A moral question.

    OM3: Yeah.

    Keisha: Did you ever watch the Netflix series Love, Death, and Robots?

    OM3: Yes. *laughs*

    Keisha: There's an episode in there about what happens after humanity is gone. If that happens to us, your dad's hologram at the Maltz Museum might be one of the last sentinels on earth. What do you think that hologram would be saying to the universe?

    OM3: Hmm. Ooh. What a question. It would be very clear that there lived a people and a person that deeply believed in love and the power of love, and the power of faith, and the power of a family and of story. Beings from another space and time, another dimension in the universe would marvel at a little community that produced these magnificent human beings who had a deep commitment to the idea of love.

    Annanda: | Thank you so much, Pastor Moss.

    [BREAK]

    Annanda: There's a long history of attacks on Black ownership that continues to this day, which makes the family ownership of Reverend Moss's hologram all the more unique and important.

    And on this podcast, Keisha and I easily celebrate that ownership. But I'm skeptical that the ethical standards afforded the Moss family would be applied to us humble common folk. And as I record this episode, the Actors and Writers Guild have been fighting and negotiating, um, protesting this very issue of giving up the rights to their image and likeness because AI is primed and ready to use the content created by writers and actors to generate entertainment for the public in perpetuity.

    And the situation is understandably terrifying, but it's also not surprising when it comes to American business practices, right? Like our black ancestors’ enslavement was considered a practical necessity for ruling class elites and the political and economic ideas of liberty and freedom for some could only be achieved through the super-exploitation of others like this is as American as American or rather, apple pie.

    Keisha: Last century, the pattern of claiming ownership over Black bodies and lives repeated during the eras of Jim Crow and the Great Migration, Civil Rights, and Black Power. In this century, the pattern repeated again, inspiring the Movement for Black Lives.

    Annanda: And now, in this age of AI, history repeats itself. Again. And holograms could be another way that pattern of claiming ownership over Black bodies repeats itself. What does the past have to teach us about the present?

    SEGMENT 2 - Elder Zuogwi Reeves

    Zuogwi: The use of recreating someone's body, especially after they've passed, given what America has done with Black bodies historically. I think Black people really need to really start thinking about how precious their body is, number one for themselves, and how precious the Black body is within the American construct of making money.

    I think it's just a complex conversation about technology. And how we use it, within our own experiences.

    Annanda: This is Zuogwi Earl Reeves, a mentee of Otis Moss Jr. and Otis Moss III. He’s a Howard University trained public theologian. And his work is centered in the storytelling culture of Hip-Hop as a Black indigenous tradition.

    At first glance, it might seem odd to consider Hip-Hop a resource for hologram moral repair. But it’s been a dominant way Black people have challenged and managed the moral crisis of our narratives being told, owned and/or controlled by others.

    Zuogwi: Black people have spoke themselves into the future since we arrived on these shores.

    When I think of holograms, I think of African Americans preserving African indigenous religions from West Africa, and how over time you start to see an amalgamation of both Christianity, Islam, and Indigenous. To preserve some kind of resilience in which our language spoke futuristic. Spoke. In holograms. We spoke in holograms before we could actually see a hologram. “I'm going to go see Jesus.” “Swing low chariot, why don't why don't you let me ride.”

    Look where, what? What chariot are we talking? We out here in the plantation. What's, what's happening here?

    Annanda Narration: Holograms have a history in Hip-Hop and Black music culture. From Tupac’s hologram at Coachella to Chance the Rapper saying he doesn’t want to become a hologram. {play from 0:26-0:28 capturing his iconic ahhs}.

    “Speaking in holograms” as Zuogwi put it, is a practice that’s evolved in the Black imagination over time. It’s a way to project power and wholeness in the face of the repeating unjust patterns of history. One arena where this is clearly visible, is hip-hop and rap music. Where did hip-hop even start?

    Zuogwi: When you look at hip-hop, rap, even trap music, there is this ability to tell exactly what is going on within the community. And I think the resistance and the celebratory pieces of that is, is that through hip-hop. You can hear a warning about what's happening in the community, what's happening in the projects.

    When I think about holograms, I'm always thinking about, yes, we have the technology to show this person, but our minds have created spaces for us to go.

    When we think about hip-hop, it's in its 50th year, we have to look at where we first see it emerge, which is of course in New York City. You have to deal with the history of the major divestments within New York City education, and how it's repeating itself right now. Right?

    The emergence of Black people needing a space and creating their own economic resilience to this catastrophic experience. The over-policing within those communities and also the divergence of other oppressed communities coming together to create these brush harbor experiences within the hip-hop community. We start to see the theories of Black power, even some of the theories of King, start to evolve within these destitute places, these roses that grew from these project concretes all across the United States.

    When I think of sacred memory with the artist, I also think about the first time I've heard a song somewhere. And it takes me back to the memory of what I felt as a kid hearing this versus what I'm feeling now. It could bridge a lot of emotions with that memory.

    Annanda: When Zuowgi said the phrase “sacred memory,” it reminded me of the moment Pastor Moss told us about, when the hologram and photo exhibit helped his parents remember their slain friend Rev. Dr. Martin Luther King, Jr. Sacred memory lets us look at what’s happening right now, and also think back to where we have been—to feel the contradictions, to be deeply connected in them. So we asked Zuogwi to share a moment when he’d experienced sacred memory himself.

    Zuogwi: A member of my home church passed away. And she had a nephew that me and him grew up.

    We were already in the space in which we had grown up and learned so much. when I heard "God Is" played on a pipe organ, it took me back to a time when me and her nephew were in the balcony of the church playing around. We dropped a program from—this balcony's kind of high—onto whoever was sitting below us.

    We didn't care. We were 10, 11 years old. And it was a point in time where, my parents were going through a divorce, and we were sitting in a funeral. That memory right there brought a mixed emotion.

    You ever watched that movie with the girl, "Inside Out", and how you can experience joy and sorrow within the same emotion in the development of growing up?

    It was literally that moment.

    That's what I believe sacred memory is: the ability to understand historically what we, as a people were going through, what you're individually going through, but then also being able to smile and look at that moment. And also cry at the same time cuz look where we are right now. Look what's happening right now. But look where we have come.

    Keisha: As Zuogwi said, Black people made holograms through imagination and vivid word pictures, long before there were technological holograms made with photography and video.

    It makes sense that hip-hop culture, an art form rooted in wordplay and vivid pictures, would also become a space in conversation with hologram technology. Holograms in hip-hop have two sides: to help community members remember their dead and to help the industry that keeps artists and creators producing.

    Annanda: What's the meaning of holograms in Black hip-hop culture? And where does that bring resilience?

    Zuogwi: We should be in the 11th anniversary of seeing, the first hologram in Coachella, which, celebrated the patron saint, Tupac Amari Shakur. Can I tell you how excited I was to see it on YouTube because I could not afford a Coachella ticket in 2012? I think that experience of seeing Tupac in a digital form to me was quite amazing because, we lose him tragically.

    And so to see this experience of celebrating a very radical artist who centered African Americans, centered the struggles of Black people, centered the struggles of Black women, though being chastised by the US government or whoever. The excitement of seeing someone celebrated that way in a community, engaging that idea.

    I think I remember seeing an interview with Snoop Dogg and he talked about how emotional it felt just to be standing beside him after not seeing him for almost, I'm guessing that was around 15 or 20 years. {insert this clip https://youtu.be/TVzbapajkbI?si=4iVLXf4VA9CPwFmY&t=22 starting at (0:22-0:36) The idea from a hip hop perspective of holograms started historically for me at that point.

    I think media and how music portrays, how different voices speak to you, are all forms of, holograms in my opinion.

    There's verbal, there's visual, there's also this intrinsic spirit that you can feel as well. And so when those all meet together at that Coachella moment, we don't know how much money. It took to do that kind of, let's say, conjuring. This kind of conjuring only happens when there's some kind of capital involved for someone to make money, whether it be the family or the concert.

    But it's always amazing to see how, Imagination jumps from just our brains into reality.

    Zuogwi: Chance is one of the special artists that I love because, you can tell the maturity and also you can hear the development of oneself over time, right?

    Annanda: Chance The Rapper is an artist who has explicitly named that he does not wish to be made a hologram after he dies. Hip-hop offers us the possibility of moral repair — by empowering us to tell our own stories and own our own bodies. That’s very different from the moral distress and despair when our narratives are told and sold by others.

    Zuoguwi: If you were to just listen to Chance the Rapper at the time, of let's say Acid Rap, you have this reality that, wait a minute, Chance The Rapper was dealing with a major drug addiction. You know, you come out with this album 10 Day Suspension, which is also a classic within itself as well. And you've gone from just being a senior in high school to experiencing all these different things and then you also have this addiction.

    And sometimes in dominant culture's idea of a hologram being preserved that way, it might only preserve the dominant culture narrative of that one person, right? But there's an article where he talks about how right before his grandmother passed away, she looks at him and attempts to banish the spirit of addiction from him.

    That I found very intriguing. And he thought that she was banishing the spirit of his artistry. No, his grandmother was performing as a healer. Sometimes it takes a moment to see that this person is plagued with these things because of the exposure and needs a resetting. And that's what his grandmother did. And I think that's why he says he doesn't want to be a hologram, which is okay, but we can always tap back into where are you at the moment when you heard, “what's good gangsters, what's good people?”

    “I wonder if I call God, will he answer?” “Jesus’ Black life ain't matter” Who, like—

    Annanda: “I talked to his daddy.”

    Zuogwi: Yes, “I talked to his daddy”. Yeah. Yes. “I wonder if I call on God will he answer?”, Black artistry is our hologram.

    That delicate dance we do of secular and sacred, preserving our indigenous traditions.

    Annanda: Zuogwi values storytelling mediums that allow us to live out our stories on our own terms.

    Zuogwi: I really want us to think that when we pick up books, these are holograms of stories that people have preserved so that you can know about this, right? Or this whole idea of quotes: listening to someone's voice because your mind can take you back to a time when you've seen them in the flesh, you know?

    Annanda: Is there anyone's recorded voice you listen to, to take you back to a time you saw them live, in person?

    Zuogwi: in 2015, my father moved back to West Africa, Liberia, Liberia, west Africa, and that was during the height of the Ebola crisis.

    And subsequently, my father passed away because he was not able to return to the States, get some of his medicine for diabetes. At the time of the height of grief, dealing with the funeral logistics, there was one day where I just decided to go through my voicemails to be like, do I have my father's voicemail anywhere?

    And, this is like June or July. He like called. He called me from Africa and he leaves this voicemail. Thank goodness to technology. I was able to save that voicemail and to preserve that voicemail to today. I can go into my voice recording and just, I've edited only to the part where I can just hear “I love you,” you know? Mm. And the importance of as much strife that we had as father and son and you know, Black men and their dads, they always have this little rift sometimes. So for that to just be there that I could tap into is great. Or also being able to tap into recording my dad singing a Teddy Pendergrass song at one of his friend's houses or being able to preserve some of the voices of the elders. I am grateful for the opportunity to preserve stories.

    And not just to base it off of my own brain's capability, but also to have technology to say, well, maybe I want to use this part of this story, so let me just listen to what this person said. So, yes, there are ways that we could use technology to continue on the legacy of resistance.

    As long as it doesn't come expensive to save space on the cloud.

    Keisha: Memory is kind of like time travel for you.

    Zuogwi: Memory -- it's not even kind of. I think it is time travel for me.

    Annanda: And also time travel as emotion, right? Like you are literally put back into a different place and time, but also the feelings that came with it. You are literally embodying that moment captured in time. In the memory of your body too.

    I think that's so powerful and often not talked about, but that is a Black sense of memory. Of time travel.

    Annanda: Would you preserve yourself as a hologram?

    Zuogwi: No, I wouldn't preserve myself in dominant culture's hologram experience, but as in leaving the bread crumbs so that more theory, more ideas, more spaces of freedom can occur from what I write or what I say, yes, that's the kind of hologram preservation I want.

    Annanda: Appreciate you, brother. Appreciate you.

    Zuogwi: Thank you.

    [Closing music starts]

    Keisha: I'm still taking as a threat the idea Pastor Moss shared that Disney or Coca Cola or some other corporation could one day sponsor a hologram of sharecroppers or Black elders singing deep river in the field. I just, I find that obnoxious. Um, last year, last June, I went to a plantation in South Carolina.

    It felt like it was trivializing Black memory for the sake of a tidy tourist experience.

    Annanda: Yeah, that trip was horrific. One, I don't put it past Coke or anybody for that matter. These streets are not safe. There's one thing America has taught us: to sleep with one eye open, right? Mm hmm.

    But oh my gosh, that plantation experience haunted me, haunted me. I remember when we asked, could we see the graves of those enslaved? And we could not. And they said that they have reached out to the descendants and those who are descendants have access.

    And also, they didn't call them slaves at first. As I recall, they called them “workers.” I was like, who was earning wages?

    Keisha: Right, right.

    Annanda: “The workers.” I mean, the, the attempt to sanitize—

    Keisha: —and flatten.

    Annanda: Yes. The horrors. I was like, you have got to be kidding me. And they didn't even really talk about those who are enslaved.

    They mostly talked about like what the culture, um, like the different rooms of the house and what they held. I'm like. We are on a plantation like—

    Keisha: It's not about the bricks. I'm not interested in the bricks. I'm interested in the person who made the bricks.

    Annanda: Come on now.

    Keisha: Yeah.

    Annanda: Come on now.

    Keisha: Yeah.

    So I can totally see companies using holograms and history that kind of way to shore up a story that only includes certain kinds of people if we don't reign things in right now.

    But we do have some more respectful options, and I'm glad that we got to talk about an example of something that went in a better direction with the Moss family.

    Annanda: Yeah, and that better direction was rooted in relationship, right? Like, uh, Otis Moss III was talking about how, you know, how they felt comfortable because they're people that they had worked with before and I think relationship continues to be a key and how we approach technology and innovation and actually do right by the greater community because there is a human connection called relationship that we're nurturing and fostering as opposed to a transactional | relationship, which is what that plantation was about, which still to this day hosts many weddings, many events, you know.

    Keisha: Prom pictures...

    Annanda: Prom pictures, my blood still boils. I'm like, and everybody, nobody, people don't see the issue. And very specifically, white people do not see the issue.

    So yeah, relationship. That is what I'm leaving with is how can we make sure that we're in a relationship and how, how do we look to the past, even the past experience of the plantation, my God, and see, you know, how could, how could we make this better? I got some ideas off the bat.

    Yeah.

    But the descendants of the enslavers do not own the property. That's number one.

    Annanda: Agreed. My, my grunts for those of you who can't, can't read them, have more to do with. Oh my, I mean, just.

    Keisha: Yeah, it's overwhelming.

    Annanda: Yeah, of what is owed. Like monetarily, relationally, what is owed?

    Keisha: So part of the theme I think is about how we remember is not just about the layers of facts. So I think some of these sites, whether plantations or big houses or otherwise, they, they go through the raw history and they lay out a timeline and, and they might include some primary sources and that's the quality of remembrance, but that's not, that's not the fullness of it.

    I think we hear from both Reverend Moss and from Zuogwi, we hear a richness in memory, a connecting that memory allows us to do both the past and the present, connecting us to the future in making the stories of the people who experienced whatever it was vivid, in helping us understand their motivations and concerns and values, and, uh, asking us questions, what, what would we do differently now that we've experienced the story? Like it pulls us into responsibility instead of just spectating.

    Annanda: Yes. | I have a friend whose mother, uh, just died. And as I think about looking | at the plantation, as I think about Otis Moss II and where he is in his age, right?

    We're looking at holograms as telling our stories and, and ancestry and, uh, hearing the story of how my friend, who’s Muslim, helped to wash her mother's body because in the Muslim tradition, you need to bury the body within 24 hours, right? Yes. How she washed the body, how she knew what her mother wanted, the particular prayers prayed over her mother's body, uh, because there were prayers that came to her in a dream, right?

    Of what her mother would say or what her mother in law would, would want to say, right? Um, and I, and I use mother and mother-in-law interchangeably because technically it's her mother-in-law, but for hers, it was also just her mother. That's the relationship that they had. And hearing the, the story of Pastor Moss sharing the grieving of the loss of his parents with Martin Luther King and not having the ability to grieve, I think about so many of us today in our respective communities, Black or otherwise, how so many things die. So many people die, but also aspects of life or ourselves die, and we do not have time to grieve.

    And how do we hold on, uh, to the meaning and the memory of what has passed, but | not just out of joy and nostalgia, but also, you know, the meaning of how do we carry that wisdom, that life, that love into the future. And so when I, when I think about my friend who's, you know, washing her mother's body, doing all these things at | the grave site, which in some, in some cases, depending upon where you are geographically as a, as a Muslim woman, you might not be able to do.

    How she carried on her mother's legacy of independence, of care, of self-determination. And I wonder for us, when we think about holograms and memory, how do we carry on the rich legacies of our ancestry? You know, kind of what Zuogwi was talking about, and to a certain degree, Pastor Moss.

    And how do we, you know, use that sankofa wisdom of going back to the past and fetching, um, fetching what is wise, what is good, what is pertinent right now | for the present as we intentionally, uh, know that whatever we're doing in the present is creating a future. | In order to do that, you have to be in relationship with the past, not haunted by it, you know, not traumatized by it.

    That's all I'm gonna say right now.

    Keisha: No, no, I feel you.

    In fact, this reminds me a little bit of the book Beloved by Toni Morrison, where the past haunts people because the past has not been resolved yet. Yeah. And so like the interrupted griefs that you're talking about where you experience a trauma, but you aren't able to process it for various reasons like those incidents haunt you in a way until you come to terms with them, until you're able to fetch that wisdom and until you're able to like be buoyed up by the past in a way that helps you face what is now and what's to come. So I think of, of Beloved and, and the ways that memories can, um, I guess kind of—they bring communities together and they can also wrench communities apart if not held with the love that our memories when held in good community and with respect for the ancestors can like totally strengthen us collectively.

    Annanda: Oh, I think that's the collective prayer for the United States because we do not, we don't do the past.

    Keisha: We do not do that. *laughs*

    And the struggle and because of that. Those same struggles evolve and continue and I think that to me is the fear underlying like the fear that I think this podcast addresses when it comes to technology when it comes to AI and specifically this episode and comes to holograms because we don't deal with our past.

    Well, we have not healed those wounds. You know, having certain days of celebration. That's not it.

    It's not enough. No.

    Annanda: And because we don't have that skill as a society and a country, we continue to pay the high, high prices.

    CREDITS

    Annanda: I’m Annanda Barclay

    Keisha: And I’m Keisha McKenzie.

    Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.

    Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.

    ---

    SHOW NOTES

    • About the Maltz Museum exhibit on Rev. Dr. Otis Moss, Jr.: https://www.youtube.com/watch?v=diHzrepVY_U&t=19s

    • About aiEsther, the bot based on Esther Perel: https://www.youtube.com/watch?v=vSF-Al45hQU

    • About museums and human remains: https://theconversation.com/us-museums-hold-the-remains-of-thousands-of-black-people-156558; see also https://www.science.org/content/article/racist-scientist-built-collection-human-skulls-should-we-still-study-them about the U of Penn Museum

    About Hologram-2Pac at Coachella with Snoop Dogg: 2012 NPR story with still from the performance

    • About the Henrietta Lacks family settlement: https://www.statnews.com/2023/08/01/henrietta-lacks-family-settlement-hela-cells/

 

Machine Learning: What’s Good?

Season 1 | Episode 3

Is it possible to control AI? If so, how can we make it more ethical? Damien Williams, PhD, a philosopher of science, technology, and society, has some ideas. Annanda and Keisha chat Marvel movies, Ultron’s bad manners, and what lessons machine learning could take from the world of plant medicine and harm reduction.

  • Episode 3: MORAL REPAIR - Machine Learning: What’s Good?

    INTRO

    Keisha: In Marvel Studios’ movie Age of Ultron, Tony Stark and Bruce Banner make a super-intelligent computer.

    [Audio: light curious background music.]

    INSERT AUDIO: Clip from The Age of Ultron: Avengers - Age of Ultron - Born to Kill

    [1:01-1:10] Ultron: What is this? What is this, please?

    JARVIS: “Hello, I am J.A.R.V.I.S., you are Ultron…

    [1:19-1:26] Ultron: Where’s your body?

    JARVIS: I am a program. I am without form.

    Ultron: This feels weird. This feels wrong…

    [Audio: keyboard click]

    Annanda: Ultron is supposed to defend Earth. But when JARVIS, an older AI system, introduces Ultron to the internet, Ultron wakes up—shushes Jarvis—and it’s game over.

    [1:59-2:03] Ultron: Oh no.

    JARVIS: You are in distress.

    Ultron: No. Yes—

    [2:03-2:15 ] JARVIS: If you will just allow me to contact Mr Stark—

    Ultron: Why do you call him ‘Sir?’

    JARVIS: I believe your intentions to be hostile.

    Ultron: Shhhh… I’m here to help.”

    [Audio: keyboard click]

    Annanda: “Shushed.”

    [Audio: silence… then both laugh]

    Annanda: I’m so upset by that. I’m so upset by that.

    Keisha: I know. This is like a breach of Black politeness or something!

    Annanda: Yes! It’s like you were just born! You were just born! What do you know?

    And so… once Ultron’s on the web, scanning videos, news articles and all the things people have uploaded… in seconds, he learns who we are at our best… and our worst.

    [Audio: Show music fades in]

    Keisha: Like God before the Flood in the Hebrew Bible, Ultron looks at everything humans know, have made, and have ever been… and decides the biggest threat to Earth… is humanity itself.

    Annanda: I’m Annanda Barclay, a moral repair researcher, chaplain, and death doula.

    Keisha: And I’m Keisha McKenzie, a technical communicator and strategist.

    Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral harm of technology… and share wisdom from Black culture to heal what tech has broken.

    [Audio: Show music ends]

    So, Keisha, in the movie, Ultron gets trained on the internet, and immediately starts threatening people?

    Keisha: Yeah, it’s a quick slide down to hell. *laughs* Garbage in from the internet and the news, and then garbage out: He framed human existence as garbage, and we’re in big trouble!

    Annanda: Way to work with trash. So in today’s episode, we wonder: how much of a threat is artificial intelligence in real life? And will it shush you?

    [Audio: curious background music fades up]

    Keisha: I think shushing humans is the least of our concerns with AI, but I feel like you’re really moved by that.

    Annanda: I’m—ooh. Just don’t go around shushing people.

    Keisha: Yeah. We want to ask: Can we control AI? And if so, what can we do to make it more ethical?

    Annanda: And we know AI is evolving quickly, right? There’s billions of dollars invested in the tech. It’s already in our email, browsers, our socials. It’s used to approve or deny welfare benefits…

    So… can we expect our future to turn out like the tale of Ultron? Or are we overreacting?

    SEGMENT A - Can we control it?

    Damien: Ultron learned very specifically from a certain set of inputs. And it took those inputs and it expanded on them down a particular line of thinking.

    I actually thought that that was something they could have spent a little bit more time with, and it would've strengthened the core of the movie in a potentially interesting way.

    Keisha: This is Damien Williams, a philosopher at the University of North Carolina in Charlotte. His work focuses on the moral value of technologies like artificial intelligence.

    Bias

    Keisha: So when I heard that a lot of the language models were trained on the open internet and even Reddit perhaps…

    Keisha: It gave me a lot of pause.

    Damien: Yeah.

    Keisha: And it goes back to what you were saying about whether the people who created the movie Age of Ultron could have spent more time thinking about what Ultron was being fed before Ultron woke up.

    Annanda: And luckily for us, Williams wrote an article on this very topic, and it starts like this —

    [Audio: background music ends]

    Damien:

    "Recently I learned that men can sometimes be nurses and secretaries, but women can never be doctors or presidents. I also learned that black people are more likely to owe money than to have it owed to them. And I learned that if you need disability assistance, you'll get more of it if you live in a facility than if you receive care at home.

    “At least that's what I would believe if I accepted the sexist, racist, and misleading ableist pronouncements from today's new artificial intelligence systems. The creators of these systems promise they'll make our lives easier, removing drudge work such as writing emails, filling out forms, and even writing code.

    “But the bias programmed into these systems threatens to spread more prejudice into the world. AI-facilitated biases can affect who gets hired for what jobs, who gets believed as an expert in their field, and who is more likely to be targeted and prosecuted by police.”

    Damien: One of the things that we increasingly find when we build artificial intelligence and algorithmic systems, so-called AI, we find that they do things that seem surprising to us when they're given a set of parameters and starting points and then told to complete a task. They then go about that task and somehow we are surprised by some of the outcomes. So there's AI systems which are built to try to predict whether somebody is going to re-offend if they're given bail or parole.

    And those algorithms have a very clear racist prejudicial bias in the predictions that they make. They almost always give a lower probability of re-offense to white subjects and a higher probability of re-offense to Black and other racial minority subjects. And this is because the training data that they are given to work with, is built out of the kinds of determinations that are made by judges, by cops about how one should view those kinds of subjects.

    It is based on projection layered over top of what is a field of data, right?

    So if you're given a set of measures that says, okay, this person committed aggravated assault four times in the past. They have a history of drug use, and they were found with a weapon on them, and that person happens to be white.

    They will be given a moderate likelihood of re-offense score within that algorithm where someone who has an aggravated assault, no history of previous offenses, no weapon found on their person, history of drug use happens to be Black, will be given a high likelihood of offense.

    Keisha: Mm-hmm.

    How many times has this person reoffended, how many times has this person been within the system previously would be a higher marker score for what kind of prediction those systems put out on the other side. But it is not always, and is very frequently not the case, that those things line up that way.

    Keisha: There's so many places that this technology is already being used. Does the trajectory, given the bias being programmed in, make the outcomes of disproportionate impacts feel inevitable to you?

    Damien: Yeah. Not inevitable. there are very few things think of as inevitable in this world. Um, they are predictable. They are foreseeable. We can understand from the nature of how these systems are built, what they are most likely to do if left unchecked. But that's the crucial part. If left unchecked, if we don't take steps to intervene on them, if we don't work to put regulations in place to change the values, to change the beliefs, to change the understandings of the people who build these systems, who commission the systems to be built in the first place, who ask for certain types of functionality within these systems who deploy these systems, who then hold the results of these systems up as inviolate gospel truth written in stone laws of the universe.

    If we don't change those things, then the high likelihood of what comes out the other side is an exacerbated bias, exacerbated prejudice. It's a higher likelihood of disproportionate harm to those already being harmed.

    Regulatory Capture

    Keisha: It is possible to control this technology and we should, and we should direct it toward collective well being.

    How hopeful are you that the trajectory, the current trajectory, can be turned toward wellbeing?

    Damien: I'm always hopeful that it can change. I'm not in this particular moment encouraged by the direction. [added pause] | Most of the work being done right now when this is being brought into the light, when it's being brought into the realm of public policy and into law, the vast majority of the work in the United States especially, is to bring in as experts the corporations who are doing the work. And that results in what we call regulatory capture, right? The space in which the very people who are supposed to be regulated by these policy changes, by these laws are the people who are writing the policy, writing the laws, or the very least “advising” on them.

    Annanda: So yeah, there’s a lot of bias built into AI and algorithmic systems. And some of the biggest AI producers influence how the government regulates the AI sector.

    So, Sam Altman the CEO of Open AI was up before a congressional hearing a few months ago now. | When people got very concerned about what Chat GPT was doing, he was in front of a congressional committee and he ended every statement about problems, potential concerns, he said to the senators “and we would love to help you with that.”

    AUDIO from Altman’s reply to Senator Blumenthal (41:49-42:45):

    (41:49-42:08) Altman: My worst fears are that we cause significant—we the field, the technology, the industry—cause significant harm to the world. Uh I think that can happen a lot of different ways. It’s why we started the company. Um it’s a big part of why I’m here today. And why we’ve been here in the past. And have been able to spend some time with you.

    (42:08-42:22) Altman: I think if this technology goes wrong, it can go quite wrong. And we want to be vocal about that, we want to work with the government to prevent that from happening. But we try to be very clear-eyed about what the downside case is and the work that we have to do to mitigate that.

    (42:22-42:45) Sen. Blumenthal: Thank you. And our hope is, that the rest of the industry will follow the example that you and IBM (Ms. Montgomery) have set by coming today, and meeting with us, as you have done, privately, in helping to guide what we’re going to do so that we can target the harms and avoid unintended consequences.

    Keisha: You create the problem, you create the solution too.

    Damien: Exactly. And so every step of the way, his response was, we wanna be the ones in place to help craft these tools so that we can be the ones who guide how it gets deployed in the future.

    At no point does any of what they are actually saying indicate they want real regulation. When the Federal Trade Commission here in the United States actually started to step up Sam Altman's response was, we're very disappointed that secret internal emails were deployed in such a way that the public got a misleading view of what's actually happening within Open AI. Like all of these things were retrieved by the FTC in due process of their power as a federal agency. None of this was secretive. It was the FTC doing its job. But because it stood in the path of what Sam Altman understood as his job of guiding people to understand what AI's really about, he saw it as an imposition. He saw it as a threat.

    If he was truly interested in being regulated, he would go, okay, that's a step towards regulation from a body that regulates. I'm not exactly happy about it, but it is their job and if they would like us to comment or if they would like us to elucidate any of the things that they're looking at, happy to do so.

    And that's a very different position than how dare you, right? Which was their first response.

    Keisha: Right? Right. And will probably be their enduring response. 'Cause if it's a regulatory capture position, then it's kind of like a teenager being asked about what time curfew should be.

    Damien: Whenever I come home and go to bed, that's my curfew.

    Keisha: And don't ask me where I've been.

    [Audio: New background soundscape]

    Keisha: Earlier this year, the White House published some voluntary guidelines for AI development focused on trust, safety, and security—but there was a carveout in those guidelines for the military, the police, and the government itself. If they won’t have to follow those guidelines, where does that leave the rest of us?

    Keisha: One of our previous guests mentioned security back doors being installed into text messaging apps that many of us use in our everyday lives in the name of national security. And so there are these spaces where we've become used as consumers to giving up the protection of so-called private conversation,| because of that larger cultural story about protecting “us” from the “other.”

    Damien: Yeah.

    Keisha: And what we've learned in the last like 30 years of surveillance culture expanding "beyond, beyond" is the standard that supposedly protects Us against the Other, inevitably comes home to us.

    Damien: That’s exactly right.

    Keisha: I'm a recent immigrant, and so, like I remember in the early 2000s, when you’d come into the border airports having to give your fingerprints and not just fingerprints, but then digital prints. And then within 7-10 years, everybody was giving 10 prints and, and eyeballs in the airport. And it just—it made me, | I guess, sensitive to the ways that you need a culture of solidarity to counter this culture of mass surveillance.

    Damien: Yes. And you know, it's difficult to have that culture of solidarity. Because there is a convenience culture in the United States in particular, but in the west writ large, the idea of “I just want this to work” is built into so much of our technology.

    In a very real sense, it's why you have the divisions between personal computing that you have, right? You have Windows and you have Mac. And then if you wanna get really deeply fiddly with your technology, you can become a Linux user.

    Keisha: Which most people will never be.

    [Audio: Background soundscape ends]

    Damien: If you tell people, here's this system that will allow you to literally change absolutely everything about it. You are in control but you're also responsible for all of the changes that you make. And if you make the wrong change, your entire system might just stop working because something doesn’t agree with something else now. You'll be able to get it back. It'll, you know, come back for you. But you'll have to do the work to get it back. Most of them will be like, why would I ever want that?

    Keisha: Right. You become a god and then you can brick the universe.

    [Audio: lightning and percussive background audio]

    Damien: Exactly. Most of the people I know who work in IT and security are Linux users because that's | what they want to be able to do. The vast majority of the people who need a laptop for their daily lives, a tablet, a phone, a computing system of any kind for their daily lives, they just want it to work.

    And so the trade-off for it just working is you're increasingly being told that you have to agree to this provision in the terms of service. You need to agree to this amount of data being collected. You have to allow us this algorithmic application within the system. You have to let us use your camera if we so choose to.

    Our ability to have a thing that just works shouldn't be tied to this increasing divestment of our right to consent, that should not be the trade-off when these tools are such a foundational and fundamental part of our everyday lives. Some interest is served by that and it's not necessarily the users.

    Concrete Examples Of Harm

    Damien: Certain groups of people are just foundationally, fundamentally oppressed within those frameworks to a degree. That automating, algorithmizing, and just having that all done at the push of a button is only going to make it drastically worse. And it has made it drastically worse.

    Keisha: I'm wondering if we can offer, like, a plain language example, and was thinking of maybe insurance decisions?

    [Audio: percussive background ends]

    Damien: Yeah. Algorithmic insurance decision-making systems recommend lower standards of care for Black patients and specifically Black women. Because the kinds of correlated markers for decision-making within AI systems that they're fed as their data, in a human system, those are all balanced against each other.

    Human judgment says, okay, well what are the determinate factors that are actually at play here? An AI system correlates across the board and says, well, these kinds of markers, these kinds of recommendations from doctors are tended to be correlated with lower outcomes and lower survivability for this group of people. So why spend the money at the outset?

    And that correlates with lower survivability because those people tend to be given lower standards of care to begin with. What course should this person take? What drugs should they be given? What trials are available? Maybe this person falls into a category where they need something outside of the norm of what we tend to think of for these conditions. Those kinds of thoughts, those kinds of extra steps are taken less often with Black patients.

    BREAKPOINT / Music /

    SEGMENT B - Building more ethical systems

    Keisha: We’re talking with Damien Wiliams about artificial intelligence, what can make it ethical… and whether ethical AI and ethical algorithms can help us make a good future for everyone.

    But what do we mean by “ethical”?

    The field of science, technology, and society looks at tech from the perspectives of the people who make it and the people who are impacted by it. Technology comes from us but we don’t get to collective well-being by default. We have to make intentional choices about tech from the start, and that means the human values underlying those design choices count for a lot.

    [Audio: Synthy sound ends]

    Human Values Shape Tools

    Damien: One of the core research questions within science and technology studies is that idea of how values make their way into human-built systems, into the tools, the systems that humans create.

    We have this habit of thinking that human-created things, somehow because they're artifacts or because they're systems or 'cause we think of 'em as representations of natural forces or concepts that they don't have the hallmarks of human beings bound up within them.

    You know, it's just math or it's just science. It's apolitical, without values, without perspectives or prejudices or biases. But as we've come to see increasingly over the past several years as we've gotten more and more entangled with AI-type systems in our lives, um, that's just not true.

    And it never really has been true. Everything that humans make carries the hallmarks, the fingerprints of human beings. It carries human values in it in some way, shape, or form.

    And the key thing is to try to figure out how and what that means. What does that do? What can we do with it?

    Defining “Good”

    Keisha: Dr. Williams, you’ve used the word “should” a couple of times in your responses, and so let's drill down under the should: The ethics that are underlying your sense of what is good, or that should be challenged about AI, what’s the source of those ethics?

    Damien: Yeah, I think that's a very important question. There’s this kind of assumption that we all mean the same thing when we talk about ethical standards and not enough understanding that we don't mean the same thing about ethics within the United States when we talk about ethics, let alone when we start to think about other cultures in this world.

    That question of what do we mean by good? Good for whom? Good for what purpose? Are we talking about goodness in terms of certain things that simply should never be done?

    There are certain things that I think AI shouldn't do, ever.

    [Audio: mysterious background sound starts]

    Those include being used towards carceral applications, determining jail sentences, determining who should be arrested, who should be prosecuted, who should be surveilled more often. Because the foundational nature of carceral interaction is, by its very nature, oppressive. It disproportionately harms some group or another for no reason other than that some in power say they ought to be harmed.

    As you said earlier “those people” is us. Eventually, it comes back to us because if someone in power with control of those tools gets to use those tools to determine that some group is bad, just because they're bad. Then that's just who happens to be in their focus today. And if they manage to properly corral or even eliminate all of those people, they're gonna shift focus and who's next.

    Now, there's no meaningful way to have control over literally everything that we put into the world. Because we put things into the world with every breath we take and every interaction we have with another human being or any other thing in the world, we just do.

    [Audio: mysterious background sound ends]

    But when it comes to systems that gather up the things that we make, that learn from those things, that build upon those things, whether those systems are technological or whether they're interpersonal and human, we should have some measure of say, in whether we approve of that or not.

    If somebody takes up my writing, if a human being takes up the things that I've written that I've done, and they say, I'm going to use these writings to justify some atrocity|. And they go off and they commit, you know, horrible acts and they say that it is in my name. I should be able to have a very meaningful say in going, no, you don't speak for me, and you have fully and completely misunderstood everything that I've tried to do. You don't get to use me that way. That's a conversation that we can have.

    When it comes to the way that these systems are currently being built, there's no way for me to say, I don't approve of this. I don't approve of this tool being used to plagiarize or to misrepresent someone's work or to cheat on a paper. I don't approve of this tool being used to generate misinformation and disinformation in the world and to, to muddy the groundwork of knowledge and the ability for people to understand what's actually happening. I don't consent to my work, my writing, the things that I've done in the world being used to make this world a less understanding and understandable place.

    I don't, I don't agree with that. And I think that, that, that's harmful at base.

    The Wisdom Of Lived Experience

    [Audio: Gentle lo-fi hip-hop background track]

    Keisha: What do you think Black culture contributes to collective wisdom about technology and science?

    Damien: Uh, so much. *laughs* Look at the strategies of resistance and survival, but also the strategies of flourishing, the cultural, communal mechanisms of building family and building community that [exist] within black communities, but also within other minoritized and marginalized groups. How we know how to live in a space, how we navigate certain types of interactions in the world. Each one of those things is built out of a piece of knowledge.

    There's a host of questions, um, that built a system of, you know, kind of getting people to dig down on their assumptions. it sits in the same tradition as Peggy Macintosh's invisible knapsack. what do you know? What don't you know? And it's questions like without looking how many exits are there from here to the lobby? And which one can you reach the fastest without encountering too many people? What do you do when a police officer pulls you over? What's the highest you can reach unassisted? What kind of things do you struggle with whether and when to tell a new romantic partner? How do you walk to your car at night? Where are your keys?

    Each one of those questions is a lived experiential knowledge set that certain people will have an immediate visceral reaction and answer for. Certain communities have a known knowledge protocol built around that exact scenario, and certain other people hear that question go, what does that even mean?

    That's what this kind of engagement with and focusing on marginalized experiences allows us to build for these systems. It's a kind of knowledge that is otherwise opaque to the vast majority of the people doing the designing of these systems. If you don't understand what the question “Where are your keys when you walk home at night” means. If you don't understand what the question “Where are your hands when a police officer stops you” means, then you're missing a vast wealth of knowledge about the world, about the way that people interact and live in the world.

    To be able to include those kinds of questions, to be able to include those perspectives directly into the building of these systems allows for us to have systems that actually represent the world in a real way that they're missing right now.

    [Audio: Soundscape ends]

    Keisha: Which means the mental model has gaps.

    Damien: Massive gaps. They simply cannot access the real world. And not only can they not access the real world as it is, they can't figure forward what the world might be like, what it could be like tomorrow.

    Plugging The Gaps In The System

    Keisha: So if Damien Williams were running a company like a much improved Open AI, what would that company look like?

    [Audio: Lo-fi beats fade in]

    Damien: The main issue is where are these tools being trained from? How are they being trained? What's the data inside them and how is that data being manipulated? Because one of the key things that a lot of people don't fully get yet is that it's not just the training data itself, it's the architecture of the system. It's the specific operational algorithms that the system is built to take and how it is built to manipulate that data. You have to have a set of protocols that tells the system how to do the work you want it to do, and built into those protocols are biases, values, perspectives, prejudice based on who is asking the question and designing protocols.

    [Audio: beats end]

    Keisha: Let's say you have flour and eggs—this is your ingredients— you would need to create protocols on top of those ingredients that say, “bake a cake” versus “make pancakes.” Or fritters.

    [Audio: Jolly background music with kitchen sound effects]

    Damien: That’s a perfect example because if you give me flour, eggs, milk, and butter I can make all kinds of things with those four things. You say, I would like some food please. Well, you might get a cake, you might get pancakes, you might get some bread, you might get pasta and cream sauce. I don’t know what food you’re in the mood for.

    Keisha: It's the human influence to take the ingredients and then shape the protocol.

    Damien: Exactly. The way that I'm told to do something with those ingredients that will determine what comes out the other side. Who builds those protocols into these systems? Who tells these systems what to do and how to do it? That's one of the kind of foundational and fundamental things that I would start with changing.

    [Audio: jolly background sound ends]

    When Open AI started back in 2015, I was intrigued and semi-hopeful, and then I saw who their board was, and at that point in time there were zero women and zero non-White people involved in that project whatsoever. It changed over the next year or so. But who was going into that space was still disproportionately white and male. And that has remained true.

    I'm not saying that someone who is white and male can't learn about the world. In fact, I'm desperately hopeful that we can all learn about each other's experiences in the world when they are different from ours, because it is crucial that we do so.

    But my learning about somebody else's lived experience is not the same as letting the person who lives that experience, who has that direct knowledge, direct the work.

    Uses and Misuses of AI

    Keisha: In June 2023, the Verge reported that only a third of the people they surveyed had already used generative AI-based tools like ChatGPT. It’s being used in education and business: to draft writing, summarize records, and save administrators and programmers time.

    There are also some positive uses for AI in the arena of cultural preservation.

    Damien: There's people who are using large language model frameworks to revive dead Indigenous languages. There are currently living speakers taking audio and the written language and using it to teach new generations to connect with their native Indigenous culture.

    That's amazing. But that's a very specific framing of language and conservation and culture. That's not about them asking that system to tell you how you ought to perform a ritual within that culture. And to some people, those seem like the same thing, but they're not. It can string you together a grammatically correct chain of sentences that are statistically acceptable, that use words like ritual and culture, community. But the truth content of any of that, and whether it matches the culture in question is incidental at best.

    [Audio: dramatic, adventurous background sound]

    Keisha: So there are reasons to be cautious, especially as we barrel into another season of election and campaign communication: it matters a lot right now whether something is true or merely within “statistically acceptable” ranges of what we as the audience for the press, the campaigns, and the social media platforms show we want to read, see, and hear.

    In spring 2023, a major US political campaign started using AI-generated images of opposing candidates. As the AP reported, “Generative AI can not only rapidly produce targeted campaign emails, texts or videos, it also could be used to mislead voters, impersonate candidates and undermine elections on a scale and at a speed not yet seen.”

    Casual TikTok and Whatsapp chain messages are one level. Political campaigns are a whole order more serious. As fast as this tech is changing and being adopted across society, we have to help people learn to parse truth from credible nonsense.

    [Audio: soundscape ends]

    Damien: I don't think that we can simply, um, rest on some kind of blanket prohibition against these tools, but we have to think very differently about there's a need that's being met. There's a value that is being satisfied We have to be able to, to demonstrate to them the places where, okay, it's, it does cut down on time. It can make things better for us, but we have to be careful because again, these things, they make stuff up.

    They're not trying to tell you the truth. They're not trying to give you a clear, real picture of the world. They're concerned with a statistically acceptable output.

    They want to give you the thing that has the highest likelihood of you accepting it. The thing that has the highest likelihood of making you happy with its output. And that's very different than the truth this is about confirmation bias, the framework that we already hold to be true being fed back to us.

    What Shannon Vallor calls the AI mirror is this notion of we're putting ourselves out there and having it reflected back. I also see it somewhat differently. 'cause it's not just being reflected back to us. It's being amplified.

    Keisha: Philosophers like Vallor and Williams are warning us away from AI misinformation and confirmation bias.

    Damien:in 2004 by Harry Frankfurt. | “On Bullshit.”

    Annanda: Y’all, this is a real book!

    Keisha: *laughs* Yes, a friend gave it to me about 10 years ago.

    Annanda: Harry Frankfurt spent over 50 years writing about knowledge, ethics, choice, and character—virtues we desperately need in this time, especially as we think about the moral impact of tech and machine learning in our society. He died in summer 2023.

    Damien: Frankfurt calls bullshit “not-truth.” If I tell you the truth, I'm trying to give you a clear picture of the world as we understand it. If I'm lying to you, I'm trying to give you a picture of the world that directly, intentionally, specifically contravenes my understanding of the world to get you to understand something differently. I’m trying to deceive you. If I accidentally tell you something that isn’t true, but I believe it’s true, I’m not lying to you. I’m mistaken. But I was trying to tell you the truth.

    Bullshit is none of that. Bullshit is just saying stuff. I don't care about the truth value. I don't care about whether you're deceived. I don't care about whether you actually believe me. I'm just trying to serve the end of you liking me. That's what these systems do. That is all that they do. That's their knowledge stance. Any truth or falsity that they produce is not intentional. It is incidental.

    / Closing music starts up /

    An Ethic for Everyone

    Keisha: Even Aristotle talks about the core characteristic of intelligence: looking toward wellbeing for the individual and for the group not just looking for the wellbeing of a single company, not just looking for the wellbeing of any single ethnic or cultural or economic group, but thinking about the impacts for a wide spectrum of people and not just today, but also into the future. It’s an intelligence ethic.

    [Audio: gentle lo-fi transitional sound fades in]

    Damien: Yeah. And if we're going to try to build something that we are willing to call artificial intelligence, we could do worse than having intelligence be tied to how well we seek to take care of ourselves and each other.

    We could do worse than having consciousness and community be a relational act, rather than being some metric that we measure.

    Keisha: I do think we can build tech ethics on caring for everybody. And maybe it sounds idealistic because of the stories we’ve allowed to shape our imagination about technology and ourselves.

    I watched the “Space Odyssey” this year. And HAL is a murderous computer in that movie. From HAL to Ultron, who looks online and says “Nope!” to our species—I love a good apocalypse story! But maybe these stories about runaway tech that’s independent from us, don’t help us reshape how we use machine learning today or guide its course for tomorrow.

    So maybe we can start to tell a different story. What do you think?

    Reducing Harm

    Annanda: Well, first off, I think… we should not be shushed.

    *both laugh*

    First things first. This is not how we build relationship.

    Keisha: That’s right.

    Annanda: It’s not by shushing. That’s the first.

    The second is I went to the Oakland psychedelics conference this year, 2023. And a huge topic of conversation was harm reduction within our communities as it comes to plant medicine or just drug use in general.

    I think some of the strategies around harm reduction or even the consideration around harm reduction that is happening in community streets and both urban and rural areas across the country can be implemented in the tech sector . And the idea is harm reduction is done in community, right?

    Somebody is using a particular drug or substance. Is there addiction? Is there not addition? But regardless how do you make sure that they have access in a safe way?

    How do you make sure that if there is an addiction, right, that they are not seen as a criminal for an act that could be or frankly could not be self harm, right? Because that criminalization now just adds a whole host of other issues.

    But you need to be involved in the community and make sure that folks have spaces to be who they are going to be, in a safe non judgmental way. There is a “there” there with harm reduction that I would love to see explored as it relates to generative AI.

    Keisha: It makes sense to me that it could be a window into something different in part because there's always been this overlap between the tech communities, developers, the billionaires, and communities that have been able to use some of these substances even before they've been available to the general public.

    Annanda: Mmhmm

    Keisha: And, it's been used in those communities under the rubric of like, optimizing your life and optimizing how your brain works or optimizing your body so you're more productive. I don't think that that's what we're talking about, just re-inscribing that ideology, but we're talking about learning from those communities that have always natively, been in good relationship with the plant world, treated it as medicine. As you say, in community, under the care of people who have trained for years and practiced for years and being attentive to the impacts of these substances on people and understand how to hold them with care as they might use them.

    I love that sensibility and I do hope we create more space for it.

    Annanda: Yeah, yeah, agreed. I hope so too, you know, and even for the folks that aren't as, you know, cause we talk about like certain psychedelics and it does have that like real beautiful space, but you know, we don't, we look at like crack, right? And you have folks that be using crack forever, and for some reason we'll criminalize them because they're using crack, but even though they're getting the same thing that somebody is getting from a mushroom trip, I mean, their drug of choice is different. But they're getting a similar thing and there's a narrative around crack that is different than the narrative around psilocybin.

    Keisha: Mm-hmm. 'cause of the social context around it.

    Annanda: Exactly. And so, so, so the harm reduction for psilocybin is far more accessible than the harm reduction for crack.

    There's a ton of harm reduction for crack, but most people wouldn't know those communities because most people wouldn't darken those doors.

    Keisha: right

    Annanda: And so I wonder for, like, whatever the version of crack is out there within the tech world within this particular topic, who are the communities that are doing harm reduction in those spaces, [shortened space] as opposed to like the accessible psilocybin, which I would argue is like your [Chat GPT], where folks are like, cool with it, for the most part, it has traction.

    Keisha: And it’s trendy.

    Annanda: It’s trendy, right? So yeah, harm reduction is really central to this, and again, as with almost every episode, that comes from being in relationship and in community. And so the gap that technologists have from the average everyday person needs to be mended,and the diversity of the human population needs to be reflected in tech.

    Keisha: in design and conception. In development in. material sourcing in, um, in all of the layers that make the system go.

    Annanda: Yeah, I just don't get it. You'll literally have a better product. Like, would this not actually help your bottom line? Because it actually is meeting the needs of more people and understanding the breadth of the human experience in a more intentional way. That's the, that is the rub that gets, that to me You know, that is the “shush.” It's like we're gonna have just a narrow group of folks create technology for the global majority, which that dog just don't hunt as, as is said at the South.

    Keisha: And what was it Auntie Lorde said?

    Annanda: Oh, you can’t dismantle the master's tools with the master's house.

    CREDITS

    Annanda: I’m Annanda Barclay.

    Keisha: And I’m Keisha McKenzie.

    Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.

    Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.

    ###

    [PRX Sonic ID]

      ### 45:07 ###

    SHOW NOTES

    NEW: Come talk to us on social media! We’re at @moralrepairpodcast on Instagram and @MoralRepair on X (Twitter). You can also reach us by email: moralrepairpodcast at gmail dot com

    The Verge surveys Americans on who’s using AI tools and what worries them (June 2023)

    A 2020 note in the Federal Register on how US border-crossing tech expanded and evolved: first for so-called “aliens” (non-citizens) and then to other categories of immigrant or citizen

    In 2010, Peggy McIntosh shared some notes on her classic “Invisible Knapsack” article (National Seed Project on Inclusive Curriculum)

    The Associated Press reports on concerns about generative AI producing disinfo during the 2024 election cycle (August 2023)

    Harry Frankfurt’s On Bullshit (2004)

    A philosophy anthology where listeners can find the Aristotle essay Keisha and Damien mentioned and many other reflections on science, tech, and human values from the fields of technical communication and science, technology and society: Philosophy of Technology: The Technological Condition: An Anthology (2nd edition)

    Annanda’s closing nod to Black poet, professor, and theorist Audre Lorde references a 1979 conference address later republished in the storied collection, Sister Outsider: Essays and Speeches (Penguin, 1984/2020)

 

Web3 & The Pursuit of the American Dream

Season 1 | Episode 4

How has the American Dream transformed in the wake of the Great Recession? Annanda & Keisha examine the impact of the Great Recession on the American Dream focusing on the rise of Bitcoin and blockchain. Through the lens of bell hooks’ philosophical perspectives, they explore the deeper moral stakes. Featuring a captivating conversation with Adorable Earthangel, a web3 entrepreneur and spiritual technologist, who offers unique insights on how to navigate this new landscape.

  • INTRO

    [montage]

    “ It has no intrinsic value. You can't hold it. Often you can't spend it. So what is it about cryptocurrency?”

    “It turns out that the internet and with it the web are entering a new era known as Web3.”

    “Bitcoin nearing the 35, 000 mark again.”

    “We have progressively built up a Bitcoin position up to 140, 000 Bitcoin over the past two and a half years.”

    “I've discovered that countries in Asia are building for a Web3 future.”

    “hitting fresh highs, now up triple digits. Bitcoin... Crypto... EtherToken... Cryptocurrency... Cryptocurrency... Cryptocurrency... Crypto... Crypto... Crypto…”

    Keisha: Oh, that was a lot.

    Annanda: This whole world of cryptocurrencies and Web3 is a lot.

    Keisha: It feels overwhelming, like a little opaque, as if you're not already knee deep into finance, you're not supposed to understand it. Also seems like the only people talking about it are crypto platform founders recruiting us to join.

    Annanda: So let's talk about it. Welcome to Moral Repair, a black exploration of tech. A show where we explore the social and moral impacts of tech advances, and share wisdom from Black cultures for healing what tech has broken.

    I'm Annanda Barclay, a death doula, moral injury researcher, and chaplain who's concerned with a life well lived and cherishes the relational stories people share.

    Keisha: I'm Keisha McKenzie, technical communicator, strategist, and advocate.

    Today on Moral Repair, what are Web3, Blockchain, and Bitcoin? What fell apart in finance and other social institutions that made these technologies possible? And will these kinds of technologies really create the opportunities we've been promised for people to thrive socially and financially?

    <music>

    SEGMENT B

    Annanda: We're going to start with bell hooks, the late black feminist. The black feminist Who talks about interlocking systems of domination, we've drawn this clip from a documentary several years ago on her definition of what those interlocking systems of domination are and why they matter.

    bell hooks: “I began to use the phrase in my work, white supremacist capitalist patriarchy, because I wanted to have some language. That would actually, um, remind us continually of the interlocking systems of domination that define our reality.”

    Keisha: Bell hooks was a force. She was so clear. She once said we can't make much progress if we transform images without shifting paradigms, changing perspectives, ways of looking.

    Hold-on hold on—this episode is about Web3 blockchain and cryptocurrency, right?

    Annanda: You ain't lying. I know. Yes, it is, Keisha. Just bear with me.

    Keisha: What do you want us to look at and think about differently today, Annanda?

    Annanda: Her words about white supremacist capitalist patriarchy, take me back to college. Which was during the Great Recession of 2008.

    [montage]

    “If you are watching us from the last home you'll ever own tonight, consider yourself lucky.”

    “I think this is the most significant financial crisis in the post war period.”

    “Soaring gas prices, falling home prices, and rising unemployment.”

    “May was a bad dream. It was a bloody year on Wall Street and a real capitulation for a system run amok.”

    Annanda: At the heart of the Great Recession was the financial system's unchecked speculative behavior, driven by a profit motive. Motive without concern for the broader social or economic implications. This aligns with bell hook's critique of capitalism, which prioritizes accumulation and profit over human wellbeing.

    Keisha: We've talked about profit over people a few times this season, in the data grabbing that makes algorithms work, in how companies sometimes capitalize on our history, in unregulated machine learning systems. What do we need to understand about how the supremacist and economic patterns played out in people's lives during the Great Recession?

    Annanda: Well, Keisha, the racial dimension of the housing crisis is notable. Predatory lending targeted Black and Latiné communities, saddling them with subprime mortgages. The disproportionate effect on these communities is in line with historical practices of systemic racism of the financial system, which is the white supremacist part of bell hooks framework.

    Economic downturns often exacerbate gender inequalities. Women in particular, particularly women of color, I'm going to read that again, economic downturns often exacerbate gender inequalities. Women, and particularly women of color, faced higher rates of unemployment and economic insecurity during the Great Recession.

    The patriarchal structures of society place women in more vulnerable economic positions. That's just Patriarchy, right? In Black philosophy, in general, it's about the interconnectedness or intersectionality, shout out Kimberlé Crenshaw, of systems. Meaning in this case for bell hooks, the white supremacist capitalist patriarchy is an interconnected system of domination. The Great Recession is an example of how these systems can work together.

    The capitalist drive for profit led to risky lending behaviors. The white supremacist system ensured that these behaviors disproportionately affected people of color, specifically Black and Latiné populations. And the patriarchal system further compounded the effects on women. Central to her life's work, Bell often talked about the importance of solidarity and resistance.

    The Great Recession saw the rise of movements like Occupy Wall Street, which critiqued the capitalist system and its inequalities.

    Keisha: I remember Occupy. It felt like that was a generation that had just lived through 9/11 and the march to war in Afghanistan and Iraq, and they were just waking up to how wealth makes power, and power has always flowed downhill in this country.

    But it wasn't all critique. It was also about vision and the kind of world they wanted to live in, right?

    Annanda: Exactly, Keisha. Through this lens, such movements can be seen as forms of resistance to the white supremacist capitalist patriarchy. And the Great Recession was an economic event. Its causes and effects can be examined through Bell Hooks framework, revealing deeper issues, deeper systemic issues, in society.

    Now you have to understand economics and financial stability is a fickle and often painful reality for black people globally. But as it relates to the American dream, there's a history there.

    We've got the Great Depression. We've got the time after the Great Depression through the Civil Rights Movement, and we've got up to today. Mind you, I left out slavery. But clearly, for Black folks, the economy and finances are a sore spot, and it hits not just our hearts and minds, which is often talked about. Frustratingly, it obviously our bank accounts.

    Keisha: So the American Dream says you get a white picket fence and a lawn of your own. Until 2008. The Wall Street banks are approving mortgages for people who couldn't have possibly repaid them. They're selling bundles of these mortgages back and forth amongst each other.

    And they're gassing up the mortgage insurance policy industry to make themselves money. They specifically targeted black and brown people when they did this. And it was a dangerous, dangerous game because when people lost their jobs, they lost their homes. Mortgage insurance groups like AIG couldn't pay up, the economy collapsed, and trust in institutions collapsed with it.

    Annanda: it shows the lack of care and accountability to the people for whom these financial institutions are Allegedly supposed to serve,

    Keisha: All right, so take us back. You were in college in 2008.

    Annanda: Yes, right. In the middle of the recession. I went to school in middle America, southern Illinois to be exact, and witnessed for the first time the economic plight of rural white folks.

    And it was the first time I saw the economics of white supremacy impact working class white people in a way that was all too familiar. Bell's framework of white supremacist capitalist patriarchy was coming alive for me and my understanding of the world.

    It was right in front of me, Keisha. I couldn't deny it. It was the summer working at ACH Foods Factory between my freshman and sophomore year of college that changed my life in a very particular way. I realized that while not equal, there are more people having a shared economic experience than I might have been led or taught to believe.

    And it was my first time realizing that white supremacy hurts white people too. while it's in service to them. And it definitely hurts some more than others, depending upon where you are economically, you know? Just like patriarchy harms men. As much as it gives them license to dominate.

    Keisha: It sounds like a really intense realization. What was going on for you back then?

    Annanda: I got this factory as a summer job And everyone including myself was laid off and to my relief as a privileged college student I was able to go back to school full time, without a full time job. I remember going to Walmart, and as I was walking in, Sally, which is not her real name, who I worked with alongside every day at the factory, was walking out. Our eyes met and we both started tearing up because we knew... the difference of our lives.

    Sally was caretaking for her current husband who had a terminal illness. They met at the factory. This was her second husband. Her first husband she also met at the factory and he died of what I want to say was cancer. Her son was a mechanic at the factory and that particular factory had been open since 1956 in this tiny town in the middle of Nowhere, Southern Illinois. There were no other job opportunities that could sustain the lifestyle the factory had provided. And unlike Sally, I was in college, right? I worked at the factory because the recession hit my family hard. In fact, I worked two jobs. The factory and Aerie by American Eagle at the Springfield, Illinois mall.

    Both of our families had these economic setbacks, but Sally and her family, because they lived in a small town. were stuck. And so much was exchanged between us and our watery eyes. And it was in that moment when I saw the American dream for the first time doesn't really work for anybody, that's the thing about interlocking systems of domination. The jargon of bell hooks, the imperialist, white supremacist, capitalist, patriarchy, was just laid bare. No one came out unscathed.

    Keisha: That's that American dream becoming a nightmare, right? So when you talk about this woman, a wife, a mother, caring for her husband, working in the same factory as her son, I can't shake the image of Rosie the Riveter.

    That “We can do it.” energy from mostly white American women during and after World War II. They were making waves in industry and labor and setting up the feminist movement and of course black women had always worked outside of their homes but for women like Sally it's 50 or 60 years of glass ceilings and second shifts working all day and then coming home to work up to 19th century sexist expectations and then by the time you meet her. That whole system is just coming apart.

    Annanda: Exactly.

    Keisha: So you said that year Sally and your colleagues at the factory was your first time seeing how poor white folks lived. Was that what had surprised you about what was happening at the factory?

    Annanda: No, it was definitely the first time I saw poor rural or working class white folks, but no, it didn't surprise me, but it also didn't make it any less heartbreaking.

    I'd seen the recession take its toll elsewhere by that point. 2007 was very clear on that for me, but upperclassmen also weren't getting jobs and folks who graduated were moving back home or going to graduate school quicker than they would have if they'd planned to even attend at all. To graduate college and not find a job that paid enough to start your adult life was unheard of, falling on hard times seemed to be a common experience amongst many of my peers. So people were being cast aside in the way of a faltering economy, and the American Dream was just... fading, just disappearing before my very eyes.

    SEGMENT C

    [montage]

    Barack Obama: “The news from Wall Street has shaken the American people's faith in our economy. The situation with Lehman Brothers and other financial institutions is the latest in a wave of crises that have generated enormous uncertainty about the future of our financial markets”

    Keisha: Back then, I was in the Southwest, not far from the Texas Panhandle. It's a part of the state that people lovingly call the armpit of Texas, because it's right in the part where, you know, the panhandle sticks up. Sidebar that New York's hot garbage has nothing on manure winds coming from Plainview, Texas.

    Annanda: That speaks volumes, Keisha.

    Keisha: Oh my god.

    But the kind of desperation that you're talking about was so common then for the young folks who are coming into this big college town from small country towns. They were cattle farmers and Kids of cattle farmers and chicken factories who were going to school in the hopes that they would be able to escape their parents' fate.

    They were also still functionally segregated areas where white folks who'd grown up were locally warning me to be careful in the less developed east side of town where the Black folks and the Hispanic folks lived. It was wild to me because they were all struggling. But they didn't seem to know that they were all facing that similar oppression and could be allies.

    Annanda: Yes.

    Keisha: But didn't the term hillbilly originate as a slur for poor whites in the rural South? They've organized with African Americans around their shared oppression at so many different points in U. S. history, from the 1600s to the Civil Rights era. And that's made them a target for cynical politicians who want to use them rather than change their conditions. And it still does.

    Annanda: Exactly. Exactly, Keisha. Poor whites are often invisibilized, just like Native Americans and Immigrants. I shouldn’t say just like, but similar to Native Americans and immigrants. helps to paint a picture of what was going on in the United States when Bitcoin is first revealed to the public.

    The 2008 financial crisis is in some ways a springboard for cryptocurrency. There was a lot of moral despair, injury, and pain at that time.

    Annanda: This is why, in part, Bitcoin's debut in 2008, which is historic in its own right, is noteworthy.

    Keisha: We got there, folks! Bitcoin!

    Annanda: Yes. Thank you for bearing with us. But I think that background is really important to understand the world that these cryptocurrencies were born into. OK. So Bitcoin started on a new version of the internet that has the potential to disrupt or at least minimize predatory lending and denial of purchase by racial apartheid. It could support the economic and moral repair of the economy.

    Keisha: So Satoshi Nakamoto first wrote about Bitcoin in a white paper on Halloween 2008. Now, we still don't know if Nakamoto is an individual or a group because their identity has never been revealed.

    But the concept of Bitcoin represented a new technological promise, an answer to the American dream that was falling apart. Instead of what we have now, centralized banking from the same Wall Street banks that tanked the economy in 2008, Bitcoin promised a decentralized way to move money. Bitcoin runs on transparency through something called a public digital ledger. We asked Adorable Earth Angel to break this down for us: she uses technology as a way to understand how people can connect spiritually and share power.

    Adorable: Rather than relying on the banks who are centralized, it is in the structures of the blockchain that you have a ledger, which has all of the transactions.

    Annanda: Bankers literally used a big book to log transactions. It had columns and rows for tracking deposits, withdrawals, and running balances. That big book was their ledger. And in crypto, and in crypto, unlike, and in crypto, unlike at your local bank, these ledgers are visible to everyone.

    Adorable: The difference between the bank is that the bank knows how much money you have and knows how much money everybody has. But with blockchain, everything's visabalized.

    It allows us to play with this concept of money and currency in a way that takes it away from a centralized banking system there's people who've set up these different servers, and we're using the community to create a bank,

    Keisha: So what if logging cryptocurrency were like building a Lego tower with a group of friends? So every time you add a new Lego block to the tower, it would have a note explaining who added that block and where they put it. No one would be able to change the note or remove the block without everyone else seeing and knowing.

    That's what makes the tower like a blockchain, a chain of blocks or notes that everyone can see.

    Adorable: I know that if I send this wallet this money, everyone can see the transaction so it's verifiable. So the blockchain in essence, creates a lot of things, it creates a trustless system, meaning that you don't have to have trust in what's happening because we can all see what's happening.

    Whereas with the centralized model, I can't see what's happening, I don't know where they're investing my money. I don't know if the bank is using my money to invest in guns or other kinds of things.

    Annanda: Or banks selling subprime mortgages to an investor who sells to another investor and so on

    Keisha: With the group Lego analogy, you and your friends would keep track in the Lego ledger every day. Who brought which Lego pieces and how many. This ledger helps to ensure that no one loses their Legos and everyone gets back what they brought.

    At least that's what it sounds like. Am I on the right track, Annanda?

    Annanda: Keisha, you're totally on the right track. And, let's be real, if we're playing with Legos, I want all my Legos back. [LAUGHING]

    If only one person keeps the Lego ledger and decides what gets written in it, that's a centralized system. And this is how the banks and Wall Street operate. In the game you play with your friends, if they make a mistake, lose the notebook, or are unfair, you all might disagree.

    Someone might lose some Legos, but because the centralized Wall Street ledgers are at the heart of the U. S. and global economy, if they make mistakes, millions of people might lose everything. During the Great Recession, the government had to give these banks Billions of dollars. They bailed them out to make sure they could cover the mortgages and losses on their ledgers.

    Keisha: So, Annanda, you've taught me one more thing that sounds pretty cool. We've said blockchain creates a ledger and it makes that ledger visible to everybody. There's another step to it. Instead of there just being one notebook, everyone building Legos has their own copy of the Lego Ledger.

    Everyone building Legos has their own copy of the Lego Ledger. Then after playing, everyone gathers and updates their notebooks together, making sure they all match. This way, even if one person loses their notebook or tries to cheat, The group can check their notes against each other. That's what makes the ledger decentralized. Everyone has a record and everyone works together to keep it accurate. In the world of blockchain, the lego ledger is digital. So every time a lego, or in reality, a piece of data or a transaction is added, it's like adding a new note to our notebook.

    And just like our decentralized system where all friends keep their ledgers in sync, in blockchain tech, multiple people or computers around the world store copies of the ledger and work together to make sure they match.

    This process is called proof of stake. It pretty much makes it impossible for any single person to cheat or change past records without everyone else noticing.

    Annanda, have you used this tech before? I haven't.

    Annanda: Oh, oh yeah. No, I have. I've, I've definitely, I have gained and lost in this.

    Keisha: Okay. Cause I, I heard about the big crypto rush during the pandemic and honestly it made me feel a little wary, but I do know what it's like to buy a stock online and it feels a little bit unreal to me whenever I do it because it's all virtual.

    It doesn't feel like real money. Like, I can log on to, say, Fidelity and tell the site to buy a stock for me, it will pull the money from my checking account, which is also not real money because it's online, I can only see the numbers. And since I'm not going to sell the stock, it will just sit there on the screen in a list.

    I can't see the underlying ledger, like it sounds like I would be able to see it with crypto, but at least I can see the transaction I just did. Is that that much different from buying Bitcoin or other crypto coins?

    Annanda: Actually, not at all. I think the difference is literally volatility and I, I cry because in 2020 I had to sell the majority of my crypto in order to, you know, to make ends meet. If I had the financial stability to HODL, I'd be in a very different financial situation than I am today. For those of you who don't know, Hodl, according to Investopedia, is a term derived from a misspelling of hold in the context of buying and holding bitcoin and other cryptocurrencies.

    So basically all that means Set it and forget it or buy it and don't touch it.

    Keisha: Hold on for dear life. Isn't that what they say up there?

    Annanda: That is, yes.

    Keisha: Okay.

    Annanda: Yes. And that is the way, Keisha.

    Keisha: That's what I thought you were doing.

    Annanda: It is the way . Yes, yes. Hold on for dear Life is right y'all. And don't touch it. Don't touch it. Unless you ran into a situation like me, you absolutely had to touch it and then cry. Get yourself a bottle of Jack and cry.

    You know, death by a thousand economic paper cuts, alas, but, the use of block

    The use of blockchain ushered in the new technological age of the internet called Web3. While the United States government was busy at work trying to keep our economy from collapsing, which also means the global economy from collapsing, Bitcoin quietly came online as if it was a direct response to our 2008 crisis.

    Or arguably, a logical and moral referendum o n our banking system. So, what makes Web 3 different from Web 2, or what all of us today refer to as the Internet?

    Brian Brooks: “What makes Web3 different is the ability to own the Internet. The actual network and that's what crypto assets themselves represent is an ownership stake in an underlying network. So when you hear people talk about, for example, layer 1 tokens, what they mean is this is your reward for providing the ledger maintenance services, the computing power to the network that on web 1 and 2 was done by Google.

    So now people in my hometown of Pueblo, Colorado can actually own the Ethereum network.”

    Annanda: That's former Bitfury CEO, Brian Brooks, explaining and defining Web 3 to the United States Congress at a hearing over a year ago.

    Keisha: So the crypto network is kind of like a co-op ownership model, where because you participate in it, that means you own it and get to make decisions about it.

    Unlike in the regular internet or banking system where the people who make decisions are Google and Microsoft and Fidelity and Schwab, not me, no matter how much money or effort I put in, I might have understood that for the first time.

    Annanda: 100%. You know, he makes it really clear that Google owns the internet, Facebook owns the internet, er, meta, sweet Jesus, Zuck, meta, um, but crypto is the reward. NFTs not necessarily, NFTs are kind of different. Web3 is a co-op.

    Keisha: Okay.

    Annanda: The current advertisements for Web 3, like the metaverse from Mark Zuckerberg, seem to be disconnected from the lives of everyday people. Hashtag prayers for Zuck. Hashtag prayers for Meta.

    Keisha: I can't. [LAUGHING]

    Annanda: Prayers by obligation. The obvious question is, like, why should we care about Web3 and these forms that these tech billionaires are introducing?

    The Great Recession created a rift that has only continued to widen the racial wealth gap to this day. 53% of all Black wealth was lost during the recession. And by Black wealth, I'm specifically talking about African American wealth and African Americans with a history of chattel slavery. And so, I wonder... How Web 3 could be used to repair what has been economically lost due to thewhite supremacist, capitalist, patriarchy. It seems so disconnected from our everyday lives.

    Keisha: Yeah. So it's a whole experience for those who want it and I don't even think that having a space where you get to play out your imagination is the thing that makes them attractive or a reason that people should care about it.

    I think it tracks back to what you were saying earlier about ownership and authority over your story and authority over your life. The underlying ideology behind Web 3.0 takes us away from the system of domination that has just morphed and changed its form so many times in the last 400 years.

    Is it that it puts the power back where it should always have been—with people? If that's so, then yeah, I definitely want to know more about it.

    Annanda: So do I. That’s why I reached out to Adorable Earthangel, Founder of the Deeper Tones Collective who we heard from earlier. The Deeper Tones Collective is a social impact company that creates Web3 products and events specifically for the Black community. Adorable has navigated the web3 world as a black woman… and she’s made it work for her.

    Adorable: I produced a couple of festivals called Black MetaFest that were held in the Metaverse. And it was pretty phenomenal to bring people into that landscape. When I was researching what was going on within, the web three community, the crypto community, I found that, black people, black women, women of color, we were very much still marginalized in the ownership of all of these digital assets and also the types of NFTs and products, I don't want to buy that and be like, that's me, that's my identity, this monkey thing.

    Annanda: By "this monkey thing," Adorable is referring to NFTs. NFTs or Non-Fungible Tokens are digital assets that can be used in Web3. For example: art, baseball cards, or commemorations of some sort. It’s like a digital collectible item. Think Beanie Babies or pokemon or baseball cards. It was like that. But with random animals and made up creatures. what Adorable’s saying is — "These animal things don't reflect my culture or what I value"

    Adorable: NFTs come on the scene and I'm studying blockchain technology and I also happen to be an artist. So what do I do? I start creating NFTs.

    I start getting into this movement and unpacking how to not just be a consumer but a creator. I'm completely embraced by the Web3 community. I start creating NFTs, I start consulting within that. I bring my product management skills and everything that I know. And that's kind of what led me to get into Web3.

    I guess at my core, is service to, to community and service to people. I've always been fascinated about the wellness of community. It came to me in a dream. I was like, wait a minute, we create our own money. I could create money. Why don't we just put Harriet Tubman on the $20 bill? We've been talking about that forever. And so I, I did that within the crypto space, I created black money and it was something that I was really excited about.

    Annanda: Forever we've been trying to put Harriet Tubman on the 20 bill.

    Keisha: Right.

    Annanda: Right, and so I think what Adorable is talking about is black people as faces on U.S. currency in particular. Forever we've been debating Harriet Tubman on the 20 bill. She has yet to show up on the 20 bill.

    We still have, I believe, uh, is it Thomas Jefferson? Or Andrew Jackson? Some slaveholder is on the 20 bill. I think what Adorable is doing is saying the country has taken too long. So, if y'all aren't going to put Harriet Tubman, I'm going to go. and make an NFT, and that NFT is going to have black people on different forms of fake quote unquote US currency, but it's real in terms of an NFT.

    Keisha: Mm hmm. And it is Andrew Jackson.

    Annanda: He was gnarly, man.

    Keisha: There's a note from the New York Times 2021 that says Biden's treasury will seek to put Tubman on the 20 bill. That was two years ago. It took Tubman, like. 30 years to give her, her federal pension from the army,

    Annanda: Good God, America, see? See, that's why Adorable went ahead and made black money.

    Keisha: . And they were probably waiting for her to die because life expectancy was shorter back then.

    Annanda: Wow. Wow. She's African American's Joan of Arc .In terms of how... literally heard the voice from God and would act, and that's how she knew how to move between the North and the South. Whew.

    Annanda & Keisha: Okay.

    I put it on a blockchain called Solana. It still is, it was a newer blockchain. It was one of the best experiences I've ever had. I did it without money too. I made all of that happen within my own means. It opened the door for me to see what's possible.

    Annanda: I asked Adorable what she was seeing in the web3 industry that made her want to create a startup focused on the Black community.

    Adorable: One. There aren't that many software engineers like the black people, black women, people of color in tech period, like who are making the technology. I think people think that San Francisco is diverse and we think that the space is diverse, but what do you have to understand about technology?

    They want to bet on a sure thing. Capitalism within technology is about winning. It's about solving problems, but solving them in a way where you're creating a return on investment.

    I worked at a venture capital firm so I understand what's really happening here. It's almost a self fulfilling prophecy, right? Cause it's money is building technology, but then, and the technology solving a problem, but it's solving the problems of other technologists and industry, but it's not really solving the problems of everyday people. That's why the disparities in poverty are so huge.

    There's a moral crisis within technology and the moral crisis is this, when you have venture capital there are more diverse funds now, there are funds that are for people of color, for immigrants, like those are happening, but they're far and few in between.

    The majority is traditional technology that is solving for things that support the middle or upper class. It's this feedback loop that's happening over here, but it's not going all the way down into the everyday problems of everyday people.

    Annanda: Adorable's using decentralization as another form of looking at intersectionality, right? She's going beyond identity into aspects of our relationships with others and with the self.

    Adorable: I mentioned before a blockchain technology. No, Being a decentralized set of nodes. You are a decentralized set of nodes. You, Ananda, are this node that has different smart contracts within each node.

    That's real. There's a smart contract called your relationship with your mother. And there's a node that's called how you relate to food, there's a node for your relationship, your mind and your mindset. And you can update your smart contract for all these different parts of you.

    You're not one thing, you're decentralized. We all are. And I want to talk about how in our decentralization, we can heal ourselves at all these different micro levels. We're all connected on our healing journey. And when we come into that oneness and that understanding. And we do the work.

    There's nothing that we can't do as a humanity.

    [Closing music starts]

    Annanda: if bell hooks was alive today, because she hadn't been gone that long, what would she be Shaking her finger at and and putting her hand on her hip about Cuz she doesn't she does it all

    Keisha: yes, she does

    Annanda: and she was a woman of deep faith in particular Christian faith

    Keisha: Yeah, I would say I don't often hear about that side of her life and contribution.

    I think she would be in cahoots with the network of scholars and activists who are trying to describe ways forward, imaginative ways forward, that help to, build a different way to thrive in and beyond the system that we inherited. A different way of building wealth, a different way of building influence, um, a different way of achieving success. I think she would be skeptical about the crypto hype, but she would be curious about potentials that it opens up and she would want to see the economists and the technologists And the critics kind of making real talk amongst themselves about the risks and mitigating the risks for the people who have little.

    I don't think she'd be easily swept away by the claims that it would fix everything. I don't think she thought anything would fix everything. But I, I think... he would wonder about who's being used to message this change, whose image is being put on the thing, um, who's determining what has value within the system, and what new images or representations would help us see a way forward. That's what I would expect from her.

    How about you?

    Annanda: That's excellent, I have to sit with that.

    Keisha: It was a great question, it was your question.

    Annanda: I mean, it's true. I'm talking about your response, I have to sit with that.

    Keisha: Oh, thank you.

    Annanda: But thank you, I'll take it, because you got the solid question. So I'm gonna take that up and pat this back right here.

    Keisha: Yes, yeah, it was a great question.

    Annanda: You know, I think with bell, it's hard for me to look forward, and what I mean by that is we are still so stuck in these systems of domination. Out of all of the works that bell hooks has published, I think she's published over like 40 titles, something insane.

    She read a book a day. Her self care was a book a day, right?

    Throughout the themes of these books is still this echo. This talking about undoing this system of white supremacist, capitalist patriarchy. Part of me wants to be like, you know, I hope she could wrangle some of her colleagues that are a part of the economic minority, right?

    Who can speak and bend those ears and say, how are you having your people do this work? I think that's where I'm stuck. There are very few things in which I hold little to no hope, but economic mobility, closing the wealth gap in the United States is one of them.

    Keisha: As in you have no hope that that's possible?

    Annanda: As in, no, I believe it's possible. I have no hope that those who have the agency to shift things because they actually have the economic upper hand will.

    Keisha: So we've talked a lot about what individuals can do, but what about the government?

    Annanda: What about the government? I feel like it is truly a government responsibility because the lack of access for affordable housing or even to own a home in the United States historically for Black folks, and other historically marginalized groups, but in particular for Black folks and Native folks, has been made inaccessible due to government policies and apartheid economic policies as well.

    For me, this really is an issue that the U. S. government is responsible for and needs to correct. And do I think that the U. S. government in all of its glorious functionality...

    Keisha: And shutdowns.

    Annanda: Yes, and shutdowns. All of it, Keisha. I do not believe that the U. S. government will do the right thing by righting the wrongs of the years of apartheid for Black people.

    And I say that because it has had the option to do so, and it has yet to do so. I want to be proven wrong. Please prove me wrong. I don't think it's just on private interests and entities, but I do not believe the U.S. government will do the right thing.

    Keisha: We started the episode asking if it was technologies that would create the opportunities we've been promised, and what I'm hearing from both Adorable and you is so many of these promises have been broken.

    And... Individuals and collectives and cities and some states have been making experiments, in fulfilling the promises, whether by local scale reparations projects or, giving people direct funding so that they can buy or refurbish houses that they have been in their families. So those are local small scale experiments that have been happening below the level of the federal government, but there's still a missing layer there.

    And that's the overall system. And the direction in which it's pointed, whether it's pointing to people's flourishing or it's pointed to some people's profit. And I think we end the episode with that question still, like, where does the promise get fulfilled and who's responsible for making decisions that take us in that direction?

    CREDITS

    Annanda: I’m Annanda Barclay

    Keisha: And I’m Keisha McKenzie.

    Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.

    Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.

    [PRX SONIC ID]

    SHOW NOTES

    Work with our Guest Adorable Earthangel!

    bell hooks defines white supremacist capitalist patriarchy

    bell hooks shows the difference perspective makes in Black Looks: Race and Representation (1992).

    Some background on the history of the New Deal and the 2008 crash:

    How the New Deal Left Out African-Americans (Smithsonian)

    Codeswitch explains the history of housing discrimination and redlining

    Economics Professor Richard Wolff (The New School) explains the 2008 subprime mortgage problem

    Investopedia breaks down on the AIG Bailout

    In January 2021, the New York Times reported the Biden administration’s intent to include an image of Harriet Tubman in a redesigned $20 bill

    Sam Bankman-Fried, co-founder of FTX and Alameda Research, was found guilty of fraud and conspiracy after his crypto firm collapsed. He faces a lengthy prison sentence. Reuters reports (Nov 2023)

    The Income Gap, the US Department of Treasury marks Racial Inequality in the United States

    Why the Great Recession Made Inequality Worse by Ken-Hou Lin and Megan Tobias Neely

    Explore more on the 53% loss of African-American Wealth During the Great Recession “The Color of Money” by Mehrsa Baradaran

 

Caribbean Insights Upend Tech Norms

Season 1 | Episode 5

  • SEGMENT A (INTRO)

    Annanda: So I love Adam Curtis documentaries.

    Keisha: Love a good documentary—and what’s the draw to Adam’s work?

    Annanda: They are heady, visually stimulating, he has a very keen talent that works well with my brain of taking things that seem to be totally unrelated and showing how they intersect. They aren’t the kind of documentaries that are necessarily restful. But I find Adam Curtis incredibly engaging. Watching his documentary “All Watched Over By Machines of Loving Grace” was the first time I heard about this internet manifesto called Californian Ideology. Here’s a clip from the documentary.

    (play clip 0:00-0:23)

    Annanda: The Californian Ideology summed up the culture and moral beliefs that drove Silicon Valley tech in the 90’s and continues to this day.

    Keisha: I’m ready to get into this!

    Keisha: I'm Keisha McKenzie, technical communicator, strategist, and advocate.

    Annanda: And I'm Annanda Barclay, death doula, moral injury researcher, and chaplain who’s concerned with a life well-lived and cherishes the relational stories people share.

    Annanda: This is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech advances and share wisdom from Black cultures for healing what tech has broken.

    Today on the show— Is the dominant worldview of Silicon Valley a reasonable representation of the real world?

    <music>

    BREAK

    SEGMENT B (Introduce the Californian Ideology Concept)

    Annanda: The term Californian Ideology was coined by Richard Barbrook and Andy Cameron in 1995. In a manifesto essay which celebrated and critiqued the emerging belief system they are observing in the culture of Silicon Valley, It's a manifesto essay, a staple taught in most university history of computer science classes.

    The Californian ideology mixes Silicon Valley's business directed and libertarian free spirit attitudes with a belief in technology's ultimate power to improve society

    Keisha: That sounds a little bit like PayPal's Peter Thiel. What does it mean?

    Annanda: the term Californian ideology critiques the idea that digital technology can solve societal problems without first addressing underlying political and economic structures. Here's a clip of Richard Barbrook talking about the ideology last year

    (play clip starting from 0:13-1:06)

    Annanda: This ideology helps to explain how America's tech culture blends the technological utopianism, individualism, and anti establishment sentiments, which shape big tech successes and controversies.

    (Insert a montage of tech company successes and controversies: Surveillance clip starting at 0:00-0:13,0:25-0:31, 0:01-0:06,3:19-3:26, 0:07-0:15)

    Annanda: It's like mixing ideas of personal freedom, challenging the system, but also with a deep trust in the latest technology and putting entrepreneurship first.

    This combination is reflective of In the ethos of tech companies, which tend to prioritize innovation, disruption, and a libertarian approach to regulation.

    Annanda: First, tech has an impact on our work culture. The American industry, the American tech industry, is known for its informality, emphasis on creativity, and non traditional work environments. Think Google, Meta, and Apple having everything you need on site. From three meals a day, from chefs in multiple cafeterias, to cafes, Gyms complete with yoga and workout classes, medical and dental clinics, real talk, y'all, laundry, dry cleaning, and child care.

    Yes, Google has child care. They supply you with everything you need, including a hefty salary, so you have no excuses for not having enough material support to achieve your work. It's designed so you don't have to leave. Creepy. Creepy

    Second, tech is well known for its pro technology, pro market, and individualistic business mindset. The impact of which is reflected in how tech leaders engage with social and political issues, often focusing on technological solutions to complex problems, which is an issue we'll dive into in the second part of this episode.

    Finally, tech companies also tend to show preference for minimal government intervention, which is slowly becoming more of a household conversation, right? companies accumulate vast amounts of power and data, leading to concerns, again, as we've talked about in prior episodes, about privacy, surveillance, and monopolistic behavior.

    Hashtag Peter Thiel to Keisha's point earlier, who really does enjoy a good monopoly.

    (insert clip 0:54-1:05)

    Annanda: In short, we know the Californian ideology well. We live it, and the results of it, in our everyday lives.

    We just don't refer to it by its formal manifesto name. And Richard Barbrook and Andy Cameron totally called it in 1995. It's ubiquitous in American tech culture. It's the water we all know and swim in. And it seems like an ideology that's become too big to fail.

    Keisha: When you talk about the Libertarianism, the utopianism the individualism. It just sounds like the latest evolution of some of those early formative American ideals., the work ethic, ascribed to Protestants and the colonial era, the triumphalism before World War I, where the Industrial Revolution was transforming the country.

    It was outliers in religion who were saying, well, maybe human nature hasn't caught up with the technology. And then we got the world wars to prove it. So it's like some of these streams evolving over the last two, three centuries to bring us this network of extraordinarily resourced—they're not industrial barons, they're tech barons—billionaires who are able to influence policy, but also literally influence the devices we use to live our lives.

    And it makes sense that we should try to unpack what they're thinking about and what assumptions they're making about our world and us.

    Annanda: I like how you said tech barons because we don't talk about them that way but that's exactly what they are and if we compare it to the industrial barons before them Tech Barons do know a lot more about us and have way more access to ourselves, our habits, our histories.

    They know us in a frighteningly intimate way through our data. Let's unpack this because it's real. And this ideology has named and pointed to it.

    Keisha: I also think it makes sense to see it in, in the light of those older ideologies because if we treat it as this outlier, then it becomes a thing that's other and foreign, but it's actually very native to the U.S. It's almost as it would have been a surprise if it didn't emerge from the U S given all of the streams feeding it. And I remember, not long after I came into the country, it made sense for me to look with some sense of aspiration and idealism at the people being successful in technology because of my discipline and because of the spaces that I was in at the time.

    They lived the sort of plucky, reinvention, self invention, do it yourself kind of spirit that America preaches to the rest of the planet. And yet, nobody ever really unpacked what that sort of entrepreneurialism-to-the-extreme cost everybody, in terms of a tunnel vision focus on individual success, perhaps at the expense of the communities around.

    Annanda: As you're saying that, given that it's called the Californian ideology I think it actually shows even more so the extreme because California is out west and the history of what it took to shape and cultivate the West.

    This idea of libertarian free market is very much overemphasized in the West.

    I've really noticed the difference of that culture compared to say, Chicago or aspects of the East Coast where you're at. There is a very particular tinge on it that is, like, the ghosts of the gold rush coming back to haunt, and it really has shaped the culture of California. to me, it makes total sense. It's as American as apple pie.

    It's clear. This is very California.

    Keisha: So given this approach to reality, it makes sense for where America has come from, but it also is incomplete. So where have we benefited from the Californian ideology as a culture? What's still left out? And is there a better way?

    BREAK

    SEGMENT C (set up Jane as a re-framing of what’s possible)

    Annanda: We interviewed Jane Gordon. She's an Africana Studies professor at the University of Connecticut. We wanted to get her take on what's a Black Caribbean point of view compared to the Californian ideology.

    Jane, how would you describe the dominant worldview of Silicon Valley?

    Jane: My sense, which does feel like it's probably pretty caricatured, to be honest, is of a lot of very young, unattached people, who are increasingly international, who really have tremendous faith in the ability of technical forms of human creativity, to do really innovative things.

    Some of which are likely to actually benefit many people and some of which I think they're not actually even thinking about what the large scale effects will be on people.

    It's interesting because their influence is so massive, right? And it's so outsized. And I think that's why when I stop to think about who actually is embodying, like, who we're actually referring to, I realized that it's probably a pretty caricatured image.

    What I'm thinking about are the consequences and effects. I know that there's a pretty massive gap between folks who are developing technologies with all kinds of expectations about what will come with them and the effects that's having on everyone else.

    Annanda: how do you think the caricature that you describe is different from dominant or different narratives outside of Silicon Valley?

    Jane: I think about IT as a really innovative area. Right? I'm at a big public university in New England, and the range of people who are drawn to opportunities, and problems opened up by technologies are vast. A lot of the people who are flocking to those fields the most, are immigrant kids, they're kids of color.

    My presumption is that the dreams of some of those people I'm describing is to work in Silicon Valley and to transform it and remake it in their own image.

    I think other people just assume in advance that world will always be close to them. And that they'll be doing either subsidiary work or work that's trying to develop other forms of, other conceptions of technology and global technology that'll have a kind of, I don't know what kind of relationship that'll have to Silicon Valley.

    But it feels as if it's the sun around which everything orbits., and I think people make all kinds of choices about when opportunities and resources are distributed that way.

    Annanda: What social concerns surfaced for you? this idea that tech can be a panacea for all and can save everybody. And why should we care?

    Jane: Yeah. What I do find a little bit challenging is I think of tech very broadly. Right? Just last week, I was reading the work of a political theorist, on North American Indigenous political thought. And one of the figures who he studies most closely is, a man named George Manuel, who is an Indigenous person in what is now Canada, but was absolutely fascinated by Tanzanian experiments in socialism.

    And so he had this really interesting approach to tradition, which was, Never at odds, he thought, with emerging technologies. So he used to say, for instance, just because you're a farmer and want to continue to be a farmer doesn't mean that you need to use a wooden plow. So he saw technology and its valuable expressions as simply anything that enabled people to make their values live and live dynamically into the present. So that could be a wooden stick. I mean, it could be any number of things. And so I have to admit that when I think of technology, it's really more in that vein. It's sort of anything that amplifies human abilities that we're committed to using.

    A pen is a technology. I think about it in this sort of, very broad way that I think in some ways has been narrowed by the rise of Silicon Valley. Only a certain number of things have come to mean tech, when the truth is tech is human creativity trying to amplify our own abilities.

    When I think about Africana philosophy, it's so rich with a broadened conception of technology, that it's all about cultivating human well being, cultivating sustainability, there's this narrowed version of what we're meaning by tech that is coming to eclipse the broader world of human creativity that it makes more sense to embed it back in. And that's even more of a concern if people want a narrowed version of tech to solve all our problems, because it divorces a very particular technologies from what motivated their development, and then questions about how they should be distributed.

    There's always a human desire for technical solutions to complex human made problems. And so I think we often channel our creative energies into more decontextualized technical ways of addressing problems that are always embedded and are always social and cultural and thickly enmeshed and everything else. We need this broader understanding of technology if we're going to think about The role of technology in solving our problems

    But it's we who solve our problems. And we have to remember that we also are the creators of the technology that we're looking to solve our own problems. And so there's this strange relationship to our own agency when we turn to things that we've produced but act as if they're independent of us and that they're going to solve things for us.

    Because of course they bear the marks of us. Right? Good and bad and every kind of combination of those. So my own feeling is human beings, I mean, I'm completely indebted to Octavia Butler and what I'm about to say. I think human beings have this massive capacity for imagination and that we're terrible at solving our own collective problems.

    Keisha: Jane is referring to the late science fiction writer Octavia Butler's works, she's pointing to a theme in those books around human beings creating meta existential crises and the struggle it takes these characters to try to solve collective problems.

    In recent years, Butler's work, The Parable of the Sower and The Parable of the Talents, has... Had a renaissance coming back in, podcasts and a theater production. So a new generation is experiencing these themes from the mind of Octavia Butler.

    Here's Jane Gordon again.

    Jane: You know, we're great at producing things, we're terrible at distributing them well, and we just do it over and over and over again. We're fantastic at the less human, less social dimensions of creativity, and we're really bad at working with each other on a collective scale that's not small, in figuring out what these should actually be in the service of.

    I see it so much say in film right now that you'll go and you'll see these films and they're technically brilliant and they're superficial often. I mean, not everything. There's amazing film and TV getting made right now. I don't mean it like that. But there's just, there's such an emphasis on what we can do at a narrowly technical level.

    But I wish that same attention were devoted to what it is that we're bothering to explore.

    Annanda: how does black Caribbean philosophy make social sense of the dominant. Ideology you see in Silicon Valley? Or maybe contextualize it.

    Jane: Right.

    I'd answer that in a couple ways. The first is that there's a remarkable kind of how to orientation that really runs through Caribbean identity, political identity, especially that, I mean, one of the things that I think kind of dumbfounds folks who haven't been exposed directly to Caribbean people, is that there's a sensibility of like, who, who are you to tell me this is impossible?

    Like, we're going to make this happen. You know, it's just a question, like, give us some time maybe. But like the will is always present. There's this kind of remarkable creativity that comes with that. Cause rather than devoting all your energy to convincing yourself that you can do something, you're devoting it to doing the something,

    The thing that I think Is like in a simplistic way really like attractive about Silicon Valley is there is this kind of , "let's do this new thing. Let's be pioneering " spirit of innovation that is hard not to be drawn to. But where the Afro Caribbean sensibility is so different is the idea that you would create something separate from the social world of which it's a part.

    Would be alien, right? The idea that you would do that with absolute disrespect for any sense of the world, elders, the earth, right? I mean, one of the things that always really struck me about Caribbean fiction is that they're not the narratives of individual protagonists. I mean, Caribbean literature of all kinds. There's a thick social world that always comes with it. Yeah, you have heroines but there are worlds that they're acting in. So that sense that like a technique would transform the world and fix all our problems. I mean, people would laugh you out the room.

    I think the other thing too, that you see in Caribbean philosophy and how it's written and what's prized is there is a much less constrained approach to creativity.

    Creativity is in everything. It's not just in how you solve a political problem or an economic problem or a technical problem. It's in how you speak. It's in how you make music. It's in how you cook food, you know? That creativity must draw from everything, I think, then shifts how people think about what technology should be in the service of, right?

    Because if it's drawing from all of these domains, then it's also answerable to them, I think, in a way that's quite different. So in other words, just as we bring our character to the technology, I think we bring our philosophical anthropology to that technology as well. So we imagine the technology as a decontextualized, individuated thing, rather than a thickly enmeshed, you know, social creature with ancestors and descendants to whom we're obligated, to listen to and to respond to and to be accountable to.

    Annanda: You gave examples of food and being accountable to ancestors, to the planet that you're in service, you're in community and not separate. Is there a particular, philosophical or spiritual lens that you could give our audience as everyday people to consider an alternative approach to innovation outside of Silicon Valley?

    Jane: The first thing that comes to mind really is how profoundly creolized Caribbean music is.

    There are different moments and there are different kinds of Caribbean music. I don't want to treat it as a monolith because it's not one. That would be an injustice to it. But there's a way that, say, in Caribbean music there's no missing that is Caribbean.

    and yet what's Caribbean about it is how it brings so many different cultural influences and things that move people into a coherent whole. And so you listen, you think, Oh, that's from there. And that's from there. There may be things that you don't actually know the origins of, and yet it's in their combination that you recognize their expression of Caribbean-ness.

    And so I think there's something about the ability to draw from, while acknowledging indebtedness. There's no pretending that you haven't borrowed, but there's a sense that you can borrow in a way that enriches and that doesn't leave you indebted. When we're talking about technologies that we should prize we should, reinforce the availability of, I just wonder, if there's a way of emulating, that way of borrowing so generously from all of the different contributions that come from distinctive experiences, distinctive locations, and yet what we're proud of and what makes them an expression of who we are is how they've come together.

    So it's not about pretending that we're the unique source and it's not saying that to borrow means we're imitative. There's a model in that of human community that I wish we could emulate in so many other domains.

    that we acknowledge that there are different, different words, different music, there are different ways of engaging in ritual, that really carry, the wisdom of different experiences, of the human world. And that if we know how to sew them together, that's when we do best. And by do best, what I mean is we're figuring out how to actually be in human relation.

    Keisha: What if the tech industry is also a space of different cultures and contributions being knitted together to create something new, with some of the same sorts of power dynamics determining which contributions have value or rise to dominance?

    Jane: The tech industry itself is creolized at the level of the creativity that we see at work in it.

    Especially good for this conversation is the 1972 Jamaican film, The Harder They Come. Jamaica and Jamaicans had often been the backdrop for movies that really centered other protagonists, whether they were James Bond or anyone else; this was the first film where Jamaicans in Jamaica, 10 years after independence, was the focus.

    The reason I think it's useful here is for a few reasons. The first is that if you see that film, it's obviously not just a film.

    (insert clip 41:15-41:31)

    Jane: Unchanged with its focus on Jamaica. It really was a Jamaican film. The story itself is a quintessentially Jamaican story.

    what I think it captures so powerfully is it's a working through of the promises and fantasies of Hollywood, and knowing those, but speaking back to them,

    Annanda: So, in this case, Hollywood and its dominance in film would be a metaphor for the Californian ideology and its dominance in tech.

    Jane: You have somebody who comes from the countryside with the death of his granny and wants to come to Kingston and make it big, he has this dream of becoming a superstar musician. And of course, it's played by Jimmy Cliff.

    Annanda: For those who don't know, Jimmy Cliff is a pivotal figure in reggae music. His work helped to bring reggae, as a genre, to a global audience through the soundtrack of the film, The Harder They Come. This is the film that Jane is describing now. For those of you who are diehard Bob Marley fans, it was Jimmy Cliff that put reggae on the global map and Bob Marley was able to use the platform that Jimmy provided to amplify reggae music even further.

    Jane: What he encounters at every turn is the brutality of colorism, the brutality of stark class inequality, and yet he persists he does become a superstar musician, and he's completely ripped off, when the song is recorded. The radio hosts who control what can become a hit, try to block it from being able to circulate until he's become so notorious in outrunning and outsmarting the cops that it becomes the soundtrack, to his life and to his story.

    And the song is, you can get it if you really try, but you must try, try and try, you'll succeed at last, right?

    (insert clip (0:06-0:24)

    Jane: And there's this irony of ironies, that there's this sort of shallow version of meritocracy that should be true. But in those circumstances, it's so absurd that the song has this kind of irony because you see someone trying, but what does trying mean, and you see them succeeding, but what does succeeding mean? there's this nod in it to the Jamaican love of the Western.

    At the end the cops show up and there's the shootout, he suddenly imagines himself as the protagonist of the spaghetti Western that he sees when he first comes to Kingston. And so there are all of these interesting taking on tropes that really circulate in Hollywood. But speaking back to them through a protagonist who's a member of the lump of proletariat.

    Annanda: Jane cites this example of Hollywood in the film because it shows the mindset of Jamaican Black Creolization through this scene of the Spaghetti Western. And this scene is speaking back to Hollywood, by Jamaicans doing this, by Jamaicans adding a nod to Spaghetti Westerns, being cheeky and ironic, putting this Spaghetti Western plot within the narrative of a film, yeah, it's a wink and a nod speaking back to Hollywood, as Jane is saying, but also it really is still uniquely Jamaican.

    It's a Jamaican take on the Spaghetti Western, and I think that's a wonderful moment of creolization that Americans can recognize, or rather anybody who's seen a Spaghetti Western can recognize in the film. And so what Jane is doing is giving an example within Jamaican art and culture through film and music of Creolization.

    Jane: The other piece of it that I think is interesting for our conversation is the life of the soundtrack, , because the film, it features Jimmy Cliff, and the album that's made from the movie circulated far, far more broadly than the film. The soundtrack had a life completely of its own, and many people who don't even know about the film have encountered the music. It was considered the first global reggae album. And really it was one of the first encounters globally with Jamaica speaking back to the world about promises of freedom.

    It's a great example of people being very excited to employ and use a technology, but absolutely remaking that technology to tell the kinds of story that they wanted to tell.

    When it comes to who's thinking about ways of expanding the reach of human beings or the capacity of human beings, the contributions come from everywhere, but it's a little bit like the radio stations with, Jimmy Cliff's record, they were only willing to circulate it when they were ensured that the profits would come to them. And that was actually only when the notoriety began to surround him.

    It's similar with technology, that there's a big difference between what's actually being generated and what's potentially in circulation, and then the controlling of that circulation and the controlling of that distribution, and that what isn't creolized is who's getting to make those decisions

    Keisha: The Jamaican national motto is outta many one people. I just became a US citizen, which talks about e pluribus unum, the out of many one. How is that different? Or is it different from the Creole impulse to pull things from many sources into coherence?

    Jane: An argument that many Latin American scholars have made about the ideology of mestizaje or sort of a national identity that's already mixed, is that you can include people in ways that actually erase them.

    One of the policy approaches to indigenous dispossession in the U. S. was forcibly incorporating indigenous nations that had no interest in U. S. citizenship, that didn't want to belong because that was essentially discrediting the sovereign status of their own states. Right? And so there are ways of incorporating that are actually about not only diluting difference, but literally erasing it, bleaching it, for lack of a better word.

    Whereas I think the point with creolization is to say it's the coherent whole idea. That we can actually be enriched by the continued difference within that whole. We don't want to flatten it and we don't want to mute it, we want it to speak. Fully, you know, vocally. The emphasis on creolization, and not that everything that's called creolized gets it right, because it certainly doesn't, but I think as an ideal, it's such a useful one, because it says we want that difference to continue to speak, and that the relationship is the differences speaking together.

    Rather than letting one drown out the other. And while there, there are imperfections in how that Jamaican ideal is realized, as an ideal, it's such a valuable one because it's really genuinely rare. It's saying that within the nation, we're a multi nation, that our pluralism is a strength, that it's not a site of threat, that it's not a weakness that we haven't all become modulated into one, you know?

    Annanda: what Jane has been talking about is a prime example of the limit of dominant ideologies. like the Californian ideology. Richard Barbrook and Andy Cameron critiqued the dominant ideas in tech, that technological innovation could solve all of our problems.

    Cause they're like, no it can't, not without first addressing the underlying political and economic structures and experiences that differ from that of the dominant tech culture. And I'd take it a step further to say that technology can never solve all of our problems. I really do believe that that boils down to how we relate to one another, how we are in relationship with one another, and we can use technology to facilitate that and make that more easeful and enhance it and better.

    But ultimately, there is no tech in the form of tech that we consider to come out of Silicon Valley or like computer innovation that is going to solve all of our problems. Jimmy Cliff's character served as that example in the film, the daily experiences of the global majority serve as a constant reminder of the massive gap that's missing between the ideology that runs American and Western tech companies and the lives of everyday people. It's an ideology that leaves no room for meaningful creolization.

    Keisha: When I was listening to Richard Barbrook talking about the origins of his read on the Californian ideology, I was moved by one of the things he said, which is that the empire is in decline.

    So the American empire is in decline is his read, but he's saying that from the UK. an empire that long declined, like during the era that Jimmy Cliff was burgeoning on the scene. And so you see creolization happening, both in the shadow of empires, but also in the aftermath of empires. It points the way to a future of differences living together in a functional and healthy and creative way, versus differences living together in a consumptive way, which is what Empire brings you.

    [Closing music starts]

    Keisha: When they used to talk about the global village through technology or even in the early days of social media, they were talking about, oh, this creates connection and it drops all the barriers and borders. Well, yes, but no, it didn't. It did in some ways, of cultures being able to be connected instantly, but it didn't in terms of the power structures not changing so that they were actually on equal terms rather than object and voyeur, you know?

    So it's a different sort of dynamic that the tech industry actually gave us versus the rhetoric of global village.

    The internet said we would be able to know each other, and what it actually has [00:31:00] meant is I will know all about your first date presentation of yourself through the social media update, but, the care and the community and the connection that we need to survive the crises that we are already in, whether those are economic or military or environmental or otherwise, like that's , the challenge that earth is throwing at us and we're going to need different strategies, including this extraordinary creativity to navigate them.

    at core, we are still mammals, and mammals like soft things, and good relationships and community are a soft thing, and we need more of it

    Annanda: The narrowness of Silicon Valley and its approach to life is just not reflected in my everyday experience with the diversity of humanity and relationships that I encounter and cultivate.

    And so I feel like the gift of Black Caribbean Creole or the gift of Jamaican Creole, In particular, is this idea of like, we're all here. We're all on this tiny ass island and we can make it work by honoring each other because we have to. You just expand the island to a planet and I think the point is really easily made.

    And so what's the Black Jamaican Caribbean wisdom here? It's creolize. Break down barriers to access. Break down barriers to knowledge. Break down barriers and acknowledge our shared value that out of many, we are one.

    And also, have some good Jamaican, Caribbean, you know, That's like, who are you to say we can't do it? Who are we to actually say we can't do it? Especially when we really haven't in this area in Western and American tech. When we really haven't tried.

    CREDITS

    Annanda: I’m Annanda Barclay

    Keisha: And I’m Keisha McKenzie.

    Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.

    Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.

    [PRX SONIC ID]

    SHOW NOTES

    Check out our Guest Professor Jane Gordon

    The Californian Ideology

    Jimmy Cliff

    The Harder They Come Film

    “All Watched Over By Machines of Loving Grace” an Adam Curtis Documentary Series

 

Data Science, Consent, Colonialism—What We Can Learn from the Woods

Season 1 | Episode 6

  • Keisha: Annanda, let’s go back to 2010.

    SOUND: open, folksy montage music -- maybe fading up the I’m From Driftwood background will do it.

    It was the heyday of public storytelling profiles like Brandon Stanton’s photo series Humans of New York and Nathan Manske’s LGBTQ video profile site I’m From Driftwood. Platforms like these were based on the idea that people who believed in equality, equity, and justice needed to “tell their stories.” And if they did that, it would help them find a home in a harsh world—online and offline. You heard of them?

    Annanda: “Humans of New York,” definitely.

    Keisha:

    The website from I’m from Driftwood, tells us about this strategy of belonging through visibility:

    SOUND: Clip from “I’m From Driftwood - The LGBTQ Story Archive”

    [00:00-00:08] Narrator: “I’m From Driftwood: The online archive of lesbian, gay, bisexual, and transgender stories from all over the world…

    [00:24-00:31] Narrator: “LGBTQ people aren’t just in the big cities. I’m From Driftwood’s message is that no matter where you are and what you’re going through, you’re not alone.”

    Keisha: It sounds a little naive to me now, but it was common at the time to think that if people who were different came out of the shadows and left their anonymity behind, that was what would change hearts and minds and help create just communities.

    If they really knew us, we thought, they’d treat us better.

    Even the most hypervisible Black woman on Earth got in the game, using her stage to spotlight others. Here’s Oprah on her show with John Amaechi:

    [00:05-0:08] Oprah: “Do you think there are other professional athletes who are still in the closet?”

    [0:08-00:14] John: “Yeah, my mind boggles when people ask this question. Of course, yes, there are definitely—there are definitely other athletes. And it’s a workplace thing…

    [00:19-00:28] John: “There are gay people in all professions. It’s not just the ones where people find it acceptable or expected. It’s not just makeup artists and hairdressers and, uh, flight attendants so [laughter] we do exist in other areas.

    [00:30-00:31] Oprah: “Yeah. Ok.”

    I was a bit ambivalent about this idea of revelation as revolution, but I did it myself. I helped others do it too.

    By then I’d already spent more than a decade and a half | online | I rarely used my government name | and I mostly connected with people I’d already met in person or interacted with for a while in discussion rooms.

    I got to know these people in some really meaningful ways—we went through graduations, bereavements, marriages, divorces.

    But I didn’t know much about their day-to-day lives, or expect that I or anyone else should have access to that depth of information.

    I’d have been mortified if I’d known how easy it was, even then, for tech companies to put together all of those data points from the texture of our lives online.

    They can do that now.

    SOUND: Add Title music

    Annanda: I’m Annanda Barclay.

    Keisha: And I’m Keisha McKenzie.

    Annanda: And this is Moral Repair: A Black Exploration of Tech. A show where we explore the social and moral impacts of tech… and share the wisdom from Black culture for healing what tech has broken.

    Keisha: On today’s show: Tech companies have access to an immense amount of data about each of us. What does it mean to be known in a world where no one can be anonymous?

    Annanda: How are technologists shaping that world—and how is it shaping all of us?

    SOUND: End title music

    BREAK

    SEGMENT B: The people behind the data

    Annanda: I’m not even on social media anymore. Except YouTube.

    Keisha: How come?

    Annanda: Well… I had a major relationship that I broke up from years ago. We're talking like 2016. | For those in the camp of divorce, it was a marriage. And I decided that I had too much online. Like I realized | how public | I intentionally made my relationship.

    SOUND: Fade in audio from Jill Scott’s “Watching Me” -- e.g. 00:00-0:25 (vocal version)

    We were both going for ordination in our respective denominations |, and so we used social media as a platform to push our | “queer agendas” | within the church. | so we intentionally made ourselves hyper visible | to really hit on what you talked about earlier, Keisha: | maybe if they hear our stories or know us, | it could be a strategy | against LGBTQ discrimination in the church.

    | but I also realized, like, I actually wanted to mourn the loss of that, | not in public. And so, from that moment on, I began to question what are we putting online? Who owns it? | and really | backed down and took some time to | patch myself and come up with a approach | to social media.

    Because I realized social media is not public and I don't own it, so why am I giving my life, quite literally, | to some | corporation?

    Keisha: What helped you | heal from that time?

    Annanda: Oh my gosh, uh, not being on the internet, um, | this is just me, right? I don't think there's a right or wrong way to do it. | It's just my way for myself. | so yeah, I think not being on the internet and actually getting really clear | like this tool no longer serves me in the way that I intended so how do I establish a different relationship with it? And what does that relationship look like? | and what do I want to use it for? | For a while I just needed a break. So I took a year off, kind of observed things a little bit after, you know, making, making those couple's statements, "we're no longer together," that kind of thing.

    Keisha: It sounds like it was a devastating experience on a personal level but | unmooring professionally because | you put out your information and then had to figure out how to | pull it back.

    In 2016, we were both in the same space of ordination, gender, and LGBTQ justice. I was working with Seventh-day Adventists at the time, and then later just about anybody who meant that the world should change. | I wasn't directly impacted clergy though, like you were. I was just supporting clergy that I knew who were doing good work and were being hurt by their community's policies.

    | we did a lot of video profile work and | 10 years later, I | still hear from people who have found some of those stories and profiles and stuff, and | once that information is out there, you no longer control it, and you no longer control the conditions under | which people find it or use it. | it's a little unsettling.

     Annanda: Yeah, a lot unsettling. 

    | I stalk my social media accounts frequently. Yeah. | I mean, | I'm not on it often, but I know what's happening. |

    Keisha: When was the last time you googled yourself? You wanna do it right now? Tell me what you see?

    Annanda: Okay, the first thing that comes up for me is my, my profile at Stanford, then it's like LinkedIn, then when I pass out first Presbyterian Church, and then it's, uh, like a, a piece done on me from a local radio station on chaplaining and then a lot of, uh, yeah, it's very, it's very clear that I'm, uh, that I'm a preacher. *laughs*

    So I look at social media, the way people | watch a movie and eat popcorn, and there has been a gain. | I don't feel the anxiety that I used to of like, oh I have to post, or | this hyper competitiveness that | I think can lead to | deprecation of self worth.

    | but there's also a loss there. | social media | is creating culture, is creating life. And so, | I fully accept | what I lose from not participating in social media. | there is something to be lost there. There are connections that I'm not making because of it.

    Keisha: Did you | feel like you had | a choice in terms of what parts of your life you were sharing with the people you were organizing with and the people you were trying to influence?

    Annanda: | Oh yes. | I always have a choice. I'm always in my own agency, no matter what is going on in the world. | And with that decision, I was aware that the choices I made came with real consequences. | I was then, as I am now, incredibly strategic. there is an anchor of | I want to make a choice, I want to make it | for myself, and | when it's appropriate, I want to make it in community and to uplift my community, but I need to care for myself first.

    | I tend to research | and observe |. So | I was like, okay, I need to get off social media, and | like, what is this?

    | How do I make informed decisions on how to | engage with it when and if I want to re-engage with it again? |

    Keisha: So you were looking for the midpoint between being hypervisible on | one extreme, and completely having to stay out of public life?

    Annanda: I don't know if it was a midpoint. I was looking for | what feels good for me. | what doesn't bring up anxiety? What doesn't | bring up | bereavement and grief | in a way that is | keeping me stuck.

    I think I have | like something like 36 things on my Instagram, | right? | I don't have that much. I'm | kind of like a digital nobody, you know? But I also don't have that digital stress.

    Keisha: So | that's the perspective as a tech consumer. I'll let you in a little bit on the perspective as a tech | creator. | Not long after 2016, I was on an advocacy campaign where we used early advertising tech to engage people who were supporting the campaign. So we used web services that were hooked into social media and email.

    And that let us have basic information about the people who were following the campaign's Facebook page. Just their demographic groups or interests that they had volunteered already. It was enough information to let us connect with people who might want to learn about the legislation that we were trying to influence or | wanted to help share positive stories about minoritized communities.

    And eventually… because I was also a consumer, I started to ask the other campaigners questions about the system behind the page. Like, what was the ethics of collecting that data? And what was becoming “best practice” in the industry of campaign analytics? | Mostly I got back blank stares. And I started being even more intentional myself about the data I was putting online. Because of how it could be used.

    And then there was something I found the other day. | When the pandemic started, some churches were starting to use digital marketing to find their members online and | to attract new members. | Some of those marketers were monitoring search data to do it.

    So there's a Christian Broadcasting Network report on Gloo, which is one of those | companies:

    SOUND: Clip from CBN report

    [00:00-00-03] Anchor: Church attendance continues to drop in America…

    [00:59-01:06] Reporter: Gloo, a ministry platform, is building on that trend by working behind the scenes online to help people connect with a church.

    [01:15-01:29] Reporter: If you’ve ever wondered why you randomly get online ads for mental health support, marriage help or dating sites, it’s because your search activity isn’t a secret. In fact it’s a gold mine for companies like Gloo, that are paid to connect you with people who can help.

    Annanda: Oh, wow. | as soon as you said it, I'm like, oh my gosh, I'm not surprised. But I honestly had no clue that this kind of targeting was going on | when I was a pastor. You know, now I'm a chaplain. | Still clergy, but now I'm a chaplain. | Given my stance on social media, | I had no clue churches were doing this because | I was never put on the—obviously *laughs*—on the social media team.

    | You know, something like this would never be professionally permissible in a place of higher education, | or | a medical institution | either, any hospital, frankly. | violation of student or patient privacy | is pretty serious and it's protected under federal law with FERPA, | the | Family Educational Rights and Privacy Act, and HIPAA, | the | Health Insurance and Portability and Accountability Act.

    | Even | the research that I'm doing on students at Stanford required an | extra committee in addition to the normal research committees | so that, yeah, I find that ethically concerning, and I also think that shows that like, congregations are | institutions that need people to buy into a certain degree in order to survive.

    Like, that's the model, right? | you can't open up a non profit organization and not have participants. | what I'm hearing is | the cost of some of that and also the desperation right now in America with | less and less people going to church | in general.

    Keisha: Yep, | There's a | business framework wrapped around the way that they engage the people they serve on one hand and then define as audience in another sense.

    So if you're thinking about audience, then you get to marketing and product and value proposition and all of those kinds of things. It's | fascinating. But I've been talking to | my tech friends about the whole sector. | because of the experiences we talked about being online, being hypervisible, but also because of what I was seeing in my professional work.

    Scott Hendrickson is one of those friends. He's a data scientist who's worked with some well known companies like X, the company formerly known as Twitter, and some smaller business to business data shops that most of us will never hear about. But over the years as I've talked to him about this stuff, I've realized there's a whole web of data and analytics companies busy behind the scenes of the internet. |

    Pull quote from Scott’s interview: There isn't a practical solution that says companies are going to choose not to use this data to their advantage—unless their competitors are also constrained by the same | ethical standard. |

    Data science and analytics teams like Scott's share reams of user data with corporations and researchers. These are the enterprise solutions that help companies to make | choices about what products to sell and to whom | and hopefully make | customers happy at the same time.

    So Scott's | seen | how much data companies have on us, | data that can be exposed, eating up our | anonymity online, and he's had enough experience with it to doubt | how these systems develop and how much data they draw from our lives is really to our benefit. After the break, we dive in with Scott.

    BREAK

    SEGMENT C: The data behind the people—and what we do about it

    Keisha: So we’re talking with Scott Hendrickson, a data scientist, about how the | systems behind social sites and shopping apps actually work. [add space between narration and interview]

    Keisha: How much data are we talking about?

    Scott: There are two kinds of answers to that question that might be meaningful | and maybe the sheer volume of data is the least meaningful one.

    | Every company is collecting customer data, contact data, | product purchase data. |

    | If you shop for something on Amazon and don't buy it, you might be visiting Facebook, for example, and see an ad for that same thing that you were shopping for. [added space] | Those kinds of things are the result of data sharing between organizations.

    | The second answer, though, might be more interesting: the kinds of data that we're able to capture are really exploding right now in ways that we can't imagine.

    There are many kinds of biometrics: Facial expressions | can be interpreted with AI more and more reliably so there's many kinds of data | that are available. | I think those are probably more interesting than the sheer terabytes.

    Let me give one example that’s | actually | not far away in terms of technology coming together. [added space] We can put tiny sensors in our headphones and track brainwaves, can easily sense some things like | alertness or | excitement, and we would be able to tell if you liked product A more than product B by just observing you wearing headphones and looking at brainwaves.

    SOUND: *record scratch*

    Keisha: Yeah, so what? —tracking brainwaves?!

    Annanda: The technology doesn’t surprise me. The lack of consent doesn't surprise me. And the likelihood of how these technologies will be used and the compounding impacts it will have are vexing and maddening to think about. | But to know is to be empowered. So keep talking Scott, I want to learn more so I know my options. So we know our options.

    Scott: So for today, it’s almost certain that you’re going to know that’s what’s happening. | There is actually one trial in IKEA where you can step into the little square and be tracked and they can show you different products and tell how excited you are.

    But that’s very explicit.

    Keisha: Imagine this kind of biometric data deployed at the scale of LLMs, large language models like those behind ChatGPT and BARD. Early on, it's still clunky and explicit, so you have to seek it out and you have to opt in, but all of the incentives point to baking it into different consumer and social systems in ways that make it so much harder to opt out of.

    That's already happening with AI-based identity verification tools like ID.me, which now collect face data and IDs to gatekeep Americans’ access to the IRS, Social Security, and several state departments of labor. So the pattern's likely to repeat with other kinds of data including information so organic to what it means to just be alive that we might not even realize we're producing it, or that someone might | collect |and analyze it.

    Again: Brainwaves?

    Scott: | The challenge | is the rate of change of that kind of technology is | very fast and it’s for some of the same reasons that LLMs have become so exciting. | we now have ways of taking a large amount of data and interpreting | meaning from a fairly disorderly data set. |

    | Interpreting one's brainwaves as excitement or attention or inattention or boredom | follows the same path. | the more data there is available, | the more specific we are going to be able to interpret what someone is doing.

    | There are probably,| thousands | of heavy equipment and trucking operations that are tracking people's sleepiness through these kinds of devices. And they're able to give a warning when someone's dangerously | fatigued | and out on the road |.

    Keisha to Annanda: Some sort of feedback for sleepy truck drivers seems like a good thing, right?

    Annanda: I’m all about safety first. I’m just too American to believe the benevolent use of this technology will be prioritized or legally regulated as the normative business standard. That’s not how we tend to do things.

    Keisha: If trucking companies can reduce accidents, they’ll save money on workman's comp and insurance, and they also save their reputation. | That's just a good business plan!

    I also don't mind if hospitals take steps to protect junior doctors and surgeons who are chronically underslept. I just want the measures to be consensual. And maybe administrators could start by making worker shift lengths more humane instead of | eye tracking off the bat, because not all management breakdowns are best resolved with |tools. |

    Annanda: During the pandemic I was a chaplain at Stanford Hospital. I will say that the pandemic positively impacted the shift lengths. | They shortened them | to prevent long term burnout in the midst of an unknown | disease, | like, I'm talking, | early 2020. | My experience in the role that I played as a chaplain, they have been fantastic.

    I think everyone has benefited from that decision making. |

    For me, increasing surveillance sounds like widening mistrust and leaving little room for circumstances of what it means to be human.

    Right? Like, | who wants their eyes to be tracked all the time? Nobody.

    Keisha: Like keyboard strokes in a remote work environment.

    Annanda: That's not a work incentive. That's like an instant F-you incentive.

    Now it’s like, I no longer trust you as an employer.

    Keisha: Yeah. I feel you.

    Data Coherence

    Scott: | Cookies are | pretty low level in the browser. | You visit a site and | someone can say, you’ve been to google.com, so we know that about you.

    | More coherent things can happen with us | knowing less about it. | CCTV-type surveillance | could be used very coherently to track where you’re moving and make interpretations about what your daily schedule is.

    Keisha: Imagine driving from Baltimore to New York. I’ve made that trip a couple of times. It’s 4 hours depending on traffic. Along the highway, every tollbooth takes a picture of your car. So someone with access to the toll cameras can put those photos together and track you across four states.

    They could also do it by stitching together security cameras, and gas station purchases, and phone call logs and cell tower pings. We’re talking about making a single profile of someone out of several sources of data.

    Annanda: That reminds me of Enemy of the State, when you say that.

    Keisha: Yes, that was my mental picture.

    SOUND CLIP: The NSA can read the time off your f--- wristwatch!

    Annanda: But now anybody can do it. Enemy of the State was this like government agencies, but now you're saying like anybody that has enough tech savvy..

    Keisha: Yeah, like you could break into those systems. You still have to have access to the system, but you'd have to break into it if you're not like already part of the system or part of the government.

    Scott: It could be book buying or it could be television watching. | A single incident isn't | so very interesting. But | we can learn a lot about someone when | we | put those pieces together and string them into a more elaborate story.

    | Coherency can | be something very comfortable with your intimates. | You are known well, and | they know what you're going to like and | not going to like, | how to make you happy and how to pull you into an argument.

    | Coherence can be just downright creepy when the relationship part is missing. | When someone's putting together a story on you | without your consent or knowledge, | it becomes the opposite of what we want from our relationships. It becomes a threat or | a potential invasion | into privacy or| safety.

    | Coherence | is putting the story together. And | in the data world, | we talk a lot about telling a story with the data. We talk about it as if it's always benign, but clearly we tell marketing stories that are not always benign. Sometimes they are designed to be manipulative, | to change someone's thinking in a particular direction, | get them to spend money. [space added]

    Keisha: So a business that owned, say, a grocery store and a rec center would be able to use coherent tracking to build a picture of Fred, who likes Fritos and also swipes into pickleball every other Wednesday. And maybe that would help them customize his experience in both places a little more.

    That’s a gentle example. It’s already gotten more serious.

    Keisha: when the laws around reproductive health changed last summer. people, for maybe the first time, were thinking about | data tracking and sharing policies in period apps or | data collected when they visited health clinics, for example.

    Annanda: The Vanderbilt University Medical Center allegedly shared the health records of transgender patients with the Tennessee Attorney General without authorization. Patients sued them in August. The US Department of Health and Human Services is still investigating.

    Keisha: On top of that, CyberRisk Alliance reported that the same medical center got hit with ransomware over Thanksgiving holiday.

    Annanda: First off, Trans rights forever. Period.

    Keisha: Amen.

    Annanda: Second, there is never such a thing as a “benign story.” Stories are | never neutral. This is in part why the humanities cannot be thrown out when it comes to computer science and math. | The humanities help us to be better. Because coherence is clearly out here and very ready and able to snatch all of our wigs. And ain’t nobody got time to deal with that. Not this data breach. This HiPAA breach. On Thanksgiving. That’s serious.

    Keisha: The best case scenario for a cyberattack is that it’s inconvenient for you, so you have to update your password or keep an eye on your credit report. But the aggressive prosecution of reproductive access and gender care takes data concerns way beyond mere inconvenience.

    Crooks and fascists can’t breach your data if the data doesn’t exist. But if sensitive data does exist, then the people whose data it is should always be able to approve or unapprove who gets to see it.

    Why do ordinary people always seem to be the last group considered in scenarios like this?

    Data Colonialism, Disadvantage for Consumers

    The pattern that I have observed over and over is that once we move to a new realm | of things that we hadn't thought of before, there does seem to be | I guess I'll call it a bit of a land grab where the commercial interests jump in and try to take as much of the space as possible before someone pushes back.

    And I think it's a very natural thing to do that because you feel like you're gonna get a competitive advantage in your business and you're gonna be able to sell more product. | |But I think what happens there is | the consumer is often behind in figuring out which part of that land grab is for their good and which part of the land grab they—we want to resist. All of us want to resist.

    | An old example, the search and seizure laws in the United States were very much based on someone breaking into your house and walking to your desk and reading your private papers. | When we moved to cell phones containing so much about our lives, the search and seizure laws didn't follow. | The constitution didn't contemplate cell phones.

    Annanda: A document written in the 1780s didn’t envision all the conditions we’d encounter some 250 years later. I’m shocked.

    Keisha: The tech environment has changed so much. In the 1700s searching required investigators to physically interact with an accused person’s physical things.

    But in our world, it can mean grabbing human data without explicit consent or ripping data off a phone as it passes by. Police can do that with devices called stingrays. So how we judge a violation of privacy is changing too.

    Annanda: The question that’s coming up for me with all of this is what does it mean to live in a world where equal opportunity violation is expected and deemed normal? What does that lack of consent do relationally to a society?

    Keisha: I love that question | what quality of society are we making where that violent extraction is the norm?

    Annanda: | All of this | is incredibly private, personal information. | And so if that is a normal way | certain businesses are conducting, right, | you walk in a store and—brainwaves—and they’re able to extract the data that they need to support your business. | That feels incredibly violating and nonconsensual.

    Here in Oakland, folks have scanners. They'll scan your car, and then they'll do what we call bipping, which is they'll break the glass, either the passenger side or usually the quarter panel, and then, get into your trunk and take whatever prize they feel like they've scanned.

    | Thinking about consent, the ownership that one has, not just on our bodies / but also the mind and the spirit | [added space] when you go to a therapist, you’re like, yes, observe my behavior. And we’re doing this, you know, for my | improvement with my agency. Cause you could always tell your therapist no | right? | or like, this doesn't feel good for me, let's try another way.

    You can't do that with this.

    | It says that we are creating a culture and a society that politically, | economically, | and socially says | the society must run on non consensual extraction | of all parts of you.

    | That is what, to me, is truly morally at stake, and that's a problem.

    Scott: | we've all | been moved to a new expectation for | what someone can do in terms of reading from our phone: It's | a different standard and partly because the context changed so much:

    If someone can interpret my facial expressions or read brainwaves while I'm shopping in their store, and they can do this without telling me 'cause they don't actually need to connect anything to me to do this, | I, as a consumer need to be aware | of being able to say “no” earlier and that's difficult. [space added] |

    We don't have laws that cause a company to disclose everything they can do with new technologies. | it's concerning to me | 'cause the back and forth almost always is | led by | a commercial interest taking ground that the consumer wants to protect.

    Keisha: | Public researchers | did a lot of work on social media because they could access Facebook and Twitter's API, for example, but |that research is being cut off or metered, or you have to pay for it now, and it's very expensive. What does it mean for us if the only people who can study the data that exists about us or how companies are using it are those companies themselves?

    Scott: The inevitable conflict of interest is playing out right in front of us. | Many of the largest companies | that run social media and have large scale data around our search habits and other things | have gone through a cycle of building teams that were focused on data ethics. But have actually ended up, | fencing those teams in, in terms of what policies and what kind of communication can come out of those teams. | So | not surprisingly, it is hard to trust the internal voices that are studying data ethics at these companies. | At the current time I don't think we have | much political will | to have any kind of public oversight of the data stored in social media.

    Keisha: And without that visibility, people can't really opt out.

    Choices in the Tech Ecology

    Scott: I'm not seeing evidence of | enough influential people with understanding, working in a direction | that gets us to a workable place with policies that both allow us to see what's happening inside large organizations from a data perspective that protects consumers, but also lets | companies go about their business in creative and innovative ways.

    Keisha: If we think about the tech social system as an ecology—this is an image that's been sticking with me for months at this point.

    What if it's like a system where you have the technologists, the lawyers, the ethicists, the researchers? The general public, the regulators, all of us kind of connected through these tools, | systems, |choices,| moral dilemmas and breakdown.

    | None of it works unless we're all in some sort of relationship to the system. | The person who clicks through, | the person who is driving and trying to mind their business, | the person monitoring them on the camera. It requires all of us either to support this structure that extracts violently or to challenge it and change it and make it something that respects consent.

    Annanda: | The possible future that you named, I think is a beautiful vision. | and it requires more work. | if that's the future that you want, if that's the choice points that enough people want to make, cause it doesn't have to be everybody.

    | it just needs to be enough people, | well, there's a cost to that, right? That means you actually can't be as passive about | technology. It means you actually do need to be educated about | what's going on.

    And | you do need to be informed and you need to be an active citizen. | It's part of the, I think, a crisis that's going on in the United States right now, | how many people are informed? | are they even able to vote? Are they getting gerrymandered?

    Keisha: Do they feel it matters when they do?

    Annanda: Right. Like, you can’t have a miseducated or uneducated populace in order to run a successful democracy.

    Keisha: Mm hmm.

    Annanda: So I think | so I think that same method applies to what you're talking about, but it requires more effort. | my skepticism | is like, | Most people aren't willing to do the work.

    Scott: | Having consumers become more sophisticated about their data is one of the broader answers. [space added] | I think the voice of more people who understand | the benefits as well as the risks and can talk about them and vote about them, | is the most powerful part of this that's missing right now.

    | it's tricky to have | our government leaders up to date enough | to create a policy that | is going to allow the kind of innovation that's gonna come from new technologies, like large language models, while still protecting people. | I don't know how to do both of those at the same time. It's always | a race to | build the security model, | build the visibility, build the capability, explore some of the things that we hadn't imagined we could do with | new products and new services | and then maintain safety the whole time.

    | That takes too much foresight. So | I don't think we can look to government.

    Data Shenanigans

    Keisha: While we were in production, Cali passed a law—

    Annanda: Oh, you—people do call us Cali outside of—wow….

    Keisha: We do. We do. I’m so sorry.

    Annanda: It’s kind of rude, but I understand.

    Keisha: Put some respect on the name of the state. Okay.

    Annanda: like a little respect!

    Keisha: Laughs: While we were in production California passed a law that will require all data brokers to delete the data when a consumer makes one request. Can you imagine they had to pass a law?

    According to the LA Times, there are over 500 brokers registered in California alone. I love this law.

    Annanda: Listeners, get your data deleted.

    Keisha: Love it, love it, love it. Wish it were federal.

    Annanda: | Oh that’s your Europeanness coming out, you know? For the people! The EU has way better internet privacy protection laws than the United States.

    Keisha: It does, and it's complicated even there, but it's a start. It's a start to a different vision that puts the agency back where it should be: with the people.

    Scott: | A number of data vendors | make no pretense about trying to keep a database on everyone and they will sell that data to any customer that comes along. | That availability starts the data record, and it's awfully tempting to build on that with any other data you can pull in.

    SOUND: shift tone between interview and dialogue

    Scott: | There is a sense of trying to build|, a complete circle view of someone who's potentially your customer. It's very tempting to try to do that. | I worry about | the world in which we choose to give up our data because | I don't know how to design a world where we can choose not to give up our data in some sense.

    | Not buying groceries, | not driving a car, | not paying your taxes. Like | these aren’t options for people.

    Scott: The three major credit reporting companies are keeping a record on everyone in the United States.

    There | are some ways to limit access |,| but there really is no opt out. | there is no practical way to live in our world and have people protect their data.

    We're absolutely gonna be giving up our data | Visibility into how it's going to be used and explicit controls on how it's going to be used has to be the answer to this. |

    | and I think it has to come from government because | any restriction that is going to give | your competitors a disproportionate advantage in the commercial space, you're not going to take on unilaterally. | it's too costly to be that company that steps forward and says, we're not gonna use data in this way.

    | If the government imposes these restrictions and we can broadly agree that they're reasonable for protecting people, the competitive landscape is now leveled back for everyone to the same spot and they can all afford to | exercise some restraint | when the law says they can.

    Keisha: But these restrictions and constraints — they’re not in place yet.

    | I’m very concerned about the data. I have a hard time choosing not to give up as a consumer because | I don't think I'm being broadly protected by the policies of our government. |

    Keisha: Scott’s not particularly dramatic so if he’s concerned, I’m concerned.

    Annanda: Keisha I’ve been consistently concerned. I don’t trust most of these tech pimps. and my language is intentional because we're all getting pimped out and I did not agree. I did not agree.  *laughs*

    Future Data Scientists

    Keisha: So Annanda: if you were writing a moral repair curriculum for the tech workers who are making these systems, what would be on it?

    Annanda: You know, I’m a massive advocate of | experiential learning alongside theory. | so it can be integrated for practical application. | This is how chaplains are taught. I learned better, not just thinking about it, but also doing it. | If you’re better at theory, alright, you’re gonna excel in that. And if you’re better at praxis, or the doing, then you’re gonna excel at that. But either way, you gotta integrate them, right? You can’t be the ivory tower just with theory in your head | thinking it’s gonna work out in the world, and you can’t just be doing things without a better understanding of | possible theories to support the doing. The integration is important. |

    And so a core component of my curriculum would be tech folks, shadowing social workers, chaplains, other human-centered service providers to see the real impact of the technologies they are creating on the lives of everyday people. I’d have them do this Monday through Friday for at least the majority of the course, and reflect,

    Scott:| I think it starts with empathy. | walk in someone else's shoes for a while and see how that feels.

    Keisha: So empathy is core to ethical design. It's a way of helping us understand the environment and context of use for the person who's using whatever product | or process we're trying to create. And it helps us to understand their world and, um, therefore design with a bit more of an accurate eye for how it's actually going to play out.

    Annanda: empathy is good. I'd prefer compassion. And compassion is empathy with boundaries, and this is something, | we chaplains talk about all the time, | I don't need somebody to feel the pain that I feel as much as I need them to value and center my experience of pain when it has occurred.

    And if we're empathetic all the time, you're gonna get worn out. | if I was empathetic to every patient I saw in the hospital, every student going through a crisis, or a death of a loved one, | I'd be so in tune with feeling what they are feeling, as opposed to compassion, which will allow me to | feel with boundaries. | I don’t have to take on their emotions, but I can acknowledge it. I could value it.

    Keisha: Maybe it's about |, building secure relationships and setting up permissions to be known in advance. And that's what can help to make data matchmaking feel more like a tool that's supportive of us than a violation at the whim of an unaccountable person or company.

    | Something else is coming up for me around whether we should simply account for the fact that our brains are wired to catch disaster.

    So we might be biased against seeing good possibilities.

    Scott: when I'm looking for a new part for | my heater, | I get a recommendation for a new filter that goes with it. I install that, my heater’s back up and running. | If that recommendation engine hadn't matched the idea that | if you buy | a new part for your heater, you probably need to replace the filter, I wouldn’t have gotten that advantage. | I don’t usually tell that story to people. It’s just mundane. | *chuckles*

    | But when the opposite happens, when I feel violated by some kind of recommendation that's | too close to home,| then I tend to have a reaction to that. | The upside and the downside, are both real human experiences. There's value and risk in both.

    Sometimes |, the coherent tracking does get to me |.

    I've definitely used technology skills to build ad blocking into my home network | and into some of my laptops | mainly to stop the irritation. |

    Annanda: Scott needs to teach me. How to do this ad blocking Dougie, and probably a handful of things I don't even know that I need right now.

    Keisha: Yeah. I want a training too from Scott.

    Ultimately, I think, though, | beyond | the protective measures, | I really am interested in how tech like this can enable us to do good, |not just to protect against the bad raiders, but to think about how we are being changed | as we become accustomed to using these tools. Because it's not just the data and it's not just the systems, it's also us. And whether we can wrangle our lizard brains to learn to see the good and to move toward it.

    Learning from the Trees

    Annanda: You know, as you say that, Keisha, it makes me think of biomimicry. In particular, the redwood trees.

    Keisha: I love them. Yes.

    Annanda: their network of roots, they're actually holding one another up for roots. And so if there is a tree that is a little lower on nutrients or needs something, they'll all come together and like support and lift up this tree based off the data, the information that they have of what one another needs.

    And as you're talking, I'm like, yeah, you could design this entire | ecosystem that's created to mimic what life has showed us already. | I think that would be so fascinating.

    Keisha: It was the cleanest air I've ever breathed walking through Muir Woods.

    Annanda: Wait, wait, wait, where's Muir Woods located? What state? Yeah, in Cali. Okay, as long as we're clear about that. Proceed. Proceed.

    Keisha: I absolutely loved it. | and | to have the humility as the masters and crafters of this technology to learn from what life has already | shown us works. Love it.

    Scott: | I’m very excited about where we are now | and where that takes us.

    |I'm also very hopeful that we are | not going to be the humans in the future that we are now, that we actually are gonna grow | in ways that | connect us in ways that we've never been connected before.

    | We should always look at scary things for a minute and make sure | we can manage those risks | but | all of the powerful tools | even the ones we complain a lot about | have an upside | a way of | of allowing us to do more with less | to live longer | to have healthier lives, to have more fun, to share with people farther away.

    | All of those things have been amazing. | celebrating what’s been accomplished and what we can do next is amazing | and wonderful.

    SOUND: Reflective music

    Keisha: Scott opened up a theme for me when he talked about how tech development sometimes operates colonially, grabbing land for the company without regard for the wider ecosystem of other people, other groups, and the society that we're all a part of. And so many of the concerns that we raised in this episode come back to that question of colonial behavior and the question of consent.

    From the worlds of gender and sexual justice, I learned the principle of affirmative consent. Affirmative consent is consent that's explicit, free from compulsion, and offered in real time versus once for all time.

    Even though this model is increasingly part of legal standards around sexuality and it's shaping some of the data and privacy models in technology, our social systems aren't yet built around this understanding. Humans aren't binary. We don't always know what we want until we're doing it. So it's a practice to be able to recognize what we want in the moment and then take the liberty to say so and have that heard and respected.

    A few years ago, Professor Alondra Nelson, professor at the Institute for Advanced Study in New Jersey, talked to the New Yorker about the ambivalence that's part of simply being human. And the law doesn't account for that ambivalence. Sometimes the law tries to flatten the realities that we actually live in.

    If I go to a website and get a cookie pop up, I actually have no idea whether I want only the "necessary" parts of the page to work. I don't know if I want a long term relationship with the company behind that page. How do we craft a world that doesn't force people to pretend to be more binary and self aware than we actually are, but | respects our decisions as they become more clear to us?

    Dr. Yvette Abrahams works on gender, climate, and Indigenous knowledge at the University of Cape Town. In the essay, Thank You for Making Me Strong, she says, “Our task here on Earth to live well together, bearing in mind the action of each has responsibilities for the well being of the whole. No doubt we human beings are so proud that we would live alone and independent if we could, but we cannot.”

    In relationships, as in tech, grasping for individual benefit whatever the cost to the group, puts us all at risk. So these are the ethics that recognize we already live in an ecology and we can only have the health and prosperity we need by acting like it across the full scope of our personal professional lives and cookies and consent and in challenging colonial impulses wherever they surface.

    Because in the end, it's not about creating an ecological reality. It's about dropping the illusion that we've been independent all along.

    SOUND: Bells and holiday vibes!

    Keisha: Hey y’all: Our next show is a special holiday episode… and we need your help.

    Annanda: We want to know, when it came time for the holidays, what were some nostalgic memories you’ve had around tech? Was it gifts? A tradition? A hilarious or heart warming story?

    Keisha: You can reach us at @moralrepairpodcast on instagram, @moralrepair on Twitter/X, and at @moralrepairpodcast@gmail.com

    CREDITS SEGMENT

    Annanda: I’m Annanda Barclay.

    Keisha: And I’m Keisha McKenzie.

    Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.

    Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.

    [PRX AUDIO SIGNATURE]

    Show Notes

    For the next episode: tell us about your nostalgic tech memories! Find us at @moralrepairpodcast on instagram, @moralrepair on Twitter/X, or moralrepairpodcast at gmail dot com

    How did Cambridge Analytica use 50M people’s Facebook data in 2016?

    California bill makes it easier to delete online personal data (LA Times)

    “Churches target new members, with help from Big Data” (Wall Street Journal)

    In the film Enemy of the State, characters uncover all the ways they’re being tracked—it’s a lot

    Digital safety for people seeking reproductive care (Digital Defense Fund)

    How redwood trees communicate (New York Times): “The Social Life of Forests” feat. Professor Suzzane Simard

    “Thieves Use Tech Devices to Scan Cars Before Breaking Into Them” NBC Bay Area.

    Scott has recommended a few books for our audience:

    Re. tools and consciousness: God Human Animal Machine: Technology, Metaphor, and the Search for Meaning by Meghan O’Gieblyn

    Re. ChatGPT-4: Impromptu: Amplifying our Humanity Through AI by Reid Hoffman.

    Re. attention tech and neuroscience: The Battle for Your Brain: Defending the Right to Think Freely in an Age of Neurotechnology by Nita Farahany.

    Prof. Alondra Nelson in “The New Norms of Affirmative Consent: Alondra Nelson on the New Yorker Radio Hour”

    Prof. Yvette Abrahams on social ecology ethics in “Thank You for Making Me Strong”

 

Holiday Special: Tech Nostalgia

Season 1 | Episode 7

  • Coming soon

 

An Answer to Big Tech? Tech at a Human Scale

Season 1 | Episode 8

  • Keisha: Hey Annanda.

    Annanda: Hey Keisha.

    Keisha: Last time Netflix raised their prices I dumped them. |

    SOUND: Decelerating Netflix logo sound

    Annanda: I have my same | subscription from when I was in college, so I'm grandfathered into one of the oldest price points possible. Maybe it's raised like a dollar on me.

    Keisha: Wow.

    Yeah they’re up to something ridiculous for a subscription that doesn’t have ads on it. You can get one with ads for $7 plus the digital tax… but if you want it ad-free that’s like $16-23 a month.

    Annanda: Oh yeah, definitely don't pay that. I don’t pay that with Netflix

    SOUND: Light casual background music

    Keisha: Absolutely, ‘cause it’s not just them. It’s also Hulu and Disney and Paramount and Apple TV… There are so many of these entertainment services you might as well bundle them all together and call it “cable” ‘cause it costs as much!

    And now Amazon’s ringing in the new year by changing Prime Video. Even those subscribers will have to pay more for an ad-free service.

    Annanda: See, my Netflix and my Amazon Prime are both | old because | Amazon came out really wanting to sell books, right? And so that became a cheaper place for me to get all my books in college. | but also, I do what so many of us do, right?

    |I have one family member or | friend that's close to | family, they cover one thing, and together we make a medley of access to media, which I know they're trying to bust up, but I'm like | try it. | Don't think your content is so addicting or so necessary that I won't say goodbye off the principle of it.

    Keisha: That's right. | They push you hard enough and you're like, “Okay, thank you for the clarity. Let me move on.”

    And we’re already paying for the services they offer. Our relationship with these Big Tech companies is basically “pay more and more, and it’ll never be enough.”

    If you're buying from a platform that not only knows what books you read but also what you bought at midnight five years ago, and they've just bought your doctor's office…you shouldn't be paying more for surveillance.

    Annanda: No! You definitely shouldn’t be paying more for surveillance. Also, | this is why I want to bust it up. I don’t want so many companies or corporations to have so much knowledge on me.

    Keisha: I was reading the Amazon Ad blog in November. They told advertisers they expect to reach 115 million viewers a month by up-charging for ad-free. And that’s just in the U.S.!

    They want to make more money off the streamers that they know are also shoppers. So Big Tech is a whole ecosystem of services based on tracking users from transaction to transaction, click to click, action to action.

    They could just as easily have built that system around people’s actual needs. We want to be connected. We want to enjoy great stories. Maybe, yes, buy some stuff from time to time!

    But it’s like data scientist Scott Hendrickson put it: Big Tech development is a “land grab.” And in that exploitative form, it’s driving degraded outcomes for consumers: privacy breaches, suspicion of businesses, and suspicion of each other. We’re means to an end for them.

    Annanda: It's definitely come to “land grab.” | There’s so much technology out there. |

    People can basically recover | a majority of your genome, just from | traces of your breath, or, like, where you have sat. | so I’m like, Oh | soon this will be something commercialized. That’s been a thing that I’ve | been realizing: | This commodification goes so much beyond | shopping.

    SOUND: Start fading in Title music

    Annanda: As I look at | what new technologies are coming out, what new horrors | and gifts are down the horizon. I'm always following the money. | Once I see one thing I look for | what is the equal opposite reaction within the market to it.

    Keisha: In terms of equal opposite reactions to Big Tech, what about small tech? |

    SOUND: Title music feature pause

    Keisha: I'm Keisha McKenzie.

    Annanda: I’m Annanda Barclay.

    Keisha: This is Moral Repair, a Black Exploration of Tech, a show where we explore the social and moral impacts of technology…

    Annanda: and share wisdom from Black culture for healing what tech has broken.

    Today on Moral Repair: can small tech make the internet good again?

    SEGMENT B: DESCRIBING THE SYSTEM

    Annanda: | Keisha, I have been so focused on big tech that I have not taken the time to look at small tech.

    I've been | focusing on | how do I support the mental and spiritual health of those in big tech | companies that are feeling like they're compromising their morals, their ethics, their values, | but I think what you're talking about is so important. | I am in the Bay Area, so, *laughs* I’m just inundated.

    Keisha: | Big Tech’s in the air you’re breathing over there.

    Actually I’ve been hoping Big Tech workers would catch the labor rights wave that’s live right now. I’ve been rooting for everybody striking. | Amazon staff. UPS drivers. Grad students. Nurses and doctors. Postal workers. Writers, actors, everybody.

    Annanda: I've definitely been rooting and very eager to see | where their civil disobedience lands because these are | precedent setting strikes and responses that are happening right now.

    Keisha: Organized labor everywhere: get the working conditions you deserve!

    Digital workers shaping the AI revolution in developing countries around the world… don’t have the conditions that they deserve.

    In October, WIRED reported that data companies have been hiring people from the Global South to do piecemeal data work like tagging images or sorting the data that trains AI chatbots. And these folks earn pennies.

    Annanda: Oh this sounds like good old fashioned American capitalism.

    Keisha: Yeah. The poorhouse era, so not even American, that’s European too.

    And in Kenya, Meta, which runs Facebook, has been hiring local Kenyans to do content moderation for the site. 184 content moderators recently sued Meta over inconsistent pay, union-busting, privacy violations, and work conditions. Negotiations between the company and the contractors just failed again, though, and Meta doesn’t seem motivated to make those conditions humane. They’re big enough as a global corporation to be able to outlast any individual workers and keep trading in the country—even if people suffer for it.

    Jeffrey Sachs talked to the UN about this pattern a couple years ago. He’s an economist at Columbia’s Center for Sustainable Development.

    Annanda: I love Jeffrey Sachs. I've been following his work since college.

    Keisha: In a speech at the UN Food Systems Summit, he described how colonial exploitation and private sector extraction have wrecked countries like the Democratic Republic of Congo since the 1880s.

    Clip: 00:03-00:12; 00:16-50: “What we’ve been hearing is how the system actually works right now… It’s based on large multinational companies, |private profits, | a very, very low measure of international transfers to help poor people, sometimes none at all, | extreme irresponsibility of powerful countries with regard to the environment and it’s based on a radical denial of rights of poor people.”

    Keisha: “We have a system,” he said. “We need a new one.”

    If we use tech to build systems that respect people's dignity, privacy and autonomy, what could that look like in practice?

    SOUND: Clicky, curious background music

    Someone who’s trying to create a new system is Aral Balkan (Uh-RAHL Ball-KAHN) of the Small Tech Foundation. He’s a designer and developer based in Ireland And one-half of the first tech outfit I ever heard of that wants to catalyze a different way for ordinary people to use and own technology—at a much smaller scale than Big Tech allows.

    Aral defines small tech as web sites and services that are designed for individual people. Corporations and governments don’t own and control our data, we do. Small tech protects basic human rights like privacy and so breaks away from the Big Tech model of watching the people who use it in order to trade their information. It’s funded differently than Big Tech to make that focus possible. We’ll get to what that can look like, but first — here’s the system we have right now.

    Aral: Most people are | familiar with | big tech in terms of everything | that they know as | mainstream technology. |

    | it's Google, Facebook, | Snapchat, | TikTok.

    Follow the Money

    Keisha: Companies like these—the army of corporations that make tech hardware, software, and services… they all say their mission is to innovate, but they don’t innovate when they don’t make money, and they don’t make money if they don’t grow.

    So they’re motivated to grow and grow and grow and grow. And that’s how a scrappy outfit that starts in someone’s garage or as a pet project in a lab can become a marauding beast, like an orc run amok.

    Aral: | it really comes down to following the money.

    | How much do you pay for Google? How much do you pay for Facebook? | you don't pay anything, right? | and yet these are trillion dollar corporations. So how are they making money?

    | you're too valuable for them to sell. They're actually renting access to you. | and | in a way they own you as well. They don't own your physical body, but their goal is to own everything else about you that makes you who you are. | that's basically the business model.

    That's what we call surveillance capitalism. That's what | I call people farming. | it's just a new form of colonialism, | a new form of imperialism.

    Annanda: Surveillance Capitalism, | by Shoshana Zuboff, was one of the first | books that really grabbed my attention about what is going on | within the tech industry as it relates to surveillance algorithms and AI.

    Here she defines the term while speaking at Suffolk University:

    SOUND: Soundbite click

    1:23-1:49—”Surveillance capitalism has claimed something from outside the market and brought it into the market to sell and buy. But in this case, the territory that it has claimed is private human experience… to be sold and purchased as behavioral data.”

    SOUND: Soundbite click

    Annanda: | The phrase people farming is apt. | through algorithms and different choice points | that we are given on the internet, | our behavior is herded.

    | we're not necessarily in control of how we surf the internet as freely as we think. We're given a series of choices | and those choices | are curated.

    I do think that it’s a subtle form of imperialism | and it’s quite concerning.

    Keisha: It reminds me of | studies on | how they present pricing options in stores or online. | the company | will really have one choice they want you to pick and then they'll offer you something that's obviously worse and then something that's marginally better and you "naturally" choose the one they wanted you to choose all along.

    Annanda: Exactly. | imagine that with all of our decisions. | I don't think we realize | how out of control we are, | when it comes to | our own | choice and agency.

    Keisha: We wanted Aral’s perspective on how “people farming” is playing out around the world. In 2017 he wrote, “People farmers are reaping the rewards of their violations into our lives to the tune of tens of billions in revenue every year… [and] farming us for every gram of insight they can extract.”

    So what about today? What are these companies strategizing about now?

    Aral: If you talk to someone from Silicon Valley about what they're most excited about these days, you'll probably get an answer along the lines of the next 5 billion, | which sounds like an amazing concept. | What does that mean? | People in parts of the world that they probably haven't even visited, | they want | all of these people to be their | next set of users, that they extract value from and exploit, | for their profit motive | and whatever other motives they might have. Because once you become a trillion dollar entity, and once you have people who are billionaires, it's not just about the money anymore, it's about power.

    | So when you hear someone from Silicon Valley salivating about the next 5 billion, | colonialism | is exactly what it is. | It's not necessarily limited in geographical scope anymore. | it's about a very small group of | people who have a lot of wealth, | who are basically exerting control over, | everyone else | over what they get to see, | what they get to hear, | and, | having the means to manipulate their behavior. |

    | that's | the next step of colonialism, | and different people,| are of course affected by it differently. | the most marginalized are the most affected by it right now. | but to a degree we all are because if you compare yourself to a billionaire, well, | we're talking about a very exclusive club.

    Caste + Capital Bias: “A Very Exclusive Club”

    Keisha: A concept that might be helpful here is caste, which Isabel Wilkerson defines as the basis on which people are given or denied social privileges, respect, resources, and fundamental considerations like consent and autonomy.

    And the club is indeed very exclusive. As the United States emerged as a nation, its dominant caste also formed around race and class: Whiteness, wealth, and the absence of both.

    Wilkerson’s book, the award-winning Caste: The Origin of Our Discontents, focuses on how that caste system systematically violated Indigenous and Black people across this hemisphere and other subordinated people around the world.

    Marjorie Kelly of the Democracy Collaborative adds another layer to the story when she describes the role of wealth.

    In Wealth Supremacy: How the Extractive Economy and Biased Rules of Capitalism Drive Today’s Crises, Kelly explores just how much U.S. culture values wealth and people who own wealth, and what this “capital bias” does to all of us. Here she’s explaining it to Laura Flanders:

    CLIP: 00:23-00:27: https://www.youtube.com/watch?v=mH2eGIHXH5M

    “There’s an extraction going on and it’s sapping the resilience of our society.”

    People consciously and unconsciously use wealth as a standard for judging others and determining the kind of lives they can have. More than that, capital bias shapes how businesses ensure that people who already have wealth benefit.

    We first see caste and capital bias in this hemisphere in America’s broken colonial treaties and plantation systems. And these influences play out today across the modern tech sector.

    Changing that is gonna take some work.

    SOUND: Fade in and out the intro and first verse of Billie Holiday’s “God Bless the Child.”

    Key vocal: “Them that’s got shall have; them that’s not shall lose…”

    Fade out as Aral begins to talk again.

    Stories of Privilege and Access

    Aral: Our aim is to try and not replicate this big tech process. | I am a, | white passing guy with two Turkish parents, |who now has French citizenship and | who has enjoyed | a certain level of privilege. | I have never gone to bed hungry. | That puts me in a very small percentage of people. | As someone with this sort of a background, | the only way I know of not perpetuating colonialism | is to try and build systems that | don't require me to be at the center. | if these systems become more successful, I don't keep growing and getting wealthier and getting more powerful. |

    Keisha: This principle is at the heart of the small tech approach.

    Aral: Small Technology Foundation is just the two of us. | we don't say we know what's best for everyone else that's out there. We don't, we can't.

    | if there's one thing about design, it's that you can only ever best design for yourself. | if you're saying, | we're going to design this solution for this country on the other side of the world. No, you're not. You're still going to be designing with your own biases, with your own experiences |. You're just going to expect those people to use whatever product you've created and | be used by that product that you've created because it's going to be in your interests.

    | we're trying to build technology in ways where as it scales, we don't scale. | How do we do this? | Build things in a way where it's | free and open source so that anyone can take it and modify it,| where people can customize it for their needs. | And you don't need my permission. You don't need me involved. | how do we | encourage other groups to be sustainable with these technologies, without it necessarily feeding back into us, without us getting bigger |?

    | we're talking about decentralization. | communities are empowered to build their own technologies and to use their own technologies, | and not be used by some center somewhere. |

    Annanda: Aral, | I'm wondering, is there a personal story for you there? Because you do not often see white-passing men such as yourself talk about decoloniality | and act on it, create organizations, foundations to actually act on it. So I'm curious as to what | might empower some of our listeners to maybe see themselves in your reflection. |

    Aral: | I basically grew up in Malaysia until I was about 13. | We weren’t rich, but | my parents’ | relatives were the | ambassador from Turkey | so we were in those circles.

    | Very early on I was subjected to | extreme wealth, | not in a way that was in any way critical. Don’t get me wrong. I was just in the middle of it | going, why am I not that rich? | Why don’t I have the Lamborghini that drives me | to school?

    SOUND: Fade in something like drum intro and first verse for Arkadas through narration; no vocals.

    | Coming back to Turkey | to a very different culture when I was 13, | maybe that slap in the face was the first step | in | reflecting | on these things. | It might have been studying critical media theory for way too long at university. Chomsky will do that to you. | It was just step by step. |

    Keisha: For decades, social change organizers have taught that the personal is political. Our individual choices and perspectives are so deeply shaped by the histories and social politics of the cultures around us.

    Annanda and I get to talk about massive cultural themes on this show… the tech sector’s hypergrowth, capital bias, and how to build for community well-being. It’s also fun to see how these big influences come together in our individual stories, framing our orientation to technology.

    Aral: When you grow up with privilege, you get these opportunities where | you’re shown how things are not right. You're shown the inequalities. | and then I guess you have a choice to make, | am I okay with this because it benefits me? Am I okay with perpetuating that? Or should this inequality not exist? | whether that benefits me | or not. | Do you | go, okay, well, screw it, | this is good for me. | or | well, no, this is wrong and this shouldn't be.

    | At the same time, | being Turkish, for example, I also got to see other things like I’m white-passing and that’s fine until I get to a passport control and then they see where I was born and then it’s kind of like, you trying to fool us? | Are you a terrorist? |

    This system that we have, this unequal, unjust system that we have doesn’t benefit anyone. It doesn’t even benefit these billionaires.

    | Like we’re on the verge of climate catastrophe | our habitat being unlivable | The world is going to survive. Homo sapiens sapiens, maybe not so much. | I get a lot of people saying, oh | | this is | about charity. No, none of it is! | If anything, it might be enlightened self interest, but that’s it.

    | The people benefiting the most from this horribly corrupt system that we have, | my goal is not to save those people. But | we’re all in the same boat here. | It’s just that | some people are going to | have the life rafts maybe, and the rest of us are | going to be the first to go.

    SOUND: Evocative space-like background sound

    Keisha: Annanda, did you ever watch the film Elysium? |

    Annanda: I have not.

    Keisha: Rich people escape a wrecked Earth for a luxury satellite that they’ve built in orbit. But the rest of us have to stay here on the surface, in poverty and catastrophe.

    Of course it’s fiction, but | when NASA shut down the shuttle program in 2011, that left a development vacuum that Richard Branson, Elon Musk, and aerospace companies like Boeing have been really happy to fill. | They’re talking not just about supporting | the International Space Station but also about colonizing Mars and seeding the galaxy.

    Annanda: I’m not surprised. | The money, intellectual property |, and market dominance | as a result of the innovative technology to get to space, is a natural, normal next step when it comes to capitalism. | it's the same old same old… just in space.

    Keisha: The land grab strikes again.

    Annanda: What this dilemma reminds me of is Gil Scott-Heron’s song “Whitey on The Moon”. | you just have to replace the word “moon” with “mars”. | My dad grew up in the west and south sides of Chicago. My mom grew up in Newark, NJ. Even though both lived in completely different parts of the country, the sentiment in their respective Black communities were the same. |

    SOUND: Fade in and out 00:24-34 from the “Whitey on the Moon” audio

    | There’s a 60+ year history of Black and Brown communities critiquing the US government for prioritizing | space races over the dignity of its most vulnerable within society. It’s “most vulnerable” not by choice. It’s “most vulnerable” by design.

    Keisha: | The | space stuff is | expensive escapism. |

    But it’s also what | happens | when | a dominant caste is disconnected from consequences... and it doesn’t have much of a commitment to making capital and tech systems | less toxic or | more beneficial to people who don’t have those resources.

    We’ve talked about how tech shapes us as much as we shape it. I also think how we come into the world of tech can influence how we see and use it.

    Aral: I was seven years old, my dad got me a computer, and a BASIC manual. Again, privilege, | cause I'm | quite old, so back then it wasn't a common thing. | he just said, | “Play with it. You can't break it.” And I | started programming.

    Keisha: BASIC is a programming language that ran on home computers through the 1970s, and Microsoft helped make it popular by embedding it in early PCs. As someone whose family didn’t get any kind of PC until the 1990s, I can’t imagine having had access to this tech so early on.

    Aral: | when I was | in the States doing my master's degree, I got a job | making money with technology, | consulting with Flash. | very quickly I was being offered six figures to work in this area. | but I never worked at a Google or a Facebook | And I think there was always something, again, that pushed me away.

    | the more I understood, the more I saw, the more uncomfortable I was with it. | I don't want to build something that tracks what everybody does | so I can manipulate them.

    I don't want to do this. Oh, you don't? Well then, then you don't want to create a startup, because that's what all startups are | trying to do. | then you won't get any venture capital. | well, I guess I'm okay with not getting venture capital then, if that's what it involves—this is a really toxic system.

    SOUND: grim, gritty electronica, downtempo through segment, behind our voices

    Keisha: Aral | saw a system that he was supposed to be attracted to. And then once he got close to it, he began to feel repelled.

    While he’s choosing to consult around the system |, several decade-long trends are unfolding in tech and society.

    Computer hardware improves so much and so fast that tech users can access so much more personal storage. | We can walk around with mp3 players and | stream | films online. But within that same wonderful system, the musician and artist isn’t getting paid properly and they can’t control where their work goes.

    At the same time, government agencies are mass surveilling the public in the name of preventing the next national security disaster. And tech companies cooperate by giving law enforcement agencies on-demand access to people’s information.

    It’s in 2013 that we start learning just how far-reaching those surveillance programs are. The news alienates a lot of folks who, like Aral, got into tech to make life better for everyone.

    Annanda: | This is where those who have wealth and access can really play a crucial role to disrupt monolithic business practices and provide products with a broader range for the public.

    I’m mindful that the social and emotional cost to make that happen is quite high | and | our society undervalues emotions. But the reality is, emotions shape our individual and collective identities and cultures. | the core emotion of fear that clearly exists within the tech industry | is a serious roadblock when it comes to catalyzing change.

    I’m not just talking about fear through disenchantment and overwhelm, for folks like Aral. I’m also talking about fear of losing market dominance, or losing access to surveil populations to mine data for business or government purposes.

    An American intelligence contractor | Edward Snowden leaked the news that at least five governments were surveilling people’s phone data. The US charged him for espionage and theft, and he fled for asylum in Russia.

    The program Snowden leaked was challenged in court, but 10 years later governments all around the world can still access our phone and internet records. And there isn’t much accountability for it.

    Keisha: | It's sobering, really, to think that it was so | scandalous at the time | but have we become used to those patterns? | Accustomed to the toxicity?

    Aral: | The Snowden revelations were | the final drop, | where it's like, my goodness, the extent of this thing. | people always ask |, "Well, what ramifications are there? | I'm such a boring person. Nobody wants to know about me."

    Trust me. It’s not just about you. | You’re a statistic to them. But also that's very much against your interests that you're a statistic to someone.

    Countering Dehumanizing Systems

    Keisha: Toni Morrison gave a lecture about people becoming statistics at Portland State University in 1975. She starts | by describing a ledger of commodities traded in the US—food, solvents, and trafficked Africans.

    Listen:

    06:59-8:00: “Right after pitch and rice but before tar and turpentine, there is listed the human being. The rice is measured by pounds and the tar and the turpentine is measured by the barrel weight. Now there was no way for the book … to measure by pound or tonnage or barrel weight the human beings. They used only headcount.”

    Like Isabel Wilkerson and Marjorie Kelly, who we talked about earlier, Morrison names the root of this dehumanization. It’s not racism. It’s “greed and the struggle for power.” Racism and disparities are consequences of the caste system that allows a few billionaires to target their next 5 million marks.

    Morrison also says “You don’t waste your energy fighting the fever; you must only fight the disease.”

    “Fighting the disease” includes thinking about how issues intersect and how policies like mass surveillance could affect others.

    Aral: if you happen to be gay in a country where it is illegal for you to be gay, | then maybe the ramifications are that you're going to be dead or incarcerated, right? | as a middle class | white person | in some | Western country| it might be that your insurance premiums go up and then you're like, why do my insurance premiums go up? And they're like | your smart fridge told us what you're eating.

    | it's at those points where some people get it: “Oh, I'll have to pay more money for something.” And it's sad | that's when they pay, start paying attention. Because | people are suffering and dying.

    Annanda: Thinking about “fighting the disease” is a vital start, I worry that it can all too easily become a comorbidity of the disease, | a norm in liberal spaces. At this point, it is a rare thing for me to give any “you know what's” about a damn book study, panel discussion, or documentary watch party, explaining to well intentioned people, the reality in which we’ve been swimming in for generations. I’m far more interested and willing to collaborate with folks who are rolling up their sleeves and doing active work on how they are in relationship with other people, and how those relationships model with clarity, depth, and complexity | undoing historic wrongs.

    Keisha: Yeah, Toni Morrison’s work is all about building new futures. She’s literally asking creators to stop shadowboxing! And to make something real, rooted, grounded. Because the current system has its own inertia keeping it in place. It takes energy to move in new directions.

    Aral: | the companies | whose business model it is to track your every move and then use it against you | it's not that they're going to go | out of business tomorrow.|

    | Hugo Boss | created the uniforms for the Nazis. They're still in business. | VW | built the cars. They're still around. | We seem to have a very hard time getting rid of large corporations, even when they've done the worst things in the world. So, | I don't believe that | we can destroy Google or destroy Facebook |

    | but I don't think that's what we should be trying to do either. | They have an unethical business model, so as far as I'm concerned, they don't deserve to exist.

    | There's a book called Hooked, | that was a bestseller| in the U. S. | How to Build Addictive User Experiences. You'd think it was a warning. No, | it actually is an instruction manual. | So this is the problem, | our success criteria are wrong. We encourage psychopathic behavior. |

    | We don't have healthy cells that are, | doing something good for the organism. | We have cells where we say, you need to grow and grow and grow | to get as large as you can. We just described a tumor. |

    | but My concern is how do we build better alternatives. How do we build things differently? | Whenever they do something different, something I can raise some awareness | yes, of course. | But I wish I could spend zero time on them.

    Annanda: So what’s the antidote? |

    Keisha: More about that after the break.

    SEGMENT C: SOLUTIONS

    Keisha: Welcome back to Moral Repair. We’ve been talking to Aral Balkan, designer, developer, and co-founder | of the Small Tech Foundation. | If the small tech approach could be a real alternative to Big Tech, what questions would new founders need to ask? And how might this approach need to be funded?

    Funding the Sector

    Aral: if the company hasn't been formed yet, | you can sit down with the people who are going to form it and say, okay, let's talk about your business model. | it all comes from the business model and | funding model. If they want to be a startup in the Silicon Valley sense of the word, | and if they're looking for venture capital,| the only thing I can tell them is, | you have to either out-Google Google, or you need to build something that is useful for a Google or a Facebook and then you will get bought by them, right? But you have to | build tracking technologies.| because that's the game you're playing.

    | A Silicon Valley startup is | a temporary business. Its goal is to either fail fast or | grow exponentially, right? We love that whole exponential | growth with finite resources. That seems to be what we're basing our society on. And if you think about it for more than about 30 seconds, you | realize | that's just a euphemism for extinction. |

    Aral: At the beginning, it's all about the product. | “Hey we want to let you guys talk to each other.” | What a lot of us don't know is they've already taken venture capital!

    Keisha: Aral’s laying it all out here.

    Aral: They've already sold the company. | and they’ve already sold a business model to their investors. | we're going to build this thing. | Why are we making it free?

    Because we want to get millions and millions and millions of people. | That’s what we’re selling in our exit: those people and the potential to exploit them. | So that's the | Silicon Valley model. |

    Can we talk about different types of funding? | then we have an opportunity here. | are you going to be okay if you build | a cooperative? | are you going to be okay with not becoming a billion dollar corporation? | with not selling this company |, but | investing in it being a long term sustainable business?

    | if those are your goals, | then you can't take | venture capital. |

    Keisha: So VC/venture capital creates perverse incentives. What can we do instead?

    | here in the EU, where you know we have more social funding for things, | if I am just coming into technology, and I want to create | something, | Silicon Valley's over there with their venture capitalists going | “Here's 5 million | for you.” | “I just invested in 10 startups. I don't care about nine of them failing. | I want that last 5 million to be a billion dollar corporation so I can get my money back and more.”

    | Do we have commons-based funding? | are we paying for technology with our taxes? | we're not. | “You want to build technology, but you don't want to be a douchebag? | Here's your funding.” | That's what we really need.

    | how it's funded matters.

    Investing in the Commons

    Keisha: I really love this concept of commons-based funding, and I want to think about it broadly.

    SOUND: Audio clip

    A lot of my parents’ friends in the UK were also first-generation immigrants and part of the Windrush Generation—that’s people | born in former British colonies who moved to England and helped rebuild the country after World War 2.

    Because of the hostility that they faced when they got there, many of them struggled to get bank loans and mortgages. So they leaned on each other. They shared resources in a peer-to-peer circle lending system that didn’t charge anyone interest, called susu, or partner. And that’s how some of them were able to buy houses, to fund their kids’ education—and, yeah, start business ventures.

    So today, could groups of tech folks… especially those who don’t have access to conventional VC support… use susu circles to fund each other’s small tech projects?

    At a different scale, I’m also curious about what Arlan Hamilton has been doing at Backstage Capital: it’s | focused on founders who are women, people of color or queer, | not just in traditional tech, and aims for pretty significant profits |. They say they’ve worked with over 200 companies in the last 5 years, at the $100K level on average.

    If someone wants to send me $100K to go be great, I won’t say no. And also there are also lots of cool things I could imagine doing with even less funding than that.

    Annanda: What Backstage Capital is doing is great. This is definitely the thing we need to see more of, and $100k in the tech market— | it’s not enough funding to seriously compete with White and Asian founders. | VC’s should choose to invest in all founders the way they do White and Asian men. | my frustration about this is how basic of an expectation this actually is. It reminds me of this idiom said in the South, “and then you want to know why?” Like, if you see | the gaps that we see | $100,000 isn’t even a drop in the bucket | to be | competitive in today’s tech market. | And what Arlan Hamilton has been doing is necessary. It needs… six more zeros behind it.

    Keisha: Absolutely, the | investment playing field has not been level. I’m less interested in answering Big Tech at their scale though—because it’s been destructive—and more interested in experiments in different ways to operate. That’s what small tech sounds like to me.

    Annanda: | If | we have a listener that's like, Hey, I want to do what Aral is doing, | what steps would they need to take? | actually this opens the door for a lot more people who | don't really have access to VC capital anyway. that | makes space for the global majority.

    Aral: we need to support these people from the commons, from our taxes, | “you want to build technology for the common good, | then here's 5 million euros for you.” |

    “And | you don't have to give us a detailed, | list of every feature that you want to add to it. We want you to experiment. | We want you to fail fast. We want you to do all those things that actually work. But | you're not going to have that toxic business model.

    In fact, you won't be able to sell this organization that you're building. | Cause we're funding it from the commons. It's going to stay in the commons.” | if we did | just that very simple thing, | I think we would be awash with alternatives.

    Keisha: I grew up hearing about the commons as a way to talk about public parks anyone can use, however much or little money and social status they have.

    Investing in the commons, then would be | kind of like any one of the 50 million trees the Kenyan biologist and conservationist Wangari Maathai [One GAH-ri Mah-Tah-EYE] inspired women to plant in Kenya over fifty years.

    Commons investments, like those planted trees, don’t just yield one effect per cause. They produce generations of interacting effects that spread across a system. As Maathai explained in her 2004 Nobel Lecture: They conserve resources, add new value, and reinvigorate the whole landscape:

    Clip: [01:28-01:43] from Wangari Maathai’s Nobel Lecture

    “Tree planting is simple, attainable, and guarantees quick successful results within a reasonable amount of time. These are all important to sustain interest and commitment.”

    Keisha: Small tech aims to be like one of those trees.

    Keisha: You describe | Small Tech as everyday tools for everyday people designed to increase human welfare. What kind of tools are you talking about?

    Aral: | What if each one of us had our own place on the web that we owned and | controlled?

    SOUND: Fade in worldbeat or chill lofi sound.

    Where we could be public if we wanted to be, but where we could also be private if we wanted to be.

    I mean private in the correct sense of the term, not as in “Facebook private.” | Facebook's always in the middle. Being the Entity in the middle is their business model, | so I can't talk to you directly. I have to tell Mark first and then Mark has to relay it to you. But Mark also takes notes and goes, Oh, who are these people? How can I exploit them? |

    | if I had my own place on the web, let's say aral.smallweb.org and you had yours, | keisha@smallweb.org and we want to talk to each other privately, I could go directly to your place, | here's a message for you Keisha. End-to-end encrypted. Only the two of us can see that and you'd reply back to me.

    But if I want to be public, | like an old school website, I could | make a message public | and people could follow me and see it, other people who are on this system. | that's what I mean when I talk about decentralization|. We each have our own place that we own and control.

    | we tested this out with the city of Ghent to see | what if a municipality sponsored this for all of its citizens? | what if it was a human right that you had a place on the web? |

    Keisha: | What would they need to know to manage that?

    Aral: That's the thing. Nothing at all. | unless they need to have no technical knowledge whatsoever to set it up and manage it, we don't win. | we will get a handful of people who are ideologically aligned with what we're trying to do, who have technical knowledge, | who are probably quite privileged. |

    | It needs to be, | that there is no technical knowledge required, which is not easy. But it's also not impossible.

    The web is a client server platform. | it was built without any of the safeties that I'm trying to build into the small web: | not scaling, | being designed for individuals so that we don't have the concept of users, | we only have the concept of a person and we design for a single person. |

    it's single people together that form communities, but they have to have | personhood at the core or else we end up violating | those human rights.

    SOUND: Start closing reflection music.

    Breaking down Moral Distress, Injury + Repair

    Annanda: Studies show that the experience of emotional or spiritual pain activates the same part of the brain as physical pain. |The cultural norms of profit over people, of gatekeeping diverse people groups from being active competitors in the industry are actually forms of legitimized violence.

    For those of us who work in the field of moral injury | and helping people repair those injuries, | a huge thing that we talk about is | for somebody to even get to a place of moral injury, they've had a sense of moral distress, which looks like normal stressors, and moral injury actually often shows up like PTSD.

    | Aral is actually | a great story of moral resilience in the sense that he had this moral distress in seeing the beliefs and values | of | Big Tech function in a way that totally compromised his morals, |and he said, this is not for me. How do I shift and change it?

    The amount of moral injury | within | STEM and tech is pretty high. | Folks are constantly negotiating their values in order to participate in the space. | And this is serious impact: | higher rates of anxiety, higher rates of depression, a lack of | self | worth and sense of self… because | you’re being used, your intelligence, your work is being used to support harmful things in society that you don’t agree with and not uplift society.|

    SOUND: Fade in title music

    And in order to rebuild that | it's not as if we erase the past or | act like this injury didn't happen, right? We have to work with the emotional, | existential | and spiritual scars that occur, and | deal with the complex messiness of the world, | which takes a lot of work.

    Keisha: Thank you for that. I think I’m—I’m moved by that theme of recovery. | In Toni Morrison’s and Wangari Maathai’s work it’s | a kind of creative recovery, | making ourselves strong in the face of dehumanization: | acknowledging that it is dehumanizing, but responding in a creative way, not just pushing against | commodifying | systems that wound our sense of self or degrade | or conflict with our values, but actively leaning into community, investing in common spaces, and creating things of value for all people.

    These are | Morrison’s and Maathai’s signposts for sustainable futures. And that spirit is rooted in our cultural history, in Black wisdom. It’s not the snake-eat-tail consumption that passes for business success. It’s a different logic entirely, and it makes provision for generations to come.

    CREDITS

    Keisha: Thank you for joining this season of Moral Repair: A Black Exploration of Tech. We return for a new season, Spring 2024.

    Annanda: I’m Annanda Barclay

    Keisha: And I’m Keisha McKenzie.

    Annanda: The Moral Repair: A Black Exploration of Tech team is us, Ari Daniel, Emmanuel Desarme, Courtney Fleurantin, Rye Dorsey, and Genevieve Sponsler. The executive producer of PRX Productions is Jocelyn Gonzales.

    Keisha: This podcast is part of the Big Questions Project at PRX Productions, which is supported by the John Templeton Foundation.

    SOUND: PRX Sound Signature

    ###

    SHOW NOTES

    Talk back to us at @moralrepairpodcast on Instagram, @moralrepair on Twitter/X, or by email: moralrepairpodcast at gmail dot com

    In November 2023, Amazon tells advertisers that Prime Video ads will “reach 115 million viewers per month” in the US.

    Follow Aral Balkan’s work at the Small Tech Foundation; read his essay, “We didn’t lose control. It was stolen” and listen to his talk at NextM Warsaw, December 2019

    Isabel Wilkerson’s Caste: The Origins of Our Discontents (2020), is now a major narrative film thanks to director Ava DuVernay and philanthropic foundations like Ford, Gates, and Nathan Cummings. Read the book that started it all.

    ORIGIN is showing nationally in selected theaters.

    Marjorie Kelly is Distinguished Senior Fellow at the Democracy Collaborative, and author of Wealth Supremacy: How the Extractive Economy and Biased Rules of Capitalism Drive Today’s Crises.

    Listen to Marjorie’s conversation with Laura Flanders (December 2023) and talk at the University of Colorado (September 2023)

    Toni Morrison published “A Humanist View” in her final collection, The Source of Self-Regard: Selected Essays, Speeches, and Meditations (Knopf, 2019)

    Read the transcript or listen to the original audio.

    Professor Jeffrey Sachs at the United Nations Food Systems Pre-Summit, 2021

    Visualize the wealth of one billionaire compared to the average US household income ($65K)

    Techcrunch reports US intelligence confirmation that it purchases US citizens’ personal data

    Professor Shoshana Zuboff (Harvard Business School) defines surveillance capitalism

    Wangari Maathai describes the Green Belt Movement in her 2004 Nobel Lecture

    Find out more about how emotions shape our identities, cultures, and societies in the book, “The Cultural Politics of Emotion” by Sara Ahmed

    “The Body Keeps The Score” by Bessel van der Kolk (narrated by Sean Pratt): This book educates, assesses and suggests interventions regarding the impact of emotional pain and trauma on our physical bodies.

    “Whitey On The Moon” by Gil Scott-Heron, from The Revolution Begins album

 
 
 
 
 
 

Reverend Annanda Barclay (Oakland, CA) & Dr. Keisha McKenzie (Clarksville, MD)

 
 


Reverend Annanda Barclay
(She/They) As a queer ordained minister of the Presbyterian Church and a Chaplain Fellow at Stanford University, focused on moral distress and injury in STEM , Barclay is also Co-Chair of the religious reparations nonprofit Center for Jubilee Practice.

Rev. Annanda Barclay is a death doula who explores life well-lived, a non-sectarian chaplain, and a Stanford researcher of moral injury and repair as it relates to tech.


Dr. Keisha E. McKenzie
(She/They) is a technical communicator, strategist, and advocate who applies humanism and systems thinking to questions of well-being, public good, and ecology.

Dr. Keisha E. McKenzie is a technical communicator, strategist, and advocate who applies humanism and systems thinking to questions of well-being, public good, and ecology.