Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
History
TV & Film
Technology
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/48/0b/7a/480b7abc-420b-b1f0-6326-d12ed0507155/mza_7767721410001904630.jpg/600x600bb.jpg
Computer Says Maybe
Alix Dunn
86 episodes
5 days ago
Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.
Show more...
Technology
Society & Culture
RSS
All content for Computer Says Maybe is the property of Alix Dunn and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.
Show more...
Technology
Society & Culture
Episodes (20/86)
Computer Says Maybe
Who Knows? Fact-Finding in a Failing State w/ HRDAG and Data & Society

Everything is happening so fast. And a lot of it’s bad. What can research and science organizations do when issues are complex, fast-moving, and super important?

More like this: Independent Researchers in a Platform Era w/ Brandi Guerkink

Building knowledge is more important than ever in times like these. This week, we have three guests. Megan Price from the Human Rights Data Analysis Group (HRDAG) shares how statistics and data science can be used to get justice. Janet Haven and Charlton McIlwan from Data & Society explore the role that research institutions can offer to bridge research knowledge and policy prescription.

Further reading & resources:

  • HRDAG’s involvement in the trial of José Efraín Ríon Montt
  • A profile of Guatemala and timeline of its conflict — BBC (last updated in 2024)
  • To Protect and Serve? — a study on predictive policing by William Isaac and Kristian Lum
  • An article about the above study — The Appeal
  • HRDAG’s stand against tyranny
  • More on Understanding AI — Data & Society’s event series with the New York Public Library
  • About Janet Haven, Executive Director of Data & Society
  • About Charlton McIlwan, board president of Data & Society
  • Bias in Computer Systems by Helen Nissenbaum
  • Center for Critical Race and Digital Studies
  • If you want to hear more about the history of D&S, the full conversation is up on Youtube (add link when we have).

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Post Production by Sarah Myles | Pre Production by Georgia Iacovou

Show more...
6 days ago
59 minutes

Computer Says Maybe
Who Knows? Independent Researchers in a Platform Era w/ Brandi Geurkink

Imagine doing tech research… but from outside the tech industry? What an idea…

More like this: Nodestar: Turning Networks into Knowledge w/ Andrew Trask

So much of tech research happens within the tech industry itself, because it requires data access, funding, and compute. But what the tech industry has in resources, it lacks in independence, scruples, and a public interest imperative. Alix is joined by Brandi Guerkink from The Coalition of Independent Tech Research to discuss her work at a time where platforms have never been so opaque, and funding has never been so sparse

Further Reading & Resources:

  • More about Brandi and The Coalition
  • Understanding Engagement with U.S. (Mis)Information News Sources on Facebook by Laura Edelson & Dan McCoy
  • More on Laura Edelson
  • More on Dan McCoy
  • Jim Jordan bringing in Nigel Farage from the UK to legitimise his attacks on EU tech regulations — Politico
  • Ted Cruz on preventing jawboning & government censorship of social media — Bloomberg
  • Judge dismisses ‘vapid’ Elon Musk lawsuit against group that cataloged racist content on X — The Guardian
  • See the CCDH’s blog post on getting the case thrown out
  • Platforms are blocking independent researchers from investigating deepfakes by Ariella Steinhorn

Disclosure: This guest is a PR client of our consultancy team. As always, the conversation reflects our genuine interest in their work and ideas.

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
1 week ago
48 minutes

Computer Says Maybe
Tres Publique: Algorithms in the French Welfare State w/ Soizic Pénicaud

Governments around the world are using predictive systems to manage engagement with even the most vulnerable. Results are mixed.

More like this: Algorithmically Cutting Benefits w/ Kevin De Liban

Luckily people like Soizic Pénicaud are working to prevent the modern welfare state from becoming a web of punishment of the most marginalised. Soizic has worked on algorithmic transparency both in and outside of a government context, and this week will share her journey from working on incrementally improving these systems (boring, ineffective, hard) — to escaping the slow pace of government and looking at the bigger picture of algorithmic governance, and how it can build better public benefit in France (fun, transformative, and a good challenge).

Soizic is working to shift political debates about opaque decision-making algorithms to focus on what they’re really about: the marginalised communities who’s lives are most effected by these systems.

Further reading & resources:

  • The Observatory of Public Algorithms and their Inventory
  • The ongoing court case against the French welfare agency's risk-scoring algorithm
  • More about Soizic
  • More on the Transparency of Public Algorithms roadmap from Etalab — the task force Soizic was part of
  • La Quadrature du Net
  • France’s Digital Inquisition — co-authored by Soizic in collaboration with Lighthouse Reports, 2023
  • AI prototypes for UK welfare system dropped as officials lament ‘false starts’ — The Guardian Jan 2025
  • Learning from Cancelled Systems by Data Justice Lab
  • The Fall of an Algorithm: Characterizing the Dynamics Toward Abandonment — by Nari Johnson et al, featured in FAccT 2024

**Subscribe to our newsletter to get more stuff than just a podcast — we host live shows and do other work that you will definitely be interested in!**

Show more...
2 weeks ago
52 minutes

Computer Says Maybe
Straight to Video: From Rodney King to Sora w/ Sam Gregory

Seeing is believing. Right? But what happens when we lose trust in the reproductive media put in front of us?

More like this: The Toxic Relationship Between AI and Journalism w/ Nic Dawes

We talked to a global expert and leading voice on this issue for the past 20 years, Sam Gregory to get his take. We started way back in 1992 when Rodney King was assaulted by 4 police officers in Los Angeles. Police brutality was (and is) commonplace, but something different happened in this case. Someone used a camcorder and caught it on video. It changed our understanding about the role video could play in accountability. And in the past 30 years, we’ve gone from seeking video as evidence and advocacy, to AI slop threatening to seismically reshape our shared realities.

Now apps like Sora provide impersonation-as-entertainment. How did we get here?

Further reading & resources:

  • More on the riots following Rodney King’s murder — NPR
  • More about Sam and Witness
  • ObscuraCam — a privacy-preserving camera app from WITNESS and The Guardian Project
  • C2PA: the Coalition for Content Provenance and Authenticity
  • Deepfakes Rapid Response Force by WITNESS

Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!

Post Production by Sarah Myles

Show more...
3 weeks ago
1 hour

Computer Says Maybe
The Toxic Relationship Between AI & Journalism w/ Nic Dawes

What happens when AI models try to fill the gaping hole in the media landscape where journalists should be?

More like this: Reanimating Apartheid w/ Nic Dawes

This week Alix is joined by Nic Dawes, who until very recently ran the non-profit newsroom The City. In this conversation we explore journalism’s new found toxic relationship with AI and big tech: can journalists meaningfully use AI in their work? If a model summarises a few documents, does that add a new layer of efficiency, or inadvertently oversimplify? And what can we learn from big tech positioning itself as a helpful friend to journalism during the Search era?

Beyond the just accurate relaying of facts, journalistic organisations also represent an entire backlog of valuable training data for AI companies. If you don’t have the same resources as the NYT, suing for copyright infringement isn’t an option — so what then? Nic says we have to break out of the false binary of ‘if you can’t beat them, join them!’

Further reading & resources:

  • Judge allows ‘New York Times’ copyright case against OpenAI to go forward — NPR
  • Generative AI and news report 2025: How people think about AI’s role in journalism and society — Reuters Institute
  • An example of The City’s investigative reporting: private equity firms buying up property in the Bronx — 2022
  • The Intimacy Dividend — Shuwei Fang
  • Sam Altman on Twitter announcing that they’ve improved ChatGPT to be mindful of the mental health effects — “We realize this made it less useful/enjoyable to many users who had no mental health problems, but…”

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
1 month ago
41 minutes

Computer Says Maybe
Unlearning in the AI Era w/ Nabiha Syed at Mozilla Foundation

Mozilla Foundation wants to chart a new path in the AI era. But what is its role now and how can it help reshape the impacts and opportunities of technology for… everyone?

More like this: Defying Datafication w/ Abeba Birhane

Alix sat down with Nabiha Syed to chat through her first year as the new leader of Mozilla Foundation. How does she think about strategy in this moment? What role does she want the foundation to play? And crucially, how is she stewarding a community of human-centered technology builders in a time of hyper-scale and unchecked speculation?

As Nabiha says, “restraint is a design principle too”.

Plug: We’ll be at MozFest this year broadcasting live and connecting with all kinds of folks. If you’re feeling the FOMO, be on the look out for episodes we produce about our time there.

Further reading & resources:

  • Watch this episode on YouTube
  • Imaginative Intelligences — a programme of artist assemblies run by Mozilla Foundation
  • Nothing Personal — a new counterculture editorial platform from the Mozilla Foundation
  • More about Mozfest
  • Nabiha on the Computer Says Maybe live show at the 2025 AI Action Summit
  • Nabiha Syed remakes Mozilla Foundation in the era of Trump and AI — The Register
  • Nabiha on why she joined MF as executive director — MF Blog

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
1 month ago
45 minutes

Computer Says Maybe
You Seem Lonely. Have a Robot w/ Stevie Chancellor

Loneliness and mental health illnesses are rising in the US, while access to care dwindles — so a lot of people are turning to chatbots. Do chatbots work for therapy?

More like this: The Collective Intelligence Project w/ Divya Siddarth and Zarinah Agnew

Why are individuals are confiding in chatbots over qualified human therapists? Stevie Chancellor explains why an LLM can’t replace a therapeutic relationship — but often there’s just no other choice. Turns out the chatbots designed specifically for therapy are even worse than general models like ChatGPT; Stevie shares her ideas on how LLMs could potentially be used — safely — for therapeutic support. This is really helpful primer on how to evaluate chatbots for specific, human-replacing tasks.

Further reading & resources:

  • Stevie’s paper on whether replacing therapists with LLMs is even possible (it’s not)
  • See the research on Github
  • People are Losing Their Loved Ones to AI-Fuelled Spiritual Fantasies — Rolling Stone (May 2025)
  • Silicon Valley VC Geoff Lewis becomes convinced that ChatGPT is telling him government secrets from the future
  • Loneliness considered a public health epidemic according to the APA
  • FTC orders online therapy company BetterHelp to pay damages of $7.8m
  • Delta plans to use AI in ticket pricing draws fire from US lawmakers — Reuters July 2025

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
1 month ago
52 minutes

Computer Says Maybe
Local Laws for Global Technologies w/ Hillary Ronen

What’s it like working as a local representative when you live next door to Silicon Valley?

More like this: Chasing Away Sidewalk Labs w/ Bianca Wylie

When Hilary Ronen was on the board of supervisors for San Francisco, she had to make lots of decisions about technology. She felt unprepared. Now she sees local policymakers on the frontlines of a battle of resources and governance in an AI era, and is working to upskill them to make better decisions for their constituents. No degree in computer science required.

Further reading & resources:

  • Local Leadership in the Era of Artificial Intelligence and the Tech Oligarchy by Hillary Ronen
  • More on Hillary’s work as a Supervisor for SF
  • Hillary Ronen on progressives, messaging, hard choices, and justice — interview in 48Hills from January 2025
  • More about Local Progress
  • Confronting Preemption — a short briefing by Local Progress
  • What Happens When State and Local Laws Conflict — article on state-level preemption by State Court Report

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
1 month ago
58 minutes

Computer Says Maybe
Gotcha! Enshittification w/ Cory Doctorow

Welcome to the final boss of scams in the age of technology: Enshittification 

More like this: Nodestar: The Eternal September w/ Mike Masnick

This is our final episode of Gotcha! — our series on scams, how they work, and how technology both amplifies and obscures them. For this final instalment we have Cory Doctorow on to chat about his new book Enshittification.

Is platformisation essentially just an industrial level scam? We will deep-dive the enshittification playbook to understand how companies lock users into decaying platforms, and get away with it. Cory shares ideas on what we can do differently to turn tide. Listen to learn what a ‘chickenised reverse centaur’ is…

Further reading & resources:

  • Buy Enshittifcation now from Verso Books!
  • Picks and Shovels by Cory Doctorow
  • On The Media series on Enshittification
  • Pluralistic — Daily Links and essays by Cory Doctorow
  • Conservatism Considered as a Movement of Bitter Rubes — Cory on why conservatism creates a friendly environment for scams
  • How I Got Scammed — Cory on his personal experiences of being scammed
  • All of Cory’s books
  • All (Antitrust) Politics Are Local — the entry to Pluralistic that Cory wrote on the day of recording
Show more...
2 months ago
55 minutes

Computer Says Maybe
Gotcha! ScamGPT w/ Lana Swartz & Alice Marwick

Thought we were at peak scam? Well, ScamGPT just entered the chat.

More like this: Gotcha! The Crypto Grift w/ Mark Hays

This is part three of Gotcha! — our series on scams, how they work, and how technology is supercharging them. This week Lana Swartz and Alice Marwick join Alix to discuss their primer on how generative AI is automating fraud.

We dig into the very human, very dark world of the scam industry, where the scammers are often being exploited in highly sophisticated human trafficking operations — and are now using generative AI to scale up and speed up.

We talk about how you probably aren’t going to get a deepfake call from a family member to demand a ransom, but the threats are still evolving in ways that are scary and until now largely unregulated. And as ever even though the problems are made worse by technology, we explore the limitations of technology and laws to stem the tide.

Further reading & resources:

  • Read the primer here!
  • More about Lana Swartz
  • More about Alice Marwick
  • New Money by Lana Swartz
  • Scam: Inside Southeast Asia's Cybercrime Compounds by Mark Bo, Ivan Franceschini, and Ling Li
  • Revealed: the huge growth of Myanmar scam centres that may hold 100,000 trafficked people
  • Al Jazeera True Crime Report on scamming farms in South East Asia
  • Scam Empire project by the Organised Crime and Corruption Reporting Project

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
2 months ago
54 minutes

Computer Says Maybe
NYC Live: Let Them Eat Compute

This just in with data centers: Energy grids are strained, water is scarce, utility costs are through the roof — ah well, let them eat compute, I guess!

More like this: AI Thirst in a Water-Scarce World w/ Julie McCarthy

It was just climate week in NYC and we did a live show on data centers with four amazing guests from around the US…

Thank you to the Luminate Foundation for sponsoring this live show and for all of our NY-based friends, and network from around the world that made it to Brooklyn for a magical evening. You can also watch the live recording on Youtube.

  • KeShaun Pearson (Memphis Community Against Pollution) will break down how Elon Musk’s xAI supercomputer is polluting the air of historically Black neighborhoods in Memphis, and how organizers are fighting back against yet another chapter of corporate extraction in their communities.
  • KD Minor (Alliance for Affordable Energy) will demystify the energy impacts of data centers in Louisiana and share organizing strategies to mobilize community opposition to Big Tech and Big Oil infrastructure.
  • Marisol (No Desert Data Center) will talk about their grassroots coalition’s recent win in Tucson to stop Amazon’s Project Blue data center proposal, which threatened the city’s scarce water supply, and how they’re organizing for future protections.
  • Amba Kak (AI Now Institute) will talk us through the bigger picture: what’s behind Big Tech’s AI data center expansion, who stands to benefit from this boom, and what we sacrifice in return.

Further reading & resources:

  • Amazon Web Services is company behind Tucson’s Project Blue, according to 2023 county memo — from Luminaria
  • Tuscon to create new policies around NDAs following the councils regret around not knowing more about Project Blue — from Luminaria
  • How Marana, also in the Tuscon area, employed an ordinance to regulate water usage after learning about data center interest in the area.
  • xAI has requested an additional 150MGW of power for Colossus in Memphis, bring it to a total of 300MGW
  • Time reports on increase in nitrogen dioxide pollution around Memphis due to xAI turbines
  • Keshaun and Justin Pearson on Democracy Now discussing xAI’s human rights violations
  • Meta’s Mega Data Center Could Strain Louisiana’s Grid — and Entergy Isn’t Prepared — report by the Alliance for Affordable Energy
  • 'A Black Hole of Energy Use': Meta's Massive AI Data Center Is Stressing Out a Louisiana Community — 404 Media

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
2 months ago
52 minutes

Computer Says Maybe
Are AI Companies Cooking the Books? w/ Sarah Myers West

OpenAI just secured a bizarre financial deal with Nvidia — but the math is not mathing. Is the AI sector an actual market, or a series of high-profile announcements of circular relationships between a tiny number of companies?

More like this: Making Myths to Make Money w/ AI Now

Alix sat down with Sarah Myers-West to go through the particulars of this deal, and other similar deals that are propping up AI’s industry of vapour. This is not your traditional bubble that’s about to burst — there is no bubble, it’s just that The New Normal is to pour debt into an industry that cannot promise any returns…

Further reading & resources:

  • More on the Nvidia OpenAI deal — CNBC
  • Analysts refer to deal as ‘vendor financing’ — Insider Monkey
  • Spending on AI is at Epic Levels. Will it Ever Pay Off? — WSJ
  • OpenAI, Softbank, and Oracle spending $500bn on data centre expansion in Abilene — Reuters
  • How Larry Ellison used the AI boom and the Tony Blair Institute to bolster his wealth
  • Oracle funding Open AI data centers with heaps of debt and will have to borrow at least $25bn a year — The Register

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
2 months ago
39 minutes

Computer Says Maybe
Gotcha! How MLMs Ate the Economy w/ Bridget Read

Multi-level marketing schemes have built an empire by enticing people with promises of self-realisation and economic freedom. The cost is simple: exploit and be exploited.

More like this: Worker Power & Big Tech Bossmen w/ David Seligman

This is part two of Gotcha! Our series on scams, how they work, and how technology is super-charging them. This week Bridget Read came to Alix with a very exciting business opportunity. Bridget authored Little Bosses Everywhere — a book on the history of MLM.

We explore how door-to-door sales in the mid 20th century US took on the business model of a ponzi scheme, and transformed the sweaty salesman into an entrepreneurial recruiter with a downline.

MLM originators were part of a coordinated plan to challenge the new deal in lieu of radical free enterprise, where the only thing holding you back is yourself, and the economy consists solely of consumers selling to each other in a market of speculation. The secret is, no one is selling a product — they’re selling a way of life.

Further reading & resources:

  • Buy Bridget’s book: Little Bosses Everywhere: How the Pyramid Scheme Shaped America
  • Family Values by Melinda Cooper
  • The Missing Crypto Queen: a podcast by BBC Sounds, about a large scale crypto scam, where there wasn’t even any crypto
  • LuLaRoe — the pyramid scheme that tricked American mums into selling cheap clothes to their friends and family with the promise of financial independence.
  • My Experience of Being in a Pyramid Scheme (Amway) — a personal account by Darren Mudd on LinkedIn
  • Watch our recent live show at NYC Climate Week

Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!

Show more...
2 months ago
50 minutes

Computer Says Maybe
Gotcha! The Crypto Grift w/ Mark Hays

Hey you! Do you want some free internet money? If this sounds too good to be true, that’s because it is!

More like this: Making Myths to Make Money w/ AI Now

This is Gotcha! A four-part series on scams, how they work, and how technology is supercharging them. We start with Mark Hays from Americans for Financial Reform (AFR), and get into one of the biggest tech-fuelled financial scams out there: cryptocurrencies.

Like many things that require mass-buy in, crypto started with an ideology (libertarianism, people hating on Wall Street post 2008). But where does that leave us now? What has crypto morphed into since then, and how does it deceive both consumers and regulators into thinking it’s something that it’s not?

Further reading & resources:

  • Seeing Like a State by James C. Scott
  • Capital Without Borders by Brooke Harrington
  • The Politics of Bitcoin by David Golumbia
  • Learn more about Americans for Financial Reform
  • Check out Web3 Is Going Great by Molly White
  • Line Goes Up by Folding Ideas — an excellent survey of all the tactics and rug-pulls during the height of the NFT boom
  • The Missing Crypto Queen: a podcast by BBC Sounds, about a large scale crypto scam, where there wasn’t even any crypto

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
2 months ago
55 minutes

Computer Says Maybe
Gotcha!

Gotcha! is a four-part series on scams, how they work, and how technology is supercharging them — running through to October 10.

In the series we look at:

  1. Crypto: Mark Hays on how a thing touting financial freedom ended up being a kind of fin-cult, rife with scamming
  2. Multi-Level Marketing : Bridget Read on the history of the biggest and most successful type of scam that still plagues us today
  3. Generative AI: Data & Society’s primer on how generative AI is juicing the scam industrial complex
  4. Enshittification: Cory Doctorow on his upcoming book, and how the process of Enchittification represents user-hostile practices that scam people into paying more, and ecosystem lock-in
Show more...
2 months ago
1 minute

Computer Says Maybe
Nodestar: Turning Networks into Knowledge w/ Andrew Trask

What if you could listen to multiple people at once, and actually understand them?

More like this: **The Age of Noise w/ Eryk Salvaggio**

In our final instalment (for now!) of Nodestar, Andrew Trask shares his vision for a world where we can assembly understanding from data everywhere. But not in a way that requires corporate control of our world.

If broadcasting is the act of talking to multiple people at once — what about broad listening? Where you listen to multiple sources of information, and actually learn something, without trampling over the control that individuals have over who sees what, when.

Andrew says that broad listening is difficult to achieve because of three huge problems: information overload, privacy, and veracity — and we are outsourcing these problems to central authorities, who abuse their power in deciding how to relay information to the public. What is Andrew doing at OpenMined to remedy this? Building protocols that decentralise access to training data for model development, obviously.

Further Reading & Resources

  • The Computer as a Communication Device by JCR Licklider and Robert W Taylor, 1968
  • World Brain by HG Wells
  • Learn more about OpenMined
  • We’re gonna be streaming LIVE at Climate Week — subscribe to our Youtube

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
3 months ago
43 minutes

Computer Says Maybe
Nodestar: Building Blacksky w/ Rudy Fraser

Social media isn’t really social anymore. But that might be changing. Rudy Fraser over at Blacksky Algorithms has built something new. He has built the infrastructure to provide a safe online space for the black community, and in the process challenges the ideas of hierarchical, centralised networks. His work — even outside the very cool development of Blacksky — is an amazing, concrete example of how the abstract ambitions of decentralisation can provide real value for people, and sets us up for a new kind of tech politics.

More like this: How to (actually) Keep Kids Safe Online w/ Kate Sim

This is part two of Nodestar, our three-part series on decentralisation. Blacksky is a community built using the AT Protocol by Rudy Fraser. Rudy built this both out of a creative drive to make something new using protocol thinking, and out of frustration over a lack of safe community spaces for black folks where they could be themselves, and not have to experience anti-black racism or misogynoir as a price of entry.

Rudy and Alix discuss curation as moderation, the future of community stewardship, freeing ourselves from centralised content decision-making, how technology might connect with mutual aid, and the beauty of what he refers to as ‘dotted-line communities’.

Further reading:

  • Blacksky Algorithms
  • Blacksky the app — if you want an alternative to Bluesky
  • More about Rudy Fraser
  • Open Collective — a fiscal host for communities and non-profits
  • Paper Tree — community food bank
  • The Implicit Feudalism of Online Communities by Nathan Schneider
  • Flashes — a 3rd party Bluesky app for viewing photos
  • The Tyranny of Struturelessness by Joreen

Rudy is a technologist, community organizer, and founder of Blacksky Algorithms, where he builds decentralized social media infrastructure that prioritizes community-driven safety, data ownership, and interoperability. As a Fellow at the Applied Social Media Lab at Harvard’s Berkman Klein Center for Internet & Society, he advances research and development on technology that empowers marginalized communities, particularly Black users

Show more...
3 months ago
41 minutes

Computer Says Maybe
Nodestar: The Eternal September w/ Mike Masnick

How did the internet become three companies in a trenchcoat? It wasn’t always that way! It used to be fun, and weird, and full of opportunity. To set the scene for the series, we spoke to a stalwart advocate of decentratilsation, Mike Masnick.

More like this: Big Tech’s Bogus Vision for the Future w/ Paris Marx

This is part one of Nodestar, a three-part series on decentralisation: how the internet started as a wild west of decentralised exploration, got centralised into the hands of a small number of companies, and how the pendulum has begun it’s swing in the other direction.

In this episode Mike Masnick gives us a history of the early internet — starting with what was called the Eternal September, when millions of AOL users flooded the scene, creating a messy, unpredictable, exciting ecosystem of open protocols and terrible UIs.

Further reading & resources:

  • Protocols, Not Platforms by Mike Masnick
  • List of apps being built on AT Protocol
  • Graze — a service to help you make custom feed with ads on AT proto
  • Otherwise Objectionable — an eight part podcast series on the history of section 230
  • Techdirt podcast
  • CTRL-ALT-SPEECH podast

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
3 months ago
52 minutes

Computer Says Maybe
Big Tech’s Bogus Vision for the Future w/ Paris Marx

What’s the deal with Silicon Valley selling imagined futures and never delivering on them. What are the consequences of an industry all-in on AI? What if we thought more deeply than just ‘more compute’?

More like this: Big Dirty Data Centres with Boxi Wu and Jenna Ruddock

This week, Paris Marx (host of Tech Won’t Save Us) joined Alix to chat about his recent work on hyperscale data centres, and his upcoming book on the subject

We discuss everything from the US shooting itself in the foot with it’s lack of meaningful industrial policy and how decades of lackluster political vision from governments created a vacuum that has now been filled with Silicon Valley's garbage ideas. And of course, how the US’s outsourcing of manufacturing to China has catalysed China’s domestic technological progress.

Further reading & resources:

  • Buy Road To Nowhere: What Silicon Valley Gets Wrong About the Future of Transportation by Paris Marx
  • Data Vampires — limited series on data centres by Tech Won’t Save Us
  • Apple in China by Patrick McGee

**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

Show more...
3 months ago
41 minutes

Computer Says Maybe
Short: UK Groups Sue To Block Data Center Expansion

Foxglove and Global Action Plan have just sued the UK government over their YOLO hyperscale data center plans.

More like this: Net0++: Data Centre Sprawl

Local government rejected the data center. But Starmer’s administration overruled them. They want to force the development of a water-guzzling, energy draining data center on a local community who has said no. And all of this is on the green belt. The lawsuit filed this week might put a stop to those plans.

Alix sat down Ollie Hayes from Global Action Plan and Martha Dark from Foxglove to discuss the legal challenge filed this week. Why now? Aren’t the UK aiming for Net 0? And how does this relate to the UK government’s wider approach to AI?

Further reading & resources:

  • Read the Guardian article about the suit
  • Read the Telegraph piece about the suit
  • Donate to the campaign
  • Data Centre Finder on Global Action Plan

Computer Says Maybe Shorts bring in experts to give their ten-minute take on recent news. If there’s ever a news story you think we should bring in expertise on for the show, please email pod@themaybe.org

Show more...
3 months ago
13 minutes

Computer Says Maybe
Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.