The Fight Against Surveillance Capitalism: A Conversation with Meredith Whittaker
Released on 12/04/2024
Hi, thank you all for being here today,
to echo what Katie said.
I'm so excited to be talking with Meredith Whittaker.
Meredith is the President of the Signal Foundation,
which for those who don't know overseas and runs
and maintains the Signal Messenger,
which is an encrypted messaging app and platform
that is used by hundreds of millions of people globally.
As far as WIRED's concerned,
the gold standard in encrypted messaging,
we all use it, recommend that you do too.
But Meredith also co-founded the AI Now Institute at NYU,
which focuses on the social implications of AI and tech
and spent over a decade at Google,
which I think is also especially relevant
given what we're talking about today, which is the idea of,
it says surveillance capitalism.
Oh, it did.
The name of the talk Does it.
[Brian] Where? is about
surveillance capitalism, if you check your program,
if there's a program.
But I think we wanna draw a little bit of distinction there
about what we're really talking about today, Meredith,
which is more of a surveillance business model
that has kind of defined the last couple of decades of tech.
Yeah, well, it's great to be here with you, Brian,
and thank you everyone.
I think, you know,
as someone with a bit of an academic background
who really pays attention to definitions,
this is just a clarification I often make,
surveillance capitalism is an more or less academic term
that is very freighted with sort of disciplinary histories.
And when we at Signal,
when I talk about the surveillance business model,
we're being a bit more diagnostic and a bit more, I think,
schematic about what we're talking about.
Ultimately, Signal exists in an industry
where the profit incentive is defined by surveillance,
and that's not a scary term.
In this case, we're talking about data collection.
You collect as much data as you can
about whatever your domain is, your customers,
your environment, what have you.
You use that to create models, to target ads, perhaps,
or to train AI models or what have you.
Or if you don't do that, you provide goods and services,
tool chains and libraries to those who do.
So there's a flywheel,
there's sort of a center of this industry
that is defined economically
by the collection and creation of data
with this sort of the more, the better paradigm.
And in Signal's case, we reject that out of hand,
of course, because we are motivated and dedicated
to preserving privacy, to creating technology
that maintains the hundreds of thousands of years of,
you know, private communications as a norm
into an age where privacy is vanishingly scarce
in our daily lives and communities and activities.
So when we talk about the surveillance business model,
we are talking about a paradigm
within which Signal both exists and which it rejects.
And I think what we see here
is Signal almost as the acid test for what's wrong
with the current incentive structure in tech,
when something as valuable as Signal
can't figure out how to make a profit.
When we can be core infrastructure for human rights work,
for journalism, for every military in the world,
if we're gonna be frank, for governments, for boardrooms,
when it's clear we are the gold standard
for a value that everyone cares about.
The Apple Billboards proclaiming privacy,
Meta's sort of turned to privacy.
This is something,
it is clear there is a market for and people want,
and yet we have to squeeze ourselves into a nonprofit shape
that is not quite conducive
for the high availability realtime tech
we are maintaining and developing,
simply because there's no real way to make a profit
in the tech industry given the surveillance incentives.
So I think that introduces
why we focus on that paradigm so much
and kind of the distinction practically speaking
between how we use it
and how it might be used in different academic discourses.
Well, and I think that is the central question, right?
Is you are presenting a model
of how to do things differently
and beyond just messaging,
but you are providing the messaging, a messaging model
and then maybe other people can follow that.
But I think what's interesting
and why I'm glad to talk to you
is that it's not just hypothetical,
you are currently making it work
and you genuinely have this sense that things are changing.
You wrote for WIRED in an essay
that we published a couple weeks ago,
Next year will be big tech's finale.
Do you feel like we are at a sea change?
Do you feel like we are that close
to things unwinding for big tech or, you know,
is that a little more hyperbolic
and it takes a little more step,
a few more steps to get there?
Yeah, well, look, y'all asked me in the middle of 2024
to write a prediction for 2025, like, you know,
past data does not necessarily predict a volatile future.
And what I really wanted to write was,
you know, like I sat down and I thought,
well, what is there to be optimistic about right now, right?
Like, how do I actually
sort of sincerely communicate optimism?
And what I came to is, well, you know,
let's write something that is half manifestation
and half optimism as invitation, right?
That's not a prediction where we say,
oh, it looks like, you know, probability approaching one,
we can all sit back
and let the gears of the world take shape.
No, this is an invitation to action
and to marshal the forces that I do truly see fomenting
in which I have never seen a broader
and more heterogeneous coalition
looking at concentration of power in tech,
looking at the deep dangers of pooling vast dossiers
of surveillance data on every person on earth
in the hands of a few companies,
looking at these choke points and single points of failure
where, you know, one bug and a kernel driver
can take down core infrastructure globally across medicine,
transportation, governments, you know, what have you.
Like, none of those things are chill or nice.
No one is championing them,
certainly not in the way they were 10 years ago
or even five years ago,
so, you know, the first step to a strategy for change
is an honest map, and that's what I think we have now.
No one is pretending this is okay.
And so that is the opportunity
and the invitation for change.
And again, to take things like Signal,
which is, you know, we're existing as a nonprofit.
We exist on donations.
We persevere at being kind of the keystone species
in the tech ecosystem around privacy and tech
that is rights preserving and works with integrity,
but we're not looking to be alone, right?
We don't wanna be the outlier that proves the rule.
We want to be a new set of rules
leading the way to a much more open
and diverse tech ecosystem that,
again, isn't reliant on like,
you know, 5 companies and 15 guys
in a paradigm that is very, very stale,
and ultimately not healthy for the world or the future.
Well, and I wanna talk about Signal and how it is,
you know, it is ubiquitous,
especially among the privacy conscious.
But that ubiquity is expensive, right?
Signal cost about $36 million to run last year
and about 50 million to run now, which is a pretty big jump.
About two thirds of that,
by my understanding at least in 2023,
came from contributions.
[Meredith] Is that? It all came
from contributions.
[Meredith] All, okay. It's fully
donor supported. So as you grow,
things get more expensive, you need more donor support
to fund all of this, these features, these servers,
how sustainable is that?
Do you feel like you're on a flywheel right now
where as you grow the donations grow in kind
or does there need to be another model that steps in
that makes you sustainable for the long term?
Well, I mean, it's kind of a,
it's a bit of a tricky question
because it's sustainable right now
because there are a lot of people who value privacy
and who get why what we're doing is so critical, right?
Like, one thing to take home, we couldn't make Signal again
if the heartbeat stopped, right?
If Signal went away for a year, you can't just restart
a globally interconnected communications network
where you can pick up your phone
and knew your friend also has it installed on their app.
That network effect is huge, it is not replicable.
And when Signal came up over 10 years ago
when it emerged as two apps, then one app,
and the sort of history of Signal
and the masterful work of its founder, Moxie Marlinspike,
who basically carried this thing through for a decade,
it was coming up in an environment
where its competition was web clients.
Like remember Jabber and Google Chat, right?
It was coming up in an environment
where less than 30% of human beings had a smartphone.
Most people left their desktop at their work
when they went home.
A vastly, dramatically different ecosystem
than what we have today where we have incumbent monopolies
who've saturated the market, right?
And so we're talking about preserving something
that grew through the cracks into sort of a mighty tree,
let's use that metaphor,
and needs to continue to be fed and watered
because without that, you know,
we can't just snap our fingers and introduce tech
that will actually serve the purpose of Signal, right?
Because everyone has to use Signal for you to use it.
You can be the most, you know,
the most virtuosic cryptographer
who created the most elegant
and beautiful private communications technology.
You go out there, you're like, hey, use it, right?
But if your friend doesn't use it,
it doesn't matter to you, right?
So that network effect is hugely important
and the reputation that Signal has developed
because of the care and attention
and sometimes agitation of the hacker and InfoSec community
is equally important, 'cause we develop all in the open,
our cryptographic implementations and our core technologies
are open source, every change to our repo get scrutinized
by like communities of dudes
who are like trained spotters for signal code pushes, right?
And that like, we love them for it
because they will find a little bug and they will report it,
so it's really making good on the promise
that many eyes make more secure code.
And again, that's, you know, similar to CUDA
or some of the other standards and infrastructures
you might hear about in other sectors like AI,
what we're talking about is a community of practice,
a community of labor norms, and a sort of,
you know, standardized understanding of tech
that has come through this, you know,
Signal as infrastructure over about 10 years,
and again, can't just be recreated, right?
I can publish anything on GitHub,
but I can't force anyone to read it or care about it, right?
So these things are very, very precious
and they need to be maintained.
They need to be supported,
and right now we do have support.
We're looking to diversify that,
you know, now is a very critical moment for privacy,
preserving communications.
You know, the more geopolitically volatile the world is,
the more it's important for us to be able to have real talk
that is not interrupted by surveillance.
And so I think, you know,
we're working on growing the base of supporters
and thinking through are there models
that can marshal the type of money we need, right?
It's absurd that we are a core infrastructure,
you know, from everything to human rights and defense,
and then you see these companies
that are basically white labeling some like AI API
getting billion dollar valuations,
and we can't figure out how to like make money on that
to sustain a thing that if it went away
would literally change the fates of nations,
like y'all, right?
Well, and can I, you wrote also,
I know you wrote it a little bit ago, but about this idea
of this nonprofit critical infrastructure ecosystem, right?
So not just Signal, but other aspects
where we are too reliant on private companies.
Where's the, where's the nonprofit version of that?
And as part of that,
you mentioned a need for state capital to get involved
at some point.
Is that when you talk about
alternative and diversifying funding,
is that something that Signal's looking at?
And are there any concerns with that too
about taking money from governments
that you're also protecting people from with,
you know? Yeah,
none of this is simple friend.
Yeah, oh yeah, no for sure.
Like what I'm looking at
is there's the type of capital we need,
how do we get it?
Not how do we partner with states, absolutely not,
not how do we sort of enter into shady agreements.
I think there is, you know,
the issue here is not nonprofit or for-profit.
Nonprofit is a tax status,
it's not actually like comfortable or amenable.
Like we are constantly, you know,
I feel like we're like fitting inside a dollhouse.
We're like a massive, you know, like tech,
not massive in size, but like, we have to maintain,
we have to act and function like a tech company.
We are for all intensive purposes a tech company.
We are a nonprofit tech tax structure,
because again, that means that my board
isn't gonna go to Davos
or isn't gonna be pushed to have sort of,
you know, a profit motive come back to me
and be like, I love you Meredith.
I love this privacy thing real cool, real nice,
but ultimately I have a fiduciary duty
and just as the board sold Twitter,
our board is gonna have to strip some privacy protections
because you have to be making some revenue, right?
That's the risk that we are warding against
with the nonprofit structure.
We are not celebrating nonprofit
as some sort of alternative to private enterprise,
and those are the two,
you know, the cage match there, right?
This is a structure that we have to have
if we're gonna preserve our obsessive mission on privacy,
because otherwise it would be whittled away
because those dudes on my board
ultimately have no choice or they want an AI strategy
or they're, you know, whatever, whatever.
There's a lot of ways we could make money
that we simply are not gonna make money doing
because we're privacy preserving.
So the call here is not everyone come be a nonprofit.
Nonprofit is not the perfect form.
It is a prophylactic from the incentives,
the perverse incentives that define the industry we're in.
And that, you know, how do we create better incentives?
How do we marshal things like state capital, which is,
you know, let's be clear
that's going to tech all day every day.
If you look at the subsidies,
if you look at the research history,
if you look at the massive cloud contracts
that are now ultimately the pipeline of revenue
for most of these companies, or a big pipeline of revenue,
like there's no state capital involved here.
I mean, you look at their government sales departments
in any one of these large companies,
they, you know, [indistinct] has a magnitude
bigger than Signal's team, right?
So we also have to stop with these bright lines
as if there's some, you know,
clear I guess difference between sort of private industry
with state control, what have you, yeah.
So I think the question then is how do we marshal that,
you know,
how do we marshal state capital without state control?
How do you get long-term commitments, guarantees?
How do you do things like the sovereign tech agency
in Germany, which was set up ultimately
to take government appropriations,
but then has a lot of clear, a lot of clear boundaries
where they independently allocate those
outside of state control.
So we're, you know, we're interested there,
but I had to, I had to kind of go into a long digression,
I think to explain what we're actually talking about,
[Brian] which is not, It was a good one.
not a clean schematic, not a clean divide
between for-profit
and non-profit. Clean is boring.
I think that was a really
[Meredith] illuminating. Yeah,
clean is boring. Yeah, right?
While we're on the topic of governments,
we obviously had an election here in the US recently.
Donald Trump, yeah. What, when?
It's true, it's true.
Read about it on wired.com.
Is that a website? It is, it is.
So, but I wanna talk to you about it.
I think a, there's, well, the Trump administration,
the last go round was hostile to encryption,
not, you know, several governments worldwide are.
You've got Kash Patel coming in potentially as FBI director.
How does this affect Signal, its users?
Like what is Signal's role in this moment?
How are you feeling about the incoming administration,
both for your users and for Signal itself?
I mean, look, Signal knows who we are.
Signal will continue being Signal.
Signal has one thing we do and we do it really well
and we do it pretty obsessively, and that is provide
truly private communications infrastructure
to everyone everywhere globally, full stop.
We're not changing.
That's how I feel.
[audience applauds]
Is Signal going to stay in the United States
no matter what in terms of where it's based?
I mean, we're a global, we're a global operation
in so far as we, you know,
people use Signal everywhere around the world,
across the globe.
We're an open source project.
We have people who, you know,
we don't routinely take third party commits,
and when we do, we scrutinize them really closely, right?
But we do have people who sort of,
you know, contribute to Signal globally.
We're a jurisdiction in the US and we don't, you know,
we don't have plans to share on that, but we are, you know,
we are not a globally interconnected communication network
has to touch as many people as possible,
so you can make that call from like Bangkok
to Kingston, Jamaica,
and then make it from Kingston to London
and then London to New York.
So, you know, we don't have any plans to move.
We are focused on markets around the globe
and we are of course aware of the way that,
you know, geopolitical volatility
informs our ability to function in one or another market.
We've been fighting things like the online safety bill,
Now Act in the UK, which threatened to,
and still sort of threatens, but much more in the distance
to mandate back doors in end-to-end encrypted communication,
you know, a disastrous cybersecurity vulnerability
that would ultimately undermine Signal anywhere it's used
irrespective of jurisdiction.
We're fighting Chat Controls,
which is the European version of that law.
It's sort of like this, it's this zombie that honestly,
you know, since 1976 when Diffie and Hellman
tried to publish their paper
introducing public-key cryptography
and the US government freaked out
and tried to suppress publication of that paper
has been this sort of, you know,
ever present specter behind the idea
that there could be private communication networks
or other privacy preserving technology
that would protect interpersonal communications
and activities,
not simply government or corporate activities.
And this is a tension that threads
through the crypto wars in the nineties
with the specter of evil
that encryption is said to be protecting,
changing over time, you know, it's terrorists,
it's people who harm children, it's what have you.
But with the goal remaining steadfastly focused
ultimately on destroying the only technology we have
to preserve the human right of privacy
in an age of unprecedented mass surveillance.
So we fight that wherever it emerges.
With the time we have left,
I want to talk a little bit about the future for Signal
on a product level,
which it sort of feels less grand than,
you know, global fights against [indistinct],
but is fundamentally what it is, it is Signal itself.
What I wanna ask is, as you grow out features you have,
it feels more complicated
because you have to build them out in a secure way
whereas your competitors do not,
so it's more expensive, more timely,
harder to keep parity in certain ways.
What do you see as future, as Signal's future as a product?
What's next on the roadmap?
What do you care about, parity about?
What do you not care about?
Which is really a way to ask about AI,
'cause no one has yet, I get to be the first person today.
Everyone's gonna do it, so I just, I get to do it here.
What does Signal need to do to keep up with and expand
and get to a billion users that it's not doing right now?
Well, we have to provide a product
that when anyone picks it up, looks and feels normal,
that works, that when you wanna send
a cat dancing on a birthday cake to your auntie thread,
you can go to gift search
and type in cat birthday cake dance
and the right little thing will pop up
and you send it to your aunties, right?
It has to do that.
That is fundamental,
as fundamental as ensuring that our crypto systems
are robust and not susceptible to attack.
Because again, if your friends don't use Signal,
it doesn't matter how ideologically committed
to privacy you are, you don't use Signal either,
unless you know, I don't know,
you open it to send note to self,
but note to self is a great feature by the way.
So, you know, what we are aiming for
is a product that just works,
'cause no one picks up their phone to use Signal.
They pick up their phone to talk to their partner
or talk to their dad or, you know,
the actual imperative of messaging
is human relationships and connection,
and so honoring those as a first order
is what Signal has to do
on the same level as ensuring privacy preservation.
And this is where I think,
you know, the early Signal team, Moxie, Tyler,
some of the people who are really at the founding
of this product got it right where very few people did.
'Cause there was a whole crop of sort of privacy preserving,
command line looking, PGP looking, you know,
technology that no one used.
I was a person trying as hard as I could
to get anyone I know
to like email me back with PGP and crickets, right?
Not because people didn't care,
but because ultimately when they wanted to talk to me,
they didn't wanna do that.
And so from its inception,
Signal has really focused on usability, on design,
and on being extremely stringent about making an app
that looked and felt normative.
And so we will continue doing that and I think,
you know, that means that we're doing something, you know,
you should see in the next months or so, encrypted backups,
which is a, you know, that's a gap, you know,
so you can back things up
and if your phone falls in a river,
everything isn't on device and lost, right,
which is a, you know, a thing that we wanna do.
It's obviously much harder to do in a privacy preserving way
where we have full deniability around,
you know, our ability to, you know, see people's data,
backups are tricky there.
so we're looking at improving some of those gaps.
I don't, you know, AI,
what I see across a lot of these apps right now is a,
I would say almost a glassy-eyed company imperative
that is coming down from the top
to integrate AI into everything.
And I can imagine how the OKRs get written,
you know, it's the year of AI and then your little VP,
right, is like, Okay guys, it's the year of AI.
Integrate this into your group level or your product level.
Okay, well then there's somebody here who has to take on,
you know, integrating an AI feature into this,
but like, that's not how good innovation happens, right?
And that's why we have these little prompters,
these little like, Hey, let me write your mom for you.
And I'm like, You don't need to write my mom.
Like stop it, right?
And so I think what we're seeing is like this,
like it's annoying at this point.
It is cluttered, it is swirly and swoopy and unclear
and it doesn't actually solve a problem for me.
The problem it solves is AI takes vast amounts of capital
in the bigger is better form we have,
and there is a need for a consumer market fit, right?
And they're trying, right?
They have people who are like,
well, they said, it's the year of AI.
Let's try to shoehorn this into an interface
that ultimately would serve us better if it were cleaner,
would serve us better
if it weren't cluttering up with suggestions
and like little prompts, right?
And so that's the thing Signal's not gonna do.
And Signal doesn't have to, again, 'cause we, you know,
we didn't go out and invest a trillion dollars
in data centers on a bet that this stuff was gonna,
you know, return as a good product feature,
which is another benefit of kind of our model.
So we're laser focused on doing one thing
and doing it very, very, very, very, very well,
and when you know, something comes along
that doesn't serve that we don't have to pick up that mantle
because we're not chasing quarterly returns.
We're not chasing sort of investor approval,
you know, to make sure that we continue with the valuation
that we'll, you know, keep our C-suite employed, right?
And so, you know, we do use a little bit of AI
and it's an on-device face detection model.
If you go to Face Blur feature,
it works kind of half the time, it's fine,
but you know, again, when we talk about AI right now,
we're talking about these bigger is better
sort of transformer models, everything is generative,
and ultimately I think a lot of that is gonna fade away
because it's not proving useful in a consumer context.
Meredith, thank you so much.
That's all we have time for.
I really enjoyed talking with you.
I really appreciate it.
Thank you. [audience applauds]
Halle Berry Answers The Web's Most Searched Questions
Mathematician Answers Geometry Questions From Twitter | Tech Support
Jimmy O. Yang Answers The Web's Most Searched Questions
Inside a Funeral Home with Mortician Victor Sweeney
Joey Chestnut Answers Competitive Eating Questions From Twitter
What Speakers That Cost $370,000 Sound Like
9 Levels of Pickpocketing: Easy to Complex
Aerospace Engineer Answers Airplane Questions From Twitter
Cleo Abram Answers The Web's Most Searched Questions
Selena Gomez & Zoe Saldaña Answer The Web's Most Searched Questions