In the past, if you wanted to change the world, you had to pass a law or start a war. Now you create a hashtag.
Ethan Zuckerman studies how people change the world, or attempt to, by using social media or other technological means. As director of the Center for Civic Media at MIT and an associate professor at the MIT Media Lab, he tries to help his students make sense of these issues. Zuckerman is also writing a book about civic engagement during a time when we have a lot less trust in institutions—government, businesses, banks, and so on.
Maybe that lack of trust is reasonable. After all, we’ve spent the last decade-plus slowly turning our data over to large corporations like Facebook and Google without quite realizing we were doing it.
Zuckerman knows what it’s like to build technology that pisses a lot of people off. Back in the 1990s he created what became one of the most hated objects on the internet: the pop-up ad. The aim was to show an ad on a web page without making it look as though the advertiser necessarily endorsed the content on the page. “Our intentions,” he later wrote in an apology to the internet at large, “were good.”
Zuckerman spoke with MIT Technology Review about how social media started controlling us rather than the other way around.
How are people using technology—rather than, say, lobbying for laws to be passed—to force change in new ways?
We used to make change mostly using law as our primary lever. Now we use the legal lever less; we use the levers of norms of markets and technology more often. #MeToo is an example of a norms-based campaign. It’s basically saying, “We’re going to challenge how people talk about sexual assault and sexual harassment.” And once we change that norm, there’s other legal pieces, market pieces, that’ll come into play. But at its heart it’s trying to change how we have certain conversations.
The point in all of these is that if you can’t get social change done through the traditional model of civics, there is a whole new set of tools, and people are starting to learn how to use these things.
But social networks like Facebook and Twitter control, or at least direct, the information we see by using algorithms to filter what we see in our feed. You worked with two colleagues—Chelsea Barabas of MIT’s Center for Civic Media and Neha Narula at the Media Lab—to build a tool called Gobo that lets people aggregate and filter their feeds on their own. Why?
What this is meant to do is to say, “Look, it’s really a mistake to give one or two companies control over our digital public sphere.” Instead, we need competing platforms. We’re trying to make the case that you want those different social networks because you want more control over your filters about what you see and what you don’t see.
If we need competing platforms, we need tools that would let us use those competing platforms. Gobo is one of those tools. Gobo is an aggregator. It aggregates Twitter and the “aggregateable” parts of Facebook—the public pages.
So first we built the aggregator. And then we built the algorithms [that determine which posts you’ll see]. And rather than making them a top-secret black box, we made it an open box where you can reach in and set the sliders and experiment and say, “Oh, I like how this works. Now let me change it this way and see if it works better for me.”
Where we want to get in the longer term is even more of an open box; we built Gobo so that other people can write filters for it.
After a lot of criticism related to the ways its news feed filters content, Facebook has started pushing posts from users’ friends and family more and deemphasizing ones from brands. Do you feel this move shows Facebook is actually starting to shift its focus?
I don’t believe that this is changing yet, and I won’t believe it until I see a credible business model based on something other than targeted advertising.
I think that building an internet where we didn’t have to pay for anything, because our attention was going to be the commodity that was traded, is one of the most destructive and shortsighted decisions that we could have made. And I do mean ��we,” because I was very much part of that. Until I see Facebook saying, “Look, you’re going to use this as a service and you’re going to pay us for the service,” as opposed to “We’re going to capture your attention and repackage and sell it,” I won’t believe it.
A growing chorus of former Facebook executives and investors have been speaking out against Facebook—saying, for example, that social media is “ripping apart the fabric of how society works.”
I think what’s happening is that some of these people who are stepping out of the really intense “I’ve been in the process of building it” are starting to look at it from the outside and say, “Oh, wow, okay; now I can see the politics from the outside, and I’m not thrilled about what I’ve been associated with.”
We need to figure out how to have those conversations a lot earlier. We should be having those conversations with people who are working at these companies and who are making these design decisions. I want to be having those conversations with my students, because my students are often going to these companies and often find themselves with the opportunity to make those design decisions.
Why is it so hard for anyone who’s not Facebook, Instagram (which is owned by Facebook), Twitter, or Snapchat to compete in this social sphere?
Network effects basically say, “I gotta be on Facebook ’cause everybody I know is on Facebook.” Because Facebook’s so friggin’ big, they get all sorts of advantages that make it very hard to catch up with them. They get more bandwidth, they get cheaper servers.
So when someone shows up as a meaningful competitor, [Facebook is] more likely to buy them and eat them up than they are to actually have to fight them in the marketplace.
You wrote a piece in the Atlantic that suggested a publicly supported social network as a potential solution to social media’s echo-chamber effect. Could this actually happen?
I think it’s wholly unrealistic in the United States. It’s something that could be realistic in Europe, [where] you have a public media culture that accepts the idea that you might want to invest money in people having some basic knowledge about politics, the world, the people around them. I could imagine an innovative European public broadcaster saying, “Maybe we build a social network that’s compatible with other social networks, has algorithms designed to help you tune whether you’re getting news about the world, news about your community, and makes those levers visible and controllable.”
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.