New Rules

John Wihbey ’98 works at the intersection of media and technology, with increasing focus on AI’s power to shape and influence media and communications. His new book, Governing Babel, places these high-stakes questions in the context of history with an eye toward new rules and smart regulation.
John Wihbey, photo by Jared Leeds

John Wihbey ’98 is director of the AI-Media Strategies Lab and associate professor at Northeastern University and is also cofounder of Northeastern’s Institute for Information, the Internet, and Democracy and a faculty researcher at the Ethics Institute. Photo by Jared Leeds.

Bowdoin: How do you think journalism has responded to AI?

Wihbey: The narrative is that the internet happened to us—and we don’t want it to happen to us again. There was the static internet, then the interac-tive web, and then social and mobile. This is the next phase. Maybe it’s truly revolutionary. At the same time, people are concerned that the technologies are not ready to render the accu-racy, fidelity, and trust you need. They’re not

rule-based like a spreadsheet; they’re probability predictive models that can make errors and make mistakes. They also have tendencies to kind of shave truth.

Bowdoin: You put the rise of social media into historical context. Tell me how you see all this.

Wihbey: I try to put social media into what I refer to as the 100-year journey. That begins at the end of World War I, where courts start to define the First Amendment. We had a first amendment through the eighteenth and nineteenth centu-ries, but it really didn’t have specific meaning. It was often left to local courts and leadership to decide. So, you have that, then you have the rise of broadcast technologies, and through the FCC in 1934, the first inkling of regulation.

I think there’s a big question about what the United States is going to do. My thesis is that there are some deep principles that might allow us to put some regulation on the books. I think there’s a chance that a presidential candidate or candidates might run on something like that. Nobody quite has. It polls well to say you’re going to try to get social media under control. Whether you can is another question.

Bowdoin: Why do you think the US is so slow or so silent?

Wihbey: We’ve always had a libertarian ethos around regulating speech and media in any way. We also, in 1996, passed a law allowing websites, later platforms, that host user-generated content to not be liable for content they host but did not produce. And we’re sort of stuck in the ’90s. We have not gotten the momentum to pass anything that would update these archaic laws.

It’s often said about social media that there just is no remedy. It doesn’t map onto anything in our history. There’s no way we could imagine something that would be First Amendment-respecting but also bring things roughly into line with broadly shared norms. I say, most of the past moments where we’ve made big deci-sions about speech and the public sphere were moments of invention and creative response to a new set of circumstances. I believe we could do it again. You see a lot of states moving against deceptive AI. The states are laboratories of democracy, and there’s a lot of action there.

Bowdoin: You said there’s a tendency to think of this as a young person’s world, but this is not in fact their revolution. What does that mean?

Wihbey: The social media revolution in the early 2000s was largely youth driven, whether by founders like Zuckerberg or early adopters. It’s not clear that AI is a youth revolution. It seems more of a man-agerial class that’s adopting these technologies. Maybe it’s just an anxiety about where the job market is going. I think there are media narratives that are scaring a lot of young people. There are going to be no entry-level jobs, you’re going to be replaced, there’s going to be no work. I hope that

students will be users but also bring a degree of skepticism so we don’t give up important things—cultural values, great books and great ideas and literature and art and music and history. But that they recognize there’s tremendous power in these technologies to help us synthesize large amounts of information, to blend ideas, to give us the basis of new ideas. Generative AI models are fantastic for stress-testing ideas and providing critical counterpoints. This is the ideal of the liberal arts education, actually—to sort of hold contradictory things in one’s mind and still proceed.

Bowdoin: What gives you optimism?

Wihbey: The story of our country is a story of dealing with new technologies and making mistakes but ultimately figuring out rules of the road to help us grapple with harms and disparities and inequities. If we look back at history, we see it often takes a while for that to happen. But if we

keep our institutions strong and healthy—public institutions but also private and cultural ones—we have a strong foundation for figuring out these problems. That’s no guarantee that in the short term we will, but I think we stand a good chance of coming up with reasonable norms and rules if we believe in the power of human beings to come together collectively, to deliberate, and to try to come to some rough running consensus about how to help humans flourish.


Bowdoin Magazine Winter 2026

 

This story first appeared in the Winter 2026 issue of Bowdoin Magazine. Manage your subscription and see other stories from the magazine on the Bowdoin Magazine website.