When Brandy Zadrozny, a librarian-turned-journalist, began covering internet misinformation and extremism in 2013, she had no idea that the twisted ideologies that festered in online spaces would so thoroughly permeate American culture.
Since joining NBC five years later, Zadrozny has documented the mainstreaming of fringe internet movements, from fake alternative cures to anti-vaccine groups. She’s broken stories at a local and national level, like how a Facebook group rife with misinformation became people’s de facto news source in a small Pennsylvania town, and how the Jan. 6 insurrection triggered the defection of a QAnon faithful. In doing so, she also tracked the rise of white supremacists from obscure corners of the web to the White House.
“When it was just about the misogynistic side of the internet, or the trollish side of the internet, it always ruined people’s lives,” Zadrozny, now a senior reporter, said. “But the fact that it’s become so mainstream is pretty shocking.”
Today, newsrooms across the country, from Vice to NPR, are investing in the rapidly growing disinformation beat. It’s one of the areas where outlets are expanding, spurring journalism classes and training workshops to teach young journalists to identify and cover fake news.
Zadrozny spoke with NBCU Academy about the skyrocketing popularity of a beat she has covered since its inception, the most challenging stories she’s reported on and the need for local reporting. The following discussion has been lightly edited for clarity and brevity.
How did you begin reporting on disinformation and extremism?
I didn’t have a beat when I began at the Daily Beast, so I had to find stories no one was reporting on. A lot of reporting between 2013 and 2016 was on weird parts of the internet, like the manosphere [a network of online communities that promulgates misogynistic beliefs] and pickup artists. People who were sort of internet weirdos then became politically important because they started doing internet shenanigans with national politics. A lot of the “alt-right” disinformation sites that were founded by people like Mike Cernovich [a far-right blogger best known for popularizing conspiracy theories like “pizzagate”] and people who ended up being affiliated with January 6 were my beat early on. NBC recognized before other outlets that this was going to be an important beat: “internet weirdos” was about more than just weird things on the internet.
How do you find story ideas?
I have a bunch of academic sources, people who were studying this from early on at universities like Harvard and Stanford. I want to know what they were looking at in terms of the qualitative and quantitative things happening on the internet. And they’ll often put me on stories or just give me general background I can use.
A lot of other stories come from tips. A big part of my reporting is about regular people and how the internet has affected their lives. I usually receive tips because I’m talking on TV or tweeting about a wider issue, and regular people say, “Hey, that thing is happening to me,“ and they get in touch. Often, I’ll get a call from somebody who has a loved one who’s really into QAnon, or about a sister who dove down a rabbit hole of pregnancy misinformation and lost a baby. When I “break” stories, like the recent attacks from far-right activists on children’s hospitals that treat trans youth, I’m more interested in exclusives than scoops. I want stories that highlight a bigger problem through a story that no one else can get.
What are some of the most challenging stories you’ve reported on?
Some stories are technically challenging — there’s a piece of misinformation or disinformation out there: Where does it come from? We get to use all our fancy tools and the knowledge we’ve amassed about how people make things on the internet and the trails they leave. For example, there’s the Hunter Biden laptop story in which we discovered a fake “intelligence” report and traced the laptop’s origins to an anti-Chinese Communist Party operative.
Then there’s the emotionally taxing part of the job, like telling the story of the woman who lost her baby because of online disinformation or the mother whose son died by suicide on a livestream. Spending weeks, months, sometimes years with these people, connecting with them over their losses — that’s really hard.
What’s the toughest part of your job?
For a while, it was dealing with harassment and abuse. Our beat focuses on media manipulators, so I’m enemy number one for a lot of them. Now, it’s mostly the sheer volume of the work. Extremism has never been a bigger problem: We had all of those mass shootings in 2019; the disinformation campaign from the election and Covid and the anti-vax movement in 2020; and now we’re waiting for the fallout of the disinformation campaign. It feels like the break is never coming.
How do you protect your privacy and take care of your mental health?
I’ve learned to put space between my personal life and work. I practice really good digital security. Most people don’t know much about my real life, and I never post photos of my children online. During the summer of mass shootings, there was a moment I had to change the way I reported. I’d been watching all of these livestreams of mass shootings, and it really started to affect my mental health. Now, I’m very conscious about when I have to log on and view extremist content for work. I have a special place in my home for it. I have this thing I say to myself before I watch or report on this stuff, which is basically like, “This is not happening to me.”
How do you feel about the rising popularity of your beat?
It’s definitely become a hot beat. In a way, I think that’s great. But I fear that people will become fatigued when we throw the word “misinformation” and “disinformation” around.
For years, we would watch these extremist websites, but we wouldn’t report on them unless something bad happens. A good example is Kiwi Farms, [an online forum known for being an epicenter of anti-trans harassment campaigns]. That site has been well known in our beat for years, but we never wrote about it because it would attract harassment toward the people already targeted, and probably help them recruit new followers. When an activist was close to taking the site down, that’s when it became newsworthy and appropriate to cover, and that coverage, by my colleagues Kat Tenbarge and Ben Collins, had a tangible positive impact.
I hope the next step on this beat is more solutions-driven stories, like this New York Times feature on TikTok creators debunking rampant health misinformation and this NBC profile on a librarian who fought back in court against conservative activists.
What kind of advice do you have for aspiring journalists hoping to cover the internet and disinformation?
Some of the most interesting work in this space right now is being done by the verification community. Following the OSINT [open source intelligence] community on Twitter, you get a lot of cool tips and tools on how to verify information.
What are some stories we should watch out for in 2023 and beyond?
What we saw from 2010 to 2016 was the mainstreaming of troll tactics and misinformation campaigns. We saw it happen at a national level with the Trump campaign. The trends we’re seeing now are the localization of misinformation and disinformation and extremism. We’re seeing local politicians adopt these tactics from misinformation campaigns. There’s the anti-CRT (critical race theory) book banning, extremists across the country winning school board seats and local elections — that’s what’s next. We can’t cover all of these things at NBC. We need local reporters to know how to do this work and have the support to tell these stories.