How to Fight the Resurgence of Neo-Nazism in the Age of the Internet

Two women who have researched neo-Nazis on the internet for years share why hate groups are now being pushed onto the "dark web" and how "free speech" needs to be redefined for the digital age.

|
Sep 5 2017, 7:29pm

Photo by Anadolu Agency via Getty Images

Joan Donovan, a research lead at the Data & Society Research Institute, spent the last two years dutifully reading the recently shuttered message board Stormfront. The neo-Nazi website, first established in 1995, claimed 300,000 registered members and received a couple hundred thousand visitors a month according to comScore, before it was shut down by its name provider for violating its terms of service.

Now that the site's plug has been pulled, Donovan says that every single link to Stormfront on the web is broken. "Twenty years of hyperlinking Stormfront—none of that goes to Stormfront anymore."

Though she doesn't believe the community will vanish—she jokes, "Maybe it'll end up as Stormfront2017.ru"—she does think corporate denials of service are "really interesting political tactics to stop the spread of white supremacists online."

Read more: Doxxing White Supremacists Is Making Them Terrified

While studying the favorite virtual haunt of Nazi murderers, Donovan would log on to Stormfront for four hours every day, "dipping her toes in" during breakfast and returning at night to see how different conversation threads had evolved. "In the lead up to the election, I just kept telling people, 'He's going to win,'" she says. "No one believed me."

According to Donovan, the white supremacist community found its inspiration in Trump long before they mobilized en masse in Charlottesville last month. The first time Donovan remembers seeing the message boards light up was back in June of 2015, when then-candidate Trump said that Mexico was sending rapists and criminals to America.

"There was a flurry of activity on Stormfront—people were shocked that he would say this stuff and no one was holding him accountable," says Donovan. "People were saying, 'If I said that, I would be fired. If I said that, my family would hate me. I could never say that in public.'" That was the moment, she says, white supremacists realized that Trump was on their side.


Watch: The 44th Anniversary of Roe V. Wade


The love affair hasn't really abated. Sure, Trump has let down white supremacists in myriad ways, but they were all very "chatty" about the speech he gave on Charlottesville, Donovan tells Broadly. "They've been watching what he's been saying about them very closely, trying to understand where they stand with him." For his part, she says, Trump seems to understand exactly what the far right wants to hear. "I think that he's been informed of the concerns of the people on the far right and he understands—and is very current with—the concerns of the community," she says. She cites Trump's distribution of far-right memes—the Hillary photo with the Star of David, the GIF of Trump wrestling the CNN logo to the ground—as evidence.

Since the election, and especially since Charlottesville, the far-right community has garnered greater exposure, but researchers like Donovan have been trying to warn the public about their online tactics for years. In 2017, the Data & Society Research Institute released a study that examined how social media platforms aided bogus disinformation campaigns like Pizzagate. Donovan doesn't think social media platforms like Facebook will ban the alt-right without public pressure. "It's not outside the purview of these companies to draw the final line [by denying these groups access to their platforms], but there's been a lot of feet dragging," she says.

Jessie Daniels, a professor at CUNY who has been studying white supremacy for decades, agrees. "I think the spate of responses from social-media platforms to these hate groups online is a positive development in an otherwise pretty bleak political landscape in the US," she says. "Sure, it can be a game of whack-a-mole, but I think it's important to keep whacking."

Over the years, Daniels has watched as social-media platforms increased the reach of groups like the KKK and Stormfront. "[Stormfront founder] Don Black and David Duke, they were computer geeks early on," she says. "You needed some tech skills to be a white supremacist online in the 90s. But [now] with the growth of Facebook—that takes no skill."

White supremacist demonstrators at the Charlottesville "Unite the Right" rally. Photo by Anthony Crider via Wikimedia Commons

Daniels believes pushing neo-Nazi sites onto the "dark web" is a worthwhile goal. "If it's harder—if [potential members] have to download the Tor browser and figure out all of that—that's going to be a barrier to a significant number of people," she said.

Daniels adds that being able to easily find Stormfront is what helped make Dylann Roof a murderer. (Her view is backed up a Southern Poverty Law Center study.) After having a conversation with friends about black-on-white crime, Roof typed the phrase into Google and the search engine—which autocompletes "black on w-" to "black on white crime"—led him to the white supremacist message board. "It wasn't that the Internet was recruiting him, it's that he sat down and typed in what he was interested in and Google helped him find other white supremacists who were there waiting to tell him, 'Yeah, your instincts are right,'" says Daniels.

Devoting years to scouring some of the darkest corners of the web hasn't exactly made Donovan and Daniels free-speech advocates. Both think the high-minded debate we are having over First Amendment protections for hate speech misunderstands the way white supremacist groups operate online. "It's easy to say everyone has a right to free speech, but once you look at the content of the speech, and the content of what they're saying in these spaces, it's hard not to react viscerally to the calls for violence and the endless parade of violent memes," says Donovan.

"What most people don't realize is that the 'absolutist' version of free speech is not supported by the US Supreme Court, which has ruled that a burning cross is not protected speech," adds Daniels. "The question becomes, what is a burning cross in the digital age?"

"What is a burning cross in the digital age?"

While some on the left have argued that the US should re-examine "free speech absolutism," others have rightly pointed out that any attempts to weaken the First Amendment would only embolden the current administration to go after its adversaries.

"When I first started studying hate groups, I'd say that I had the same view of freedom of speech as most liberals, which is that it is a constitutional right and we should defend even the most heinous speech," Daniels says. "Today, I think that if any reasonable person, even a staunch First Amendment advocate, looked at what goes on at these white supremacist sites, they would blanch. They would just be appalled and say, 'There's just no place for this in civil society. There's no right for someone to call for the genocide of an entire people. That's not what the First Amendment was meant to protect.'"

For a long time, Donovan says, white supremacist groups have been able to flourish because the media enforced an "unwritten code of strategic silence" around them. If the groups committed a crime egregious enough that the public needed to know about it, then the media would acknowledge it, but for the most part it was thought best not to give them too much attention, lest the publicity helped them recruit new members.

For More Stories Like This, Sign Up for Our Newsletter

When the far right was given air-time in the 90s, the coverage was usually sensationalist and self-serving, Daniels says. She remembers when members of white-supremacist groups were a mainstay on tabloid TV shows like Geraldo, Sally, Oprah, and The Phil Donahue Show. In one episode of Geraldo, a white supremacist group incited an on-air brawl, clocking Rivera in the nose. The episode was a ratings boost, Daniels says.

"When I interviewed the producers of those shows, they said they saw themselves 'doing a public service' by 'shedding light' on these groups," Daniels says. "However, my reading of white supremacist publications during the time suggested the groups exploited their appearances to gain a measure of legitimacy."

Today, we don't need talk show hosts to legitimize the far right, and now, hate groups simply can't be brushed aside. "I'm very alarmed by what I see happening," says Daniels. "The movement is definitely growing in a way I haven't seen in the 25 years I've been studying it...I shudder to think what lies ahead."