A violent mix of white supremacists, counter-protesters and law enforcement in riot gear took over the small Virginia town of Charlottesville six months ago, raising questions about race and racism to the surface and exposing a number of societal divisions.
“I think for a lot of Americans, they did not realize that there are so many of these people until they saw Charlottesville,” said Heidi Beirich, the director of the Intelligence Project at the Southern Poverty Law Center (SPLC).
The rally, initially launched by white nationalist group Unite the Right, started on Friday, Aug. 11 and carried into the next day, with the imagery sparking questions and concerns nationwide.
“Fringe elements of society were rising up and getting emboldened and so it all sort of came together in Charlottesville. And the lasting images that people have of young white men talking about how ‘The Jews will not replace us,’ the violence, the sheer volume of extremists that showed up were a wake-up call to other Americans across the country,” said Oren Segal, the director of the Anti-Defamation League's Center on Extremism.
Reverberations from the rally and ensuing violence -- including a woman who was deliberately mowed down -- continued for months, including online.
In the months since the violence, social media companies and Internet-based operations worked to curtail their platforms' use by individuals associated with white supremacist or other hate groups, experts said.
Groups accused of supporting hate speech find platforms online“I’ve heard other officials at technology companies say Charlottesville was so shocking and so in-your-face that they realized they didn’t want to play a role in furthering it,” Beirich told ABC News.
One example came days after a motorist fatally rammed counter protester Heather Heyer in Charlottesville, when Google notified neo-Nazi news website the Daily Stormer that they had to find a new web server.
Social media sites, including Twitter, took similar actions. Twitter adjusted its guidelines in December and removed individuals whose content did not meet the company's guidelines.
This led to the banning of a number of individuals associated with neo-Nazi or white supremacist groups, according to the ADL.
Segal said the moves by tech companies, as well as other issues like infighting among the members of the hate groups, makes it “not surprising that the movement is not as coherent as they'd hope it would be.”
“The harmony in the movement was pretty short-lived,” he said.
“But they are still around, and they are still trying to mainstream their message. They're still trying to find ways to amplify their narratives and their voices,” he said.
While President Donald Trump’s immediate reaction to the violence in Charlottesville -- which at first was no reaction at all, followed by a statement that included a condemnation of the violence “on both sides” of the protests -- was a source of criticism for the administration, Trump’s statements were not the only ones to come out of it.
In September, Congress passed a joint resolution "condemning the violence and domestic terrorist attack" in Charlottesville, which was later signed by Trump. The resolution rejected white nationalists, members of the Ku Klux Klan, neo-Nazis and other hate groups.
“My hope is that the resolution will stand the test of time ... that for the first time ever we've got this demonstrable statement that this stuff is bad and everybody agrees,” Beirich said.