The shooting massacre earlier this month at a Buffalo grocery store has put the spotlight back on social media companies, some of which are under investigation by the attorney general, who called for the industry to “take responsibility” and “be more vigilant” in monitoring content.

The suspect targeted Black shoppers at Tops Friendly Market on Jefferson Avenue, and live-steamed his rampage online. He also left behind at least two screeds – more than 700 pages – filled with antisemitism and white supremacist theories. Ten people were killed and three more were injured in the assault.

Twitch, a livestream gaming service, said it removed the live-stream video within two minutes, but that was still enough time for others to copy and share it on other websites.

“Once these are out there, it’s impossible to put the genie back in the bottle,” said Richard McNeil-Willson, a research fellow at Leiden University in the Netherland, who studies extremism and counter-extremism.

Discord, a chat app with invite-only channels, said the suspect created a private server as his personal diary, and approximately 30 minutes prior to the attack, he invited a “small group” to join the server. The Washington Post reported that 15 people accepted the invite. Meanwhile, The Buffalo News first reported and News 4 independently confirmed that a former federal agent believed to be from Texas is being investigated to determine if the individual had prior notice of the attack plans.

‘Wild West’ of hate

The Buffalo shooter revealed that social media websites 4chan and Reddit were his chief sources for information on the antisemitic, anti-immigrant, racist theories he espoused.

As a result, State Attorney General Letitia James is investigating social media platforms that she said are “responsible obviously for feeding this young man all of his hate.”

“The fact that an individual can post detailed plans to commit such an act of hate without consequence, and then stream it for the world to see is bone-chilling and unfathomable,” James said

As calls get louder for government and big tech to act with more urgency, some experts believe that thwarting these types of attacks is a very challenging and complex balancing act that is further complicated by the lack of a unified path forward.

“But also, the tech companies need to stand up and take responsibility for the fact that we know that there is toxic and potentially deadly activity occurring on their platforms,” said Jonathan Lacey, a former FBI agent who operates a private security consulting firm.

The experts interviewed by News 4 Investigates all agreed that social media platforms need to be held accountable and that government must consider different approaches to address the problems caused by the rise in lone wolf actors and far right-wing extremism.

“These social media sites are like the Wild West of hate and extremist speech and planning,” Lacey said.  “And law enforcement needs tools to be able to adequately address this threat.”

Who to hold accountable?

Arun Vishwanath, a cyber-security expert, said a national policy for social media accountability is overdue, but he would like to see action taken against those who amplify the hate speech, too.

“So, it’s not just the platforms,” Vishwanath said. “There are individuals, too. Let’s hold them both accountable and it’s time we do this because there’s going to be another shooting. Like it or not unfortunately it’s the saddest thing to say, but it’s the reality we live in.”

McNeil-Willson co-authored two papers on the Buffalo mass shooting in which he takes a deep look at the lessons to be learned from the horrific act when developing new responses to violent white supremacy and an analysis of the screeds he left behind.

“Current governmental and private sector responses struggle to adequately deal with such documents due to a combination of factors: over-reliance on takedowns, difficulties in coordination between governments and social media platforms – but also because current approaches exceptionalise the nature of extreme-right violence as disconnected from mainstream discourse,” McNeil-Willson’s paper determined. “Challenging this conceptualization may offer a path to a more effective, less security-centric response against extreme-right violence.”

McNeil-Willson said a lot of the focus is put on prevention measures to try to stop attacks, which is not something he believes is possible.

“I think there should be more concern and more funding and support around broader issues of anti-racism, countering white supremacy in local communities,” he said.

In European countries, he said there is a broader discussion of community responses to violent extremism and violent extremist ideologies. Programs that focus on community work to divert young people and violent groups away from racist narratives toward more positive forms of social engagement might be a better use of already thin resources. Social support networks that can help communities become more resilient to extremism should be considered over a prevention narrative that focuses more on preventing one person from becoming involved in violence, he said.

He suggested creating community hubs that can operate in areas where far-right extremism is more prevalent to help younger people develop responses and resiliency through other skills, such as positive engagement through the democratic process.

“It’s better to say why is the majority of the community not engaging in violence or why are the majority of people not engaging in violence and what can that teach us about how we better create resilient communities in response to violent extremism?” McNeill-Willson said.

In addition, he suggests communities and decisionmakers work toward having meaningful discussions around what constitutes hate speech and how to respond to it with a more holistic approach.

“It doesn’t mean banning it, but it means having a serious discussion around the implications of that language and how it gets used,” he said. “An awful lot of far-right language is dressed up often in order to make it palatable to mainstream audiences.”

Lacey said big tech needs to understand that they can find a balance between privacy and public safety.

“This case is a terrible reminder that these sites are being used for the planning of horrific attacks against people in our communities,” Lacey said. “Something needs to be done.”

Dan Telvock is an award-winning investigative producer and reporter who has been part of the News 4 team since 2018. See more of his work here and follow him on Twitter.

Luke Moretti is an award-winning investigative reporter who has been part of the News 4 team since 2002. See more of his work here.