Today, Chairman of the House Homeland Security Committee, Rep. Bennie G. Thompson, wrote letters to executives from tech companies — including Facebook and YouTube — over their response to the Christchurch shooting and how video of the horrific event was able to spread online.

“I was deeply concerned to learn that one of the shooters live-streamed this terror attack on Facebook, and the video was subsequently re-uploaded on Twitter, YouTube, and other platforms,” Thompson said. “This video was widely available on your platforms well after the attack, despite calls from New Zealand authorities to take these videos down.”

“You must do better,” Thompson added. “It is clear from the recent pattern of horrific mass violence and thwarted attempts at mass violence — here and abroad — that this is not merely an American issue but a global one.”

Thompson’s letter called for Facebook CEO Mark Zuckerberg, YouTube CEO Susan Wojcicki, Twitter CEO Jack Dorsey, and Microsoft CEO Satya Nadella, to appear before Congress and lay out how they’ll ensure something like this won’t happen in the future.

Although there’s great irony in anyone from Homeland Security calling out tech platforms for being a tool to spread Islamophobic hate, the letter reveals how big of a problem this has become. It also shows that the United States is under some pressure.

Viral trends are popular on the internet with people carefully planning ways to maximize the number of views on an event. Unfortunately, these viral moments are not limited to new games or jokes. As the Christchurch shooting demonstrated, it’s not too difficult to make hate to go viral.

Video of the shooting, which left 50 people dead, was broadcast on Facebook Live.  Before, it was difficult to know how many people watched the video as it streamed. Now, though, Facebook says about 200 people watched the live video — and nobody flagged it to moderators.

The shooter also uploaded a 17-minute video to Facebook, Instagram, Twitter, and YouTube. Since then, the video has exploded across social media, with companies scrambling to keep up.

In a tweet, Facebook said it removed 1.5 million videos of the New Zealand shooting in the 24 hours after the original broadcast. YouTube also said it removed thousands of uploads of the video and Reddit closed its infamous r/watchpeopledie after the video surfaced there.

Those who shared the video or didn’t report it when it surfaced online are part of the reason it spread as widely as it did. There shouldn’t be any entertainment in watching people die. But big tech companies — who own and operate the platforms where these hateful messages are being spread — shoulder some of the responsibility as well.

What the video’s rapid spread shows is that there’s a disturbing culture of promoting hate, in this case Islamophobia, that is perpetuated online. Most alarmingly, this problem isn’t new —  the shooter cited his own online influences in a manifesto — but tech companies have done little to nothing to stop it.

For example, Twitter is infamous for its reluctance to ban Nazis, mongers, and others. In some cases, known members of the alt-right have even been verified, a process meant to signify their position as an account of public interest. In 2018 article, TechCrunch referred to Twitter as a “Nazi haven”.

Facebook, meanwhile, allows advertisers to target people based on their interest in Nazis. Hate speech has been a problem on the platform since it was originally created, but not much has been done about it.

The Christchurch shooting occurred because Islamophobia has crept into every aspect of media and people’s lives. The dehumanization of Muslims that casts them as an inherent threat to social order isn’t limited to a particular party or platform.

The video was allowed to spread because tech companies have displayed time and time again that they don’t take hate seriously. There is a belief that it’s all talk and the hate ends online, but Christchurch is a harsh reminder that it doesn’t.

Tech companies can’t shrug responsibility any longer or pretend that they’re unaware of what’s festering on their platforms. They need to be held accountable for what they allowed to spread and for the consequences that follow.