Letter to Social Media CEOs - Schneider, Gonzales, Higgins Lead Bipartisan Letter Urging Social Media Companies to Respond to Violent Threats on Their Platforms

Letter

Dear Industry Leaders,

We write to you in the wake of the recent mass shootings in our districts -- Highland Park, Illinois, Uvalde, Texas, and Buffalo, New York. As you may know, on July 4th, as families-- parents, grandparents, grandchildren--lined the streets of Highland Park, a single, evil man climbed a ladder to a store rooftop and monstrously opened fire into the crowd. Seven loving people were murdered, dozens were wounded, and an entire community was traumatized.

Sadly, and unacceptably, this tragedy is all too familiar in far too many American communities. On May 24th, a gunman entered Robb Elementary School in Uvalde, Texas and murdered 21 people, 19 of whom were children, and wounded 17 others. Ten days earlier, a shooter in Buffalo, New York, opened fire in a grocery store, murdering 10 people and injuring 3. To date, there have been over 350 mass shootings in the United States this year alone.

As Members of Congress, we continue to grieve the lives lost and the tragic impact to our communities, but it is also a call to action. As representatives, we must always be asking whether there are ways to prevent mass shootings like this. The more that we learned about the shooters, the more we learned that there were warning signs which, if properly heeded, may have prevented these tragedies.

Sources revealed that the shooter in the Highland Park incident frequently used the social media platform, Discord, to share violent materials, including videos depicting him committing mass murders. The shooter in Buffalo also used Discord to document his plans for the shooting and live streamed it on Twitch1.

In Uvalde, while friends and family may have been unaware of the shooter's deadly plans, he had a robust online presence where he detailed his violent inclination and plans to harm others. He sent messages threatening to kidnap, rape, and kill. He was particularly active on Yubo app, where he posted images of dead cats, joked about sexual assault, and sexually harassed young women. While several people reported him for bullying and threats on Yubo app, the killer remained active on the platform.

The details from the Highland Park, Uvalde, and Buffalo shootings echo what we have heard from many prior mass shootings: shooters often make their violent intentions known online via social media. Oftentimes, this violent content is reported to your companies, usually to no avail.

As leaders within the social media industry and with a world that is increasingly online, we respectfully ask for your response to the following questions:

1) Content moderation

How many content moderators do you employ or contract?
How do you train content moderators to recognize harmful content?
Can users report multiple pieces of content as part of a single incident?
Do you offer real-time support for urgent, time-sensitive incidents or attacks?
Do you provide resources to users, such as warnings, when they click on outbound links to sites that contain content or activity that violates platform rules?
How do you engage users posting harmful content? Do you provide users that violate platform rules with a detailed explanation of why content was removed? Are there escalating penalties for repeat offenders?
What proactive steps do you take to ensure that users with a history of hateful and abusive behavior are unable to continue these practices on your site?
2) Flagged content

What is the average time it takes to review flagged content from employees, contractors, trusted flaggers, and platform users? And are protocols similar or different depending on the source?
Do you have an abuse reporting portal or ticketing system for users to track your response to reported incidents?
When you receive reports from users about troubling comments by users, how do you determine whether to bring the content to the attention of local or federal law enforcement?
In how many instances in the U.S. have you flagged user content for local law enforcement?
3) Process improvements

What actions have you taken to refine reporting mechanisms, content removal processes or other content moderations in the wake of these recent attacks?
We all have a responsibility to do what we can to keep our communities safe. I appreciate a thoughtful response on how you are doing your part and welcome the opportunity to work together on this shared goal.


Source
arrow_upward