Mass Violence, Extremism, and Digital Responsibility
10:00 AM Hart Senate Office Building 216
U.S. Sen. Roger Wicker, R-Miss., chairman of the Committee on Commerce, Science, and Transportation, will convene a hearing titled, “Mass Violence, Extremism, and Digital Responsibility,” at 10:00 a.m. on Wednesday, September 18, 2019. In light of recent incidents of mass violence, this hearing will examine the proliferation of extremism online and explore the effectiveness of industry efforts to remove violent content from online platforms. Witnesses will discuss how technology companies are working with law enforcement when violent or threatening content is identified and the processes for removal of such content.
Witnesses:
- Ms. Monika Bickert, Head of Global Policy Management, Facebook
- Mr. Nick Pickles, Public Policy Director, Twitter
- Mr. George Selim, Senior Vice President of Programs, Anti-Defamation League
- Mr. Derek Slater, Global Director of Information Policy, Google
*Witness list subject to change
Hearing Details:
Wednesday, September 18, 2019
10:00 a.m.
Committee on Commerce, Science, and Transportation
This hearing will take place in the Hart Senate Office Building 216. Witness testimony, opening statements, and a live video of the hearing will be available on www.commerce.senate.gov.
*Note: Wintess added 9/16/19
If you are having trouble viewing this hearing, please try the following steps:
- Clear your browser's cache - Guide to clearing browser cache
- Close and re-open your browser
- If the above two steps do not help, please try another browser. Google Chrome and Microsoft Edge have the highest level of compatibility with our player.
Majority Statement
-
Chairman Roger Wicker
Majority Statement
Chairman Roger Wicker
Over the past two decades, the United States has led the world in the development of social media and other services that allow people to connect with one another. Open platform providers like Google, Twitter, and Facebook and products like Instagram and YouTube have dramatically changed the way we communicate and have been used positively in providing spaces for like-minded groups to come together and in shedding light on despotic regimes and abuses of power throughout the world. No matter how great the benefits to society these platforms provide, it is important to consider how they can be used for evil at home and abroad.
On August 3, 2019, twenty people were killed and more than two dozen were injured in a mass shooting at an El Paso shopping center. Police have said that they are “reasonably confident” that the suspect posted a manifesto to a website called “8chan” 27 minutes prior to the shooting. 8chan moderators removed the original post, though users continued sharing copies. Following the shooting, President Trump called on social media companies to work in partnership with local, state, and federal agencies to develop tools that can detect mass shooters before they strike – I certainly hope we talk about that challenge today.
Sadly, the El Paso shooting is not the only recent example of mass violence with an online dimension. On March 15, 2019, 51 people were killed and 49 were injured in shootings at two mosques in Christchurch, New Zealand. The perpetrator filmed the attacks using a body camera and live-streamed the footage to his Facebook followers, who began to re-upload the footage to Facebook and other sites. Access to the footage quickly spread, and Facebook stated that it removed 1.5 million videos of the massacre within 24 hours of the attack. 1.2 million views of the videos were blocked before they could be uploaded. Like the El Paso shooter, the Christchurch shooter also uploaded a manifesto to 8chan.
The 2016 shooting at the Pulse Nightclub in Orlando, Florida, killed 49 and injured 53 more. The Orlando shooter was reportedly radicalized by ISIS and other jihadist propaganda through online sources. Days after the attack, the FBI Director stated that investigators were “highly confident” that the shooter was self-radicalized through the internet. According to an official involved in the investigation, analysis of the shooter’s electronic devices revealed that he had “consumed a hell of a lot of jihadist propaganda,” including ISIS beheading videos. Shooting survivors and family members of victims brought a federal lawsuit against those three social media platforms under the Anti-Terrorism Act. The Sixth Circuit dismissed the lawsuit on the grounds that this was not an “act of international terrorism.”
With over 3.2 billion internet users, this Committee recognizes the challenge facing social media companies and online platforms’ their ability to act and remove content threatening violence from their sites. There are questions about tracking of a user’s online activity, does this invade an individual’s privacy, thwart due process, or violate constitutional rights. The automatic removal of threatening content may also impact an online platform’s ability to detect possible warning signs. Indeed, the First Amendment offers strong protections against restricting certain speech, this undeniably adds to the complexity of our task.
I hope these witnesses will speak to these challenges and how their companies are navigating these challenges.
In today’s internet-connected society, misinformation, fake news, deepfakes, and viral online conspiracy theories have become the norm. This hearing is an opportunity for witnesses to discuss how their platforms go about identifying content and material that threatens violence and poses a real and potentially immediate danger to the public. I hope our witnesses will also discuss how their content moderation processes work. This includes addressing how human review or technological tools are employed to remove or otherwise limit violent content before it is posted, copied, and disseminated across the internet.
Communication with law enforcement officials at the federal, state, and local levels is critical to protecting our neighborhoods and communities. We would like to know how companies are coordinating with law enforcement when violent or extremist content is identified. And finally, I hope witnesses will discuss how Congress can assist ongoing efforts to remove content promoting violence from online platforms and whether best practices or industry codes of conduct in this area would help increase safety both online and offline.
So, I look forward to hearing testimonies from our witnesses, and hope we engage in a constructive discussion about potential solutions to a pressing issue.
Minority Statement
-
Ranking Member Maria Cantwell
Minority Statement
Ranking Member Maria Cantwell
Senator Maria Cantwell
Opening Statement at Commerce Committee, Science, & Transportation Hearing on Mass Violence, Extremism, and Digital Responsibility
Witness: Ms. Monika Bickert, Head of Global Policy Management, Facebook;
Mr. Nick Pickles, Public Policy Director, Twitter;
Mr. George Selim, Senior Vice President of Programs, Anti-Defamation League;
Mr. Derek Slater, Global Director of Information Policy, Google
September 18, 2019
CANTWELL: Thank you Mr. Chairman and thank you for holding this important hearing and for our witnesses being here this morning.
Across the country, we are seeing and experiencing a surge of hate and as a result we need to think much harder about the tools and resources we need to combat this problem both online and offline. While the First Amendment to the Constitution protects free speech, speech that incites eminent violence is not protected and Congress should review and strengthen laws that prohibit threats of violence, harassment, stalking, and intimidation to make sure that we stop the online behavior that does incite violence.
In testimony before the Senate Judiciary Committee in July, Federal Bureau of Investigation FBI Director Chris Wray said that the white supremacist violence is on the rise. He said the FBI takes this threat “extremely seriously” and has made over 100 arrests so far this year.
We’re seeing in my state over the last several years. We’ve suffered a shooting at the Jewish community center in Seattle, a shooting of a Sikh in Kent, Washington, a bombing attempt at the MLK Day parade in Spokane, and over the last year, we’ve seen a rise in the desecration of both synagogues and mosques. The rise in hate across the country has also led to multiple mass shootings, including the Tree of Life congregation in Pittsburgh, the Pulse nightclub in Orlando and most recently, the Walmart in El Paso.
Social media is used to amplify that hate and the shooter at one high school in the Parkland posting said the image of himself with guns and knives on Instagram wrote social media posts prior to the attack on his fellow students. In El Paso, the killer published a white supremacist anti-immigration manifesto on 8chan message board, and my colleague just mentioned this streaming of live content related to the Christchurch shooting, the horrific incidents that happened there. In Miramar, the military engaged in a systematic engagement of Facebook, using fake names and sham accounts to promote violence against Muslim Rohingya. These human lives were all cut short by deep hatred and extremism that we have seen has become more common.
This is a particular problem on the dark web, where we see certain websites like 8chan and a host of 24/7, 365 hate rallies. Adding technology tools to mainstream websites to stop the spread of these dark websites is a start, but there needs to be more to be a concentrated and coordinated effort to ensure that people are not directed into these cesspools. I believe calling on the Department of Justice to make sure that we are working across the board on an international basis with companies as well to fight this issue is an important thing to be done. We don’t want to push people off of social media platforms only to then being on the dark web, where we are finding less of them. We need to do more, the Department of Justice, to shut down these dark web sites and social media companies need to work with us to make sure that we are doing this.
I do want to mention, just last week, as there’s much discussion here in Washington, about initiatives. The state of Washington has passed three initiatives, gun initiatives, by the vote of the people, closing background loopholes and also relating to private sales and extreme person laws, all voted on by a majority of people in our state and have successfully passed. So I do appreciate, just last week representatives from various companies of all sizes in the tech industry sending the Senate a letter, asking for passage of bills requiring extensive background checks. So very much appreciate that and your support of extreme person laws to keep guns out of the hands of people who a court has determined are dangerous in the possession of that.
So this morning, we look forward to asking you about ways in which we can better fight these issues. I do want us to think about ways in which we can all work together to address these issues. I feel that working together, these are successful tools that we can deploy in trying to fight extremism that exists online.
Thank you Mr. Chairman for the hearing.
Testimony
-
Ms. Monika Bickert
Head of Global Policy ManagementFacebookDownload Testimony (62.44 KB) -
Mr. Nick Pickles
Public Policy DirectorTwitterDownload Testimony (167.16 KB) -
Mr. George Selim
Senior Vice President of ProgramsAnti-Defamation LeagueDownload Testimony (275.13 KB) -
Mr. Derek Slater
Global Director of Information PolicyGoogleDownload Testimony (82.11 KB)