Technology companies that neglect to take down online terrorist content could face heavy fines from Europe in the future.
The European Commission on Wednesday proposed new legislation that would penalize internet platforms like Facebook and YouTube for not removing extremist content quickly.
{mosads}The proposal would require technology companies to remove such content within an hour of it being flagged by the government.
“One hour is the decisive time window, when the greatest damage can take place,” European Commission President Jean-Claude Juncker told the European Parliament during his state of the union address Wednesday.
Under the proposed rules, companies that don’t remove content within the one-hour timeframe face steep fines rising to as much as 4 percent of their global revenue for the previous year.
“You wouldn’t get away with handing out fliers inciting terrorism on the streets of our cities — and it shouldn’t be possible to do it on the internet, either,” EU security commissioner Julian King said in a statement.
The proposal marks a departure from the EU’s previous strategy, which tried to compel technology companies to voluntarily remove online extremist content on their platforms.
The proposal still requires approval from the EU’s Parliament and member states before it becomes law.
Facebook and Google, who would be those among tech companies directly affected by the proposal, declined to explicitly endorse it, but said that they support and share the European Commission’s goals.
“There is no place for terrorism on Facebook, and we share the goal of the European Commission to fight it, and believe that it is only through a common effort across companies, civil society and institutions that results can be achieved,” a Facebook spokesperson said, noting that the company has made strides in improving its crackdown on terror content.
“We welcome the focus the Commission is bringing to this and we’ll continue to engage closely with them, member states and law enforcement on this crucial issue,” a Google spokesperson also said, noting that the company has been investing in spotting and tracking down pro-terror content.
Both companies have taken steps to reduce the presence of terrorists using their platforms to recruit and communicate. Reports in May, however, found that Google and Facebook were struggling to tamp down on extremist accounts and communities.