Any tech company that is responsible for people posting content online will have three months from now to report back to the European Union on what they were doing to meet the new targets it has set.
The European Commission has set out new recommendations for technology companies in a bid to attack illegal online content, including the requirement that 'terrorist content' be removed within an hour of notification.
The recommendations include provisions for companies to remove terrorist content within one hour of it being flagged, faster overall procedures to detect and remove illegal content as well as safeguards for freedom of expression and data protection. "What is illegal offline is also illegal online". "As the latest figures show, we have already made good progress removing various forms of illegal content".
While these recommendations are non-binding, they could factor into future legislation.
Exhausted of waiting while social media platforms and websites hem and haw over what to do about illegal content, the European Commission on Thursday set the bar high at least when it comes to terrorist content: Pull it down within an hour.
Some companies have been increasing self-regulation of their content, using machine learning to catch extremist videos on YouTube and matching a post's material to known terroristic content using artificial intelligence on Facebook.More news: JR Smith suspended by Cavs for detrimental conduct
More news: North Korea sending chemical weapon supplies to Syria
More news: Canadian Athlete, Wife, Manager Arrested In PyeongChang
European Digital Rights, a civil rights group, described the Commission's approach as putting internet giants in charge of censoring Europe.
The commission, responsible for managing day-to-day European Union operations, said it will monitor action taken in response to the recommendations and determine if legislation will be required. Executive director of European Digital Right (EDRI), Joe McNamee, said the EC is pushing "voluntary" censorship to internet firms "to avoid legislation that would be subject to democratic scrutiny and judicial challenge".
"We need to monitor that is taken down from one space and make sure it doesn't pop up elsewhere, even when content is automatically deleted", said Gabriel.
- Larger platforms should share detection technology and best practices with smaller companies.
"EDiMA fails to see how the arbitrary Recommendation published by the European Commission, without due consideration of the types of content; the context and impact of the obligation on other regulatory issues; and, the feasibility of applying such broad recommendations by different kinds of service providers can be seen as a positive step forward", said the association in a statement emailed to BuzzFeed News.