Skip links

Start now

The #SaveYourInternet fight against Article 13 continues

On 12 September 2018, all 751 Members of the European Parliament (MEPs) got a chance to shape the European copyright reform with a plenary vote.

The outcome: 366 MEPs blatantly ignored your calls asking them to #SaveYourInternet, as they adopted the copyright #CensorshipMachine.

What’s next: The JURI Committee Rapporteur, MEP Axel Voss, has been granted a mandate to start informal negations with the representatives of the EU Member States (Council) and the European Commission (EC), so-called ‘trilogue negotiations’, the black box in the EU policymaking process. See EDRi’s explainer for more details on the remainder of this process.

Article 13 only benefits big businesses

Due to the collateral damage created by the vague and overly broad wording of Article 13, only big platforms and powerful rightholders will benefit from its adoption, to the detriment of all other stakeholders. 

Bad for Users

Users will have access to less content and will be unable to share their content with others, even when it’s legal. Moreover, any complaint mechanisms will be easily bypassed if blocking is done under the pretense of a terms and conditions violation, rather than as a result of a specific copyright claim.

Bad for Creators

If platforms become directly liable for user uploaded content they will arbitrarily remove content based on their terms and conditions. As a result, many creators will see their content get blocked too. And, as less platforms survive the burden of this provision, creators will have less choice on where to share their creations.

Bad for competition

Only platforms with deep pockets will be able to comply with the Article 13 requirements and even if small enterprises get an exemption from its scope, this simply means they are not allowed to scale up and compete with the big US platforms, under the motto ‘in Europe, small is beautiful’!

By requiring Internet platforms to perform automatic filtering all of the content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users. (…) we cannot support Article 13, which would mandate Internet platforms to embed an automated infrastructure for monitoring and censorship deep into their networks. For the sake of the Internet’s future, we urge you to vote for the deletion of this proposal.

+70 Internet and computing luminaries Open letter

Although the latest proposed versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating ‘best efforts’ and taking ‘effective and proportionate measures.’ (…) I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.

David Kaye UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression

I think that the overfiltering problem is huge and the norms are so vague. Article 13 is doomed to failure. The Digital Single Market Directive draft is some speculation that if we put these really strict rules in place, all the tech companies and platforms that can afford to license content will do that. I think that’s naive.

Professor Pamela Samuelson Director of the Berkeley Center for Law & Technology - President of the Authors Alliance

The lesson, for me, is: Don’t tear down the building, be the landlord. It’s far more beneficial for me to embrace the community that is remixing my art, to set my own rules about how my work is used, and to embrace the shared creativity and profits that come from it. It wasn’t easy for me to adapt my thinking, but today I work with a number of online services to give fans what they want while still getting paid.

Wyclef Jean Grammy-award winning musician and founding member of hip-hop group The Fugees

The concern of the vzbv: Out of fear of completely unclear liability rules many contents will disappear in the net. Dubious content, so-called fake news, on the other hand, will find it even easier to spread on the internet in the future.

Klaus Müller Executive Director of The Federation of German Consumer Organisations (Verbraucherzentrale Bundesverband e.V. – vzbv)

Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens’ fundamental rights. (…) Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens’ communications if they are to have any chance of staying in business.

+50 NGOs representing human rights and media freedom Open Letter

Claim: The Internet will not be filtered

Assessment: Not true. Upload filters will become an obligation for platforms that want to enter the market. The distinction between Internet and platforms is artificial. There is hardly any internet service without active user involvement. The spectrum of user generated content ranges from newspaper websites, blogs and social networking sites to online forums and cloud solutions.

+200 academics from over 25 research centres Open Letter

Claim: There is no problem relating to freedom of expression

Assessment: Not true. (…) Article 13 motivates firms to use cheap upload filters which will block legitimate content. Complaint and redress mechanisms are insufficient to cope with this problem. Expressions such as permissible parodies will be affected.

+200 academics from over 25 research centres Open Letter

The world should be concerned about new proposals to introduce a system that would automatically filter information before it appears online. Through pre-filtering obligations or increased liability for user uploads, platforms would be forced to create costly, often biased systems to automatically review and filter out potential copyright violations on their sites. We already know that these systems are historically faulty and often lead to false positives.

Professor María Sefidari Huici Chair of the Wikimedia Foundation

(…) under the new Directive, activity that is core to Reddit, like sharing links to news articles, or the use of existing content for creative new purposes (r/photoshopbattles, anyone?) would suddenly become questionable under the law. (…) Protecting rights holders need not come at the cost of silencing European internet users.

Reddit Blog Post

What’s at Stake?

Article 13’s various versions creates a system whereby platforms face an increased (direct) liability for the content uploaded by their users if it infringes copyright. As a result, these platforms are likely to overblock even legal content and use automated techniques to avoid being sued, which will mean users will no longer be able to share and experience the content they were used to find online.

Our Ability To Post Content On The Internet Will Be Limited By A Censorship Machine

Some of the content uploaded on the Internet infringes the copyright of rightholders (which are often not the content creators but intermediaries and investors such as recording or film studios) and content creators complain that due to the digital evolution, they make less money than they used to (the so-called ‘value gap’). This does not reflect the reality accurately, specifically in the case of the music industry that year after year announce that their incomes keep increasing. However, what they claim is that some platforms (YouTube, Vimeo… ) do not pay them enough when they stream copyrighted content: that is what they call the “value gap” (the gap between what rightsholders think would be fair as a compensation and what platforms pay them).

Article 13 claims to address these problems but does so it in a way that hampers the way the Internet has been functioning so far by asking platforms to put in place costly and opaque solutions to pre-screen our content. This proposal would require intermediaries such as Facebook and YouTube to constantly police their platforms with censorship machines, often with no human element involved in the process. It will mean that you will no longer be able to upload or enjoy the same content as you used to, as automated blocking is likely to stop (legitimate) content of ever making it online. Analyses by EDRi of the European Commission and JURI proposals show the underlying threats in Article 13’s logic.

And what’s worse: none of the versions of Article 13 make life better for creators. Article 13 actually makes no mention of creators: only rightholders.

The scope of application of Article 13 is excessively broad and does not comprise any mechanism that constrains inappropriate or unreasonable claims by rightholders. To solve this, some of the proposed version include carve-outs for specific platforms in a more or less defined manner (for example for online encyclopediae like Wikipedia) but this approach means that only those platforms that are known and valued today get a ‘pass’ from the censorship machine.

New Censorship Machines Should Not Be ‘Encouraged’ And Existing Ones Should Have User Safeguards

The measures required by Article 13 to avoid liability will be expensive to implement and will thus make it harder for European start-ups to grow and compete with big US platforms that already have these filters in place (such as YouTube with ContentID).

Moreover, where most of the ‘complaints’ seem to come from the music and film industry, Article 13 applies to all types of platforms and all types of content, including text or software code, or music sheets, architect blueprints, etc.

As organisations such as Github and Wikimedia raised their voice, carve-outs have been written to try and avoid them becoming collateral damage of Article 13. But what about the companies that have not raised their voice or not been heard (e.g. WordPress, AirBnB)? What about the platforms that do not exist yet but could bring the same benefit to society in the future as Wikipedia does currently? The carve outs show the collateral damage is real. The extent however is currently unfathomable, as shown by an infographic by trade association EDIMA (note: some versions of the text of Article 13 include partial carve-outs for code sharing platforms, online encyclopedia, online retail platforms and (B2C) cloud services but these are not without loopholes).

The copyright rules in the European Union are extremely complex and nuanced, as evidenced by a solid body of case law from the highest European court, the Court of Justice of the European Union. Many of the handling we currently do on social media rely on exceptions to copyright (such as parody or quotation) which are not identifiable by algorithms as they require ‘context’ (is this funny? Are you acting in a non-commercial manner? Did you use this for the purpose of criticism) and are not implemented in the same manner in each EU Member State.

Algorithms and Filters Have a Proven Track Record at Being Bad At Nuance

Creativity and free speech will be harmed by Article 13 because algorithms struggle to tell the difference between infringement and the legal use of copyrighted material vital to research, commentary, parodies and more. This is far too high a cost for enforcing copyright.

No filter can possibly review every form of content covered by the proposal including text, audio, video, images and software. Article 13’s mandate is technically infeasible and it is absurd to expect courts in 27 EU Member States to be constantly working out what the “best” filters might be.

Moreover, it is a bad idea to make Internet companies responsible for enforcing copyright law. To ensure compliance and avoid penalties, platforms are sure to err on the side of caution and overblock. To make compliance easier, platforms will adjust their terms of service to be able to delete any content or account for any reason. That will leave victims of wrongful deletion with no right to complain – even if their content was perfectly legal.

Finally, the proposed censorship machines are a disproportionate and ineffective ‘solution’ to the problem: this has been highlighted by the highest European Court, the Court of Justice of the European Union, in a decision called SABAM v Netlog (CJEU C-360/10), which ruled that social networks and other web hosting providers cannot be required to monitor and filter activities that occur on their sites to prevent copyright infringement. This would be a breach of freedom of expression and of privacy.

What’s on the table?

Looking at both the European Parliament and Council drafts, 3 key flaws can be identified:

  • the text is not balanced with fundamental rights;
  • Article 13 puts an end to the e-Commerce Directive for a vast array of platforms, and this without a proper Impact Assessment; and,
  • the provision makes platforms directly liable for user uploaded content, which implies upload filtering to avoid this liability.

Moreover, rightholders do not even have to identify the works that platforms need to take down and user safeguards have been reduced to a paper tiger, which will leave users without any recourse to push back against wrongful blockings. See the CopyBuzz.com analysis on an early draft of the adopted proposal, as well as this handy flowchart by COMMUNIA on it.

So what’s next?

The JURI Committee Rapporteur, MEP Axel Voss, has been granted a mandate to start informal negations with the representatives of the EU Member States (Council) and the European Commission (EC), so-called ‘trilogue negotiations’. These negotiations are often considered the black box in the EU policymaking process, because they happen with little to no public accountability behind closed doors, and usually until late at night to broker a deal, without the negotiations documents being publicly available. Therefore, this process is subject to un-transparent horse trading. See EDRi’s explainer for more details on the remainder of this process. But so far nothing is set in stone. The fight against the #CensorshipMachine is thus far from over: we need to keep pressure both on MEPs and national governments to ensure that they find a sensible compromise in the end to #SaveYourInternet.

Jan-Feb 2019

Agreement between Council and European Parliament

After the 12 September vote, the Copyright Directive text is being negotiated in what is called 'trilogues' between the European Parliament and the Council (=governments of the Member States), with the European Commission as 'honest broker'.
March-April 2019

Final Vote – European Parliament Plenary

Once the trilogue is concluded, the final text will be submitted for a plenary vote to all Members of the European Parliament. This will be the ultimate chance to stop this unbalanced review of the copyright framework.

Want to be kept up-to-date?
Follow FixCopyright on Twitter

Follow @Fixit_EU