Diesen Beitrag in Deutsch lesen
Authors: Petra Sitte, Simon Weiß
In this article series, we will discuss the contents of the new draft of the DSM Directive. What was the Directive initially about, what does the current draft say, what has been changed compared to earlier drafts, and what needs to be amended in our view? This part will focus on Article 17, i.e. on upload filters, while in part 2 we will turn to copyright contract law.
What is it about?
There is no single part of the EU copyright reform that has drawn as much attention and criticism as Article 17. The core issue is this: platforms to which users can upload content, such as YouTube, must in future obtain licenses for all forms of content and prevent unlicensed content from being uploaded. This results in an obligation to install upload filters, i.e., to have uploaded content checked and blocked by automated software filters.
These upload filters are dangerous, as there exists no technical procedure capable of identifying the contexts that define whether a particular publication violates, or respects, copyright law. This inevitably leads to “overblocking”, i.e., the blocking of actually permitted content, and thus limits users’ freedom of expression.
We, The Left Party Parliamentary Group in the German Bundestag, have therefore strictly opposed Article 17 from the outset and still hope that the European Court of Justice will overturn this provision given that it violates fundamental rights. The German government has announced in a detailed protocol statement that it seeks to make upload filters “largely unnecessary” when implementing the provision.
What does the draft say?
The Ministry of Justice has certainly undertaken serious efforts to deliver an implementation that can be easily managed in practice and secures the rights of the users. But good intentions do not make good policy if their foundations are flawed. And with the current draft, there is no doubt that upload filters will be allowed.
The current draft addresses some of the issues mentioned in the protocol statement, but not all, in particular questions regarding data protection and open interfaces. A key question is whether the German government will be able to implement its ideas in the dialogue processes at European level – but so far, not much seems to have happened in this regard, contrary to its announcements.
The draft contains good approaches in two areas. Firstly, the introduction of a de minimis rule, which allows short excerpts of works to be freely used on platforms. This is crucial, especially in order to protect an online culture that thrives on memes and remixes, but also to exempt many other everyday practices such as screenshots shared on social media. Unfortunately, European legal provisions prevent this barrier from becoming the rule, which could lead to paradoxical situations, with users being allowed, for example, to post content on Facebook but not on their own website. This goes to show that we still need new regulations at the European level to protect everyday activities from copyright sanctions.
Secondly, the draft provides for a “pre-flagging” mechanism, which allows users to indicate when uploading copyrighted content that they have used this content legally. This would then exclude uploaded content from automated filtering, provided the protected work being uploaded has been significantly altered. Such considerations had already been suggested in the government’s statement and in commentaries coming from the scientific community, and in theory they are able to prevent widespread overblocking. But while pre-flagging is well-intentioned, it is far from well implemented in the draft.
One of the reasons for this is that the draft fails to provide a clear definition of “extensive matches”. It merely states that any uploaded content shall be automatically blocked if it matches “at least 90 percent” of the information provided by the rightholder – but what this figure refers to in the context of automatic pattern recognition is left open to debate. Beyond this issue, however, the new draft bill introduces a change that gives rise to a much more fundamental problem.
What has changed?
The previous version of the discussion draft would have given users a general option to pre-flag content during upload. Now, by contrast, users may only pre-flag content after an upload filter has determined that the uploaded content matches a copyright-protected work. On the surface, this may seem like a harmless change. But it is precisely this change that establishes the use of upload filters as a default mechanism, which small providers previously could have avoided by invoking the principle of proportionality. In addition, with this procedure in place, users can no longer challenge content blocked during upload. Julia Reda from the Society for Civil Rights (Gesellschaft für Freiheitsrechte) has provided a detailed description of the problems involved and the way in which the draft helps to increase the market power of large platforms.
The remaining changes to the previous draft concern various details, some of which do indeed bring about improvements. But there are two further modifications which are problematic.
The first provides for sanctions in the case of repeated false labelling in the context of pre-flagging. The discussion draft already allowed platforms to exclude users from the pre-flagging mechanism in the event of repeated false labelling. Now they would even be obliged to do so. In other words: Laypeople who repeatedly fail to correctly assess a possibly complex legal situation will automatically be restricted in their basic rights. This is unacceptable, especially given that respective copyright infringements can be prosecuted under current civil and criminal law.
The second change is that remuneration obligations now also include pastiches, a term that is being newly introduced into German copyright law. It covers certain forms of imitation and adaptation of content, such as literary homage, but is also intended to cover certain online uses. The problem here, of course, is not the obligation to remunerate creatives per se. We expressly welcome the fact that the draft provides for direct remuneration to creative individuals for licensed content. Remunerating pastiches is, however, problematic for two reasons: Firstly, pastiches are not remunerated outside of platforms, and with good reason, because nobody should have to pay for a certain form of creative expression. This makes it difficult to justify remuneration here. Secondly, this regulation would require drawing a clear distinction between pastiches and other exceptions, such as quotations, parodies and caricatures, in order to remunerate some forms of creative expression and others not. But this is hardly possible in practice.
Therefore, the Federal Government had rightly announced in its protocol statement that it would not provide for remuneration of pastiches, since the rightholders would not suffer “any relevant economic losses anyway”. But here, too, the commitment made in the protocol statement is undermined.
What needs to be done?
The fact remains: upload filters are an inadequate means of enforcing copyright law, and they jeopardise basic rights. It would probably be best to refrain from implementing Article 17 for the time being, at least until the European Court of Justice has ruled in the current proceedings.
Nevertheless, it still makes sense to push for every inch of improvement. Comments on the draft bill can still be submitted until 6 November.