Diesen Beitrag in Deutsch lesen
Study – Paul Keller
Article 17 is supposed to address a so-called “value gap” identified by the music industry. According to this industry, online platforms such as YouTube and Facebook make huge profits by selling advertisements alongside copyrighted content uploaded by their users–all without adequately rewarding the copyright owners. While this general observation is most likely correct, the measures proposed to remedy the value gap has proven to be politically decisive and there are serious doubts that they will indeed result in a more equal distribution of the value generated by these platforms.
The core of Article 17 is a provision making platforms that host content uploaded by their users liable for copyright infringements committed by their users. Before the DSM Directive, online platforms that host content uploaded by users could rely on Article 14 of the E-Commerce Directive to provide their services. Under this provision, platforms can host uploaded content without any risk, as long as they remove such content once they receive information that it is infringing on someone else’s rights. This limitation of liability for copyright infringement committed by users provided the legal foundation for the development of a wide range of online platforms that allow user uploads. A large number of platforms rely on this safe harbour It is worth noting that to date the CJEU never clarified if the big platforms targeted by Article 17 such as YouTube are indeed be eligible for this protection. while at the same time many platforms also entered into licensing agreements with copyright owners to ensure continued availability of copyrighted content on their platforms and to show advertisements alongside that content.
Article 17 of the Directive (formerly Article 13) of the Directive, on the one hand, excludes certain for-profit content-sharing platforms from the above-mentioned liability protections, and, on the other hand, makes them liable for infringing content uploaded by their users. As a result, they have two options:
(a) they obtain authorizations from copyright owners to communicate such content or, if no authorization is granted,
(b) they take a set of steps to be exempted from liability for such infringing content, such as actively searching for infringing content by filtering or other mechanism.
The first option is the preferable option, from the users rights perspective, but it will only be effective if Member States do not rely on individual licensing to grant authorizations to platforms for every piece of content that is available on their services. The second option may require the use of automated filtering technology and, thus, result in widespread overblocking of users’ uploads, interfering with uses made under copyright exceptions and with fundamental freedoms, such as freedom of expression Because of this, the Polish government filed an action for annulment of Article 17 with the CJEU, which means the court still has a saying about the faith of this controversial provision. In the … Continue reading.
Taking into account that users rights will be at greater risk if the platforms rely on filters than if they obtain authorization to communicate their users’ uploads, national lawmakers should fully explore legal mechanisms for granting those authorizations and limit, to the extent possible, the application of filtering technologies. Turning the exclusive right granted by Article 17 into a remuneration right or into a copyright exception or limitation subject to remuneration would be the ideal solutions.
In the current situation it seems relatively unlikely that Member States can implement Article 17 in such a way that the application of filtering technologies can be prevented This is contrary to the expectation expressed in the statement of the German government at the occasion of the adoption of the Directive by the Council.. This raises important questions about the balance between the obligations that Article 17 imposes on platforms and the rights of users of these platforms. In this regard the text of the Article contains a number of conflicting provisions that may be difficult to reconcile by Member States seeking to implement the directive. For this analysis it is important to note that in spite of its public reputation as a draconian measure limiting user freedoms, the second half of Article 17 actually contains a number of provisions that significantly strengthen the rights of users sharing content via online platforms Most of these have been added to the text by the European Parliament during late stage trilogue negotiations and are the result of significant pressure from internet users and civil society … Continue reading. There are two groups of provisions that are especially interesting in this regard. Provisions dealing with the relationship between exceptions and limitations and the filtering obligations introduced by Article 17 (these can be found in paragraphs 7 and 9) and provisions establishing procedural safeguards for these user right (in paragraph 9). Both sections have been carefully analysed by a group of European copyright academics who in November 2019 have issued Recommendations for safeguarding User Freedoms in Implementing Article 17 that have been signed by more than 50 academics from the relevant fields.
Exceptions and limitation in the context of Article 17
As mentioned above, Article 17(7) requires Member States to ensure that users of Online Content Sharing Service Providers (OCSSPs) can rely on the existing exceptions for “quotation, criticism, review” and “use for the purpose of caricature, parody or pastiche”. While there has been some doubt with regards to how to interpret “existing” in this context, the above mentioned academic statement shows that “existing” must be read to “refer[s] to those E&Ls already contained in EU law”. This means that at least with regards to making available works via OCSSPs The Academic Statement on safeguarding User Freedoms in Implementing Article 17 recommends that Member States implement these exceptions for all types of uses: “A rational national lawmaker … Continue reading, Member States who have not done so must implement the parody exception (Art 5.3(k) from the InfoSoc Directive). They also have to implement the quotation exception but this exception is one of the very few that is already implemented by all EU Member States. In order to fully safeguard users freedom of creative expression on online platforms falling within the scope of Article 17, the academic statement also recommends that “Member States should consider clarifying in their national laws that the E&L for incidental use applies fully in the context of acts of making available by users on OCSSP platforms”. This recommendation is in line with the recommendation made in the section on Article 25 above.
Article 17(9) of the Directive further provides that the filtering requirements established by Article 17 “shall in no way affect legitimate uses, such as uses under exceptions or limitations provided for in Union law”. The wording of this provision (“in no way”) makes it clear that the protection of the user rights derived from these exceptions and limitations must be given priority over the requirement imposed on online platforms to filter works uploaded by their users.
Procedural safeguards for user rights
In order to protect user rights Article 17(9) establishes a number of procedural safeguards: OCSSPs must implement “effective and expeditious” complaint and redress mechanisms for users in the event of disputes. These mechanisms entail obligations for both rightholders and OCSSPs. On the one hand, rightholders that request the disabling or removal of content must “duly justify” their requests. On the other hand, OCSSPs that administer complaint and redress mechanisms must process submitted complaints “without undue delay” and subject decisions to disable or remove content to human review. These safeguards significantly strengthen the positions with regards to filtering/blocking by platforms as the existing legal framework does not impose any such requirements on platform operators and rightholders. Existing filtering systems like YouTube’s Content ID work based on rules that have been determined by the platforms themselves.
In order to meet the requirement not to “affect legitimate uses, such as use under exceptions or limitations” Member States must implement these safeguards in such a way that, for OCSSPs, the requirement to protect users’ rights prevails over the requirement to remove or block access to content uploaded by their users. Since current technology is not capable of assessing the legality of individual uses of copyrighted works, this means that platforms must be prevented from automatically disabling access to or preventing the upload of works unless there is certainty that the use in question is infringing. In all other cases, users must have a meaningful opportunity to assert their rights, and uploads must remain available to the public until it has been established that a use is indeed infringing This recommendation is echoed in the Academic Statement on safeguarding User Freedoms in Implementing Article 17: “We recommend that where preventive measures […] lead to the filtering … Continue reading.
While such an implementation would be beneficial from the perspective of users, it represents a huge challenge from the perspective of the European legislator. Given the different policy positions of Member States during the legislative proceedings there is a high likelihood that the implementations of the user rights safeguards will differ substantially between the Member States. This would run counter to the overall objective to create a Digital Single Market. Establishing uniform rules for implementing the user rights safeguards must therefore be one of the priorities of the stakeholder dialogue on the implementation of Article 17 that are currently ongoing. At the time of writing two meetings of the stakeholder dialogue had taken place. These meetings (and the two meetings planned for the remainder of 2019) have not yet addressed concrete … Continue reading The European Parliament, on whose initiative these user rights safeguards were introduced into the text of the Directive, must carefully monitor the proceedings of the stakeholder dialogues and demand that the Commission fully takes these important safeguards into account in the guidelines to be issued based on the outcome of the stakeholder dialogues.
During the ongoing stakeholder dialogue another important aspect that requires attention on the European level emerged. Article 17(4) gives rightholders unprecedented abilities to require OCSSPs to prevent the availability of works (block uploads) and to remove works from their services. Request for blocking or removal must be based on collaboration between rightholders and platforms as part of which rightholders must provide OCSSPs with “relevant and necessary information”. Apart from the issues related to user rights under exceptions and limitations discussed above, this raises another important concern. The information provided by rightholders may be erroneous, misleading or conflicting leading to unjustified actions by OCSSPs. In order to prevent unjustified removals or blocking by rightholders claiming ownership of works that they do not own, such requests for blocking or removal should be made by submitting the “relevant and necessary information” to a centralised publicly accessible database. This would allow public scrutiny of claims of ownership and reduce the risk of misappropriation of works. It would also provide a mechanism for resolving conflicting or contradictory rights claims, thereby improving the quality of the rights information. While these effects are important in the context of protecting users from erroneous blocking and removal, such a public database would also serve another important function that serves both rightsholders and (smaller From the perspective of the bigger OCSSPs such as YouTube and Facebook having their own private databases with rights information is a competitive advantage. Requiring all platforms to work with … Continue reading) OCSSPs: It would significantly reduce the cost associated with managing the “relevant and necessary information” as such information would only need to be provided once (rightholders) and would be consolidated in a single source (OCSSPs).
In order to increase the quality of the rights information and to reduce the overhead costs for rightholders small OCSSPs the European legislator should require the use of a single public database for the storage of the “relevant and necessary information” shared between rightholders and OCSSPs in the context of the blocking and removal actions undertaken in line with Article 17(4).
|It is worth noting that to date the CJEU never clarified if the big platforms targeted by Article 17 such as YouTube are indeed be eligible for this protection.
|Because of this, the Polish government filed an action for annulment of Article 17 with the CJEU, which means the court still has a saying about the faith of this controversial provision. In the meantime, EU countries have to implement it.
|This is contrary to the expectation expressed in the statement of the German government at the occasion of the adoption of the Directive by the Council.
|Most of these have been added to the text by the European Parliament during late stage trilogue negotiations and are the result of significant pressure from internet users and civil society groups.
|The Academic Statement on safeguarding User Freedoms in Implementing Article 17 recommends that Member States implement these exceptions for all types of uses: “A rational national lawmaker implementing the E&Ls in Article 17(7) in line with the above recommendations should take this opportunity to fully harmonize the respective national E&Ls beyond uses concerning OCSSPs.
|They also have to implement the quotation exception but this exception is one of the very few that is already implemented by all EU Member States.
|This recommendation is echoed in the Academic Statement on safeguarding User Freedoms in Implementing Article 17: “We recommend that where preventive measures […] lead to the filtering and blocking of uploaded content before it is made available to the public, Member States should, to the extent possible, limit their application to cases of prima facie copyright infringement. In this context, a prima facie copyright infringement means the upload of protected material that is identical or equivalent to the “relevant and necessary information” previously provided by the rightholders to OCSSPs, including information previously considered infringing. […] In the remaining cases (no prima facie infringement) there should be no presumption that the uploaded content is infringing, meaning that such content should remain available to the public in the OCSSP until its legal status is determined, following a procedure consistent with Article 17(9).”
|At the time of writing two meetings of the stakeholder dialogue had taken place. These meetings (and the two meetings planned for the remainder of 2019) have not yet addressed concrete implementation questions related to article 17 but have so far focussed on discussing “existing practices”. For a critical summaries of these meetings by COMMUNIA see here
|From the perspective of the bigger OCSSPs such as YouTube and Facebook having their own private databases with rights information is a competitive advantage. Requiring all platforms to work with a public database will diminish the competitive advantage that Article 17 affords to the existing dominant platforms.