The Digital Single Market Strategy (DSMS) according to the European Union (EU) is one in which the free movement of persons, services and capital is ensured and where the individuals and businesses can easily access and engage in online activities under conditions of fair competition, and a high level of consumer and personal data protection, irrespective of their nationality or place of residence.[1] Through a variety of policy interventions, the DSMS wishes to move the EU from twenty-eight national markets to a single one by “bringing down barriers to unlock online opportunities[2].  This review seeks to evaluate the balance of rights between intellectual property rights owners and the competing interests and rights at stake given Article 13(Now 17) (as amended) of the EU Directive on Copyright in the Digital Single Market[3]. For this discussion, I would be focusing on Art 13(17) of the proposed E-commerce directive and the recitals that affect Art 13(17).

Due to the extensive impact of Intellectual Property infringement, the EU is determined to take the necessary steps to restore the proper functioning of Single Market, through harmonisation of copyright and its enforcement[4]. The directive has three aims which are to Modernize Europe’s copyright regulatory framework, achieve a fully functional and close the Value Gap[5]. A valuation gap is a difference in the actual market value of an object and the value that the owner expects to sell it to achieve his/her needs or target.[6] Some authors, however, believe that the value gap is non-existent. The EU wants to move from a regime of liability by ISPs to one of ISP responsibility mainly dealing with illegal content such as child pornography, terrorism, hate crime and piracy.[7] The strategy lists steps to be taken particularly the policy and legislative action regarding the protection of IP rights.

People use the internet for locating information, networking and communication, working at home or telecommuting, shopping for or marketing materials, accessing social media, uploading memes and image and using artistic works to portray their expressions in a non-commercial way. This usage has further developed the internet and expanded the horizons of technology. Given this, the EU has proposed the copyright directive to allow original content creators protection over their works. While protection is guaranteed under the copyright directive in line with the provision of Art 17 on the protection of IP rights of the charter on Fundamental Rights of the European Union; the question remains is the protection proportionate[8] considering the rights of other users and OCSPs under the proposed directive?

The Promusicae-case[9] was the first case where the right to intellectual property conflicted with other fundamental rights. The CJEU confirmed that the Member States are allowed, but not obligated, to enact legislation that lay down obligations to disclose personal data also in civil proceedings. The CJEU in the L’Oréal-case stated that barriers to legitimate trade might comply with the requirement of striking a fair balance between the fundamental rights of the EUCFR.[10] To achieve a fair balance, required actions must be effective and proportionate to end the copyright infringement. The actions taken must still make sure that the opposing right’s very substance is not impaired.

  • Facebook
  • Twitter
  • Google+
  • Gmail
  • LinkedIn

The CJEU also stated that an injunction, requiring an intermediary to suspend an ongoing infringement of intellectual property and to prevent similar further infringements, can be issued if the intermediary does not on its initiative suspend the infringement.[11] From the above decision, one can discern that measures to be taken are not allowed to make it impossible or excessively difficult to exercise the rights provided by EU-law.[12] Therefore, it can be said that these measures are not unfair to impair the very substance of the protection of personal data and the freedom to conduct business. The CJEU came to the same conclusion in the Tommy Hilfiger case.[13] 

While the Directive has been lauded as a step in the right direction by major copyright owners[14] such as warner music group, IFPI amongst others, it has been lampooned by individuals and big data hosting Companies as violating the safe harbour principles already entrenched in the E-commerce Directive. The provisions of Article 11(15) and 13(17) now have been regarded as particularly contentious. Article 11 is the link tax provision which provides that news aggregators e.g. google news would pay news websites for featuring their news articles. Big content hosters like Youtube[15] have described Article 13 as “discouraging or even prohibiting platforms from hosting user-generated content”[16]. By the Provisions of Art 2(5) ‘ OCSSPs[17] are defined as large companies without a clear delineation of what determines a large hosting company focuses on large OCSSPs which are after that mandated to police user’s data with the aim of stopping copyright infringements at all costs.

The existence of safe harbour legislation under the E-commerce directive is fundamental to operate a properly functioning online ecosystem[18]. The E-commerce safe harbour provisions are mirrored after the provisions protecting OCSSPs under section 512 of the Digital Millennium Copyright Act[19]. The E-commerce Directive provides that intermediaries as in this cases OCSSPs are protected under its safe harbour provisions which sees them as ‘mere conduits’[20], allows OCSSPs to use caching to make websites faster[21] and the storage of information exemption provision[22]. This principle has served as a shield for online providers to avoid filtering and intermediaries’ policing mandates which are reinforced by the obligation not to engage in general monitoring of users.

Art 13 looks like a big step towards creating a copyright registry which may become centralised sooner or later. This may be a good thing as the right to intellectual property under Art 17 of the Charter of Fundamental Rights of the European Union could be better protected with the creation of a copyright registry to be maintained by an OCSSP however this may lead to a duplicity of lists particularly if a central body does not manage the registry. The proposed Art 13 is motivated by the proliferation of unlicensed copyrighted material on online platforms which generate revenues through ads thereby short-changing IPR owners and allegedly creating value gap which the IFPI has protested[23].

The adoption and promulgation of Article 13 could render the safe harbor doctrine provided for under the E-Commerce directive meaningless, destroy the balance between affected fundamental rights and freedoms, reduce investments in new online services. It would also disrupt novel services developed by small and medium-sized enterprises and result in furthering anticompetition doctrines in favour of providers which already have a strong market position particularly organizations such as the IFPI.

The provisions of Article 15 of the E-Commerce Directive mandate that member states shall not allow the imposition of a general obligation on providers, when providing services stipulated under Articles 12, 13 and 14, to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity. This position sharply contrasts with the provisions of Article 13 option 2 of the proposed directive which stipulate a notice and stay down procedure involving the filtering of user’s information by OSPs.

In addition to the above, general filtering systems intended to protect copyright challenge several rights[24] protected under the Charter of Fundamental Rights of the European Union. The CJEU in Netlog Vs Sabam stated that the requirements of general filtering as implied by the provisions of Art 13 which would require a digital platform to install a complicated, costly, permanent computer system at its own expense would be a violation of Art 15 of the E-commerce directive.

Art 13, if passed, would force big data organisations such as YouTube or any organisation that falls into the meaning of a large data organisation to install copyright filters for every upload, and impose harsh penalties for infringing content that slips through. Youtube already runs Content ID which is a notice and staydown mechanism. This legislation would effectively move OCSSP from a regime of liability to one of direct responsibility.

The court in Sabam vs Netlog further reiterated that the use of a general filtering mode as proposed by Art 13 would affect users’ right to the protection of their personal data guaranteed under Article 8 of the Charter of Fundamental Rights of the European Union as it would involve the identification, systematic analysis and processing of information connected with the profiles created on an intermediary[25]. It is important to note the system might not always be able to distinguish between unlawful content and lawful content, eventually blocking lawful communications. If Deep Packet Inspection (DPI) is deployed for monitoring on an OCSSP and is used on a typical social media user, it may result in privacy issues. This is because content recognition and filtering generally rely on fingerprinting and watermarking technology, which entails ‘real-time’ monitoring and identification of unlawful material on route to the user, before blocking access[26]. However, the CJEU has been known to waive the right to privacy under article 8 in cases involving banking secrecy[27] and in specific instances may prioritise the right to IP over privacy.[28]

The Court in Sabam Vs Netlog also stated that it is not sufficient for a file exchange to be declared unlawful because it is protected under copyright. Therefore, a transmission may be lawful depending on the application of statutory exceptions to copyright which vary from one Member State to another. Also, in some Member States, certain works fall within the public domain or may be posted online free of charge by the authors concerned.

In Netlog and Scarlet Extended, the CJEU stated that filtering measures and obligations cannot attain a ‘fair balance’ between fundamental rights and copyright. Such measures would undermine users’ freedom of expression[29]. The proposal would greatly disorient users’ freedom to receive and impart information. Automatic infringement assessment systems might undermine the enjoyment of users’ exceptions and limitations. Therefore Art 8 & 11 had to also be considered in balancing the fundamental freedoms. Automated systems cannot replace human judgment that should flag a specific use as fair or falling within the scope of an exception or limitation. Also, complexities regarding the rights of certain works might escape the discerning capacity of content recognition technologies.

From the above, it can be seen that the CJEU has already found that imposing general obligations to filter on OCSSPs can interfere with the fundamental rights of end-users however, the circumstances of those cases determine decisions in individual cases. While the CJEU may change its mind or be convinced by the attempts in the law to provide safeguards for the rights of end-users, the Court does have the option of invalidating the Directive, if it finds that it is incompatible with other legislation, including the EU charter of fundamental rights. This has happened in some cases, e.g. in the Schrems case[30] or Digital Rights Ireland.[31]

Regarding the freedom to conduct business, it would go against Article 3 of the IPR Enforcement Directive which provides that procedures and remedies necessary to ensure the enforcement of the IPR shall not be unnecessarily complicated or costly and such should avoid creating barriers to trade.[32] 

Frosio, explains that such technology, especially when its monitoring is all-encompassing and indefinite as Article 13 of the DSMS is suggesting, would be costly and unfeasible, especially on those ISPs that do not own any content recognition technology. Therefore, this obligation would not be too burdensome on intermediaries which already have content recognition technologies, such as YouTube’s Content ID, but it would hurt upcoming platforms, widening the gap between an established ISP and an emerging ISP.

The guarantees introduced for freedom of expression are not enough to address the issues highlighted by the CJEU in the Sabam cases. In particular, it remains unclear how automated filtering systems could ever make freedom of expression assessments. These systems according to several academics[33]  are just not smart enough to define permissible quotations, parodies, remixes, mashups etc. and impermissible copying in the light of the fragmentation of the national copyright legislation across member states.

Protection of user data might also be under threat, the CJEU in the Scarlet Extended-case discovered that a filtering system would collect users’ IP addresses, which are protected personal data[34] because tracing IP-address could be used to identify users[35]. Article 13 option 1 is not clear, precise like all laws should be. Large amounts of data are not quantified. A proper definition of a small organisation is not given that falls under the definition of OCSSPs. It is a general principle in European Union law that a law must be specific, in that it is clear and precise, and its legal implications foreseeable.[36] The adoption of laws which will have legal effect in the European Union must have a proper legal basis. Legislation in countries that implement European Union law must be worded so that it is easily understandable by those who are to obey the law. It is therefore in the interest of justice that legislation preventing situations of uncertainty be a dictate of the legislature. Thus, to fulfil their functions, proper limitations must be fixed in advance[37].

  • Facebook
  • Twitter
  • Google+
  • Gmail
  • LinkedIn

In order to ascertain the extent to which the implementation of Article 13 would be legitimate it would depend on the compatibility of this provision with the three-parts of the Court of Justice of the EU’s non-cumulative test. Under the EU Charter, any interference with human rights must: be ‘provided for by law’, pursue a legitimate aim and be ‘necessary’ and ‘proportionate’. The answer should recognize that when assessing whether Article 13 is compatible with the Court three-part non-cumulative test, there are a number of principles that must be applied. Specifically, Article 13 must: (i) be accessible; (ii) be foreseeable; (iii) be rule of law compliant; (iv) pursue a legitimate aim; (v) be a last resort measure; (vi) be proportionate; (vii) and strike a fair balance between all fundamental rights at stake.

Another interesting right conflicting is that Article 13 does not distinguish between passive hosting service providers LV and Google[38] and active hosting providers Ebay Vs loreal. In the case of L’Oreal v eBay[39], it was stated that the nature of the means used by the intermediaries could make the intermediary fall within or outside the safe harbour protection. In other words, Recital 38 profoundly impacts the concept of neutrality underpinning the Copyright Directive safe harbour regime and clashes with the distinction between passive and active hosting providers.

Obtaining authorisation from copyright holders is tricky as no universal licensing model exist. The Directive, however, expects intermediaries to enter into agreements with right holders before their content can be allowed on their platforms. The possibility of licensing every work uploaded on the internet is slim as the rights society do not have every creative person or IP right holder as members.  

Another right that does not have an inkling of fair balance is the complaint and redress mechanism which contradicts Art 47 of the EU Charter. While Google has admitted that very few persons have contested the results of is Content ID tool the users that dispute Content ID takedown measures win about 60%[40] of the contests. The proposal recommends an automated tool to be used to filter infringing content the use of this tool may lead to a plethora of false positives and negatives. Evaluating a proper balance between rights is further complicated by the existence of automated and opaque systems for detection and removal of content which may have proven effective but have also led to consistent false positives and false negatives particularly regarding works that fall under copyright exceptions.

  • Facebook
  • Twitter
  • Google+
  • Gmail
  • LinkedIn

Art 13 would entrench the economic positions of the current major platforms, make life difficult or impossible for smaller ones, and erode the distinctions between big platforms and the state as they may operate as the same. It would mandate content removals using filters that are guaranteed to misapply the law[41]. Frosio, explains that such technology, especially when its monitoring is all-encompassing and indefinite as Article 13 of the DSMS is suggesting, would be costly and unfeasible, especially on those OCSSPs that do not own any content recognition technology or cannot afford one. Therefore, this obligation would not be too burdensome on intermediaries which already have content recognition technologies, such as YouTube’s Content ID, but it would have an adverse effect on upcoming platforms, widening the gap between an established ISP and an imminent ISP.

Google created a monster they should tame it liability regime for alleged copyright infringement occurring on YouTube,.[42] The directive defines optimization in Recital 37a as including the promotion, display, tagging, curating and sequencing of works. Before now, the position was that such acts did not give a platform an ‘active role’ of the kind that would expose it to liability[43].

In my own opinion, the balance of rights between the three stakeholders in question – the intellectual property rights, those conducting a business, the right to freedom of expression and the protection of personal data, was not struck, and perhaps leaned more towards the right of the IPR holder, while being too burdensome on the other parties. I believe the general monitoring prohibition of Article 15 of the E-Commerce Directive (ECD) must be respected, so also should the limits set by the Charter of Fundamental Rights of the EU.

Although Article 13 only creates obligations for platforms rather than end-users, undoubtedly, filtering will have a profound impact on consumers who may still try to upload works to social media platforms. However, these uploads will never arrive at the platform if they are identified as infringing by the filtering mechanisms applied[44]. Rather than the responsibility based approach being proposed at the EU, a negligence-based system could better protect users’ fundamental rights. As Van Eecke remarked, “the notice-and-take-down procedure is one of the essential mechanisms through which the eCommerce Directive achieves a balance between the interests of rightsholders, online intermediaries and users[45]. Given this I suggest a couple of ideas for minimising the harm that Art 13 would do to users, creators and the internet:

The scope of application should be as narrow as possible with the aim of being a targeted application with specific timelines, frames and users[46]. While I acknowledge the efforts made to give right holders legislators should make sure that any proposed legislation should be tailor-made to address specific IP addresses or terminals used by repeat copyright infringers.

A general liability exposure would ensure platforms are liable for all works that are uploaded by their users[47].

The agonising concern with the requirement to implement upload filters is that these filters are only capable of identifying works. Filters are just not capable of determining if the use of a work is infringing or if it is allowed under exception or limitation to copyright. Yet the Council and the Parliament have stated that any measures to be ratified needs to respect the rights users have under exceptions and restriction. Given the current state of technology, this just amounts to paying lip service to the charter. The legislation should, therefore, include proper redress mechanisms that would allow users to challenge unjustified blocking or removal of their uploads promptly.

Given that over-filtering is likely to occur it is not right to place the burden to rectify this on users who can only act after their rights have been violated. To ensure that measures respect user rights platforms and rights holders must both face significant damages for unjustified content removal or blocking. The legislation currently does not meet this condition given the current state of technology that would mean that they cant rely on automated blocking or removal and will need to ensure that platforms are fully licensed.

Art 13 is structured such that copyright enforcement and the safeguarding of fundamental user rights are left to private entities (rights holders and the online platforms). Privatising enforcement and rule setting in the hands of for-profit entities undermines the idea of a democratic digital media space. To ensure that rights holders and platforms do not abuse the measures, users and creators must have full transparency regarding any blocking or removal of content by platforms. In the interest of full transparency measures should be based on publicly accessible repertoire information that is available to all platform operators. This ensures that rightsholders can be held accountable for unjustified blocking or removal as a result of faulty repertoire claims and that all platforms have access to the same repertoire information.

The failure to adequately address or consider the rights of users and OCSSP would result in legislation that may fail to achieve its objective and damage the EU internet ecosystem rather than strengthen same. It will further emphasise that the EU copyright framework protects legacy business models of a few at the expense of freedom of expression and innovation of the majority.

Art 13 when fully legitimatized could grant a weaker position to the freedom of expression, the freedom to conduct a business and the protection of personal data compared to the right to intellectual property.

[1] ‘Shaping The Digital Single Market’ (Digital Single Market, 2018) <> accessed 13 December 2018

[2] COM (2016)593

[3] European Commission, ‘Proposal for a Directive on copyright in the Digital Single Market’ COM(2016) 593 final (14 September 2016)

[4] ‘Shaping The Digital Single Market’ (Digital Single Market, 2018) <> accessed 13 December 2018

[5] Recital 37 European Commission, ‘Proposal for a Directive on copyright in the Digital Single Market’ COM(2016) 593 final (14 September 2016)

[6] Cashman M, ‘Valuation Gap – Aspen Grove Investments’ (Aspen Grove Investments, 2018) <> accessed 13 December 2018

[7] Digital Single Market Strategy,

[8] Article 52

[9] Case C-275/06 Promusicae (n 116).

[10] Case C-324/09 L’Oréal and Others (n 32), paras 139-141.

[11] Case C-324/09 L’Oréal and Others (n 32), para 141

[12] Case C-324/09 L’Oréal and Others (n 32), para 136 and Joined cases C‑222/05 to C‑225/05 van der Weerd and Others [2007] ECLI:EU:C:2007:318, para 28

[13] Case C-494/15 Tommy Hilfiger Licensing and Others (n 37).

[14] Resnikoff P, ’84 European Music Organizations Declare Their Support For Article 13′ (Digital Music News, 2018) <> accessed 13 December 2018

[15] ‘Youtube: New EU Copyright Law Could “Drastically Change The Internet” – Torrentfreak’ (TorrentFreak, 2018) <> accessed 13 December 2018

[16] ‘European Parliament Backs Copyright Changes’ (BBC News, 2018) <> accessed 10 December 2018.

[17] Online content sharing service provider

[18] Urban J Karaganis, ‘Notice And Takedown In Everyday Practice’ [2016] SSRN Electronic Journal

[19] 1998

[20] Article 12

[21] Article 13

[22] Article 14

[23] To buttress the value gap argument, the European Commission Impact Assessment revealed that rights holders are not always able to decide under what conditions rights holders can make their content available on these services and obtain fair remuneration for the use of their rights.

[24] Darren Meale, ‘(Case Comment) SABAM v Scarlet: Of Course Blanket Filtering of the Internet is Unlawful, But This Isn’t the End of the Story’ (2012) 37 Europ Intell Prop Rev 429, 432

[25] Scarlet Extended, paragraph 51

[26] Internet Society “Perspectives on Policy Responses to Online Copyright Infringement(2010).” <> accessed 13 December 2018.

[27] Case C‑580/13 Coty Germany [2015] ECLI:EU:C:2015:485.

[28] Dietrich Kamlah, ‘Banking secrecy does not have unlimited priority over the protection of intellectual property’ (2016) 11(1) Journal of Intellectual Property Law & Practice 61-63 <> accessed 13 December 2018.

[29] Netlog (n 5) § 55. See Charter of Fundamental Rights of the European Union, C326/391 (26 October 2012) art 8 and 11.

[30] Case C-362/14 Schrems v Data Protection Commissioner (ECJ, 6 October 2015),

[31] Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and others [2014] All ER (EC) 775 at 776

[32] IPR Enforcement Directive (n 24) Article

[33] ‘The Copyright Directive: Misinformation And Independent Enquiry’ (, 2018) <> accessed 12 December 2018

[34] Article 2(a) of the Data Protection Directive; Article 4(1) of the GDPR an IP address is regarded as personal data

[35] Case C-70/10 Scarlet Extended (n 80), para 51.

[36] Kaczorowska A, European Union Law (Routledge-Cavendish 2009)

[37] Case T-22/02, Sumitomo Chemical, 6 Octobre 2005, §82. The CJEU confirmed that, limitation periods fulfil the function of ensuring legal certainty » (Case C-367/09, SGS Belgium e.a., 28 Octobre 2010, §68).

[38]  Case C-324/09 L’Oréal and Others (n 32), paras 139-141.

[39] C-324/09 L’Oréal v eBay International [2011] RPC 27 (2011)

[40] 2018 edition of How Google Fights Piracy

[41]Daphne Keller, ‘Counter-Notice Does Not Fix Over-Removal Of Online Speech’ (, 2018) <> accessed 1 December 2018.

[42] Justice Luis Felipe Salom Google Brazil v Dafra, Special Appeal No. 1306157/SP (Superior Court of Justice, Fourth Panel, 24 March 2014) <>.

[43] Loreaol V Ebay

[44] ‘The Copyright Directive: Misinformation And Independent Enquiry’ (, 2018) <> accessed 12 December 2018

[45] Patrick Van Eecke, ‘Online Service Providers and Liability: A Plea for a Balanced Approach’ (2011) 48(5) Common Market L Rev 1455, 1486-1487.

[46] Association C, ‘Article 13: Four Principles For Minimising Harm To Users, Creators And The Internet – International Communia Association’ (International Communia Association, 2018) <> accessed 12 December 2018

[47] This would be inclusive of works that cannot be licensed by the platforms. From the perspective of platforms, been held liable for infringements is to a blanket order and this will mean that platforms shall block all copyrighted works uploaded by their users for which they do not have a license. This would not benefit the parties involved and could severely limit the freedom of creative expression of millions of EU Internet users. To prevent this the liability of platforms must be limited to those works that they can license.

Share This

Share this post with your friends!

Share This

Share this post with your friends!