The European Union is a political and economic union con­sist­ing of 28 member states, which are subject to the ob­lig­a­tions and the priv­ileges of the mem­ber­ship. Countries which form part of the European Union are therefore char­ac­ter­ised by their ongoing efforts to create a common European area with an internal market which promotes sci­entif­ic and tech­no­lo­gic­al de­vel­op­ment.

What is more, the European Union and the United States are tied not only by the world’s largest bilateral trade agreement, but also by common in­vest­ments, which form part of the most in­teg­rated economic re­la­tion­ship in the world. Ad­vance­ments within the EU are therefore seen as important building blocks of this re­la­tion­ship. One of the most recent de­vel­op­ments concerns the European Union’s copyright reforms, which have been an ongoing phe­nomen­on for the past couple of years.

Apart from the ancillary copyright for press pub­lish­ers in­tro­duced in Germany in 2013, Article 17 (formerly Article 13) has also raised a few eyebrows, for obliging internet platforms to use so-called upload filters. While sup­port­ers of the clause in question see this as an essential tech­no­lo­gic­al de­vel­op­ment ensuring correct copyright measures in movies, music, or texts, its opponents fear that it will weaken the network culture and threaten the right to freedom of ex­pres­sion with un­fore­see­able con­sequences.

We must therefore ask ourselves what the upload filters are all about, how they work, where they have already been im­ple­men­ted, and most im­port­antly, why do they cause such heated debates?

The current status: The EU has decided that upload filters will come

Despite much protest­ing, on the 26.03.2019 the EU Par­lia­ment decided to introduce copyright reform. Shortly before this, opponents of the reform tried to sway their opinion by holding numerous public protests: Demon­stra­tions took place around Europe the weekend before the vote. In Germany alone, more than 100,000 people took to the streets. Wikipedia Germany also shut down their en­cyc­lo­pe­dia for the day, in a sen­sa­tion­al protest. Instead, when visitors tried to access the page, they were re­dir­ec­ted to an in­form­a­tion page about the protests. In the end, these actions were fruitless: 384 MEPs voted in favour, 274 against and 36 abstained.

Article 17, which deals with content filtering, was formerly known as Article 13 and is still known under that label. The directive does not ex­pli­citly provide for upload filters, but the wording more or less does not allow any other options. Platform operators are already obliged to check contents for copyright in­fringe­ments before pub­lish­ing videos, music or pictures, otherwise the operators will be liable for in­fringe­ments. The­or­et­ic­ally, it would also be con­ceiv­able to check each upload by hand, but critics consider this to be an un­real­ist­ic option, es­pe­cially when it comes to larger providers like YouTube.

Ex­cep­tions apply to online en­cyc­lo­pe­di­as (es­pe­cially Wikipedia) or other edu­ca­tion­al offerings, platforms for the de­vel­op­ment of open source software, as well as services that have not yet been available for three years or that generate less than €10 million (£8.5 million, approx.) in sales per year.

Time will tell what providers like Google and Facebook will do. First, the European Council needs to approve the reform, but this is more or less a formality. This is due to take place at the beginning of April. The directive will then have to be trans­posed into national law. EU Member states will have two years to in­cor­por­ate the reform into their re­spect­ive national laws.

EU copyright reform: upload filters not mentioned ex­pli­citly

Upload filters have long been discussed at European level, as they could play a role in the single market under copyright law. In July 2018, the European Par­lia­ment rejected a cor­res­pond­ing draft law. On September 12th, 2018, a new version of the draft was put to the vote, which states that pro­vi­sions to check content for copyright in­fringe­ment will apply to large sites, and smaller ones would be spared. Online en­cyc­lo­pae­di­as such as Wikipedia are also to be exempted from the ob­lig­a­tion to check content. The European Com­mis­sion have also presented another directive in which upload filters play a key role: As part of the fight against terrorism, internet platforms are to be forced to examine all content for terrorist pro­pa­ganda. What was not included in the reg­u­la­tion, however, was an exception for small website operators or open source offerings. This means that the upload filter policy would be a blanket policy, covering all nations.

In the context of upload filters, Article 17, formerly Article 13, is of par­tic­u­lar interest, even though it is not exactly mentioned. The European Par­lia­ment does not tell operators of online platforms how to ensure copyright pro­tec­tion. But critics and observers assume that there is no other option: According to the draft, platforms should check their content for copyright in­fringe­ments already, before pub­lish­ing anything. Due to the enormous volume of data, this is prac­tic­ally im­possible without automated upload filters.

In a vote, the European Par­lia­ment adopted the bill with 438 votes in favour, 226 against and 39 ab­sten­tions.

What is upload filtering?

Upload filters are automated computer programs that scan data either when it is uploaded online or before it is published on a platform, and sub­sequently verify it according to certain criteria. There are three possible scenarios upon detecting content which does not conform to pre­vi­ously defined criteria – it is either blocked, the user is prevented from uploading, or the content is simply modified to match all norms. Upload filters can either be installed on in­di­vidu­al sites and apps, im­ple­men­ted by web hosts, or by the user's internet provider. They can be used for the following purposes:

  • Pre­ven­tion of extremist and criminal content
  • Lim­it­a­tion of false reports, insults, and cy­ber­bul­ly­ing
  • Filtering of por­no­graph­ic or violent content
  • Iden­ti­fic­a­tion of copy­righted material
  • Cen­sor­ship is possible in the event of misuse

It is the last point that has recently triggered heated debates about upload filters among members of the European Par­lia­ment in the context of copyright law.

How do upload filters work?

Upload filters require two essential com­pon­ents, one of them being a database of im­per­miss­ible data stored in the form of hash values. In the case of the current EU-driven incentive, such a database would contain copyright-protected material.

Fact

Hash values, which are primarily used to store passwords, are strings of letters and numbers generated by math­em­at­ic­al functions from source materials. The same source material always produces the same hash value. However, it is not possible to deduce the source material from hash values.

An algorithm compares the hash values of copy­righted material with those of the uploaded data. An overlap between the two prevents the file from being uploaded. Nev­er­the­less, upload filters do not become activated solely in cases of com­pletely identical or very similar files. With the aid of various machine learning methods, you can spot in­di­vidu­al com­pon­ents in images, videos, song parts, or texts. It is even possible to model un­der­ly­ing files to a certain extent. Let’s take cat images as an example. From a database of cat images, al­gorithms are able to learn how a cat looks and sub­sequently recognise new cat images that have not yet been saved in a database.

Where have upload filters already been im­ple­men­ted?

A na­tion­wide com­mit­ment to upload filters would be a far-reaching step. However, in order to validate the large amounts of data uploaded onto various platforms on a daily basis, large internet companies have already been using this tech­no­logy for a long time.

YouTube

YouTube’s upload filter – Content ID – checks all newly uploaded videos for copyright in­fringe­ments. Upon detection of content which violates re­spect­ive reg­u­la­tions, copyright owners can act in three different ways:

  • Block access to the video and delete its content so that it can no longer be accessed
  • Gain revenue from ad­vert­ise­ments which are broad­cas­ted before videos start
  • Stay informed on the number of hits and other cor­res­pond­ing stat­ist­ics

In par­tic­u­lar, the un­au­thor­ised dis­tri­bu­tion of films, shows, songs, and music videos is to be prevented by this method. According to YouTube, the algorithm in question spares the work of 180,000 human in­spect­ors.

Facebook

The world’s largest social network – Facebook – pre­dom­in­antly uses upload filters to spot posts, images, and videos which are violent, in­ap­pro­pri­ate for young viewers, and offensive even before their pub­lic­a­tion. In order to combat terrorist or extremist content, Facebook, Twitter, Microsoft, and YouTube resort to a common database compliant with the op­er­a­tions of Europol – the European Police Authority.

Microsoft OneDrive

As in­di­vidu­al files are uploaded onto its Cloud, the file hosting service in question performs automatic data analysis called PhotoDNA, which is pre­dom­in­antly used to combat child por­no­graphy. In this way, in 2015 for example, the German Federal Criminal Police was able to detect a pae­do­phile with the aid of evidence provided by Microsoft.

Re­searchG­ate

Upon requests by various pub­lish­ers, this social network for sci­entif­ic pub­lic­a­tions was forced to introduce upload filters to identify un­au­thor­ised secondary pub­lic­a­tions and pla­gi­ar­isms. The algorithm simply decided whether pub­lic­a­tions should only be made available to certain research groups or should be deleted in their entirety instead.

Criticism of upload filters

Although upload filters and their fight against child por­no­graphy, extremism, and copyright in­fringe­ment initially sounds like a support-worthy cause, they also cause some con­sid­er­able risks, which are in­cess­antly pointed out by opponents of this new EU copyright law.

Sus­cept­ib­il­ity to errors and ma­nip­u­la­tion

are re­l­at­ively easy to trick when carrying copy­righted material across blocking filters. Non­ethe­less, what seems to be even more worrying is that computer programs often censor per­miss­ible content. In this way, al­gorithms do not spot parodies, remixes, or tributes, which are generally covered by copyright law. Critics therefore not only speak of re­stric­tions to artistic freedom, but also of an end to ‘meme culture’. This internet phe­nomen­on is often based on re­con­tex­tu­al­ising copyright-protected images, videos, and songs, and even sometimes con­trib­utes towards the re­dis­tri­bu­tion of content which has been changed in its entirety.

It is also possible to fraud­u­lently claim copy­righted material and register it in a database. The spread of content un­pro­tec­ted by copy­rights would therefore be im­possible before de­term­in­ing the true owner.

Pos­sib­il­ity of cen­sor­ship

Upload filters sim­ul­tan­eously provide a means for pre-censoring and con­trolling gov­ern­ment-owned in­form­a­tion. When misused, upload filters could favour the re­stric­tion of the right to freedom of ex­pres­sion and of the press. For instance, if databases were not filled with copy­righted material, but rather with unpopular state­ments and other forms of criticism directed at the State, then they could no longer be freely voiced on the internet. The im­ple­ment­a­tion of such forms of tech­no­logy gives a taste of what the na­tion­wide content-filtering policy in China feels like.

What is the current debate on upload filters actually all about?

The redesign of copyright laws in the European Union has con­sid­er­ably heightened the public’s awareness of upload filters. Copy­rights holders such as pub­lish­ers, film dis­trib­ut­ors, and the music industry demand improved pro­tec­tion of their copy­righted works on digital dis­tri­bu­tion channels and urge the pre­ven­tion of un­au­thor­ised data dis­clos­ure (already taking place on platforms such as YouTube).

On the other side of the fence, however, there are various as­so­ci­ations, internet activists, civil rights cam­paign­ers, Wikipedia operators, and crit­ic­ally-minded politi­cians spread across various parties. Although they are in favour of the aims of the proposed act and support the pro­tec­tion of in­tel­lec­tu­al property, they point out that upload filters are not the right way to go at all. They believe that the filters in question go far beyond their main objective, are not yet fully developed, and would pose a threat to freedom of ex­pres­sion.

Please note the legal dis­claim­er relating to this article.

Go to Main Menu