Is "Licences for Europe" already falling apart? The case of text and data mining
It was just a few months ago that the EU Commission proudly released its (quite ambitious) plans to modernise EU copyright.
Following an orientation debate on content in the digital economy, at the end of 2012 the Commission indeed agreed on having two parallel tracks of action.
Besides discussion on medium term issues for decision-making in 2014, the Commission announced its intention to have a structured stakeholder dialogue [Merpel wonders whether this has any actual meaning or has just become something akin to mantra-sounding terms in the EU copyright debate, like "evidence-based", "sound economics" or "fit for purpose in the digital context"] to work to address six issues where rapid progress is needed: cross-border portability of content, user-generated content (UGC), text and data mining (TDM), private copy levies, access to audiovisual works and copyright licensing.
Shortly after this announcement and the release of a Communication from the Commission, last February the "Licences for Europe" initiative was launched.
Example of lack of cross-border portability drama: how can you stand the grief of missing MasterChef Italia when you are outside Italy? |
“Licences for Europe” is intended to work as a forum to which relevant stakeholders reunited in four working groups can contribute to deliver rapid progress through practical industry-led solutions in certain areas. These include:
- Cross-border portability of services [this is a cause this Kat truly cares about, as probably do all those who travel across the EU but cannot access the services they subscribe to in a certain Member State outside its territory];
- UGC and licensing for small-scale users of protected works;
- Facilitating the deposit and online accessibility of films in the EU;
- Promoting efficient TDM for scientific purposes.
As it often happens in these cases, shortly after the launch of the “Licences for Europe” initiative, a certain feeling of dissatisfaction with how things were conducted in Brussels began to grow and the IPKat itself published an anonymous complaint.
A couple of days ago stakeholders representing the research sector, European technology SMEs, and open publishers announced their withdrawal from “Licences for Europe”. This was due to disagreement over the overall approach to text and data mining.
How data mining worked in the good old analogue world |
"any meaningful engagement on the legal framework within which data driven innovation exists must, as a point of centrality, address the issue of limitations and exceptions. Having placed licensing as the central pillar of the discussion, the “Licences for Europe” Working Group has not made this focused evaluation possible. Instead, the dialogue on limitations and exceptions is only taking place through the refracted lens of licensing.
This incorrectly presupposes that additional relicensing of already licensed content (i.e. double licensing) – and by implication also licensing of the open internet – is the solution to the rapid adoption of TDM technology.
[...]
Urgent steps are now needed to remove existing legal, technological and skills barriers that prevent TDM technology from being adopted. In order to do this in a way that best serves the public interest in facilitating new medical discoveries, creating new jobs in a vibrant EU technology industry, and maximising the investment of public money in research and innovation, we believe the Commission needs to conduct a rigorous and comprehensive evaluation exercise of TDM, its potential applications and the conditions required to encourage its adoption. [Has this ever been done at the Commission level? Merpel has indeed been unable to find any relevant documents]"
While this Kat agrees that debate about TDM goes well beyond the topic of licensing, she wonders how things are going in the Working Group dealing with UGC, as certainly also this is something which is not confined to the sole boundaries of licensing.