An ethical approach to fundamental rights

Publication

If you believe the words sometimes attributed to Gandhi, law is codified ethics. But effective laws and standards of ethics are guidelines accepted by members of a society, and these require a social consensus. I believe that technology is changing or at the very least influencing our ethics and it’s a phenomenon we need to urgently address.

On 31 May this year, I wrote about the first EDPS-Ethics Advisory Group (EAG) workshop that was taking place that day as part of a broader discussion we were launching, both in the EU and globally, on the digital environment and its ethical implications.

I am delighted at the workshop’s success and the feedback we received on it. I know that the work of the Group will yield tangible results in due course.

For me the workshop was an occasion to listen, take note of a variety of views and consider the relation between ethical reflections and the work of the EDPS. It was also an occasion for experts from the wider data protection community to become more familiar with the flagship project of my institution’s third mandate.

The workshop highlighted that while the new EU General Data Protection Regulation (GDPR) is a landmark in human rights law, its adoption is even more reason for us - data protection authorities such as the EDPS - to think about the effects of data protection in practice and ask the difficult questions that need to be considered.

The GDPR emphasises the need for an ethical reflection on the digital environment and fundamental rights: the GDPR reinforces the need for organisations to be accountable; what if we consider that accountability implies a responsibility to take ethical considerations into account as part of an organisation’s corporate social responsibility?

As much as the GDPR is designed to grapple with the realities of global, ubiquitous data in the internet era, laws generally address societal needs after innovations have been widely adopted.

While we must welcome innovation and the benefits it brings us, we must also remain committed to sustainable development, taking into account issues of inequality, human dignity and inclusiveness.

Emerging technologies inevitably have wide implications not only for security and ethics but for our definition of human dignity and the equality of individuals.

It’s because of this that the EAG was launched as part of the EDPS’ broader initiative to consider the ethical impact of the digital era since it challenges not only existing data protection principles but our values and mores as a society.

At the very least, ethics can help to keep the concrete effects of the GDPR robust. But more effective would be for individuals, organisations and society at large to take a broader approach to ensure that technology does not dictate our values.

I believe we must collectively analyse how we implement data protection principles ethically and supplement them where necessary. In our digital world, data protection cannot be the sole responsibility of data protection authorities.

As part of their mandate to explore how to ensure the integrity of our values while embracing the benefits of new technologies, the EAG met in October to discuss how ethics can contribute to a data protection regime confronted by a digital world.

The debate raised many interesting questions.

Is compliance with the GDPR, or indeed any law supporting data protection or privacy, only about avoiding harm or fault?

Does compliance offer protection to the individuals the law is designed to protect or does it simply mitigate risks for organisations? The weighing up of harm and risk also involves an ethical assessment.

The Group’s discussions emphasised the importance of complying with the GDPR but also the importance of building upon that compliance and the need to consider what goes beyond it:

  • Effective enforcement is necessary to ensure the proper application of data protection principles.  What if companies are compliant with data protection rules but are not ethical?
  • The main actors in the online environment have the power to monitor, predict and influence individuals and private lives. They also contribute to designing our public space and society. Given the breadth and width of their reach, should they be accountable on a larger scale?
  • Due to their different perspectives, the dialogue between lawyers and engineers is often lost in translation. But is this gap between law and innovation really a viable excuse from big companies? There is a need to bridge this gap. Perhaps ethics could help to bridge it.

The balance of power between individuals and big business is tipped in favour of internet giants and to hold fast to our values requires more energy and commitment today than it did before the onset of the digital age.

It is high time that technology developers and data processing actors were ethically motivated. I am confident that the EAG will contribute well-founded arguments to help us to define this motivation.

In light of this, the Group is working to identify the ethical responsibilities of online actors. The greatest challenge is to encourage long term, ethical analysis and prospective thinking towards technological innovation - a holistic approach, if you will, to the digital project.

I anticipate that the first interim report of the EAG to be published next year will be a fascinating read.

The report and videos of the first workshop on 31 May 2016 are available here.

A second EDPS-EAG workshop with experts from the scientific research community is in the planning for spring 2017.

Big Brother, Big Data and Ethics

Publication

Privacy is dead they say.

But of course it isn’t. It is well and truly alive. Regardless of how much we share on social media, in reality we are still selective about what we do share. Even online, we find ways to secure, conceal or protect ourselves whether through ad blockers, security settings, the dark web or other ways. That is privacy.

While technology and the internet influences the way we behave, there is no evidence it has diminished our values as a society. So much so, that the EU continues to uphold our fundamental rights to privacy and data protection.

As technologies and personal data become ever more intertwined, the need for an ethical reflection on our fundamental rights, technology, markets and business models is long overdue.

So I'm proud that we're initiating a worldwide debate on how to ensure the integrity of our values while embracing the benefits of new technologies.

As part of this flagship project of my independent organisation's third mandate, the Ethics Advisory Group (EAG) was launched in January 2016. With the help of this group, we intend to identify a new ethical approach in the coming years so that individuals are no longer reduced to mere data subjects in the digital environment.

At the time of publishing this blogpost, the first EDPS- EAG workshop is underway.  Experts from the data protection community are gathered together with the distinguished members of the EAG and other eminent ethics experts to explore the main concerns of the wider data protection community. The discussions today will form valuable input for the continued work of the Group.

I am honoured that so many respected people were eager to take part in today's workshop. Our venerated moderator for the day is my predecessor as European Data Protection Supervisor, Peter Hustinx. 

Leading the breakout discussion sessions are four honourable members of the EAG, Peter Burgess, Luciano Floridi, Aurélie Pols and Jeroen van den Hoven. They will invite participants to examine four areas in which the rapidly changing digital landscape requires new thinking on ethics and data protection. The two other members of the Group, Antoinette Rouvroy and Jaron Lanier will also play an active role in the debates.

1) What is digital ethics?

George Orwell warned against big brother. He didn’t realise at the time that we'd also need to pay attention to big data.

The internet has evolved such that the tracking of people’s behaviour has become routine for many intelligence agencies, not to mention an essential revenue stream for some of the most successful companies. I've said it before but it's worth emphasising: we are each more than the sum of our data and yet we are more defined by our quantified selves than ever.

Algorithms based on the data collected about us are not objective, they reflect choices about what, how and who is doing the measuring. The same can be said of their interpretation. Sometimes, decisions by state authorities or private companies are made on the basis only of what can be measured. But humans are unpredictable; we cannot be assessed by algorithm alone.  Should the efficiency that is associated with technology override fairness, dignity and the common good?  Should algorithms be subject to ethical critique?  These are, obviously, rhetorical questions, but I hope that our project will guide us in dealing in practice with the ethical consequences of technology.

2) Human dignity in the digital age

Human dignity is the cornerstone of fundamental rights. We value privacy not to hide something but because the control of our personal information is central to our sense of self.

But in a society characterised by massive data sharing, do we need to revise our notions of human dignity, privacy and personal data?

While we accept that technology transforms the norms of human behaviour, it also blurs our ability to give free and sometimes, informed consent.  How do we reconcile this?

Can we introduce moral responsibility in the vacuum created between people and automated processes such as surveillance or data collection?

3) Technology as a driver and an actor

In our technology-driven society, it is easy to get excited by the frequent novelties introduced on the digital market. But should ethical considerations determine the direction of innovation?  Should human values play a part in the development of new technologies?  Remembering the human element in innovation was certainly the message of Stephen Hawking and the Future of Life Institute in their powerful letter of January 2015.  We ignore these luminaries at our peril.

4) Ethics and the law

In Europe, privacy and data protection are separate rights sanctioned by the law.
But is it enough that a practice affecting our privacy, our personal data or both is legal?  What if the law was to be the minimum standard?  The new General Data Protection Regulation (GDPR) introduces the concept of accountability in EU law so to what extent should ethics play a part in this concept?

Big data is one example of how personal data is driving technologies and practices in the public and private sectors. These technologies raise profound questions not only about human rights but also about what it is to be human.

In its fourth recital the GDPR states, 'Data processing should be designed to serve mankind'. This, I argue, is an ethical sentiment to be shared around the world. With today's workshop we have begun the dissemination.

Agenda for the first EDPS-EAG Workshop.
A report from today’s meeting will be prepared and made public in due course. A second EDPS-EAG workshop with experts from the research community is in the planning.