Digitalisation means better access and free data exchange. It also means that our personal details are becoming – for many tech companies – a valuable resource. What do concerns you most in digital societies?
One of my biggest concerns is – although I would call myself an optimist through and through – that we miss out on discussing our social, moral and legal guidelines, formulating them clearly and applying them under the new given circumstances. For this, politics, economy and society must work very closely together. This is far from being done sufficiently yet.
I am also concerned about society dividing into different groups of people: those who have understood digitalisation and use it for their own ends, those who do not want to deal with it and thus are indifferent to it, and those who are afraid of what is going on and what will come and try to fight it. We must prevent this social fragmentation.
We have to approach this problem from different angles. First, people must be equipped as early as possible with the necessary knowledge and understanding of the role data play today. They need to know exactly how, as media users, they have to behave in various situations and contexts.
Secondly, we also need meaningful, effective but also practical consumer protection laws that define the legal framework on the part of politics. And finally, companies must adhere to a kind of codex and internalise a self-image for what is morally no longer possible.
Personalised prevention and treatment, access to medical data, Big Data analyses for research and virtual trials – digital health promises benefits for the patients and healthcare in general. What can we learn from other industries concerning data misuse or data protection?
Health care data are particularly sensitive and highly personal data. If we consider how serious the consequences were in cases like, e.g. Cambridge Analytica, we have to multiply the relevance of this very data misuse when it comes to thinking about the impact on fields such as the health sector.
It is precisely in these areas that people’s trust is even more critical than in other areas of digitalisation. If this trust is lost because of abuse, it is almost impossible to regain it.
We need meaningful, effective but also practical consumer protection laws that define the legal framework on the part of politics
That is why we need a clear, well-thought-out and well-regulated consent system that enables patients to make the final decision on the use of their data and prevents them from losing control.
Here, too, we need clear legal rules, the violation of which must be sanctioned not only theoretically but also practically and with all consequences.
From your experience, what issues should be on the top of the digitalisation agenda of healthcare in Europe right now?
I firmly believe that the desire and the ideas to create an innovative healthcare system have long been present in many places already. What we need is a concept that enables us to use this potential and put such concepts into practice in a way that people can trust them and see the right sides of technological progress.
To this end, science must be intensively involved. We need to ask: What is possible and do these possibilities also make sense and will they benefit the people.
GDPR was the first step to make personal data secure. But technologies like AI, IoT, Big Data, mobile apps, robotics etc. set new challenges. Don’t we need something like a European constitution for AI? Right now we are experimenting with some innovations with no road map.
The idea of the GDPR is absolutely right, and it is good that it has been enacted. But not everything went perfectly well, and there are still a lot of things for us to learn – for example when it comes to communication and the implementation of regulation like that.
What also became apparent during the process of working out the GDPR is the fact that a national legal framework is not sufficient anymore. There is no doubt – we need global or a least European approaches.
But in addition to that there are still completely different challenges: What, for example, do we do with data of which we had no idea at all would play a role in connection with data protection. I think of things like brain-data. In fields like that, we need to clarify terminology as quickly as possible and develop a road map for times to come. If we fail to do so, progress will overtake us in no time.
How to ensure the ethical use of data in healthcare?
We need to know what is happening; we need to understand correlations and then assess possible consequences. We must conduct social discourses that are scientifically substantiated. It is not conspiracy theories but scientific knowledge that should guide us when we define moral, social and legal guidelines.
But we must also be aware that we cannot look into the future. The day after tomorrow, entirely new questions may arise that we do not even suspect today. With this we need to be honest but at the same time optimistic, as technology in itself is not bad. I am sure that if we all work together, we will see the immense potentials of the digital transformation. And if we seriously accompany the many developments with a generally positive attitude, then we can also meet the most significant challenges.
Philipp Otto is one of the leading national experts for digitisation in Germany. He is head of the Innovation Office of the Federal Ministry for Family Affairs, Senior Citizens, Women and Youth. His work focusses on the strategic development of individual concepts and models for dealing instantly and constructively with the challenges posed by digitalisation and by the internet for politicians as well as for public and private institutions. He was a Visiting Researcher at the Berkman Centre for Internet & Society at Harvard University.