Part of the
4TU.
Federation
TU DelftTU EindhovenUniversity of TwenteWageningen University
4TU.
Federation
NL|EN
Close

4TU.Federation

+31(0)6 48 27 55 61

secretaris@4tu.nl

Website: 4TU.nl

#4 Ethics of technology

Which ethical dilemmas are at play in today's data-driven world? And how can companies deal with this in a responsible way? In 4TU.techtalk #4 Karin Bos of Achmea and Philip Brey of the University of Twente discuss this topic.

Text: Nienke Beintema | Photography: Dieuwertje Bravenboer

In the series 4TU.techtalk, a 4TU professional will discuss a topical subject with someone from another discipline or societal domain – on the lookout for each other’s viewpoints and the commonalities, the triumphs but also the dilemmas. Check them out

Avoiding the ethical pitfalls of a data-driven world

4TU.techtalks | Remarkable new technologies are developed every single day. From robotics to precision medicine and from IT to climate technology, many of the innovations are transforming the way we live and interact.

The advantages are manifold, but there are also risks. Aiming to address this challenge responsibly, insurance company Achmea wants to prepare for this in a responsible way and is supported in this process by 4TU.

“We want to exclude certain risks from insurance, and we want to work on prevention. But which data can we use for this? And how far can we go in this regard?””
Karin Bos, Achmea

Nobody can possibly object: new technologies must be sustainable, democratic and fair. They must ensure their users’ autonomy, security and privacy. But what do those values actually mean? Are they universal, or subject to interpretation? In fact, who decides what is allowed and which risks we accept in technological designs? And what about our standards and values themselves – to what extent are they influenced by technological developments?

These are all extremely complex questions that have no simple answers. Yet they need to be addressed when setting frameworks for new developments. Increasingly, it’s not only engineers and politicians who are looking into these questions, but also ethicists and philosophers. They are trained to look beyond the boundaries of disciplines, business sectors, cultures and generations.

“The world is completely different now compared to some 20 years ago,” says Philip Brey, professor of Philosophy and Ethics at the University of Twente. “Many developments are based on what we now call 'big data': enormous amounts of data. These have become very important for many organizations, who use them extract all kinds of valuable information. At the same time, there are many complex dilemmas involved. Technological, but certainly also socio-ethical. ”

“We are now transitioning to being a ‘digital and data-driven insurance company’,” says Karin Bos, Director Non-Life Insurance Private Individuals at Achmea. “We think it is very important that we don’t let ourselves be guided by the technological push, but that we keep in control of the implications. That is why we called in the help of the University of Twente in 2019.”


Who?

Karin Bos (1969) is Director Non-Life Insurance Private Individuals at insurance company Achmea. After studying accountancy, she first held various board positions at KPMG and ABN AMRO Bank. She then became financial director of the IT branch at Achmea, a position in which she says she developed a love for technology. This led her to initiate a transition towards ‘digital and data-driven insurance’.

Philip Brey (1966) studied philosophy and is currently professor of Philosophy and Ethics at the University of Twente. He is also president of the International Society for Ethics and Information Technology (INSEIT). From 2013 to 2018 he was scientific director of 4TU.Ethics (see text box). Since January 2020, he has been leading the NWO Gravitation Program Ethics of Socially Disruptive Technologies (see text box).

How do you collaborate?
Philip: “We are developing a 'Data Ethics' training course for the insurance industry. There is a high demand for that; not only at Achmea and other insurers, but also, for example, at the national Tax and Customs Administration. In this project, we are fortunate to be able to draw on the achievements of both the 4TU.Ethics Center and the NWO Gravitation Programme, in which the strongest researchers work together. Working with the insurance and tax professionals is a great way to let the academic developments find their way into practice.”

Karin: “In the transition within our company, we focus on four overarching themes: Data Science, Data and Ethics, Business IT and Security. For the ethics theme, we turn to Philip and his colleagues. That is really a subject that requires the involvement of experts.” 

Which ethics dilemmas are there in the insurance industry?

Karin: “Our business model, in fact also our entire raison d'être, rests on predicting risks. The essence of insurance is that people bear a risk together, based on solidarity. That concept only works if you weigh the interests properly. You want to keep risks insurable. Thanks to increasing amounts of data, we can calculate and price the risks more and more accurately.

We want to exclude certain risks from insurance, and we want to work on prevention. But which data can we use for this? And how far can we go in this regard?”

Can you give an example of this complexity?

Karin: “Take your car insurance, for instance. You pay less premium if you cause less damage. New technology is taking this further and further. If you use an app or a device in your car that demonstrates that you drive carefully, for example, you pay less premium. This arrangement requires you to share data with your insurer. This sounds logical, but we have to make sure that customers are not forced to always share all of their data for all kinds of insurance. Where do you draw the line? As soon as you start differentiating between customers, it clashes with the principle of sharing the risks together. "

Philip: “The insurance industry is very interesting from an ethical point of view. The point is: how do you collectively deal with individual risks? The financial interests of the insured party may be considerable. Insurance in itself is a very nice addition to society, but it is important that it is fair. It must guarantee the autonomy, integrity and privacy of the customers. Suppose that a company had full knowledge of its customers, and so knew exactly who was going to get sick and would have a certain number of accidents, then it would want to exclude all kinds of groups from insurance. That is not a desirable situation. How can a company take its social responsibility, but at the same time earn money? That is an interesting field of tension.”


“How can a company take its social responsibility, but at the same time earn money? That is an interesting field of tension.”
Philip Brey, University of Twente


Is this currently an issue in the insurance world?
Karin: “The more data we have, the more we can differentiate. Achmea was founded over two centuries ago, when a small group of farmers in Friesland shared the risk of a hay fire. The farmers knew each other, and they held each other accountable for their actions. They did their utmost to prevent a fire, but if a fire did break out on one of their properties, they would share the cost of the damage. Today, this solidarity principle is a lot less tangible. People sometimes feel less responsible for preventing damage. But things may also go wrong on the insurer’s side. The hardening of competition among car insurers, for instance because of price comparison sites, can cause premiums to plummet. A possible consequence is that these companies don’t earn enough money to be able to pay out in the event of damage. This may create a perverse incentive to look for loopholes in the contracts. In other words, you need to find the middle ground. You want to offer a product that is fair for both parties. ”

Philip: “What Karin is demonstrating here is that ethical dilemmas are often also economic, social or organizational in nature.”

“If you search in a database, especially using artificial intelligence, you will always find something.”
Karin Bos

How do you deal with those complex dilemmas?
Philip: “In a very elementary way: by discussing them openly. By making your standards and values explicit. You can do this very systematically, and then work very systematically towards a solution. In the end, it’s a question of recognizing, analyzing and discussing a dilemma, and then working together on a targeted solution that also addresses the economic and social and all those other aspects.”

Are there protocols for that?
Philip: “Yes, but they are different for every industry and every issue. In many industries you have ethical guidelines or codes that provide a framework of reference. Those are still a fairly crude tool. In addition, there are methods for systematic ethical analysis, with specialist instruments per context.”

Karin: “Our industry has quite a few regulations already. Many of those have a legal basis. For example, before a new product can receive approval, we must demonstrate that we have made a balanced consideration of interests. There are tools for this: indicators you can use to weigh up the interests of all parties involved. And this is an evolving process: the more data we gather, the more important this will become. ”

What is the added value of Achmea’s collaboration with the university?
Philip: “Over the years, we have acquired a great deal of knowledge about issues such as big data and artificial intelligence and how to deal with them. We try to incorporate this knowledge into our training courses for the insurance industry and the tax authorities. People often think of privacy as an issue, but that is by no means the only ethical aspect. Justice is also a very important one. How do you avoid treating certain groups unjustly? In the Netherlands this is now very clearly the case with the Childcare Supplement Affair, for example.”

Karin: “As an insurer, we’re also addressing the issue of fraud. Our computer systems are looking for patterns that may indicate fraud. But if you search in a database, especially using artificial intelligence, you will always find something. That is why there must always be a human control step in the process. And you have to keep an eye on the human dimension. For example, we have an Unlucky Client procedure, for people who claim an above-average amount of damage. We look at each case individually: how is it possible that this person is claiming so much damage, and how can we help prevent this? This is a completely different approach than automatically assuming fraud.”

Philip: “I’m chuckling here because I myself was a multi-claimer for a while. I had a few consecutive cases of bad luck with houses and furniture, and then I also suffered damage due to the Enschede fireworks disaster. Anyway. Excesses such as the Childcare Supplement Affair…. Everything starts with awareness. People should systematically reflect on the question: what are we doing, and where can things go wrong?”

Philip, you bring knowledge to those companies. Do you also learn from them?
Philip: “Achmea is a very pleasant cooperation partner. They’ve already given the issues considerable thought, and collected many case studies. These are very useful for the company’s development, but also for our research. These cases give us unique insights into the issues at hand in the industry. We are constantly looking for this link with practice. We don't want to just produce beautiful theories, sitting in our ivory tower, but want to develop something that really benefits practice.”

Karin: “To us the advantage is that we are constantly forced to look outward, also across the boundaries of our industry.”

Philip: “In addition, this collaboration allows us to investigate how our basic concepts are challenged by new technologies. What exactly do we mean when we say ‘human’, ‘natural’, ‘privacy’, or ‘justice’? New technology can change our understanding of those concepts. Privacy really means something completely different now than it did a hundred years ago. Similarly, you can ask yourself whether the concept of ‘risk’ is also changing.”

Karin: “We recently had an interesting discussion about this. Shouldn't we offer cyber insurance for consumers? A product that offers consumers assistance when all their data is suddenly out in the open?”

Philip: “That’s a very interesting question. Fifty years ago we were still an industrial society, now we’re an information society. There are very different risks involved. The risks are less and less material, more and more informational. This calls for an entirely new ethical point of view.”

Karin: “At the same time: running risks is a timeless phenomenon. The degree of risk is also a matter of perception. There are fewer burglaries and car accidents now than a few decades ago, but new risks have taken their place.”

“Context, motivation, and complex cause-and-effect relationships: big data cannot deal with those. People will continue to have that responsibility.”
Philip Brey

Still the general trend is that, thanks to big data, you can predict risks with increasing precision.
Karin: “That's right. So the question is how far you want to take this. Behavioral profiling is allowed, but if you’re not careful, you’ll end up with groups that are uninsurable, based on your data analysis. Again, that is where the human dimension comes into play.”

Philip: “Context, motivation, and complex cause-and-effect relationships: big data cannot deal with those. People will continue to have that responsibility. At the same time: people can be just as bad as machines – as was sadly illustrated by the Childcare Supplement Affair.”

Karin: “It remains a question of the human scale and solidarity. And one more thing: this conversation has mostly been about the dark side of big data and the ethical dilemmas. But let’s not forget the many positive sides to this technological development. Thanks to big data, we can offer our customers tailor-made solutions and help prevent damage.”

4TU.Centre for Ethics and Technology

The 4TU.Centre for Ethics and Technology focuses on research, education and information exchange on the ethical aspects of technological developments. Research themes include sustainable ethics for future energy systems, accountability for and use of advanced medical images, and the ethics of biosecurity. Projects focus, for instance, on information and communication technology, biomedical, environmental and nanotechnology, industrial design, architecture and urban planning, and neuro and cognitive technology.

The center actively interacts with media, NGOs, policymakers, companies and the general public, among others, to promote awareness of ethical aspects of technology.


NWO Gravitation Program Ethics of Socially Disruptive Technologies

Top scientists in the field of ethics and philosophy were awarded an NWO Gravitation grant in 2019, amounting to 17.9 million euros for research into the ethical aspects of new technologies. The program, which will run for ten years, is a collaboration between researchers from the University of Twente, TU Delft, TU Eindhoven and Utrecht University. Wageningen University & Research, Leiden University and University Medical Center Utrecht also participate.

The researchers will revise ancient philosophical core concepts such as autonomy, justice and responsibility, which are challenged by technological developments. This should lead to an innovative view and a better understanding of the major changes that new technologies can bring about in areas such as artificial intelligence, synthetic biology and climate technology.