Estimated reading time 9 min

Let’s make 2019 the year of fair data

How could private individuals manage their own data? Who will guarantee that the platform economy giants do not misuse the data they have amassed? Who will take care of the ethical use of data in companies? What can you and I do to make 2019 the year of fairer data?


Laura Halenius

Project Director, Competitiveness through data


It has been described as the new lubricating oil that keeps the corporate wheels turning, or as the rainfall that makes everything flourish. Discussions on data tend to inspire speakers to use quite poetic metaphors. This is no wonder, since data has become so valuable that, for example, Andrew Ng, Baidu’s chief artificial intelligence scientist, says that companies launch products for collecting data at prices below their production price. The voice-controlled Amazon Echo speaker serves as a good example of such a product.

Data is not only zeros and ones but increasingly often it is information about you, me and us. It is about what we believe in, how we behave, what we fear and what we dream about. Anyone with access to such data may define our future. Therefore, this year we really must discuss the issue and make major decisions about who has access to such data and under which conditions.

What should we take into account this year as regards the use of data and, first and foremost, what could you and I do to make 2019 the year of fairer data? I have compiled a list of three perspectives and a few concrete actions we should start with.

Let’s become aware of our personal rights when using digital services

Along with the growth of the data economy, the concern over its risks is also growing. Facebook and several other international major platform companies were last year repeatedly accused of a lack of transparency and misuse of data. Accordingly, questions related to data became one of the hottest topics of discussion last year. Even though 2018 introduced data leakages and the misuse of data to the public debate, only a surprisingly small amount of people changed the way they behave. In the middle of all these news reports, it is baffling how indifferent we are towards the loss of our privacy.

Europe is protecting its citizens in many ways. As individuals, we have plenty of opportunities to influence how our personal data is used. For example, GDPR gives us new tools for using our personal data. However, at first, we must understand that the data shared through the services we use belongs to us. This year, I hope that we will understand that without us there would be no data. We create the data, and therefore we have the right to decide who will use the data and under which rules.

Could 2019 finally be the year when the users begin to require that operators respect their personal rights? I believe that campaigns like #Deletefacebook are just the initial impulse for this kind of development, and this year we will see more concrete action, when users begin to vote with their feet and purposely seek services that treat their data fairly and transparently. I hope that in 2019 we will see several campaigns associated with the fair use of data prick the conscience of ordinary users on a large scale, in the same manner as, for example, the Meatless October (link in Finnish) campaign has brought vegetarianism into the public discussion in Finland.

Let’s make the use of fair data the topic of discussion of this election year

With several forthcoming elections, both in Finland and the EU, this election year will be one when many people address their concerns about Europe being left behind in the development of the data economy. Out of the 200 leading digital companies in the world only eight are European. Online shops, social media and cloud services in particular are under the control of American and Chinese companies (link in German).

Europe has made a major value judgement by protecting the privacy of its citizens with legislation like the GDPR. However, there is a growing concern about what will happen if international data giants keep the business in their own hands, and Europe fails to create an alternative model for the data economy. At the same time, the power of these giants to dictate the rules of the data economy will grow, and these rules will not necessarily be built upon European values.

In Europe, the availability and usability of data has become an obstacle to the development of new services. So far, it has been difficult, especially for small companies, to gain access to data. A new kind of platform economy model based on open ecosystems would significantly accelerate the growth and diversification of the data economy in Europe.

The EU is striving for a dynamic, safe, value-based and integrated digital Europe. Europe is none of these things yet. Even though the eurozone is a single economic area, data does not yet travel across national boundaries. Now we need to get it moving, while ensuring that data is being used safely and responsibly. One solution to this might be found with the help of Sitra’s IHAN project, which aims to create Europe-wide rules and solutions for a fair and human-driven data exchange.

The election year and Finland’s EU Presidency, to begin in July, will enable raising the debate on the data economy to a whole new level.

Let’s force the ethical use of data onto corporate and public-sector agendas

In companies, the use of data is increasing rapidly. At the same time, new kinds of ethical questions are arising as organisations introduce new solutions that use data. The ethics of artificial intelligence are already being discussed, and the world’s largest technology-sector standardisation association IEEE is developing certifications for the ethical implementation of intelligent technologies.

What needs to be discussed more are the complex ethical questions concerning the huge volumes of data that systems using artificial intelligence will be collecting and sharing with each other in the future.

Systems collecting data have been built with the good intention of making people’s everyday lives easier. People also willingly submit their data to these services, because they benefit from using them in their daily lives. For example, by submitting location data, we have access to tailored route plans.

In the future, data will be used on a totally different scale to today.

For instance, the volume of data produced by different sensors and cameras in smart cities is already vast, and it only keeps growing. Such data opens up a totally new kind of perspective on people’s privacy. How should we react to this kind of collection and dissemination of data? In which situations would the use of such data be allowed? And when the collection of data on a totally new scale begins, are we still really of the opinion that we have nothing to hide as individuals?

A good example of software that raises both juridical and ethical concerns is that employed by facial recognition services. Facial recognition software can be used in many ways, such as for the identification of criminals (link in Finnish) and making payments. But what if we take one step further and start connecting different types of data with the aim of rating people based on their behaviour, for example? Even though this may seem like a far-fetched idea, China is already running trials of a social rating system of its citizens (link in Finnish). One could make a long list of different application areas for the use of data and their potential challenges.

However, the ultimate question is: What kind of a society do we want to build with these services?

In the past, there has been a lot of “greenwashing” – the marketing of a product or service’s eco-friendly credentials, often erroneously – as awareness of environmental impacts increased and companies made environmental reporting a part of their everyday activities. Now would be the right time to include the ethical use of data as part of corporate responsibility and to raise the issue to senior management levels.

Many parties are currently working to give all of us an opportunity to have a say on the future of the data economy and on what will be done with the information collected on us. Every one of us may have a role to play in this development – everyone’s contribution is needed.

What's this about?