The definition of corporate responsibility is expanding again, as the importance of data increases with new technologies and digitalisation. In recent years, data-related questions have been extended to the concept of corporate responsibility. Put simply, it is about how companies collect, store and use data in a responsible way from the point of view of individuals, individual stakeholders and society as a whole.
English language academic research, the terms corporate digital responsibility and data ethics have been used to describe this phenomenon. The former is seen as an extension of traditional corporate social responsibility, which takes into account questions relating to digitalisation and the data economy. The latter can be seen as a form of corporate social responsibility that takes into account the ethical use of data.
The themes relating to data responsibility can also be classified by using the ESG (environmental, social, governance) divide. There are many examples of this. The environmental perspective includes the problems of digital waste, meaning the energy consumed to store unused applications, files and other content. From a social perspective the example of digital wellbeing, which refers to the physical, mental and social wellbeing of individuals in their everyday lives, dominated by digital devices and technologies. A third example is the data management and privacy.
The different concepts are closely interlinked and illustrate how the sustainable use of data is increasingly becoming part of corporate responsibility.
Competition law as a guarantee of data responsibility
The European Commission has taken action to promote a fairer playing field in the data market. So far, the EU has been able to intervene retrospectively to address unfair business practices under competition rules and, to some extent, the EU General Data Protection Regulation, GDPR.
The Commission’s next step is to proactively prohibit certain types of corporate behaviour and address harmful market structures. In 2020, the Commission published its data strategy, aiming to build a data-driven European single market. The strategy has since been translated into five legislative proposals issued by the Commission: the Digital Markets Act, Digital Services Act, Data Act, Artificial Intelligence Act and Data Goverance Act.
There are talks about the expansion of competition law, i.e. shifting the focus from an exclusively price-oriented approach to a broader spectrum, even covering some aspects of responsibility.
The Digital Markets Act, for example, aims to ensure that certain dominant platform companies, the so-called gatekeepers, such as Amazon and Google, operate in line with fair rules in the data market. The regulation imposes specific obligations on such gatekeepers, including on data availability and transferability.
A gatekeeper is not allowed to combine personal data from core platform services, such as social networking services, search engines and cloud services, with personal data from other services it provides or from third-party services. In this context, a national court in Germany already ruled in 2020 that Facebook violated the prevailing competition rules by combining user data collected from Instagram and WhatsApp, among others. The Digital Markets Act, for its part, requires gatekeepers to give their corporate users access to the data they generate when using the gatekeeper’s platform.
The Commission’s legislative proposals supplement the existing competition law. The development is part of a broader debate on the functioning of the market and objectives of competition policy. The debate is about broadening the scope of competition law to shift the focus from an exclusively price-oriented approach to a broader one, including even responsibility aspects.
In a national court in Germany already ruled in 2020 that Facebook violated the prevailing competition rules by combining user data collected from Instagram and WhatsApp.
The concept of consumer benefit, or perhaps more aptly consumer wellbeing, is also prompting debate. Consumer benefit is deemed to be a key objective of competition law, and it has traditionally referred to economic benefit, such as the low prices of goods and services. The concept of consumer wellbeing is also seen as encompassing qualitative or social elements, such as privacy and responsibility.
These concepts are central to the objectives of competition law, as they influence the indicators by which competition and the functioning of markets are assessed.
The rise of the metaverse raises new legal issues
Legislative proposals such as the Digital Markets Act are one way of attempting to promote the sustainable use of data. But compliance with legislative obligations is only the starting point for data responsibility, and companies are expected not only to regulate, but also to take voluntary action to promote responsibility.
Ideally, regulation would encourage companies to use data voluntarily in an increasingly ethical way.
Decentralised models such as the metaverse and Web 3.0, where data is no longer just stored in databases managed by companies and organisations but in decentralised blockchains, create enormous opportunities but also challenges. They raise entirely new legal issues.
The increasing use of data and the new ways in which it is exploited will give rise to ever new legal questions. Decentralised models such as the metaverse and Web 3.0, where data is no longer just stored in databases managed by companies and organisations but in decentralised blockchains, create enormous opportunities but also challenges. They raise entirely new legal issues for which there are no ready answers or practices.
For example, the potential competition law challenges posed by the metaverse depend largely on how metaverse platforms are built and how they interact with each other. It is possible that gatekeepers will also emerge in metaverses if developments produce closed ecosystems that would restrict or prevent the movement of consumers from one virtual space to another.
It is therefore clear that data regulation and related practice will have to adapt in the coming years to changes such as the decentralised network.
Blockchain transparency is an asset
Blockchain technology has considerable potential in terms of data responsibility. The transparency and immutability of the blockchain creates better conditions for transparent operations, and the data stored in the blockchain can make it easier to monitor compliance.
Decentralisation, or information storage in numerous servers, also makes the blockchain more secure than centralised digital databases, which are more vulnerable to hackers.
From the corporate point of view, data responsibility will in practice mean new business obligations but also new opportunities. Companies’ operating models will have to meet the characteristics of ethical data use internally and externally, for example by creating clear instructions on who has the right to access data.
But data responsibility does not only entail new obligations and an increasing number of standards. Above all, it also creates opportunities, by recognising the business and innovation potential opened up by the data economy as it continues to grow in importance.
In the future, data responsibility will increasingly cut across the entire enterprise structure. Successful companies are already adopting the changes brought about by the fair data economy as part of profitable and responsible business.
Have some more
Sitra: Corporate social responsibility encompasses data (2020)