Social media algorithms shape the information environment of young Europeans in ways they cannot see or control. This exposes them to a distorted, emotionally charged and sensationalist political information environment, which can undermine democratic participation. 

These findings are from the Finnish Innovation Fund Sitra’s new study Algorithms and democracy: How social media shapes young Europeans’ worldviews, conducted in collaboration with the Behavioural Insights Team (BIT) and Bondata. 

The study also shows that half of young European adults feel disappointment, fear, anger, or sadness when encountering political and societal discussions on social media. The result was similar in all the countries studied: Finland, France and Romania. 

Social media platforms have rapidly become central sources of information and key arenas for civic discourse in the digital age. 

“However, platforms are not neutral intermediaries of information. Through opaque algorithms, they steer public debate, people’s behaviour and emotions,” says Kristo Lehtonen, Director of International Programmes at Sitra. 

Sitra seeks to protect and renew European democracy. To this end, it sought to examine how platform algorithms serve political content to young Europeans and to propose solutions to make the digital public sphere safer for democracy and for users.

Understanding digital power

The Algorithms and democracy study continues Sitra’s earlier work on understanding digital power – that is, forms of power based on data and digital technologies. Sitra’s Digipower investigation, published in 2022, showed that the resilience of the European economy and European democracy is threatened by the lack of transparency in the data economy and the concentration of digital power in the hands of a few.


The Algorithms and democracy study also builds on Sitra’s earlier work to advance democratic innovations.

Political bias and sensationalism 

The study consists of two complementary research components. 

  • The first component was a platform audit conducted by the global research consultancy the Behavioural Insights Team (BIT), widely recognised for itsexpertise in behavioural research. For the study, BIT created 18–24-year-old avatars, or virtual personas, on TikTok, Instagram and X, and examined what kinds of political content the algorithms recommended to them in Finland, Romania and France. During the tests, the avatars encountered a total of 1,719 political posts on social media, which the researchers then classified. 
  • The second component was a survey conducted by the Finnish research company Bondata among 18–29-year-olds living in the same countries. The survey examined, among other things, what kinds of emotions social media content evokes in young European adults. 

During the research period, BIT’s avatars encountered, on average, substantially more right‑wing content than left‑wing or centrist content on social media platforms. This pattern persisted even when the avatars expressed interest in left‑wing politics. Romanian feeds were an exception: they were largely dominated by centrist content, particularly government communications. Of all 1,719 political posts encountered by the avatars, 58 per cent were right-wing, 26 per cent were left-wing and 16 per cent centrist. 
 
The results also point to the ongoing deterioration of social media quality, sometimes referred to as ‘enshittification’, as platforms shift from prioritising usersexperience to maximising engagement and monetisation. As much as 67 per cent of all political content encountered by the avatars was opinion-based, entertainment and unverifiable in nature. Much of the content was sensationalist, polarising and often promoted extremist views. Examples included AI-generated videos of gorillas telling misogynistic and xenophobic jokes, as well as memes expressing support for Nazi ideology. 

“Such content does not violate platform rules and cannot be fact-checked. However, when this type of political content becomes dominant on social media, it creates an environment in which constructive civic discussion is difficult,” says Ilkka Räsänen, Project Lead of the Algorithms and Democracy project and Head of EU Affairs at Sitra. 

Image: One in three young adults in Finland, France, and Romania report regularly or repeatedly encountering misinformation (bar chart on the left), hate speech (2nd from left), hostile speech (third from left), or conspiracy theories on social media (bar chart on the right). Source: Bondata 2025


In Bondata’s survey, more than one third of young adults in Finland, France and Romania reported encountering misinformation, conspiracy theories, hate speech or hostile speech regularly or repeatedly on social media. Half of the respondents said they feel frustration, anger, fear, or sadness when following political discussions on social media. 

Towards a safer digital environment – 7 recommendations

Sitra’s Algorithms and democracy study offers seven recommendations for policymakers, authorities, educators and social media platforms to make the digital public sphere healthier and safer for democracy and users. 

  1. In line with the requirements of the Digital Services Act (DSA), platforms should disclose the main ranking parameters in plain language, offer adjustable settings, and provide a non-profiling feed option. 
  2. The EU should ensure independent, long-term systemic risk auditing. In particular, sustained monitoring is needed to track ideological amplification, exposure to problematic content, and the longer-term emotional and behavioural impacts on users.  
  3. The EU should require very large online platforms (VLOPs) to adopt protective defaults, such as reduced autoplay, clear content controls, and simple tools to adjust recommendation settings.  
  4. Democratic resilience must be strengthened through digital information literacy and the use of civic tech platforms. The EU should also incorporate epistemic rights into digital governance frameworks, ensuring citizens have access to accurate information and can understand how AI systems affecting public life are developed and used.
  5. The EU and Member States should coordinate DSA and AI Act enforcement through strong cross-border cooperation, require clear labelling and traceability of AI-generated political content, and build the technical capacity to assess recommender systems, audit algorithmic risks, and verify platform compliance. 
  6. The EU should strengthen user mobility and digital self-determination by expanding data portability beyond personal data, developing privacy-preserving standards for optional reputation portability, and explicitly recognising protection from manipulative design as a democratic right.  
  7. The EU and Member States should consider raising and effectively enforcing minimum age limits for full-feature social media access, preferably through coordinated action at the EU-level. 

Read the study: Algorithms and democracy – how social media shapes young Europeans’ worldviews

Lisätietoja

Ilkka Räsänen

Head of EU Affairs, Sitra International Programmes

See also