Profiling: how algorithms predict and influence our needs
- The data we leave on websites is shared and sold, in particular to influence our online purchasing behaviour through targeted advertising.
- To target our needs more effectively, online platforms use high-performance algorithms to understand and predict our behaviour.
- The MOMENTOUS program aims to understand whether algorithms can exploit individuals’ psychological and cognitive traits to influence their behaviour.
- We lack data on advertising that targets YouTube channels aimed at children, which increases their exposure to danger.
- More transparent access to data from online platforms is essential for effective regulatory action.
When discussing upcoming vacations or purchases with friends, have you ever wondered why highly targeted ads suddenly appear on your Facebook wall or Instagram feed? We sometimes feel like our electronic devices are watching us. This concern is not unfounded: the traces we leave online reveal valuable information about our lives, often without us being fully aware of it.
Furthermore, on 14th February 2025, the Human Rights League filed a complaint in France against Apple for violation of privacy through the unauthorised collection of user data via the Siri voice assistant. This case raises the issue of the protection of our personal data, which has become a valuable resource coveted by companies.
Take cookies, for example, those elements that we are asked to accept before accessing websites. Behind their appetising name lie opportunities for companies to access our personal data. As Philippe Huneman (CNRS research director at the Institute for the History and Philosophy of Science and Technology) shows in his book Les sociétés du profilage (Profiling Societies), it is important to distinguish between “necessary” cookies, which ensure the proper functioning of a website, and “optional” cookies, which are intended to improve the user’s browsing experience or personalise “more relevant” advertisements for the user1. By accepting these cookies on a particular website, we consent to some of our online behaviour being observed. This behaviour is often linked to data brokers, companies that buy, collect, and aggregate data from multiple websites and ultimately resell it. Among the best known are Acxiom in the United States.
Predicting and influencing our behaviour
But why share and sell our personal data? One of the main objectives is to influence our online purchasing behaviour through targeted advertising. As Oana Goga (INRIA research director at Ecole Polytechnique, IP Paris) points out: “In the field of online advertising, tracking [Editor’s note: monitoring users’ online behaviour on the web] is the basis of two targeting methods: the first is retargeting, a method that involves targeting Internet users who have already visited a website by displaying advertisements on other sites they visit. The other technique is profiling-based targeting, which involves creating a user profile.”
The traces we leave on the web in the digital age can therefore be collected to build a profile. This practice, known as “profiling”, is defined by the GDPR as “the automated processing of personal data to evaluate certain personal aspects relating to a natural person2”. It is used to analyse, predict or influence individuals’ behaviour, including through the use of algorithms. To illustrate this concept, let’s take the example given by the researcher on Facebook’s profiling targeting and how it has evolved: “In 2018, users were classified on Facebook into 250,000 categories by algorithms, based on their preferences on the platform. Today, this classification is no longer explicit. Algorithms no longer place users in categories so that advertisers can choose who they want to target but instead decide for advertisers who to send advertisements to.”

The algorithms used today to predict and influence our actions are extremely effective. They are said to be better than humans at understanding our behaviour and could even influence it. For example, research shows that computer models are much more effective and accurate than humans at performing an essential socio-cognitive task: personality assessment3.
But this effectiveness raises several questions: to what extent can these algorithms know and predict our behaviour? And how do they work? To date, the answers remain unclear. Oana Goga says: “One of the big problems with recommendation algorithms is that they are difficult to audit because the data is private and belongs to companies.” Philippe Huneman adds: “Right now, we don’t know how algorithms use our data to predict our behaviour, but their models are getting better and better. Just as with generative AI, we don’t know how the data is put together. We have to choose: do we want a world where this software is effective, or ethical?”
The ethical issues surrounding profiling and algorithms
The ethical implications of these algorithms are fundamental. In 2016, the Cambridge Analytica scandal4 highlighted the possibility of exploiting user data without their consent for political purposes, in particular by developing software capable of targeting specific user profiles and influencing their votes, as in the case of Brexit and the election of Donald Trump. However, proving that these manoeuvres actually influenced the outcomes of these events remains difficult. More recent cases include the cancellation of the presidential election in Romania by the Constitutional Court in December 2024, following suspicions of an illegal support campaign on TikTok5. The algorithms used by platforms such as Facebook and X could also be more likely than others to reinforce echo chambers, i.e. limiting exposure to diverse perspectives and encouraging the formation of groups of like-minded users, thereby reinforcing certain common narratives6.
In this context, Oana Goga and her team have been running the MOMENTOUS programme since 2022, funded by a European Research Council (ERC) grant. Its aim is to understand how algorithms can exploit psychological and cognitive traits to influence people’s preferences and behaviour. The programme offers a new measurement methodology based on randomised controlled trials in social media. As Oana Goga points out: “It is important to distinguish between algorithmic biases, such as algorithms that discriminate against certain populations, and cognitive biases, which are biases held by humans. With MOMENTOUS, we are looking at whether algorithms can exploit cognitive biases.” On a similar note, Philippe Huneman also mentions the concept of nudge in profiling: “Profiling conveys the idea of soft paternalism, or libertarianism aimed at influencing an individual’s behaviour by acting on the biases that govern them. Advertisements and website interfaces exploit these biases to influence users’ decisions; this is nudging.”

In addition, among the ethical issues raised by targeting, the first to mention is the issue of children: “From a legal standpoint, we don’t have the right to target children based on profiling. However, our studies on YouTube have revealed that it is possible to target them contextually, for example by displaying advertisements on Peppa Pig videos or on influencer channels,” explains Oana Goga. “Although platforms prohibit targeting children under the age of 18, they can target the content they watch. The problem is that digital regulators are focusing on banning profiling-based targeting of children, but not contextual targeting, which takes into account content specifically aimed at them, even though these strategies are well known to advertisers. There is a lack of data on advertising that targets children’s channels, which raises the issue of the risks of radicalisation among young people,” adds the researcher.
For more transparent access to online platform data
How can we make things happen? From a regulatory perspective, Oana Goga believes that one of the most pressing issues is ensuring more transparent access to online platform data: “Concrete measures must be taken to enable better access to data so that effective action can be taken. This could be done in two ways: i) through legislation; and ii) through citizen participation. It is essential to be able to collect data in an ethical manner that complies with the GDPR.”
With this in mind, Oana Goga has been developing tools such as AdAnalyst and CheckMyNews for Meta and YouTube for several years. Their goal is to collect user data to conduct research on the content and sources of information they receive on these networks while respecting their privacy as much as possible, in particular by not collecting people’s emails, by going through ethics committees, and in compliance with the GDPR. “It would also be interesting to have a panel of users at the European level. A platform observatory, with 1,000 to 2,000 users in France, Germany, etc., could provide access to data independently of the platforms,” she adds.
These are questions at the heart of our society, which should be at the centre of discussions on democracy in the coming years.