Megaphone Symbolizing Marketing And Sales. Сoncept Networking Events, Sales Strategies, Branding And Advertising, Lead Generation, Sales Funnel Optimization
π Digital π Society
Social media: a new paradigm for public opinion

Are online recommendation algorithms polarising users’ views?

with Giordano De Marzo , Researcher in the Physics Department at Sapienza University
On January 24th, 2024 |
3 min reading time
Avatar
Giordano De Marzo 
Researcher in the Physics Department at Sapienza University
Key takeaways
  • With the advent of online platforms, opinions are becoming polarised on many subjects.
  • Recommendation algorithms, which recommend specific content likely to appeal to users, are one of the main causes of this.
  • Using analytical and numerical techniques, researchers have simulated the evolution of user preferences according to algorithmic recommendations.
  • By identifying these strategies, the study could help to develop less-polarising algorithms in the future by increasing user engagement.
  • This is another step towards creating a more balanced and inclusive online information ecosystem.

The num­ber of people hold­ing extreme views, on sub­jects such as polit­ics, reli­gion or cli­mate change – to cite just three examples – has increased in recent years123. This “polar­isa­tion”, as it is called, is dan­ger­ous, as it could poten­tially weak­en demo­cracy itself if allowed to spread unhindered. Online plat­forms such as social media play an import­ant role in this con­text, but the mech­an­isms by which they foster polar­isa­tion are not yet fully understood.

“Recom­mend­a­tion algorithms pro­foundly shape our digit­al exper­i­ence today, determ­in­ing the films we watch or the songs we listen to,” explains Giord­ano De Mar­zo. These algorithms are widely used by most of the web­sites we vis­it every day, the best-known examples being the “sug­ges­ted for you” mes­sages on Face­book, the “recom­men­ded items” on Amazon or Google’s PageR­ank sys­tem. They are designed to give us easy access to the con­tent most likely to interest us, and max­im­ise our engage­ment with the platform.

This is a sig­ni­fic­ant step for­ward in cre­at­ing a more bal­anced and inclus­ive online inform­a­tion ecosystem

A team of research­ers led by Giord­ano De Mar­zo, from the Depart­ment of Phys­ics at the Uni­ver­sity Sapi­enza in Rome, Italy, has stud­ied how a col­lab­or­at­ive user-to-user fil­ter­ing algorithm affects the beha­viour of a group of people repeatedly exposed to it. This type of recom­mend­a­tion algorithm is routinely used by online retail giants such as Amazon to identi­fy new con­tent, based on past activ­ity, that will be of most interest to users. Using ana­lyt­ic­al and numer­ic­al tech­niques, the research­ers were able to sim­u­late how the users’ con­tent pref­er­ences change in response to algorithmic recom­mend­a­tions. Their ana­lyses revealed three dis­tinct regimes or ‘phases’ in the user-base’s state that trap people in so-called “fil­ter bubbles”.

These states depend on key factors such as the “strength” with which the algorithm recom­mends items that are liked by sim­il­ar users, or that are pop­u­lar over­all. The study also iden­ti­fied strategies that allow an algorithm to provide per­son­al­ised recom­mend­a­tions without cre­at­ing fil­ter bubbles. This could con­trib­ute to the devel­op­ment of less polar­ising algorithms in the future.

Collaborative filtering

Col­lab­or­at­ive fil­ter­ing45 is one of the best-known and most widely used recom­mend­a­tion algorithms. It relies on the prin­ciple that the past beha­viour of users can be exploited to identi­fy new con­tent that they will enjoy the most. The down-side is that these algorithms can lead to a feed­back loop. This loop nat­ur­ally tends to bias future choices, redu­cing the diversity of con­tent avail­able. It is a loop of this kind that leads to fil­ter bubble effects, where users are not exposed to new or dif­fer­ing per­spect­ives, but simply to news and con­tent aligned with their exist­ing beliefs. In short, these loops con­trib­ute to “polar­isa­tion” . They are sim­il­ar to “echo cham­bers” , which have been more widely stud­ied678. How­ever, the dif­fer­ence is that bubbles are pro­duced by algorith­mic­ally-biased recom­mend­a­tions on online plat­forms, rather than from inter­ac­tion between like-minded users.

In this new study, pub­lished in Phys­ic­al Review E, Giord­ano De Mar­zo and his col­leagues found that depend­ing on two para­met­ers, the strength of the sim­il­ar­ity bias (a) and the strength of the pop­ular­ity bias (b), a col­lab­or­at­ive fil­ter­ing sys­tem can exist in three dif­fer­ent phases. These are dis­order, con­sensus and polar­isa­tion. Fur­ther­more, when both biases are suf­fi­ciently strong, the sys­tem forms polar­ised groups, lead­ing to the “fil­ter bubble” effect. For­tu­nately, this dis­ad­vant­age can be avoided at the bound­ary between dis­order and polar­isa­tion. Indeed, an algorithm at this bound­ary can provide mean­ing­ful recom­mend­a­tions without indu­cing opin­ion polar­isa­tion or trap­ping fil­ter bubbles.

“Our research provides a sys­tem­at­ic approach to quan­ti­fy­ing and ana­lys­ing the impact of col­lab­or­at­ive user-user fil­ter­ing,” explains Giord­ano De Mar­zo. By employ­ing a stat­ist­ic­al phys­ics approach, we were able to sim­u­late and ana­lyse how users’ con­tent pref­er­ences change in response to algorithmic recommendations.” 

The new meth­od relies on a com­bin­a­tion of math­em­at­ic­al mod­el­ling and com­puter sim­u­la­tions. “In par­tic­u­lar, we have exploited tech­niques such as stochast­ic pro­cesses the­ory, prob­ab­il­ity the­ory and polya urn mod­els (a fam­ily of urn mod­els that can be used to inter­pret many com­monly-employed stat­ist­ic­al mod­els). On the com­puter side, we lever­aged Monte Carlo sim­u­la­tions” , explains Giord­ano De Marzo.

Towards more effective recommendation algorithms

These ana­lyses could con­trib­ute to the devel­op­ment of new meth­od­o­lo­gies for design­ing effect­ive recom­mend­a­tion algorithms, he adds. “By under­stand­ing the mech­an­isms that lead to ‘fil­ter bubbles’, we can devel­op sys­tems that favour a wide range of con­tent, thereby mit­ig­at­ing the risks of polar­isa­tion while enhan­cing user engage­ment and con­tent diversity. This is a sig­ni­fic­ant step for­ward in cre­at­ing a more bal­anced and inclus­ive online inform­a­tion ecosystem” .

The research­ers will now study the impact of inter­ac­tions between users (as com­monly observed in social net­works) on recom­mend­a­tion algorithms. “Adding this para­met­er could con­sid­er­ably enrich our under­stand­ing of the inter­play between social dynam­ics and algorithm-driv­en con­tent dis­tri­bu­tion. This will provide a more hol­ist­ic view of digit­al envir­on­ments”, explains Giord­ano De Marzo.

They will also study the role of link recom­mend­a­tion algorithms, that is, those that sug­gest friends we will link to. Finally, they are cur­rently exploit­ing Large Lan­guage Mod­els for power­ing more real­ist­ic sim­u­la­tions. “These sim­u­la­tions will be the ideal start­ing point for a more detailed under­stand­ing of online dynam­ics and recom­mend­a­tion algorithms,” he concludes.

Isabelle Dumé
1“The par­tis­an divide on polit­ic­al val­ues grows even wider,” https://​www​.pewre​search​.org/​p​o​l​i​t​i​c​s​/​2​0​1​7​/​1​0​/​0​5​/​t​h​e​-​p​a​r​t​i​s​a​n​-​d​i​v​i​d​e​-​o​n​-​p​o​l​i​t​i​c​a​l​-​v​a​l​u​e​s​-​g​r​o​w​s​-​e​v​e​n​-​w​ider/ (2017).
2Uth­sav Chitra and Chris­toph­er Mus­co, “Ana­lyz­ing the impact of fil­ter bubbles on social net­work polar­iz­a­tion,” in Pro­ceed­ings of the 13th Inter­na­tion­al Con­fer­ence on Web Search and Data Min­ing, WSDM ’20 (Asso­ci­ation for Com­put­ing Machinery, New York, NY, USA, 2020) p. 115–123.
3Michael Maes and Lukas Bis­chof­ber­ger, “Will the per­son- aliz­a­tion of online social net­works foster opin­ion polar­iz­a­tion?” Avail­able at SSRN 2553436 (2015).
4ath­an L Her­lock­er, Joseph A Kon­stan, and John Riedl, “Explain­ing col­lab­or­at­ive fil­ter­ing recom­mend­a­tions,” in Pro­ceed­ings of the 2000 ACM con­fer­ence on Com­puter sup­por­ted cooper­at­ive work (2000) pp. 241–250.
5Xiaoy­uan Su and Taghi M Khoshgoftaar, “A sur­vey of col­lab­or­at­ive fil­ter­ing tech­niques,” Advances in arti­fi­cial intel­li­gence 2009 (2009), 10.1155/2009/421425.
6Mat­teo Cinelli, Gian­marco De Fran­cisci Mor­ales, Aless­andro Galeazzi, Wal­ter Quat­tro­cioc­chi, and Michele Star­n­ini, “The echo cham­ber effect on social media,” Pro­ceed­ings of the Nation­al Academy of Sci­ences 118 (2021), 10.1073/pnas.2023301118, https://​www​.pnas​.org/​c​o​n​t​e​n​t​/​1​1​8​/​9​/​e​2​0​2​3​3​0​1​1​1​8​.​f​u​l​l.pdf.
7Wes­ley Cota, Silvio C. Fer­reira, Romu­aldo Pas­tor-Sat­or­ras, and Michele Star­n­ini, “Quan­ti­fy­ing echo cham­ber effects in inform­a­tion spread­ing over polit­ic­al com­mu­nic­a­tion net­works,” EPJ Data Sci­ence 8, 35 (2019).
8Pablo Barber´a, John T. Jost, Jonath­an Nagler, Joshua A. Tuck­er, and Richard Bon­neau, “Tweet­ing from left to right: Is online polit­ic­al com­mu­nic­a­tion more than an echo cham­ber?” Psy­cho­lo­gic­al Sci­ence 26, 1531–1542 (2015), pMID: 26297377,https://​doi​.org/​1​0​.​1​1​7​7​/​0​9​5​6​7​9​7​6​1​5​5​94620.

Support accurate information rooted in the scientific method.

Donate