4_facebook
π Digital π Society
The digital revolution: at humanity's expense?

“Digital platforms have poor control over their manipulation of emotions”

On June 8th, 2021 |
4min reading time
Camille Alloing
Camille Alloing
Professor of Public Relations at the Université du Québec à Montréal
Key takeaways
  • The Cambridge Analytica scandal and other cases have recently alerted citizens to the possibility of social networks manipulating votes by playing on their emotions.
  • But for researcher Camille Alloing, social networks knowingly overestimate their capacity for manipulation in order to sell advertising space.
  • In the same way, platforms such as Facebook have conducted psychological experiments on the emotions of hundreds of thousands of their users...
  • And doing so without their knowledge based on a caricatured and unreliable conception of emotion, with the sole aim of providing credit for their hypothetical capacity of manipulation.

It is said that one of the best ways to manip­u­late social media users is to make them feel scared or empath­et­ic. The alleged role of Cam­bridge Ana­lyt­ica and Face­book in the elec­tion of Don­ald Trump seems to prove it. How­ever, UQÀM com­mu­nic­a­tions and inform­a­tion sci­ence research­er Cam­ille Allo­ing, sug­gests that the power social media holds over our emo­tions needs to be taken with a pinch of salt.

Could you explain what “affect­ive cap­it­al­ism” is?

Simply put, it is the part of cap­it­al­ism that exploits our abil­ity to be moved (and to move oth­ers) to gen­er­ate value; some­thing we par­tic­u­larly see on social media.However, it’s worth look­ing a little closer at the word “affect”. It is a term that can be used to refer to any emo­tion, but the import­ant part is how it “sets us in motion,” mean­ing what causes us to take action. 

When I “like” a post, I am affected. Unlike emo­tions (which remain dif­fi­cult to ana­lyse due to their sub­ject­ive and uncon­scious nature), affect­ive con­sequences can be iden­ti­fied (you can know that a video affected me because I pressed the “like” but­ton). So, although we can­not ascer­tain wheth­er digit­al plat­forms actu­ally suc­ceed in pro­vok­ing emo­tions in users, we can ana­lyse how users behave.

Giv­en that most social media rev­en­ue comes from selling ad space, the goal of these plat­forms is to increase the time users spend on them, and there­fore the num­ber of ads viewed. To that end, affect is undeni­ably extremely use­ful – by gen­er­at­ing empathy, more reac­tions are promp­ted, and con­tent is shared more.

I have found that indi­vidu­als are now part of struc­tures that can affect them (and there­fore make them feel emo­tions and make them act) although they can­not affect back. If I post some­thing, and I’m expect­ing a response from my friends, I am ali­en­ated because I will only get that response if Face­book chooses (for reas­ons bey­ond my con­trol) to share my post in my friends’ feeds.

You say that “affect is a power­ful tool”. Is it the power to manip­u­late people through their emotions?

If I said that affect­ing a per­son meant you could suc­cess­fully manip­u­late them, I would be in agree­ment with the argu­ments the plat­forms are put­ting out. Face­book, for example, has every reas­on to let people think that their algorithm is able to con­trol users because it helps them to sell ad space. In this way, the Cam­bridge Ana­lyt­ica scan­dal [in which this com­pany attemp­ted to manip­u­late Face­book users to influ­ence Amer­ic­an swing voters in the 2016 US pres­id­en­tial elec­tion in favour of Don­ald Trump] provided incred­ible pub­li­city for Face­book with their advert­isers who saw it as an oppor­tun­ity to drastic­ally increase their sales by manip­u­lat­ing users!

How­ever, the role of social media in Trump’s elec­tion must be put in per­spect­ive, and we should be care­ful not to trust over­sim­pli­fied explan­a­tions. Even though Face­book boas­ted that its tar­geted advert­ising was 89% accur­ate, in 2019 employ­ees revealed that aver­age accur­acy in the US was in fact only half that (41%, and as low as 9% in some cat­egor­ies)1. Sure, these plat­forms’ algorithms and func­tion­al­it­ies have tan­gible effects… but they are much less than what you might think.

The research is there to facil­it­ate well-bal­anced debates, and sci­entif­ic stud­ies23 have shown that, con­trary to what we might hear, social media plat­forms can­not actu­ally manip­u­late us. That doesn’t mean they don’t try, but they can­not con­trol who they affect nor what the con­sequences are of their ini­ti­at­ives. What’s more, it can quickly become dan­ger­ous, even more so giv­en that their concept of human psy­cho­logy leaves much to be desired. Believ­ing that people are blindly sub­ject to their emo­tions and cog­nit­ive biases is a form of class contempt.

In 2014, Face­book hired research­ers to per­form psy­cho­lo­gic­al tests that aimed to manip­u­late the emo­tions of 700,000 users, without their con­sent [4]. This “sci­entif­ic” study was meant to demon­strate the platform’s abil­ity to con­trol the mood of its users and involved modi­fy­ing people’s news feeds to show them more neg­at­ive (or pos­it­ive) con­tent. As a res­ult, they claimed that they could cause “emo­tion­al con­ta­gion,” as people would pub­lish con­tent that was more neg­at­ive (or pos­it­ive, depend­ing on what they had been shown). How­ever, on top of the obvi­ous eth­ic­al issues, the exper­i­ment was stat­ist­ic­ally flawed, and the con­clu­sions do not hold up. But I think it’s fair to say that sci­entif­ic rigour was prob­ably not their pri­or­ity! Above all, the object­ive was to cre­ate good pub­li­city among advert­isers – Face­book uses research as a PR tool.

Yet it is import­ant to remem­ber that affect­ing someone is not neces­sar­ily neg­at­ive – it all depends on our inten­tions. We are con­stantly affect­ing each oth­er, and when we are feel­ing down, we need to be affected in a pos­it­ive way. We simply need to care­fully con­sider who we are allow­ing to affect us. Should private com­pan­ies have this power? Should the government?

Should we be con­cerned by detec­tion of bio­met­ric emotion?

Yes. We are cur­rently see­ing the wide­spread dis­sem­in­a­tion of bio­met­ric tools that meas­ure emo­tions. In our book [5], we men­tion a com­edy club in Bar­celona, the Teatreneu, where the price of your tick­et is cal­cu­lated by the num­ber of times you laugh (30 cents per laugh). This example is pretty anec­dot­al, but less amus­ingly, bio­met­ric tech­no­logy (which until recently were noth­ing but basic exper­i­ments for com­mer­cial ends) is now being used to mon­it­or cit­izens. The NYPD has spent more than $3 bil­lion since 2016 on its algorithms, which use tar­geted ads to meas­ure the atti­tudes towards police of 250,0000 res­id­ents [6].

The prob­lem is also that this bio­met­ric emo­tion detec­tion tech­no­logy is very bad at doing its job. This is because it is based on the work of Amer­ic­an psy­cho­lo­gist Paul Ekman and his Facial Action Cod­ing Sys­tem [a meth­od of ana­lys­ing facial expres­sions that aims to asso­ci­ate cer­tain facial move­ments to emo­tions], which does not actu­ally work in practice.

Des­pite their inef­fect­ive­ness, these bio­met­ric tools are spread­ing at a rap­id pace – yet tech­no­logy is much more dan­ger­ous when it works badly than when it doesn’t work at all! If it’s 80% reli­able, and you are part of the 20% mar­gin of error, it will be up to you to prove it. I find it very con­cern­ing that poorly func­tion­ing tools are becom­ing tools of gov­ernance and sur­veil­lance, imple­men­ted without the con­sent of the main parties involved. 

Interview by Juliette Parmentier
1http://​www​.wohl​fruchter​.com/​c​a​s​e​s​/​f​a​c​e​b​o​o​k-inc
2https://​towardsdata​s​cience​.com/​e​f​f​e​c​t​-​o​f​-​c​a​m​b​r​i​d​g​e​-​a​n​a​l​y​t​i​c​a​s​-​f​a​c​e​b​o​o​k​-​a​d​s​-​o​n​-​t​h​e​-​2​0​1​6​-​u​s​-​p​r​e​s​i​d​e​n​t​i​a​l​-​e​l​e​c​t​i​o​n​-​d​a​c​b​5​4​6​2​1​5​5​d​?​g​i​=​e​d​0​6​9​7​1​b06a5
3https://web.stanford.edu/~gentzkow/research/fakenews.pdf

Support accurate information rooted in the scientific method.

Donate