4_facebook
π Digital π Society
The digital revolution: at humanity's expense?

“Digital platforms have poor control over their manipulation of emotions”

On June 8th, 2021 |
4 min reading time
Camille Alloing
Camille Alloing
Professor of Public Relations at the Université du Québec à Montréal
Key takeaways
  • The Cambridge Analytica scandal and other cases have recently alerted citizens to the possibility of social networks manipulating votes by playing on their emotions.
  • But for researcher Camille Alloing, social networks knowingly overestimate their capacity for manipulation in order to sell advertising space.
  • In the same way, platforms such as Facebook have conducted psychological experiments on the emotions of hundreds of thousands of their users...
  • And doing so without their knowledge based on a caricatured and unreliable conception of emotion, with the sole aim of providing credit for their hypothetical capacity of manipulation.

It is said that one of the best ways to manip­u­late social media users is to make them feel scared or empa­thet­ic. The alleged role of Cam­bridge Ana­lyt­i­ca and Face­book in the elec­tion of Don­ald Trump seems to prove it. How­ev­er, UQÀM com­mu­ni­ca­tions and infor­ma­tion sci­ence researcher Camille Allo­ing, sug­gests that the pow­er social media holds over our emo­tions needs to be tak­en with a pinch of salt.

Could you explain what “affec­tive cap­i­tal­ism” is?

Sim­ply put, it is the part of cap­i­tal­ism that exploits our abil­i­ty to be moved (and to move oth­ers) to gen­er­ate val­ue; some­thing we par­tic­u­lar­ly see on social media.However, it’s worth look­ing a lit­tle clos­er at the word “affect”. It is a term that can be used to refer to any emo­tion, but the impor­tant part is how it “sets us in motion,” mean­ing what caus­es us to take action. 

When I “like” a post, I am affect­ed. Unlike emo­tions (which remain dif­fi­cult to analyse due to their sub­jec­tive and uncon­scious nature), affec­tive con­se­quences can be iden­ti­fied (you can know that a video affect­ed me because I pressed the “like” but­ton). So, although we can­not ascer­tain whether dig­i­tal plat­forms actu­al­ly suc­ceed in pro­vok­ing emo­tions in users, we can analyse how users behave.

Giv­en that most social media rev­enue comes from sell­ing ad space, the goal of these plat­forms is to increase the time users spend on them, and there­fore the num­ber of ads viewed. To that end, affect is unde­ni­ably extreme­ly use­ful – by gen­er­at­ing empa­thy, more reac­tions are prompt­ed, and con­tent is shared more.

I have found that indi­vid­u­als are now part of struc­tures that can affect them (and there­fore make them feel emo­tions and make them act) although they can­not affect back. If I post some­thing, and I’m expect­ing a response from my friends, I am alien­at­ed because I will only get that response if Face­book choos­es (for rea­sons beyond my con­trol) to share my post in my friends’ feeds.

You say that “affect is a pow­er­ful tool”. Is it the pow­er to manip­u­late peo­ple through their emotions?

If I said that affect­ing a per­son meant you could suc­cess­ful­ly manip­u­late them, I would be in agree­ment with the argu­ments the plat­forms are putting out. Face­book, for exam­ple, has every rea­son to let peo­ple think that their algo­rithm is able to con­trol users because it helps them to sell ad space. In this way, the Cam­bridge Ana­lyt­i­ca scan­dal [in which this com­pa­ny attempt­ed to manip­u­late Face­book users to influ­ence Amer­i­can swing vot­ers in the 2016 US pres­i­den­tial elec­tion in favour of Don­ald Trump] pro­vid­ed incred­i­ble pub­lic­i­ty for Face­book with their adver­tis­ers who saw it as an oppor­tu­ni­ty to dras­ti­cal­ly increase their sales by manip­u­lat­ing users!

How­ev­er, the role of social media in Trump’s elec­tion must be put in per­spec­tive, and we should be care­ful not to trust over­sim­pli­fied expla­na­tions. Even though Face­book boast­ed that its tar­get­ed adver­tis­ing was 89% accu­rate, in 2019 employ­ees revealed that aver­age accu­ra­cy in the US was in fact only half that (41%, and as low as 9% in some cat­e­gories)1. Sure, these plat­forms’ algo­rithms and func­tion­al­i­ties have tan­gi­ble effects… but they are much less than what you might think.

The research is there to facil­i­tate well-bal­anced debates, and sci­en­tif­ic stud­ies23 have shown that, con­trary to what we might hear, social media plat­forms can­not actu­al­ly manip­u­late us. That doesn’t mean they don’t try, but they can­not con­trol who they affect nor what the con­se­quences are of their ini­tia­tives. What’s more, it can quick­ly become dan­ger­ous, even more so giv­en that their con­cept of human psy­chol­o­gy leaves much to be desired. Believ­ing that peo­ple are blind­ly sub­ject to their emo­tions and cog­ni­tive bias­es is a form of class contempt.

In 2014, Face­book hired researchers to per­form psy­cho­log­i­cal tests that aimed to manip­u­late the emo­tions of 700,000 users, with­out their con­sent [4]. This “sci­en­tif­ic” study was meant to demon­strate the platform’s abil­i­ty to con­trol the mood of its users and involved mod­i­fy­ing people’s news feeds to show them more neg­a­tive (or pos­i­tive) con­tent. As a result, they claimed that they could cause “emo­tion­al con­ta­gion,” as peo­ple would pub­lish con­tent that was more neg­a­tive (or pos­i­tive, depend­ing on what they had been shown). How­ev­er, on top of the obvi­ous eth­i­cal issues, the exper­i­ment was sta­tis­ti­cal­ly flawed, and the con­clu­sions do not hold up. But I think it’s fair to say that sci­en­tif­ic rigour was prob­a­bly not their pri­or­i­ty! Above all, the objec­tive was to cre­ate good pub­lic­i­ty among adver­tis­ers – Face­book uses research as a PR tool.

Yet it is impor­tant to remem­ber that affect­ing some­one is not nec­es­sar­i­ly neg­a­tive – it all depends on our inten­tions. We are con­stant­ly affect­ing each oth­er, and when we are feel­ing down, we need to be affect­ed in a pos­i­tive way. We sim­ply need to care­ful­ly con­sid­er who we are allow­ing to affect us. Should pri­vate com­pa­nies have this pow­er? Should the government?

Should we be con­cerned by detec­tion of bio­met­ric emotion?

Yes. We are cur­rent­ly see­ing the wide­spread dis­sem­i­na­tion of bio­met­ric tools that mea­sure emo­tions. In our book [5], we men­tion a com­e­dy club in Barcelona, the Teatreneu, where the price of your tick­et is cal­cu­lat­ed by the num­ber of times you laugh (30 cents per laugh). This exam­ple is pret­ty anec­do­tal, but less amus­ing­ly, bio­met­ric tech­nol­o­gy (which until recent­ly were noth­ing but basic exper­i­ments for com­mer­cial ends) is now being used to mon­i­tor cit­i­zens. The NYPD has spent more than $3 bil­lion since 2016 on its algo­rithms, which use tar­get­ed ads to mea­sure the atti­tudes towards police of 250,0000 res­i­dents [6].

The prob­lem is also that this bio­met­ric emo­tion detec­tion tech­nol­o­gy is very bad at doing its job. This is because it is based on the work of Amer­i­can psy­chol­o­gist Paul Ekman and his Facial Action Cod­ing Sys­tem [a method of analysing facial expres­sions that aims to asso­ciate cer­tain facial move­ments to emo­tions], which does not actu­al­ly work in practice.

Despite their inef­fec­tive­ness, these bio­met­ric tools are spread­ing at a rapid pace – yet tech­nol­o­gy is much more dan­ger­ous when it works bad­ly than when it doesn’t work at all! If it’s 80% reli­able, and you are part of the 20% mar­gin of error, it will be up to you to prove it. I find it very con­cern­ing that poor­ly func­tion­ing tools are becom­ing tools of gov­er­nance and sur­veil­lance, imple­ment­ed with­out the con­sent of the main par­ties involved. 

Interview by Juliette Parmentier
1http://​www​.wohl​fruchter​.com/​c​a​s​e​s​/​f​a​c​e​b​o​o​k-inc
2https://​towards​data​science​.com/​e​f​f​e​c​t​-​o​f​-​c​a​m​b​r​i​d​g​e​-​a​n​a​l​y​t​i​c​a​s​-​f​a​c​e​b​o​o​k​-​a​d​s​-​o​n​-​t​h​e​-​2​0​1​6​-​u​s​-​p​r​e​s​i​d​e​n​t​i​a​l​-​e​l​e​c​t​i​o​n​-​d​a​c​b​5​4​6​2​1​5​5​d​?​g​i​=​e​d​0​6​9​7​1​b06a5
3https://web.stanford.edu/~gentzkow/research/fakenews.pdf

Our world explained with science. Every week, in your inbox.

Get the newsletter