Facebook conducted a psychological experiment on over 6 lakh users, who had no idea that their news feeds were being manipulated to study their emotional reactions.
The study, published in the latest issue of the journal Proceedings of the National Academy of Scientists (PNAS) in the US, claims that the experiment was "consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research".
The study proves existence of an "emotional contagion" through social networks, that is, a user's own emotions reflect what he or she sees more of in his or her newsfeed.
The decision to carry out such a study without telling users is ethically problematic, said lawyer Apar Gupta. Gupta says that though legally Facebook is in the clear when using data in this manner, ethically, they may run into murky waters.
"I do think it is unethical. The language of the terms and conditions is broad enough to cover something like this, but it is not wi-thin the contemplation of a user that this sort of manipulation would take pl-ace," says Gupta.
The study, which was done over one week in January 2012 was carried out on 689,003 randomly selected users, who posted at least one status during the experiment and viewed the service in English.
Posts were identified as "negative" or "positive" through pre-identified words. After tweaking the prominence of positive or negative stories in the newsfeeds of selected users, they studied the words in the subsequent status updates of these users. The conclusion was that the manipulation of the newsfeed reflected on their own posts and, by extension, their emotional state.
"When positive posts were reduced in the news feed, the percentage of positive words in people's status updates decreased by 0.1 per cent compared with to that used earlier, whereas the percentage of words that were negative increased by 0.04 per cent. Conversely, when negative posts were reduced, the percentage of words that were negative decreased by 0.07 per cent and the percentage of words that were positive, conversely, increased by 0.06 per cent," says the paper, which was released earlier this month.
Mishi Choudhary, legal director at the Software Freedom and Law Center, flags concerns about how easy it is to manipulate inform-ation and behaviour. "Social media is such an integral part of our online lives. It would be so easy to manipulate someone's political opinions or even buying behaviour to get more clicks on ads," says Choudhary.
Facebook had over 1.23 billion users worldwide as of December 2013, and crossed 100 million active users in India in April.
'Study consistent with policy'
Study published in latest issue of US journal, claims experiment 'consistent with Facebook's Data Use Policy to which all users agree before opening account'
Posts identified as 'positive', 'negative' through pre-identified words
Facebook tweaked prominence of positive or negative stories in newsfeeds, studied words in subsequent updates
source:timesodindia
No comments:
Post a Comment