Web Wanders: Facing the truth


  • TECH
  • Monday, 07 Jul 2014

Tricky issue: Lines need to be drawn on what's permissible when it comes to research conducted on social media.


DON’T believe everything you see on Facebook. 

Although most of us probably already know this, but yet it’s pretty easy to let ourselves be swayed by the information that we come across on our News Feed on any given day. 

Well, I don’t know about you, but I definitely can recall many occasions in the past where something I read on Facebook has had a profound effect on my emotions and sometimes, even my self esteem. 

For instance, during the early stages of my career, I remember feeling really small when I learned that quite a few of my friends had successfully landed jobs overseas. 

I also admit to weathering occasional bouts of depression whenever I browsed through the many happy photos that friends who were madly in love would post on their Timeline (side note: I spent most of my youth being single and unattached). 

On my better days, I was usually able to reason that it really wasn’t any fault of my online friends if I ended up being negatively affected just because of something they had posted on Facebook. After all, they were merely sharing their life experiences with those around them. 

However, I’d certainly feel differently about all of this had I discovered that the contents of my News Feed were in fact being doctored. 

Manipulating emotion 
The truth is that such things really do happen. In fact, recent news has informed us that Facebook itself has been guilty of manipulating the News Feed content of 689,003 users for a week in January 2012. All in the name of research. 

Here are the brief facts: the purpose of this research was to study how emotions that were expressed via Facebook posts would affect people who read them. More specifically, researchers were keen on finding out the impact on people reading such posts and how it altered their own posting behaviour. 

Users were selected randomly for this study, and the experiment was conducted solely on those who viewed Facebook in English (which means that it’s highly possible that any of us could have easily been on that list).

What was actually done was to limit the positive emotional content in the News Feed for some of the users, while for others, the researchers reduced the negative emotional content that appeared. Each time a person participating in the research loaded their News Feed, posts that appeared would be filtered accordingly (either more positive or more negative posts would be shown). Whenever their News Feed was refreshed, the content itself would change, but the same criteria would be upheld for a particular person. 

At the same time, all filtered content could still be viewed if a user were to directly visit their friends’ profile. Hence, this experiment only applied to their News Feed, and not other parts of the Facebook website such as a person’s private messages. 

On the whole, the findings of this research were quite interesting, actually. What the results showed was that the concept of emotional contagion indeed existed. In other words, the moods you express on Facebook can affect others. They found that people tended to post more negatively when positive posts were reduced in their Feed, and vice versa. 

But what’s not  so cool about this entire exercise was the methods that the researchers had used to arrive at the desired outcome. 

Ethical issues 
Many have voiced their criticisms about the fact that the researchers had not actually gotten informed consent from those who were selected to participate in this experiment. 

Furthermore, because there had been no knowledge on the part of users that this research was  being carried out, participants would not have had the chance to opt out if they had wanted to. 

This goes against what is known as the Common Rule in the United States (US). It is a principle that is usually adhered to in academic research, especially where funding is provided by the US government. It is meant to protect research participants.

However, in this case, there was no such external sponsorship provided, but that should not be an excuse. In fact, most Facebook users were obviously unhappy with the approach that the company had taken. 

Well, it could also be argued that all users consent to Facebook’s Terms of Service when they sign up for an account on Facebook. Nevertheless, in a lengthy document where there is scant mention of the word “research”, would it be considered sufficient notice to users of such activities being carried out? 

Furthermore, it also brings to mind an unsettling thought: What other experiments might Facebook by trying to run quietly in the background? 

Setting boundaries 
Just like what many others have already said, I too am not in favour of Facebook conducting research secretly in this manner. Granted, nothing’s ever free in this life and we should expect there to be a catch somewhere whenever we enjoy free services like Facebook, but someone needs to draw the line somewhere. 

Other news sites like Forbes have even gone so far as to quote sources who called this the “corporate rape culture”. 

While I think it’s a little drastic to close down your Facebook account just on the basis of this incident alone, I’d still say it’s worth reviewing your online presence every now and then to think about what risks you expose yourself to. 

Also, it’s worthwhile being aware of what you’re actually agreeing to when you sign up for something online. (Because I’m pretty sure Facebook isn’t the only tech company out there that does such things.)

Wherever possible, we as users ought to protest against the wilful misuse of our online data to the extent that regulators and the companies themselves actually sit up and take notice. 

Yes, the tech landscape may be a constantly changing environment, but good ethics should be upheld in this industry because at the end of the day, integrity is still good for business. And tech giants would do well to remember that lest they someday lose their competitive advantage due to their ignorance. 

Susanna Khoo hopes successful tech companies will use their powers responsibly, taking care to safeguard the personal rights of their users. Let her know your thoughts on this matter by emailing her at susanna@thestar.com.my.

Limited time offer:
Just RM5 per month.

Monthly Plan

RM13.90/month
RM5/month

Billed as RM5/month for the 1st 6 months then RM13.90 thereafters.

Annual Plan

RM12.33/month

Billed as RM148.00/year

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Others Also Read