google-site-verification=rELuVVyS5Y8o0Ezst8ITY3su3PIT5khzDgo-anRp4o8 The Great Facebook Debate ~ Tech Senser - Technology and General Guide

16 Dec 2014

The Great Facebook Debate

Over the years, Facebook has grown to be the largest and longest lasting social networking platform to grace our screens. Constantly adapting and providing new and relevant functionality, while remaining true to its core purpose and what we know and love - Facebook has earned its place in our hearts, homes, laptops and phones. 
With over 829M daily active users, Facebook is a rather untapped resource in the area of gathering consumer research and information. That is until news was released recently about various groups of “emotional experiments” conducted unbeknownst to a number of the participating Facebook users.
Facebook
Image Licensed Under Attribution
The Wall Street Journal reports that in 2012, 700,000 users were a part of a psychological experiment measuring the effects of positive stories vs negative stories in newsfeeds and how it affected the types of things the viewing user posted. While many people have voiced their concern about these types of tests having not been informed prior to, or even post testing, the point of contention seems a lie at the ethics of this type of testing.
On one hand, as a user who’s chosen to use Facebook services as they’re presented, a problem lies in not being informed that you’re a participant in any type of study, as that was not what was initially agreed upon. 
It’s one thing to be observed in your use of a particular product or service, especially one that’s provided for free, but it dives into new territory when that service begins to manipulate the things you interact with and that make your experience unique. 
In gauging the positive and negative psychological effects of different posts in a newsfeed, little to no consideration was taken regarding the mental conditions those users may be dealing with in real life, aside from their time online. 
While it may seem unlikely that someone could have a drastic reaction to news in their feed, you can never be sure what triggers may set someone off consciously or subconsciously. I’d say it’s wholly irresponsible to operate with such little respect to the fragility of someone’s mental state. Even when the chances of adverse reactions are low, the presence of Murphy’s Law is inarguable.
On the other hand however, how fruitful and effective would science be if it didn't take necessary risks in the name of understanding and solving problems? The type of information gathered about newsfeed posts could prove invaluable to a numerous organizations ranging from political involvement, to evaluating the mental health of veterans and providing resources to people who suffer from loneliness and depression. 
Use the same argument but in the inverse, these tests do appear to be low-risk assessments not with fabricated information but just the rearranging of already existing topics and headlines.

There are strong arguments for the ethical implications on either side of the fence, but the fact remains that this is something people would have liked to have been informed of. By giving people the opportunity to opt in or out of a trial would definitely affect the data pool, but there are proven ways to deal in voluntary numbers and still receive quality information. 
It’s also worth noting that some of Facebook’s contemporaries such as Twitter, Yahoo and Google, also test and observe their users. So what are your thoughts on the existence of these trials?
 Mark Stoller

About the Author:

Ohad Mark Stoller is a writer for Fueled, an award-winning mobile app design and development house based in New York, Chicago and London. At Fueled, we don't just build apps; with teams of designers, developers and strategists, we create visually stunning products that redefine the technical boundaries of today's mobile development standards.