'To Facebook, we are all lab rats' - are you outraged?
Related Blog Posts
- Facebook's dinosaur is no joke - It's part of new privacy tools
- Facebook, Twitter, Google score during World Cup
- Facebook, Twitter brace for World Cup fever
- In wake of spy scandals, volume of encrypted email is increasing
- Facebook's Zuckerberg, wife give $120M to California schools
On The Web
Research Triangle Park, N.C. — All of you Facebook and social media users who can't wait to share your story as well as your "likes" and 'tweets" had best read a New York Times story about how Facebook manipulated news feeds as part of an experiment.
So you think you can trust any of these social tools? Our privacy is threatened daily. (Note Facebook's own recent change.) Now an Internet giant has acknowledged manipulating data. What's next?
Don't be naive, folks.
In other words, get some sense. Don't be what the Urban Dictionary defines as naive ("Generally speaking, to be naive means you do not think enough. People who are "naive" tend to believe in whatever they are told, without question ...)" when it comes to Facebook or Twitter or anything else.
The more you share, the more you rely on these tools, the more you risk.
Vindu Goel of The New York Times wrote a first sentence in the report about a big Facebook experiment that should blow up in the social media giant's face:
"To Facebook, we are all lab rats."
The headline is alarming, to say the least:
Make sure you read it.
This manipulation, for which Facebook apologized online Sunday, could have been dangerous.
In a Twitter post, as The Times noted (ironic, isn't it?):
“I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it’s
Wrote Goel in his story:
"Facebook revealed that it had manipulated the news feeds of over half a million randomly selected users to change the number
of positive and negative posts they saw. It was part of a psychological study to examine how emotions can be spread on social media."
Would you even know you had been manipulated?
This is Big Brother territory - and it's downright dangerous, let alone outrageous.
The apology published Sunday is interesting reading in itself.
"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out," the post from Adam Kramer reads. "At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.
"Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody's posts were 'hidden,' they just didn't show up on some loads of Feed. Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.
And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it -- the result was that people produced an average of one fewer emotional word, per thousand words, over the following week."
So how do you define "hidden"?
back to the apology ...
"The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.
"While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper."
Well, let's hope so.
As for you Facebook users, don't ever be naive again. Who knows what will happen next - justified by terms of usage.
WRAL TechWire Publisher and Editor Rick Smith dishes out tidbits from the local technology sector. Read more articles…
Please Log In to add a comment.
Best of TechWire Insider
- N.C. Senate OKs bill with crowdfunding but future is 'uncertain'
- National tech news site Xconomy launches in RTP
- Vivek Wadhwa blasts Twitter's 'proud' response to diversity report
- N.C. Senate gives tentative OK to bill including crowdfunding
- Crowdfunding bill moves - again - as part of different legislation
- AT&T already seeking permits for rollout of ultrafast network
- Latest stats show institutional venture capital in N.C. becoming a joke
- With cities on board, N.C. Next Generation Network is rolling toward first big deployment target
- Is IBM-Apple deal a slap in the face for Lenovo?
- A rare look into economic recruitment, requirements: Cary's disappearing 1,237 jobs document