Facebook’s News Feed Algorithms
Facebook is at it again, violating users’ privacy in new and bizarre ways. This time it’s a psychological experiment, performed on some. 700,000 users, to see if positive statuses can invoke negative feelings and vice versa. (The experiment was performed in early 2012, but it’s existence was only recently revealed to the public.) Facebook has rightfully received a lot of condemnation for so harshly manipulating its users’ emotions, but I want to look at it from another point of view: the censorship of the News Feed.
During this experiment, Facebook modified its News Feed algorithms for certain users. Some people got more negative stories and some got more positive ones, based on keywords within each individual post. So if you were in the negative test group, you might have missed out on the news of a friend’s engagement. If you were in the positive group, you might not have learned about a relative being sick. It’s hard to say for sure, because we don’t know which 700,000 users were experimented upon. All we know is that Facebook used its software to mess around with its users, rather than to provide a useful service.
What I want in a News Feed
Here’s what I want. I want to see everything my friends post when they click the “post” button. That’s it. I don’t want to see that someone “likes” a brand. I don’t want to see when someone listens to a song on Spotify. And I most definitely don’t want to see when a friend-of-a-friend tags someone in a post. (Most of the time it’s one of my friends’ great-aunts tagging everyone she knows in some chain letter crap.)
I just can’t fathom why Facebook can’t get this right. I understand that they want to serve me ads and promotions, but their product placement is really clumsy compared to the Sponsored Tweets on Twitter. The truth of the matter is, I don’t mind handing over some of my personal information to communicate with my friends and family. I just wish Facebook would to a better job of fulfilling their end of the bargain.