(Editor’s Note: Facebook originally declined to respond to our reporter’s request for an interview. After publication of the story, Facebook contacted The Open Standard with concerns, and we’ve entered responses and clarifications in the story, in italic. Additionally, responding to a reader comment, we’ve clarified at the end of the story how the news feed algorithm is turned on and off.)
If you’ve been wondering why your uncle or high school crush dropped off your Facebook news feed, brace yourself for a rude awakening — or, as one subject in a recent study put it, “what the hell, Facebook?”
Facebook’s so-called “emotional contagion” experiment may have grabbed the summer headlines and a blizzard of public condemnation when researchers claimed to have successfully manipulated the moods of unsuspecting users. But others say most users may be more shocked to learn their news feeds are routinely manipulated by the social media giant.
The news feed is the constantly updating list of stories in the middle of your Facebook homepage. A little-known algorithm decides from what friends you do and don’t receive updates.
“The majority of people that we interviewed didn’t realize there was a Facebook algorithm,” Harvard researcher Christian Sandvig told The Open Standard. So, if your jaw just hit the floor, take heart. You can always “like” more of your friends’ posts to counter the algorithm to a degree.
But that won’t solve the problem. Some posts don’t lend themselves to a “like,” such as something unpleasant. How could anyone “like” the U.S.-led bombing of ISIS, the Ferguson shooting death of Michael Brown or the suicide of comedian Robin Williams?
Interests not served
“It’s not big brother exactly,” acknowledged Sandvig, who is an associate professor at the University of Michigan in addition to serving as a faculty associate of Harvard’s Berkman Center for Internet & Society. “The news feed algorithm is basically not serving our interests.”
Content recommendations and ad placements based on algorithms sometimes have unintended consequences – and not just on Facebook, but in the search results returned by Google and Amazon. The difference is that these sites don’t shape our online social experience as much as Facebook.
“The majority of people that we interviewed didn’t realize there was a Facebook algorithm,” said Harvard researcher Christian Sandvig. But most users already can toggle between “top stories” and “most recent”
“Let’s say you had a baby and you really wanted people to know that you had a baby. You definitely would get more play if you had a link in there to say Bud Light, because Anheuser-Busch is one of Facebook’s advertisers,” Sandvig asserted.
Responding after the story’s publication, Facebook spokesperson Michael Kirkland said via email, “That is not true. Organic News Feed ranking is not impacted at all by ads. We try to show people the things they will find the most interesting based on what and who they interact with, not who spends money on Facebook.”
The casual user also doesn’t always know when their post is used to sell a product or service.
“Most people have a hard time seeing the difference between the sponsored posts, especially the ones at the top of the feed, unless they’re really looking carefully,” Sandvig observed. “When it’s pointed out to them, users don’t like it at all, but the way Facebook is designed it’s pretty hard to notice because it happens on your friend’s feed, not your feed.”
Facebook’s Kirkland said, “your posts are no longer used in ads.”
‘What the hell, Facebook?’
Karrie G. Karahalios, an associate professor of computer science at the University of Illinois, said only 37.5 percent of more than 40 diverse participants in the recent study undertaken by her and Sandvig understood that posts were being filtered in — or out — of news feeds by Facebook’s proprietary news feed algorithm.
Some Facebook users took the news harder than others. “What the hell Facebook,” opined one, while another canceled their account on the spot, as study participants viewed side-by-side comparisons of their curated and non-curated news feeds.
In all, 62.5 percent of study participants were not aware that Facebook was picking and choosing the posts they would see in their news feeds, in some cases based on advertising.
Karahalios tells The Open Standard that even computer science majors were dumbfounded, something that affected her more than Facebook’s contagion study.
Karahalios was referring to the study conducted by a Facebook executive and two Cornell University researchers, who acknowledged manipulating the “emotional content” of 689,003 Facebook news feeds to see if they could change the mood of some segment of unwitting subjects.
In layman’s terms, they wanted to see if they could arbitrarily make people happy or sad. After analyzing more than three million posts containing more than 122 million words, researchers concluded that it was possible to change someone’s mood.
Of course Facebook is no stranger to privacy concerns. “You might have heard the rumors going around about the Messenger app,” the organization acknowledged recently. “Some have claimed that the app is always using your phone’s camera and microphone to see and hear what you’re doing. These reports aren’t true, and many have been corrected.”
Facebook did not respond to a query from The Open Standard.
App distribution planned
Researchers Sandvig and Karahalios plan to distribute an app that will allow a much wider swath of users to receive an unfiltered Facebook news feed. Even without the app, most users have the ability to toggle between “top stories” or “most recent.” All users receive advertisements.
Advice for casual users
So should casual Facebook users “like” everything they come across?
Not necessarily, said Sandvig. The solution may involve a combination of more education about algorithms, government intervention in some cases or even the emergence of information clearinghouses along the lines of Consumer Reports for the Internet.
After participating in the study, participants overwhelmingly reported changing their Facebook habits, according to Sandvig, who adds some participants are now more assertive in teaching Facebook what they “like.” Researchers also said that other subjects experimented with switching their news feed between “top stories” – algorithm on – to “most recent” – algorithm off.