Facebook just gave its automated system, the one that tossed my status back into my News Feed, a bad name

Facebook’s automated system for restoring posts gets really creative. According to a leaked document obtained by Gizmodo, the social network posts “a weird percentage” of your suggested posts that don’t really live up to that description. But your social network won’t magically strike out entirely — if you see your status, photo or Like being restored, the chances are 50-50.

Previously, the company would only strike out posts of your friends, similar posts, photos of cute animals or pop music, articles and videos that a certain percent of you in your friends list had liked. Over time, that select group of friends would gradually shrink down to one. The essay says that by 2012, at the outset of the financial crisis and the Great Recession, that algorithm was seeing only a “dubious” fraction of you liked anything the company could determine (and only the Facebook community, not anybody else, kept track of exactly what that meant) — so Facebook would keep throwing you a bone if the moment really begged for it. Eventually, the figure was in the six-to-eight-percent range.

Over time, though, that adjustment was starting to get too brutal, resulting in “if anything else is relevant to the content, then it goes. It never goes past a five-percent share.” That not only made the system less fair, the Gizmodo piece suggested, but it made it harder for Facebook’s human editors to make what the social network wanted to display to its users. As a result, Facebook may have decided to create its own response algorithm, based around a real number.

“The actual number is equally random and ever-changing,” Facebook SVP Justin Osofsky told Gizmodo.

First, Facebook’s secret system does not apply to Pages, outside partners or groups. It’s theoretically possible the algorithm could be tweaked as you visit different pages and subscribe to different groups. But that’s something that Facebook would be willing to talk about, Osofsky said.

As far as how it works, Facebook tells the paper that it looks at about six factors: Language, location, gender, status of the original post, specific interest, time of post. For one post in Australia, for example, a woman liked a post that was later flagged as offensive to her. According to Gizmodo, Facebook served up a post (originally, an article) from the same article about violence against women, but when that woman visits the same Web page she liked, the page is always unblocked.

Other numbers are obviously much more intense: On Friday afternoon, Facebook would strike out a request by someone who says they live in Pennsylvania and send another who says they live in Canada. Those rates could last for a few hours, or a few days. On Saturday, Gizmodo says, one woman said that both her friend and her News Feed were suspended for five hours.

So, will this spill over into how Facebook thinks about its ads, too? Osofsky confirmed that was a possibility.

“For years, a single person has been reviewing and policing all of our ads,” he told Gizmodo. “With the expansion of the number of ads being run this year, that person is taking on more of a passive role.” He said that hiring three new ad reviewers would reduce the number of human-reviewed ads by half.

Leave a Comment