Skip to main content

How Facebook killed (most) spam

A Facebook automated system was set up to block apps that users were hiding or marking as spam.
A Facebook automated system was set up to block apps that users were hiding or marking as spam.
  • Facebook says app "spam" down 95 percent in the past year
  • The site used automated tools to detect which apps users were hiding or marking as spam
  • If too many users marked something as spam, Facebook blocked it

(Fast Company) -- Do you "like" receiving Facebook messages about, say, your buddy Rich's new row of corn in FarmVille?

If not, you're in luck: Facebook CTO Bret Taylor told Fast Company earlier this week that it's just those kind of messages the company focused on while looking to cut down on spam in the system -- way down.

Mission accomplished. Such spam was down 95% in 2010. That's an impressive achievement. But the backstory to how the company accomplished that feat reveals some of the internal thinking that could be key to Facebook's ability to continue to grow efficiently -- and become an all-around ever-stronger product -- in the years to come.

The spam Facebook was targeting were those annoying messages from the likes of FarmVille or Mafia Wars (about Rich and his corn or whatnot) that used to pollute users' NewsFeeds.

Game companies liked them, because they raised awareness and helped recruit new users. But, said Taylor at the Inside Social Apps conference on Tuesday, the company soon realized that that was just a bad experience for many Facebook citizens. (Indeed, search Google for "stop Farmville notifications," and you'll get over 50,000 results.)

"Our focus coming into 2010 was really around user experience," Taylor said. "The reactions to some of the decisions we'd made as a product team [and] a platform team were not universally positive ... . So going into the year, we were trying to build a scalable way of dealing with this problem."

It was also a bad experience for both Facebook itself and for application developers. When Facebook started noticing that a particular kind of message was getting a negative response among users, employees would contact developers and suggest design changes.

That was bad for Facebook -- it sucked up employee bandwidth -- and it was bad for developers, who felt like Facebook was micromanaging them.

Interestingly, Facebook decided not to go the law and order route. They didn't start writing out long lists of rules about what kinds of messages would be allowed and what kinds wouldn't.

Instead, Taylor told Fast Company in an interview following the talk, they built an automated system that monitored each individual message -- and then took action, again, automated, against the specific messages that seemed to be bothering users.

Specifically, the system tracks whether recipients hide certain messages or mark them as spam, or whether they click "Like" on the message or comment on it, or whether they actually click through to see the application itself.

"Using a bunch of signals like that, we're able to infer the likelihood that something is a high-quality message," Taylor said. Or, alternatively, if it is low quality.

If too many recipients hide a message or mark it as spam, Facebook automatically starts blocking it.

The application developer is notified -- also through an automated system -- and has to go back to the drawing board to develop something recipients respond to more favorably. And if an application sends nothing but low-quality messages, Taylor said the system simply turns the application off altogether.

"What's great about this is that we no longer need to micromanage every interaction," Taylor said. "We just measure the output."

On top of that, Taylor said Facebook has been able to cut its policies--the law-and-order rules developers must follow -- by half. It doesn't need as many. Facebook simply evaluates applications based on their real-world performance, rather than by anticipating every possible eventuality and developing legislation around it.

The upshot has been good for all parties involved: Facebook users find the place more enjoyable, now that they're no longer being updated on buddy Rich's farming ventures. Facebook needs to devote fewer resources to managing developers, at least in this sphere.

Developers no longer feel nitpicked. And while some application developers may feel like they can't grow as fast as they used to, the experience of CityVille (which reached an astounding 100 million users in a mere 43 days) demonstrates that there are still ways to enable viral growth without being overly annoying.

All of which bodes well for Facebook. The better experiences it can create while minimizing the demand on its own resources, the faster it will be able to grow, and the more loyalty it will get from its users and developers.

Copyright © 2010, a unit of Mansueto Ventures, LLC. All rights reserved.


Most popular Tech stories right now