During the SMX Conference, representatives from both Google and Bing revealed the existence of “whitelists,” which are essentially exemption-based lists that compile domains and prevent those domains by being penalized or changed by certain segments of algorithm changes. To address any misinterpretations, both search giants elaborated that these whitelists were not used to boost the ranking of any sites in results listings, nor were they used to completely exempt any sites from any and all algorithm chances.
To illustrate how these whitelists function, let’s say Google releases a new algorithm update. The update is going to affect 99 percent of sites on the web in exactly the way they intended, but that remaining 1 percent is going to be hit with a penalty they didn’t deserve. This could be because the sites have a unique position, or because the algorithm cannot be perfected to accurately target each domain appropriately, but in any case, whitelists exist to prevent and correct those undue evaluations.
Whitelists for Penguin and Panda
The revelation of the existence of whitelists has been a relief to some business owners concerned about how their businesses might be affected by algorithm updates. Many websites hit by Google’s biggest updates, Panda and Penguin, have claimed that the resulting penalties were unwarranted. The idea of being “whitelisted” is a comforting assurance to those afraid of getting hit with future updates.
However, according to Google’s John Mueller, a Webmaster Trends Analyst in Zurich, Google doesn’t have any whitelists for its Penguin or Panda algorithms. Mueller illustrated an example of a whitelist in action: the Safe Search algorithm is designed to flag sites with adult content, but a site about animals could be flagged as adult despite not featuring any offending material. Here, the site would be whitelisted and removed from that algorithm’s flagging system.
He went on to explain that there’s no whitelist for Penguin or Panda because these updates were designed to affect all sites. Whitelists are intended to be a stopgap—a temporary measure to correct some fundamental flaws with algorithm changes. For example, Safe Search could be fixed in the future to avoid flagging material that it flagged in the false positive animal site. Panda and Penguin would require no such stopgaps.
There’s another problem with the notion of being “protected” by whitelists; even if whitelists did exist for Penguin or Panda, they exist to serve the exceptions to the master rule. The odds of being the one site out of a hundred that isn’t served by the algorithm the way Google intended is—you guessed it—one in a hundred. Whitelists are short, and it’s highly unlikely that you’ll be on any of them in the first place.
The motivation behind keeping Panda and Penguin whitelist free is the same motivation that drove the deployment of those updates in the first place. Google wants to give users the best possible online experience, and that means giving them the best possible results for their queries. While they want webmasters to have a fighting chance at getting visibility, they also aren’t going to worry if a handful of businesses are hit particularly hard by a new update. Adding a whitelist to a major algorithm change is a way of undermining it, especially when the algorithm is focused on quality.
Let’s compare Google Safe Search to Google Panda (or Penguin, it doesn’t matter in this case). Safe Search is unique because it scouts for very specific pieces of material—in this case, adult material, but similar algorithm changes could scout for other forms of specific content. Sites can be hit with undue penalties because they are categorized as having content that they do not have. However, Panda and Penguin are quality-focused, meaning their entire purpose is to generalize the quality of a site’s onsite content and offsite links. There may be a handful of signals that provide erroneous feedback to Google, but overall, their judgments on websites are fairly accurate. Exempting sites from this quality check is like saying that certain sites don’t need high-quality content or relevant links.
Steps You Can Take
There’s no whitelist for Penguin and Panda, so you don’t need to worry about it. All you should do is continue refining your strategy to give your users the best possible experience and put the best possible content on the web. For the Panda algorithm, that means updating your site with fresh, relevant, interesting content on a frequent and consistent basis. For the Penguin algorithm, that means building valuable, highly authoritative links on relevant sources with a relevant foundation.
If you’re worried about getting on a whitelist in the future, don’t be. Google’s niche updates aren’t nearly as impactful as its major updates, and the chances of getting hit with an unfair ranking change are extremely low. In the event that you do fall in rank, you can make a case to Google, and they’ll probably be more than willing to manually exempt you. Still, this is a rare occurrence, and you shouldn’t expect to ever encounter it.
As for larger updates in the future, it’s likely that Google will continue its approach of avoiding the whitelist entirely. Should Google unleash another Penguin- or Panda-like update on the world, you’re going to be subject to its evaluations, just like everyone else. And while you might experience a bit of unwanted volatility, the web will likely end up a better place because of it.
What can we help you with?