AudienceBloom

CALL US:  1-877-545-GROW

Category Archive: Panda

  1. Why There’s No Whitelist for Penguin or Panda

    Leave a Comment

    During the SMX Conference, representatives from both Google and Bing revealed the existence of “whitelists,” which are essentially exemption-based lists that compile domains and prevent those domains by being penalized or changed by certain segments of algorithm changes. To address any misinterpretations, both search giants elaborated that these whitelists were not used to boost the ranking of any sites in results listings, nor were they used to completely exempt any sites from any and all algorithm chances.

    To illustrate how these whitelists function, let’s say Google releases a new algorithm update. The update is going to affect 99 percent of sites on the web in exactly the way they intended, but that remaining 1 percent is going to be hit with a penalty they didn’t deserve. This could be because the sites have a unique position, or because the algorithm cannot be perfected to accurately target each domain appropriately, but in any case, whitelists exist to prevent and correct those undue evaluations.

    Whitelists for Penguin and Panda

    articleimage886 Whitelists for Penguin and Panda

    The revelation of the existence of whitelists has been a relief to some business owners concerned about how their businesses might be affected by algorithm updates. Many websites hit by Google’s biggest updates, Panda and Penguin, have claimed that the resulting penalties were unwarranted. The idea of being “whitelisted” is a comforting assurance to those afraid of getting hit with future updates.

    However, according to Google’s John Mueller, a Webmaster Trends Analyst in Zurich, Google doesn’t have any whitelists for its Penguin or Panda algorithms. Mueller illustrated an example of a whitelist in action: the Safe Search algorithm is designed to flag sites with adult content, but a site about animals could be flagged as adult despite not featuring any offending material. Here, the site would be whitelisted and removed from that algorithm’s flagging system.

    He went on to explain that there’s no whitelist for Penguin or Panda because these updates were designed to affect all sites. Whitelists are intended to be a stopgap—a temporary measure to correct some fundamental flaws with algorithm changes. For example, Safe Search could be fixed in the future to avoid flagging material that it flagged in the false positive animal site. Panda and Penguin would require no such stopgaps.

    There’s another problem with the notion of being “protected” by whitelists; even if whitelists did exist for Penguin or Panda, they exist to serve the exceptions to the master rule. The odds of being the one site out of a hundred that isn’t served by the algorithm the way Google intended is—you guessed it—one in a hundred. Whitelists are short, and it’s highly unlikely that you’ll be on any of them in the first place.

    Google’s Motivation

    articleimage886Google’s Motivation

    The motivation behind keeping Panda and Penguin whitelist free is the same motivation that drove the deployment of those updates in the first place. Google wants to give users the best possible online experience, and that means giving them the best possible results for their queries. While they want webmasters to have a fighting chance at getting visibility, they also aren’t going to worry if a handful of businesses are hit particularly hard by a new update. Adding a whitelist to a major algorithm change is a way of undermining it, especially when the algorithm is focused on quality.

    Let’s compare Google Safe Search to Google Panda (or Penguin, it doesn’t matter in this case). Safe Search is unique because it scouts for very specific pieces of material—in this case, adult material, but similar algorithm changes could scout for other forms of specific content. Sites can be hit with undue penalties because they are categorized as having content that they do not have. However, Panda and Penguin are quality-focused, meaning their entire purpose is to generalize the quality of a site’s onsite content and offsite links. There may be a handful of signals that provide erroneous feedback to Google, but overall, their judgments on websites are fairly accurate. Exempting sites from this quality check is like saying that certain sites don’t need high-quality content or relevant links.

    Steps You Can Take

    articleimage886Stepsyoucantake

    There’s no whitelist for Penguin and Panda, so you don’t need to worry about it. All you should do is continue refining your strategy to give your users the best possible experience and put the best possible content on the web. For the Panda algorithm, that means updating your site with fresh, relevant, interesting content on a frequent and consistent basis. For the Penguin algorithm, that means building valuable, highly authoritative links on relevant sources with a relevant foundation.

    If you’re worried about getting on a whitelist in the future, don’t be. Google’s niche updates aren’t nearly as impactful as its major updates, and the chances of getting hit with an unfair ranking change are extremely low. In the event that you do fall in rank, you can make a case to Google, and they’ll probably be more than willing to manually exempt you. Still, this is a rare occurrence, and you shouldn’t expect to ever encounter it.

    As for larger updates in the future, it’s likely that Google will continue its approach of avoiding the whitelist entirely. Should Google unleash another Penguin- or Panda-like update on the world, you’re going to be subject to its evaluations, just like everyone else. And while you might experience a bit of unwanted volatility, the web will likely end up a better place because of it.

  2. Are Google Updates a Thing of the Past?

    Leave a Comment

    For more than a decade now, Google updates have been keeping search marketers on their toes. Every time you got used to one common SEO custom—such as the most effective way to build backlinks—Google seemed to respond by pushing a major algorithm change that altered how it took those factors into consideration. In the SEO community, industry professionals were constantly either looking for ways to take advantage of the most recent update or trying to anticipate what changes were coming with the next one.

    Now, as we enter a new era of search, Google’s update patterns appear to have shifted. For the past several years, rather than introducing new algorithm changes, the search giant is only making tweaks to previously existing ones and making minor changes to account for new technologies. Rank disruption is still occurring, but on a much smaller scale, leaving search marketers to wonder—are Google updates a thing of the past?

    The Major Overhauls

    articleimage861The Major Overhauls

    Google updates have earned a reputation for being large, disruptive, and sometimes annoying pushes that can raise your site to the top of the SERPs or drop you off into online oblivion. That’s because most of Google’s major updates so far have been massive game changers, either completely overhauling Google’s search engine algorithm or adding some new set of qualifications that turned the ranking system on its head.

    Take, for instance, the Panda update of 2011 affected nearly 12 percent of all queries, massively disrupting the search game by introducing a new concept of content-based evaluation. Sites with high-quality content were rewarded while sites with spammy content were penalized.

    It was a fair system, and searchers of the world were happy to start seeing more relevant results and fewer obvious attempts to climb ranks by whatever means necessary. But it was still a major headache for search marketers who had invested serious time and money into the previous iteration of Google’s search algorithm. For a time, updates like these were common, and search marketers were constantly on the run, waiting for more changes like the Penguin update, or Panda 2.0, which carried another massive update to Google’s content evaluation system.

    Modern Panda and Penguin

    articleimage861Modern Panda and Penguin

    Panda and Penguin, two of Google’s biggest landmark algorithm updates, have seen multiple iterations over the past five years. Panda 2.0 was followed by small iterations leading to 3.0, and Penguin 2.0 came out only a year after the initial round of Penguin. These algorithm changes were substantial, and search marketers attempted to predict the cycle based on existing patterns, projecting when the next major Panda- and Penguin-based algorithm changes would come.

    But something changed in 2014. Rather than unloading the Panda update in a major package, Google started rolling out data refreshes and minor tweaks to the algorithm on a steady basis. Instead of hitting the search world with a massive burst, it introduced a regular, unobtrusive pulse. Similarly, with the Penguin update, major iterations were virtually done away with. Marketers named an algorithm update “Penguin 3.0” in late 2014, but search volatility was limited compared to Penguin updates in the past.

    This, combined with the fact that Google hasn’t released a major overhaul to its search function since the Hummingbird update of 2013, seems to indicate that instead of rolling out massive, disruptive updates, Google is more interested in rolling out very small, gradual changes.

    Niche Algorithm Updates

    articleimage861nichealgorithmupdates

    Other than extensions for its previous updates, Google has also released a handful of other changes. However, most of these are focused on niche functions—for example, the unofficially nicknamed “Pigeon update” of 2014 overhauled the way Google processes and displays local search results, taking local reviews from directory sites into account. Similarly, Google has been making changes to its Knowledge Graph and how it displays on SERPs.

    These niche updates don’t radically change Google’s core algorithm, nor do they interfere with any major updates of the past. They do have an impact on how search works and what strategies are the most rewarding, but they haven’t done anything to change the fundamental elements of a great SEO strategy.

    The Case for Micro-Updates

    There are a lot of reasons why Google would want to abandon large-scale updates in favor of smaller, less noticeable ones, and the evidence supports that transition:

    • Major updates have slowed to a stop. Instead of large batches of changes, Google is rolling out Penguin and Panda changes gradually and almost imperceptibly.
    • Google is no longer officially naming its updates. Penguin 3.0, Panda 4.1, and the Pigeon update are all unofficial nicknames—Google has abandoned the process of naming its updates, indicating it’s moving away from the process.
    • Search volatility is decreasing. Since Panda’s 12 percent disruption, nothing has come close to that level of volatility.
    • Google is finally at a stable point. The search algorithm is now complex enough to evaluate the quality of sites and the intention behind user queries, leaving little reason to rapidly accelerate through growth spurts.

    Of course, it’s possible that Google has a few more aces up its sleeves, but for now it looks as though major updates are dead, in favor of smaller, less momentous rollouts.

    What Search Marketers Can Learn

    There’s no reason to fear anymore. It’s likely that Google will no longer be pushing out the updates that have disrupted so many business rankings for so long. Instead, search marketers should understand that the fundamental best practices for SEO—improving user experience and building your authorityaren’t going to change anytime soon. The tweaks here and there might fine-tune some small details, but for the most part, the sites and brands that offer the best overall experience are going to be rewarded.

  3. Why You Don’t Need to Worry About Future Google Updates

    Leave a Comment

    Google seems like it’s on a warpath, releasing updates and data refreshes on a near-monthly basis, and throwing the world of search marketing for a loop with game-changing features every few months. Search engine optimization (SEO) is always on the move, never resting in one place for too long, and search marketers are desperate to stay ahead of the curve.

    There’s a lingering fear among many search marketers that their efforts are one day going to be useless—after all, practices like keyword stuffing and backlink spamming were once the breadwinners of the SEO world, and now they’re long obsolete. However, despite the fact that Google will inevitably continue rolling out game-changing updates, it’s unlikely that you have a real reason to worry—as long as you’re implementing your strategy correctly.

    Penguins and Pandas and Pigeons, Oh My!

    Google’s been making steady updates since it first came onto the scene in 1999. The first few years were a matter of getting its footing, but for the next several years after that, things remained relatively stable. Search marketers engaged in straightforward, mathematical processes to increase their domain authority and rise through the keyword ranks. Then, in 2011, the Panda update was released and search marketers were forced to reevaluate their entire onsite strategy. Panda started weeding out shady content practices, such as content duplication, keyword stuffing, and spamming content for the sake of increasing content volume at the sacrifice of content quality. On the other hand, the Panda update rewarded sites with a focus on improving user experience, rather than just advancing rank.

    Then a year later, Google released the Penguin update, an offsite counterpart to the Panda update. Where Panda eliminated black hat onsite practices, Penguin eliminated black hat offsite practices, penalizing sites with an inordinate number of repetitive links, or links based on irrelevant external sources, or low quality sources. Much like Panda, Penguin shook up the world of SEO and forced search marketers to completely reevaluate a portion of their strategy.

    Google’s next major update, Hummingbird, in 2013 struck a serious blow against the relevance of keyword-based optimization by introducing semantic search—an algorithm feature that analyzes user intent rather than focusing on keyword phrases to populate results. It didn’t affect the sheer number of queries that Panda and Penguin did, but it did radically alter the way Google populated results.

    The Pigeon update in 2014 affected local search results by incorporating offsite user reviews into result relevance. All the while, new updates for Panda and Penguin have been rolling out gradually, refining each of them and keeping search marketers on their toes.

    The Lasting Fear

    articleimage730thelastingfear

    Search marketers are consistently afraid that yet another shakeup in the search world is going to force them to change their entire strategy—or worse, make their jobs obsolete. Google has been rolling these updates out to fight against practices designed solely to influence rank, so it isn’t unthinkable to imagine the company trying to eliminate SEO practices altogether. However, the fundamental motivation behind Google’s updates isn’t based on getting revenge on search optimizers. It’s actually much simpler than that.

    The Reason Behind Google Updates

    articleimage730The Reason Behind Google Update

    Google wants one thing: to remain the world’s foremost, dominant search engine (and overall web presence, but that’s another story). To do that, they have to keep their users happy, and they can keep users happy by giving them the best possible experience.

    That’s it. There’s no ulterior motive. Google just wants to give online users the best possible online experience, and that means giving them the best results. That means every update they make, from Panda back in 2011 to some unknown update in 2025, is going to be based around the idea of improving results, and therefore, user experience. You don’t need to worry about the updates that are to come down the pipeline because you already know what they’re going to be focused on, and you know you can prepare for them by proactively giving Google what it wants to see.

    What It Means for You

    articleimage730 What It Means for Yo

    Google’s updates are arbitrary, to some degree. If you’re only worried about finding and exploiting the rare holes in Google’s algorithm in order to increase your rank, you should probably be concerned about the next updates in line. However, if you keep your focus in line with Google’s focus by consistently refining and improving user experience, you’ll never need to worry about getting blindsided. All of Google’s updates are designed to make users happy, and if you’re making users happy with your strategy, you’ll make Google happy in turn.

    There are several ways you can do this.

    High-Quality Content

    First and foremost, you need to ensure that all your onsite content is high quality and relevant to your field. That includes all your headlines, body copy, blog posts, and page-based meta information. Instead of keyword stuffing, focus on topics that your users will want to read about. Instead of focusing on the volume and quantity of your material, focus on the quality. Be consistent and as detailed as possible in your individual posts, and stay up-to-date with the latest information in your industry to stay relevant.

    Respectful Offsite Practices

    If you want to get the most out of your SEO campaign, you’ll need to get involved on external sites. That means building helpful, relevant links and submitting guest posts and press releases to outside authorities. However, the best offsite optimization practice (and the only one that’s update-proof) is one that is natural. That means only posting content and links on sites that are directly related to your industry, or those with relevant content to your business.

    Creating a Memorable User Experience

    Don’t underestimate the power of giving your users a memorable onsite experience. Sleek designs, responsive experiences that function on every browser and every device, fast site load times, and enhanced security are some of the features that lead you to a current ranking boost. However, if you want to stay ahead of future updates, you need to pull out all the stops for your users. Don’t make upgrades because Google tells you to; instead, make updates because they’ll ultimately benefit your users.

    Building Your Brand’s Reputation

    Finally, you’ll need to build and consistently refine your brand’s reputation using social media channels and local influence. Claim as many social profiles and local directory profiles as you can, and tend to them regularly. Post comments and content whenever you can, and engage with your users when they ask questions or make comments. Similarly, whenever someone posts a review on a local directory, do what you can to learn from it—try to resolve any problems that lead to negative reviews, and focus on the elements of good reviews that you can continue to emphasize and improve. Getting social and involved in the community is a surefire way to increase your brand’s reputation and improve your domain authority simultaneously.

    As you start planning your SEO strategy for 2015 and beyond, remember that your users come first. Search engine optimization isn’t ever going to die, but it has already transformed. It’s no longer a strategy designed to build rank through a predictable, mathematical series of steps. Instead, it’s about crafting your site, your content, and your branding strategy in a way that cultivates the greatest possible user experience. Put that at the core of every strategy you implement, and you’ll never have to worry about facing a penalty when the next new Google update is released.

  4. What’s Next After Panda, Penguin, and Pigeon?

    Leave a Comment

    Google likes to keep search marketers on their toes. Its search engine algorithm, kept top secret, has evolved gradually over the course of more than 15 years, but its biggest changes have come in the form of incidental spikes. Google releases major updates to its algorithm in big packages, which roll out over the course of a few days, and have traditionally caused great volatility in the search rankings of countless businesses. Google also releases tiny updates, fixes, and data refreshes as follow-ups to these massive updates, but they don’t make nearly as many waves.

    The big players of the past decade have been the Panda update of 2011, the Penguin update of 2012, and the Pigeon update from earlier this year. These updates all fundamentally disrupted certain ranking principles we had all taken for granted, and their impact has dictated the shape of search marketing today.

    Today, it’s easy to understand why Google released each of these updates, but when they first rolled out, they were surprising to everyone. While there is a certain predictable calm in the current search marketing world, it’s only a matter of time before Google changes the game again with another revolutionary new update.

    So what will the nature of the next update be? And what can we do to prepare for it?

    Panda and Penguin: Two Sides of the Same Coin

    articleimage637Panda and Penguin

    In order to understand the possibilities for the future, we have to understand the context of the past. The Panda and Penguin updates served as complementary rollouts, targeting the negative practices of onsite SEO and offsite SEO, respectively.

    The Panda update came first in 2011, shaking up the results of almost 12 percent of all search queries. The update came as a surprise, but it was only a natural response to some of the practices that were rampant at the time. The update’s primary target was onsite content, and culprits who used low-quality content as a mechanism solely to drive rank. Accordingly, it penalized those sites and rewarded sites that maintained a focus in providing valuable, enjoyable content.Low-quality spam-like practices, such as stuffing content with keywords and copying content from other sites, were virtually eradicated.

    The Penguin update came out as a counterpoint to Panda in 2012, doing for offsite link building what Panda did for onsite copywriting. Penguin 1.0 affected just over three percent of search queries, giving it a narrower range than Panda, but the sites it did affect were affected enormously. Penguin targeted sites that paid for external links, built external links on irrelevant sites, or spammed links in irrelevant conversations. Conversely, it rewarded sites that built more natural links in a diversified strategy.

    Enter the Pigeon Update

    articleimage637pigeionupdate

    The Pigeon update was slightly different from its cousins. Like them it was a major update that fundamentally changed an element of SEO, but it was never officially named by Google. It was released in the early summer of 2014.

    The Pigeon update was designed to change results for local searches. Rather than attempting a global change, like with Panda and Penguin, Pigeon is focused only on redefining searches for local businesses. Through Pigeon, local directory sites like Yelp and UrbanSpoon got a significant boost in authority, and businesses with significant high ratings on those sites also received a boost. Now, local businesses can get as much visibility by increasing the number of positive reviews posted about them than they can by pursuing traditional content marketing strategies.

    The Bottom Line

    While these updates all surprised people when they came out, and their specific changes are still being analyzed and debated, they all share one fundamental quality: they were rolled out to improve user experience.

    Panda was rolled out because too many webmasters were posting spammy, low-quality, and keyword stuffed content. The update sought to improve user experience by promoting sites with more relevant, valuable content.

    Penguin was rolled out because the web was filling up with keyword stuffed, random backlinks. The update sought to improve user experience by penalizing the culprits behind such spammy practices.

    Pigeon was rolled out because the scope of local businesses online was getting more diverse, and users needed a more intuitive way to find the ones that best met their needs. Pigeon sough to improve user experience by adding sophistication to its local business ranking process.

    User experience is the name of the game, and it’s the sole motivation behind each of Google’s landmark updates.

    Building Off of Old Structures

    Since their release, Panda and Penguin have been subject to countless new iterations. Data refreshes and updates tend to occur on an almost monthly basis, while major updates have been rolled out annually—Panda 4.0 and Penguin 3.0 both rolled out in the past few months. Pigeon is still relatively new, but chances are it will see some expansion as well.

    For now, it seems that Google is trying to build off of the structures that already exist within the confines of its greater algorithm. Rather than trying to introduce new categories of search ranking factors, Google is refining the categories it’s already introduced: onsite, offsite, and now local. It’s likely that Google will continue this trend for as long as it continues to improve user experience, gently refining their quality criteria and targeting emerging black hat tactics as they arise.

    However, it’s only a matter of time before Google discovers a new category of refinement. When it does, the update will likely be just as surprising as the big three, and will warrant its own series of updates and refinements.

    What the Next Overhaul Could Bring

    articleimage637 What the Next Overhaul Could Bring

    If we’re going to predict the nature of the next update, we need to understand two things: the emergence of new technology and the fundamental focus Google maintains on improving user experience. The next major Google update will probably have something to do with significantly improving the way users interact with one or more rising technologies.

    The Knowledge Graph

    One option is a radical expansion of the Google Knowledge Graph. The Knowledge Graph, that box of helpful information that appears to the side when you search for a specific person, place or thing, is changing the way that people search—instead of clicking on one of the highest ranking links, they’re consulting the information displayed in the box. The next Google update could change how significant this box appears, and how it draws and presents information from other sites.

    Third Party Apps

    Google has already shown its commitment to improving user experience through the integration of third party apps—it’s favoring third party sites like Yelp and UrbanSpoon in search results, and is integrating services like OpenTable and Uber in its Maps application. The next search algorithm update could start drawing more information in from these independent applications, rather than web pages themselves, or it could use app integrations as a new basis for establishing authority.

    The Rise of Mobile

    Smart phones are ubiquitous at this point, but wearable technology is still on the rise. The swell of user acceptance for smart watches could trigger some new update based around proximity searches, voice searches, or some other facet of smart watch technology. Since smart watches are in their infancy, it’s difficult to tell exactly what impacts on search they will have.

    No matter what kind of update Google has in store for us next, it’s bound to take us by surprise at least slightly. We can study its past updates and the new technologies on the horizon all we want, but Google will always be a step ahead of us because it’s the one in control of the search results. The only things we know for sure at this juncture arethat Google will eventually release another new massive update at some point, and its goal will be improving user experience.

  5. 8 Changes You Need to Make After Panda 4.1

    Leave a Comment

    After four months of silence on the Google Panda front after May’s Panda 4.0 update, the next iteration of Panda is here. Referred to as Panda 4.1, the update isn’t big enough to warrant the title of “5.0,” but is significant enough to have search marketers scrambling.

    Building on the intentions of its predecessors, the Panda 4.1 continues Google’s tradition of gradually weeding out low-quality content in favor of well-written, informative, engaging content. Sites with aggregated or copied content, such as lyric databases and medical content hubs, seem to have been hit the hardest by this iteration of Panda, suggesting that Google’s duplicate content detection is becoming more sophisticated. On the flip side, small- to medium-sized businesses with diverse original content are seeing a boost.

    The update started rolling out officially on September 25, 2014, and became active in gradual updates that spanned through the first week of October. Most companies have already seen the gains or losses from this update, so if you haven’t noticed your rankings change in the past few weeks, don’t worry—Panda 4.1 probably didn’t affect you.

    Still, Panda 4.1 has changed the world of search yet again, and if you want to take advantage of it and prepare for the next phases of Google’s evolution, there are several strategic changes you’ll need to make:

    1. Scour your site for duplicate content—and get rid of it.

    articleimage509Scour your site for duplicate content

    Sites with volumes of duplicated content are the ones who have been hit hardest by Panda 4.1. Now is your chance to get rid of the dead weight. Look throughout your site and your blog to find any articles that might be partly replicated from an outside source. Just because you don’t plagiarize work doesn’t mean you’re not at risk—extended quotes, attributed work from outside authors, and paraphrased sections could all register as duplicated material, and could hurt your overall ranks. If you find any content that could be seen as a duplicate from another source, get rid of it.

    2. Do a content audit and remove or improve “thin” content on your site.

    articleimage509Do a content audit

    “Thin” content is a vague term, referring to content that is densely packed with keywords, light on value or specificity, or shoddily written. We’ve all seen content like this, so it should stick out like a sore thumb—especially in comparison to a longer, more detailed piece. Go through your previously published material and review the pieces of content that look like they’ve been scrapped together. You have two options for these pieces: either delete them, or take the time to revise them and turn them into a similar, but more valuable piece.

    3. Adjust your content strategy to include only the highest quality material.

    articleimage509Adjust your content strategy to include only the hi

    Depending on the current level of your content marketing strategy, this change could be enormous or barely noticeable. Moving forward, all your content needs to be of the highest quality—that means based on an original idea, written by an expert, and highly detailed. Don’t worry as much about the frequency of your posts; if a piece of content isn’t as high quality as you’d like it to be, do not publish it. It’s better to have a smaller number of better-quality posts than a greater number of lesser entries. You may be doing this already, but it’s still a good idea to revisit your strategy and see what positive changes you can make.

    4. Add more outbound authoritative links to your content.

    Google wants to see high-quality, authoritative content. If you want to be seen as authoritative, you need to back up your facts and provide references to support your claims. The best way to do that is to provide in-text links pointing to outside, authoritative sites. It’s a way of leveraging the current status of well-established sites to bolster your own authority. As you continue writing new content, experiment with posting more outbound links to build your own credibility. Make sure to use a diverse range of sources to avoid spamming any one source with an excessive number of backlinks.

    5. Include more images in your posts.

    Embedded images in your blog posts do two things: first, they look more enticing to your readership, giving you greater reader retention and more reader satisfaction. Second, they give your content the appearance of detail, and make your content seem more valuable according to Google. Include infographics in the body of your blog posts to illustrate a point with information; if they are original, they’ll naturally attract backlinks and assist your strategy in multiple ways. Otherwise, include any relevant images you can find (as long as they’re legal to use) to complement the text on your page.

    6. Publish author credentials to establish author expertise.

    According to the recent leak of Google’s Quality Rater Guidelines, author expertise is an important factor in evaluating the authoritativeness of a piece of content. Instead of trying to make your content seem like it was written by an expert, have your content actually written by an expert. Include author credentials at the bottom of each published article, identifying the author’s name, title, and area of expertise. If you do this consistently, and offsite content also features this author’s name, you’ll effectively build that author’s authority, and your content will be seen as higher quality. It’s a small change that could add up to major results.

    7. Build more high-quality links to your content.

    Despite all the changes that the Penguin updates have made to the world of backlink building, backlinks are still tremendously important for building a site’s authority. This change is essentially the strategy I covered in point 4, but in reverse. If a high-quality site, such as an information database or a .edu site, links to one of your articles, that article will be seen as much more credible, giving you a Panda-proof boost in authority. If you can incorporate more of these high-authority backlinks into your link building campaign, your domain’s overall authority will significantly increase.

    8. Perform content audits regularly.

    The best ongoing new strategy you can adopt in response to Panda 4.1 is a regular content audit. On a monthly or bi-monthly basis, take an hour to review all the new onsite content that’s been published since your last audit. Carefully review each piece to determine its quality; check for originality, authoritativeness, and level of detail. If any of these pieces does not meet your quality standards, either get rid of it or revise it to make it comply. Doing this regularly keeps you vigilant, and keeps your content quality from ever declining or putting you at risk for another Panda-related drop in rank.

    Google is notorious for keeping online marketers on their toes, and it has continued that reputation with this latest update. With Panda 4.0 coming in May and 4.1 rolling out in September, Google could be back on a quarterly (or near-quarterly) updating pattern, like it was for previous iterations of Panda. If that’s the case, it could mean another major update is on the horizon for December or January.

    Stay sharp and keep your strategy up-to-date, and you’ll likely ride past the next Panda update with no mysterious drops in rank. You might even get a boost!

  6. Here’s Why Retailmenot Got Hit by Google Panda

    Leave a Comment

    Retailmenot.com, a growing coupon-based website backed by Google Ventures, experienced a stunning drop in both search engine rankings and organic traffic back in May of this year. Google Panda 4.0, the name given to May’s major Panda update, supposedly affected only about 7.5 percent of all search queries, but Retailmenot.com took a much bigger hit than expected.

    The incident has raised a lot of questions in the search engine marketing community, particularly focused on how Google Panda 4.0 works, and why Retailmenot.com was hit so hard despite being backed by Google’s own venture capital investment. The problem is multifaceted, but by understanding exactly what happened, you can protect your own web properties against a similar potential drop in the future.

    What Happened?

    articleimage408What Happened

    First, let’s take a look at exactly what happened. According to SearchMetrics, by May 21, Retailmenot.com experienced an approximate drop in organic search visibility of 33 percent. This drop does not measure the amount of organic traffic a website receives, but there is a correlation between organic visibility and organic traffic. Essentially, this metric illustrates a cumulative drop in rankings over several keywords that adds up to a third less visibility.

    In a broader context, 33 percent doesn’t seem so bad. After all, sites like spoonful.com and songkick.com experienced an approximate drop of 75 percent. But Retailmenot.com’s popularity and support from Google Ventures make it an interesting representative of the Panda update. Other sites, such as medterms.com, experienced as much as a 500 percent increase in visibility—so we know the update wasn’t only giving out penalties. So why, exactly, was Retailmenot.com penalized?

    The Mechanics Behind the Drop

    articleimage408 The Mechanics Behind the Drop

    The drop was first acknowledged around May 21, just one day after the rollout of Panda 4.0. There is no question that this update is primarily responsible for the significant drop in Retailmenot.com’s rankings. The Panda Update, which started back in 2011, has a clearly defined purpose: to improve user online experience by eliminating spam and pages with low-quality or minimal content from search results. Since 2011, several new iterations of the Panda update, along with occasional “data refreshes” have been applied to Google’s search algorithms in an ongoing attempt to improve search results.

    Panda 4.0, in May, was the latest “major” update. While the update surely introduced some new ranking signals and algorithm mechanics, the baseline goal of the update is in line with its Panda predecessors: to reward sites with high-quality content and punish those without it. Google is notorious for keeping its algorithms private, so it’s impossible to say exactly which factors were responsible for shaking Retailmenot.com’s visibility, but it seems like an inferior content program was at the heart of it.

    Why It Matters

    First, let’s take a look at why this hit was such a big deal for Retailmenot.com. A drop of 33 percent in search visibility doesn’t seem like that much on the surface; it could be the result of a handful of dropped ranks. The world of search is volatile at best, so occasional drops aren’t that big of a deal for most companies (especially big ones like Retailmenot.com). But this particular drop did have a significant impact on Retailmenot.com’s bottom line.

    CEO Cotter Cunningham reported in a conference call for Retailmenot.com’s Q2 earnings that organic search traffic represented approximately 64 percent of their total visitors—which is a big deal. Cunningham did report that Retailmenot.com has steadily recovered from the initial drop in rankings, but when 64 percent of your customers are affected by a change in Google’s algorithms, you take notice. Their stock price (SALE) closed at $31.04 at the end of trading on May 21, but by May 23, it had dropped to a low of $23.87. Their stock still has not returned to its original levels, but of course this is likely due to several factors.

    Why does this matter to you? Retailmenot.com is just one example of how significant an algorithm change can be. Preparing yourself for possible future changes, and responding immediately to any ranking drops, can help prevent or mitigate the effects of lost organic search visibility. Retailmenot.com wasn’t necessarily engaging in black hat practices; if they were spamming backlinks and posting total junk content, they would have experienced a much larger drop than they did. Instead, it appears as though the volume and quality of their content was just outside of Google’s latest standards, and as we know, those standards are constantly increasing.

    So you know it’s important to protect yourself against major search engine drops like these by committing yourself to providing your users with the best possible online experience. But the Retailmenot.com drop is also significant because it shows us that recovery is possible. CEO Cotter Cunningham also reported in their Q2 earnings conference call that some organic search visibility had been restored, and their revenue was still close to their original target.

    It’s also interesting to consider why Retailmenot.com was hit by Panda despite being backed by Google Ventures. While Google does seem biased in many of its actions (such as adjusting their search engine algorithms to favor content on their own social media platform, Google+), the fact that a GV-backed site was hit by a major update is evidence that Google has unflinching, equal standards for web quality. Let’s hope this unprejudiced stance remains as they continue to roll out more and more changes.

    How to Safeguard Your Site

    articleimage408 How to Safeguard Your Site

    Google Panda 4.0 is over. If you were going to get hit by it, you’ve already seen the damage, so if you haven’t noticed any significant outliers in your search ranking data, you’ve successfully avoided getting hit by Panda 4.0. If you have experienced a penalty from any stage of the Panda update thus far, it’s time to remove those penalties and start making a recovery.

    However, as evidenced by Retailmenot.com’s recent debacle, just because you escaped from a few penalties unscathed doesn’t mean you’ll avoid getting hit by future updates. If you want to make sure your organic search visibility stays where it is and continues to grow, you need to double check your strategy to make sure you’re complying with Google’s standards for user experience:

    • Write high-quality, original content on a regular basis and make it easy for your users to find, read, and share. Explore a diverse range of topics, avoid keyword stuffing, and make sure your subjects are valuable to your readership. Multiple types of content, including writing, images, and videos, are encouraged.
    • Encourage organic backlinking, but don’t spam links to your site. Keep any links you post as natural and beneficial as possible, and if you guest post often—consider using nofollow links to keep Google at bay.
    • Promote your content with social media, and encourage your followers to share and comment.
    • Keep your site as user-friendly as possible with an easy-to-follow navigation, ample opportunities to contact you, fast loading times, and minimal interference.

    Google is somewhat unpredictable, and because updates always come without warning, it’s impossible to completely prevent any possible drop in organic search visibility. Still, if you adhere to best practices consistently and do everything you can to give your users a great experience, you should be able to avoid a hit like the one experienced by Retailmenot.com.

  7. How to Prepare for Penguin 2.0: Take Off that Black Hat!

    3 Comments

    Google Penguin 2.0What do Penguins, Pandas, and black hats have in common? Lots! Penguin is the most recent set of guidelines published by Google designed to clean up abuses in the field of SEO, and a new version is due out soon, according to Google’s Web Spam Czar, Matt Cutts. The impending event has marketers, reputation managers, and webmasters scurrying for cover.

    SEO – A Concept Recap

    SEO (search engine optimization) is the relatively newborn public relations field that tries to increase the visibility of websites by the strategic placement of keywords, content, and social media interaction, and the industry has grown rapidly in a little over a decade.

    Carried to extremes, as such things always are, black-hat SEO is a subdivision within the field that tries to achieve money-making results in an unsustainable way (ie, against Google’s webmaster guidelines). It frustrates the very purpose of a search engine, which is to help users find the information they need. Instead, rampant SEO gone amok serves only the needs of online marketers wishing to increase sales for themselves or their clients.

    To readjust the proper balance, Mr. Cutts and his team of “penguin” police have attempted to establish guidelines that will rule out the most abusive practices of black hat SEO.

    BlackHat SEO – Are You Doing It?

    The predecessor to Penguin was Panda, with much the same purpose. Panda included a series of algorithm updates, begun in early 2011. These were aimed at downgrading websites that did not provide positive user experiences.

    Panda updates of the algorithm were largely directed at website quality. The term “above the fold” is sometimes used to refer to the section of a website that a user sees before one begins to scroll down. The term comes from newspapers, which are delivered folded in two. The section that is “above the fold” is the section one sees before opening the paper, or unfolding it.

    Typically, marketers wish to cram as much eye-catching, commercial material as possible into this section, while responsible journalists wish to pack it with the most relevant and useful information.

    Penguin, on the other hand, is targeted more specifically at keyword stuffing and manipulative link building techniques.

    One targeted abuse, keyword stuffing, is not a tasty Thanksgiving delicacy, but the practice of loading the meta tag section of a site, and the site itself, with useless repetition of certain words. Sites can lose their ranking altogether as a result of such stuffing.

    Abusive practitioners of keyword stuffing are not above using keywords that are rendered invisible because their font color is identical with the background color. The user doesn’t see them, but the search engine spider does. This practice was soon discovered, however, and dealt with by the search engines.

    Meta tags are sometimes placed behind images, or in “alternative text” fields, so that the spiders pick them up while they remain invisible to users. Popular or profitable search keywords are sometimes included invisible to humans, but visible to the search crawlers. Very clever, but also soon discovered and dealt with. With Penguin, Google now analyzes the relevance and subject matter of a page much more effectively, without being tricked by keyword-stuffing schemes.

    “Cloaking” is another tactic that was used for a while to present a different version of a site to the search engine’s crawler than to the user. While a legitimate tactic when it tells the crawler about content embedded in a video or Flash component, it became abused as a Black Hat SEO technique, and is now rendered obsolete by the technique of “progressive enhancement,” which tailors a site’s visibility to the capabilities of the user or crawler. Pornographic sites have often been “cloaked” in non-pornographic form as a way of avoiding being labeled as such.

    The first set of Penguin guidelines and algorithms went live in April 2012, and the second main wave is due out any day now (though Penguin has gone through several periodic updates since its initial release). It’s designed to combat an excessive use of exact-match anchor text. It will also be directed against links from sources of dubious quality and links that are seen as unnatural or manipulative.

    The trading or buying of links will be targeted as well. The value of links from directories and bookmarking sites will be further downgraded, as will links from content that’s thin or poor-quality. Basically, the revision in the algorithms will be designed to rule out content that serves the marketer’s ends rather than the users’.

    Advice For SEO Marketers To Stay Clean

    If you are a professional SEO, the questions to ask yourself are:

    • Is this keyword being added in order to serve the customer’s potential needs, or is it designed merely to increase the number of hits? If the latter, then the additional users that would be brought to the site by the keyword are probably not high-quality conversion potential.
    • Is the added SEO material being hidden from the user or the search engine crawler? If so, with what purpose? If that purpose amounts to dishonest marketing practices, the material runs the risk of getting you in trouble with Penguin.
    • What’s the overall purpose of your SEO strategy? If it’s anything other than increasing sales by enhancing user experience, then you may expect an unwelcome visit from Penguin.

    If you’re a user, you’ll very likely not be as conscious of these changes, except inasmuch as they will alter the look of your search results page when you perform a search in Google. Will the new Penguin algorithms cut down on those ubiquitous “sponsored links” or “featured links”? Probably not. But savvy users know how to ignore those links by now, except of course when they turn out to be useful.

    Will the new algorithms enhance the overall usefulness of the search engine experience? Probably, at least marginally, and perhaps even in a major way. The whole field of internet marketing and e-Commerce is changing so rapidly and radically that it’s hard to keep track of the terminology, especially the proliferation of acronyms. But the ultimate goal will be an enhanced user experience.

  8. Why Duplicate Content is Bad for SEO and What to Do About It

    1 Comment

    With the rollout of Google Panda, we have heard sad stories of sites that have been either devalued or removed from Google’s index entirely.

    One of the reasons for the huge drop in some sites’ rankings has been duplicate content — one of the problems that Panda was released to control.

    Most of the sites that have experienced a drastic decrease in rankings were content farms and article directories; that is, sites loaded with thousands of duplicate articles.

    While it had been made clear that duplicate content was one of the primary things Panda frowns on, some content authors breathed a sigh of relief after Google appeared to say that “There’s no such thing as a ‘duplicate content penalty’ ” in a blog post several years ago.

    But duplicate content remains a controversial issue. It has kept bloggers and webmasters nervous about publishing content that could hurt their rankings. Like many other things, there are two sides to the issue. There’s duplicate content that Google allows and there’s the type that hurts your website’s rankings.

    Let’s try to clear up the difference.

    What type of duplicate content can hurt your rankings?
    To determine whether a sample of duplicate content is going to pull down your rankings, first you have to determine why you are going to publish such content in the first place.

    It all boils down to your purpose.

    If your goal is to try to punk the system by using a piece of content that has been published elsewhere, you’re bound to get penalized. The purpose is clearly deceptive and intended to manipulate search results.

    This is what Google has to say about this sort of behavior:

    “Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.”

    If Google has clear evidence that you are trying to manipulate your search rankings, or that you are practicing spammy strategies to try to improve rankings and drive traffic to your website, it could result to your site being removed from Google’s index.

    The effects on users
    Publishing duplicate content could also hurt your reputation in the eyes of the users.

    The ultimate goal of the search engines is to provide users with the most valuable, most useful, and most relevant information. If you publish a bit of content that has been previously published elsewhere, your site may not show up for the same search, because search engines tend to show results only from the main content sources.

    This explains why the search engines omit duplicate results to deliver only those that the users need.

    When users read content on your site that they have already seen previously on a site that they trust more, chances are their trust in your site will diminish.

    But is there ever a case where duplicate content is acceptable?

    When duplicate content can be acceptable
    There may be instances when duplicate content is accidental, and therefore should not lead to any penalties.

    One such instance is when the search engines index tree identifies separate URLs within a domain that point to a single content. An example is the following trio of URLs: http://abc.com, http://www.abc.com, and http://www.abc.com/index.htm. There’s clearly no indication of manipulation or intent to spam in this case.

    Another case of legitimate duplication occurs when a content sample is published in several different formats to cater to specific users. With the explosion of mobile web browsing, content is now published to suit desktops, tablets, and mobile phone web users. Publication of a single content in several formats is not subject to any penalties for duplicate content.

    Also, keep in mind that there are instances when publishing a copy of a piece of content, in part or in whole, is needed for reference, such as when citing a news source. If the purpose is to reference or to add value to users, such content duplication is not subject to penalties.

    Avoiding duplicate content that provokes the Panda’s wrath
    Simply put, avoiding duplicate content is your best defense against any duplicate content penalties administered by Google Panda. Remember that Google and other search engines strive to provide search results that are unique and of high quality.

    Your goal must therefore be to publish unique and original content at all times.

    However, if duplication cannot be avoided, below are recommended fixes that you can employ to avert penalties:

    Boilerplates. Long boilerplates or copyright notices should be removed from various pages and placed on a single page instead. In cases where you would have to call your readers’ attention to boilerplate or copyright at the bottom of each of your pages or posts, insert a link to the single special page instead.

    Similar pages. There are cases when similar pages must be published, such as SEO for small and big businesses. Avoid publishing the same or similar information. Instead, expand on both services and make the information very specific to each business segment.

    Noindex. People could be syndicating your content. If there’s no way to avoid this, include a note at the bottom of each page of your content that asks users to include a “noindex” metatag on your syndicated content to prevent the duplicate content from being indexed by the search engines.

    301 redirects. Let the search engine spiders know that a page has permanently moved by using 301 redirects. This also alerts the search engines to remove the old URL from their index and replace it with the new address.

    Choosing only one URL. There might be several URLs you could use to point to your homepage, but you should choose only one. When choosing the best URL for your page, be sure to keep the users in mind. Make the URL user-friendly. This makes it easier not only for your users to find your page, but also for the search engines to index your site.

    Always create unique content. Affiliates almost always fall victim to the convenience of ready-made content provided by merchants. If you are an affiliate, be sure to create unique content for the merchant products you are promoting. Don’t just copy and paste.

    Conclusion
    Whatever your intent is, the best way to avoid getting penalized by Google Panda is to avoid creating duplicate content in the first place. Keep in mind that quality is now at the top of the search engines’ agenda.

    It should be yours too.

  9. SEO Mistakes the Panda and the Penguin Forbid You to Commit

    Leave a Comment

    For Google and other major search engines, quality and reliability of information are key to user satisfaction. These elements also empower the search engines to thrive as they seek to provide better data to users in terms of quality, relevance, and authority.

    And who is king of the SEO hill?

    Google, of course. And Google shows no sign of loosening its stranglehold on the universe of SEO.

    Via one algorithmic update after another, Google is wreaking havoc on people who found loopholes in the system to advance their personal interests. For years, these smooth operators devised tricks to manipulate their way to the top of search engine results pages.

    With the Panda and Penguin updates, Google may have finally patched the holes that allowed spammers to litter search engine results with garbage.

    More recently, Google rolled out newer versions of the Panda and Penguin updates. In the hope of making the Internet a better place to host and find high-quality and extremely useful information, Google supplied webmasters and business owners with guidelines to help them play the SEO game in a fairer manner.

    So let’s talk about some of the mistakes that every webmaster and online business owner should avoid so as not to get slapped by Google. We’ll also discuss some recommendations on how to optimize your site properly for Panda and Penguin.

    But first, a brief review of what the two major Google updates are all about.

    Google Panda
    The Panda was the first of the two major overhauls that Google rolled out in less than two years. It offered an initial glimpse of how the mighty Google intended to provide better search engine results.

    The main goal of Panda was to sniff out sites that carry low-quality content — or in Panda-speak, “thin” content. What Google Panda generally looked for were sites that had obviously spammy elements such as keyword stuffing, duplicate content, and in some cases, high bounce rate.

    Google Penguin
    Although at first it might have sounded cute and cuddly to Internet users, the Penguin quickly showed them otherwise. This update zeroed in on sites that were over-optimized in terms of backlinking.

    One of the most widely practiced link-building tactics prior to Penguin’s appearance was to use exact-match keywords for anchor texts. The problem with this tactic is that Penguin reads it as an unnatural linking practice.

    To promote natural backlinking, Penguin set out to penalize sites that routinely used exact-match keywords for anchor texts, and rewarded those smart enough to employ variations in their keywords.

    The top SEO mistakes you should avoid at all times
    Now that you have been reminded of what Panda and Penguin want and how they’d like us to play the SEO game, keep the following pitfalls in mind to avoid seeing your site take the deep plunge down the search results pages.

    1. Using mostly exact-match keywords for backlinks
    This used to be one of the most effective ways of getting a site to rank higher in search results. These days, this strategy can still be recommended, but with caution. Now that Penguin is policing the info highway, using mostly exact-match keywords is a sure way to get your site devalued.

    To gain or maintain favorable ranking, observe natural link-building best practices. Post-Penguin SEO calls for you to vary your keywords by using related terms. If you are optimizing for “baby clothing,” for example, use keyphrases such as “kids’ clothing,” “clothing for babies,” etc. It’s also a good idea to use your brand’s name as anchor text.

    The primary thing to remember is to link naturally. Don’t be too concerned about failing to corner exact-match keywords that you think could hugely benefit your business. After all, Google is moving toward latent semantic indexing (LSI), which puts related keyphrases into consideration for smarter indexing.

    2. Generating most of your traffic via only one marketing channel
    Many marketers, especially new ones, tend to assume the only way to gain a huge amount of highly targeted traffic is by focusing time and energy on a single marketing channel. Some only use SEO, while others concentrate their efforts on social media marketing.

    Focusing your attention on one channel could bring success in terms of gaining some very targeted traffic, but with regard to ranking, it could actually hurt you, especially since the Panda and Penguin rollouts.

    Again, diversity should be used not only in keywords and keyphrases, but also to drive traffic to your site. Apart from SEO, the smart way to drive traffic to your site will involve use of the following tactics:

    • Article marketing
    • Social media pages for your business
    • Guest posting
    • Social bookmarking
    • Forum posting and blog comments
    • Press release

     

    By diversifying your traffic sources, you will create a natural way for your audience to find your business at different online locations — a signal that will get you favorable rankings in search.

    3. Failing to take advantage of internal linking
    Even worse is not doing any internal linking at all. Internal linking not only improves the users’ experience; it’s also good for onsite SEO.

    With strategic and meaningful internal linking, you will make it easy for your users to find their way around your site and locate the information they want. Your users will also have more good reasons to linger on your site as they consume more information related to what they are looking for.

    Proper internal linking also enables search engine spiders to determine which content is related to other content.

    Proper internal linking can be executed by including the following:

    • Standard navigation above the fold — more specifically, above the content
    • Category section on sidebar
    • Related posts on sidebar or below each post
    • Links within each post that point users/readers to related content
    • Sitemap

     

    4. Publishing content with very little value
    In Google Panda-speak, this is known as “thin” content. Panda rolled out to hammer sites that carry duplicate information and that promote content which offers very little value or information to users. Such sites are often stuffed with keywords and overly promotional.

    To avoid getting smacked by the Panda’s giant paw, think critically about the value your users are likely to get from your content: Is it unique? Will it help them solve their most pressing concerns? Will the content fulfill its promise?

    Conclusion
    Are we seeing the beginning of the end of SEO manipulation? Let’s hope so.

    As Google shoves spammers way down its search results, the hope is that Google’s first few pages will feature nothing but extremely valuable and useful information that really meets users’ expectations. And as a site owner and online entrepreneur, you can depend on Google Panda and Penguin to improve your standards as you strive to deliver what your audience is looking for.

    For more information on properly optimizing your site, contact us and we’ll show you your options to make your site Google compliant.

     

  10. The Google Panda 20 Update: Some Key Information

    Leave a Comment

    Google is making everyone aware of the company’s relentless drive to supply better and more useful information.

    Following a series of Panda and Penguin updates, Google Panda #20 was released on September __, 2012. This was the first major update since July 24, 2012. More Panda updates are expected to be released within the next few months, and future updates will be more consistent.

    Unlike other recent releases, Panda #20 was a fairly major update – one that ran for almost a week. In fact, some 2.4% of English queries were affected and about 0.5% in other languages.

    Another interesting thing about Panda #20 is that it overlapped with another algorithmic update dubbed the EMD update, which Google rolled out to target sites with low-quality, exact-match domain names.

    This made it tricky for affected SEOs and site owners to determine which update had hit them. Was their site hit for failing to comply with Google Panda standards, or for having a poor exact-match domain name?

    Panda was released to devalue sites with “thin content,” or content that offers minimal value. Since its release last year, tons of sites have seen a dramatic drop in rankings. Some, especially notorious content farms, have been removed from Google’s index altogether.

    Panda also targeted sites that contained duplicate content. As a result of Panda’s release, black-hat SEO practices have been significantly thrashed. Sites that churn out hundreds of pages with duplicate content were obliterated.

    The release of Panda, as well as its equally ferocious sibling Penguin, met with a few complaints as well. Years of hard work and substantial amounts of marketing dollars to push a site to the top of Google rankings were effectively tossed out the window. SEOs, publishers, and site owners, believing they had been following recommended SEO best practices, cried foul.

    The hard lesson we can all learn in the aftermath of Google’s algorithmic changes is that, while it’s true that quality is subjective, standards have been laid out.

    The stress on quality and authority

    Quality and relevance of information is at the heart of every substantial change rolled out by Google. To ensure that every site is on the same page with Google, they have laid out guidelines for site owners to follow.

    As far as Panda is concerned, as long as your site’s content is original and offers quality and useful information, you should be fine.

    As long as the content strongly relates to the topic and offers great value to your audience, there wouldn’t be any reason for Google to slap you.

    Do your link-building activities follow the prescribed or accepted methods? Have you been linking to authority sites, and are authority sites linking back to yours? Do you make a point of regularly checking your link profiles for any potentially damaging links?

    There’s no way of telling how many more of Google’s Panda updates are coming in the future, but Matt Cutts has made it clear that Google Panda will be updated and refreshed on a continual basis. This shows how committed Google is to making the Internet a better and more reliable avenue for gleaning valuable information.

    Conclusion

    It’s crucial to keep abreast of the periodic algorithmic changes that Google rolls out. Keeping yourself on the same page with the search engines is vital to the success of your online business.

    If you need help keeping your site compliant with current SEO best practices, contact us. You can also subscribe to our feed to keep yourself in the SEO loop.

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!

Love,

-The AudienceBloom Team