AudienceBloom

CALL US:  1-877-545-GROW

Category Archive: Panda

  1. Why There’s No Whitelist for Penguin or Panda

    Leave a Comment

    During the SMX Conference, representatives from both Google and Bing revealed the existence of “whitelists,” which are essentially exemption-based lists that compile domains and prevent those domains by being penalized or changed by certain segments of algorithm changes. To address any misinterpretations, both search giants elaborated that these whitelists were not used to boost the ranking of any sites in results listings, nor were they used to completely exempt any sites from any and all algorithm chances.

    To illustrate how these whitelists function, let’s say Google releases a new algorithm update. The update is going to affect 99 percent of sites on the web in exactly the way they intended, but that remaining 1 percent is going to be hit with a penalty they didn’t deserve. This could be because the sites have a unique position, or because the algorithm cannot be perfected to accurately target each domain appropriately, but in any case, whitelists exist to prevent and correct those undue evaluations.

    Whitelists for Penguin and Panda

    articleimage886 Whitelists for Penguin and Panda

    The revelation of the existence of whitelists has been a relief to some business owners concerned about how their businesses might be affected by algorithm updates. Many websites hit by Google’s biggest updates, Panda and Penguin, have claimed that the resulting penalties were unwarranted. The idea of being “whitelisted” is a comforting assurance to those afraid of getting hit with future updates.

    However, according to Google’s John Mueller, a Webmaster Trends Analyst in Zurich, Google doesn’t have any whitelists for its Penguin or Panda algorithms. Mueller illustrated an example of a whitelist in action: the Safe Search algorithm is designed to flag sites with adult content, but a site about animals could be flagged as adult despite not featuring any offending material. Here, the site would be whitelisted and removed from that algorithm’s flagging system.

    He went on to explain that there’s no whitelist for Penguin or Panda because these updates were designed to affect all sites. Whitelists are intended to be a stopgap—a temporary measure to correct some fundamental flaws with algorithm changes. For example, Safe Search could be fixed in the future to avoid flagging material that it flagged in the false positive animal site. Panda and Penguin would require no such stopgaps.

    There’s another problem with the notion of being “protected” by whitelists; even if whitelists did exist for Penguin or Panda, they exist to serve the exceptions to the master rule. The odds of being the one site out of a hundred that isn’t served by the algorithm the way Google intended is—you guessed it—one in a hundred. Whitelists are short, and it’s highly unlikely that you’ll be on any of them in the first place.

    Google’s Motivation

    articleimage886Google’s Motivation

    The motivation behind keeping Panda and Penguin whitelist free is the same motivation that drove the deployment of those updates in the first place. Google wants to give users the best possible online experience, and that means giving them the best possible results for their queries. While they want webmasters to have a fighting chance at getting visibility, they also aren’t going to worry if a handful of businesses are hit particularly hard by a new update. Adding a whitelist to a major algorithm change is a way of undermining it, especially when the algorithm is focused on quality.

    Let’s compare Google Safe Search to Google Panda (or Penguin, it doesn’t matter in this case). Safe Search is unique because it scouts for very specific pieces of material—in this case, adult material, but similar algorithm changes could scout for other forms of specific content. Sites can be hit with undue penalties because they are categorized as having content that they do not have. However, Panda and Penguin are quality-focused, meaning their entire purpose is to generalize the quality of a site’s onsite content and offsite links. There may be a handful of signals that provide erroneous feedback to Google, but overall, their judgments on websites are fairly accurate. Exempting sites from this quality check is like saying that certain sites don’t need high-quality content or relevant links.

    Steps You Can Take

    articleimage886Stepsyoucantake

    There’s no whitelist for Penguin and Panda, so you don’t need to worry about it. All you should do is continue refining your strategy to give your users the best possible experience and put the best possible content on the web. For the Panda algorithm, that means updating your site with fresh, relevant, interesting content on a frequent and consistent basis. For the Penguin algorithm, that means building valuable, highly authoritative links on relevant sources with a relevant foundation.

    If you’re worried about getting on a whitelist in the future, don’t be. Google’s niche updates aren’t nearly as impactful as its major updates, and the chances of getting hit with an unfair ranking change are extremely low. In the event that you do fall in rank, you can make a case to Google, and they’ll probably be more than willing to manually exempt you. Still, this is a rare occurrence, and you shouldn’t expect to ever encounter it.

    As for larger updates in the future, it’s likely that Google will continue its approach of avoiding the whitelist entirely. Should Google unleash another Penguin- or Panda-like update on the world, you’re going to be subject to its evaluations, just like everyone else. And while you might experience a bit of unwanted volatility, the web will likely end up a better place because of it.

  2. Are Google Updates a Thing of the Past?

    Leave a Comment

    For more than a decade now, Google updates have been keeping search marketers on their toes. Every time you got used to one common SEO custom—such as the most effective way to build backlinks—Google seemed to respond by pushing a major algorithm change that altered how it took those factors into consideration. In the SEO community, industry professionals were constantly either looking for ways to take advantage of the most recent update or trying to anticipate what changes were coming with the next one.

    Now, as we enter a new era of search, Google’s update patterns appear to have shifted. For the past several years, rather than introducing new algorithm changes, the search giant is only making tweaks to previously existing ones and making minor changes to account for new technologies. Rank disruption is still occurring, but on a much smaller scale, leaving search marketers to wonder—are Google updates a thing of the past?

    The Major Overhauls

    articleimage861The Major Overhauls

    Google updates have earned a reputation for being large, disruptive, and sometimes annoying pushes that can raise your site to the top of the SERPs or drop you off into online oblivion. That’s because most of Google’s major updates so far have been massive game changers, either completely overhauling Google’s search engine algorithm or adding some new set of qualifications that turned the ranking system on its head.

    Take, for instance, the Panda update of 2011 affected nearly 12 percent of all queries, massively disrupting the search game by introducing a new concept of content-based evaluation. Sites with high-quality content were rewarded while sites with spammy content were penalized.

    It was a fair system, and searchers of the world were happy to start seeing more relevant results and fewer obvious attempts to climb ranks by whatever means necessary. But it was still a major headache for search marketers who had invested serious time and money into the previous iteration of Google’s search algorithm. For a time, updates like these were common, and search marketers were constantly on the run, waiting for more changes like the Penguin update, or Panda 2.0, which carried another massive update to Google’s content evaluation system.

    Modern Panda and Penguin

    articleimage861Modern Panda and Penguin

    Panda and Penguin, two of Google’s biggest landmark algorithm updates, have seen multiple iterations over the past five years. Panda 2.0 was followed by small iterations leading to 3.0, and Penguin 2.0 came out only a year after the initial round of Penguin. These algorithm changes were substantial, and search marketers attempted to predict the cycle based on existing patterns, projecting when the next major Panda- and Penguin-based algorithm changes would come.

    But something changed in 2014. Rather than unloading the Panda update in a major package, Google started rolling out data refreshes and minor tweaks to the algorithm on a steady basis. Instead of hitting the search world with a massive burst, it introduced a regular, unobtrusive pulse. Similarly, with the Penguin update, major iterations were virtually done away with. Marketers named an algorithm update “Penguin 3.0” in late 2014, but search volatility was limited compared to Penguin updates in the past.

    This, combined with the fact that Google hasn’t released a major overhaul to its search function since the Hummingbird update of 2013, seems to indicate that instead of rolling out massive, disruptive updates, Google is more interested in rolling out very small, gradual changes.

    Niche Algorithm Updates

    articleimage861nichealgorithmupdates

    Other than extensions for its previous updates, Google has also released a handful of other changes. However, most of these are focused on niche functions—for example, the unofficially nicknamed “Pigeon update” of 2014 overhauled the way Google processes and displays local search results, taking local reviews from directory sites into account. Similarly, Google has been making changes to its Knowledge Graph and how it displays on SERPs.

    These niche updates don’t radically change Google’s core algorithm, nor do they interfere with any major updates of the past. They do have an impact on how search works and what strategies are the most rewarding, but they haven’t done anything to change the fundamental elements of a great SEO strategy.

    The Case for Micro-Updates

    There are a lot of reasons why Google would want to abandon large-scale updates in favor of smaller, less noticeable ones, and the evidence supports that transition:

    • Major updates have slowed to a stop. Instead of large batches of changes, Google is rolling out Penguin and Panda changes gradually and almost imperceptibly.
    • Google is no longer officially naming its updates. Penguin 3.0, Panda 4.1, and the Pigeon update are all unofficial nicknames—Google has abandoned the process of naming its updates, indicating it’s moving away from the process.
    • Search volatility is decreasing. Since Panda’s 12 percent disruption, nothing has come close to that level of volatility.
    • Google is finally at a stable point. The search algorithm is now complex enough to evaluate the quality of sites and the intention behind user queries, leaving little reason to rapidly accelerate through growth spurts.

    Of course, it’s possible that Google has a few more aces up its sleeves, but for now it looks as though major updates are dead, in favor of smaller, less momentous rollouts.

    What Search Marketers Can Learn

    There’s no reason to fear anymore. It’s likely that Google will no longer be pushing out the updates that have disrupted so many business rankings for so long. Instead, search marketers should understand that the fundamental best practices for SEO—improving user experience and building your authorityaren’t going to change anytime soon. The tweaks here and there might fine-tune some small details, but for the most part, the sites and brands that offer the best overall experience are going to be rewarded.

  3. Pre-Panda SEO Strategies to Never Use Again

    Leave a Comment

    The Panda update first started rolling out back in 2011, and when its algorithm changes took effect, it turned the world of SEO on its head. Over the course of the next three years, the Panda update kept pushing for more advancements and more changes, from major algorithm change iterations to minor data refreshes to keep the system up-to-date. Now, with Panda 4.1 behind us and Panda’s overall influence cemented into Google’s main search algorithm, it’s time to audit your SEO strategy and make sure you aren’t using any tactics that could warrant a Panda-related penalty.

    Panda’s main goal, like the goal of every Google update, is to improve user experience on the web by providing more accurate, more relevant, more pleasant results. Specifically, the Panda update was developed to punish sites with low-quality onsite content or sites that practiced keyword stuffing, and reward sites with ample, fresh quantities of high-quality content. There are a number of strategies which were once effective, but now have fallen by the wayside as potential penalty bait.

    Make sure your SEO campaign steers clear of these strategies:

    Using Keywords a Set Number of Times in the Body of a Post

    articleimage728 Using Keywords a Set Number of Times in the Body of

    Back in the days before Panda, keywords meant everything. Rankings for given keyword phrases were based almost entirely on which sites used those keywords the most, so search marketers could stuff their content with their target keywords and be set. As Google caught on to these keyword-stuffing schemes, they started penalizing sites who practiced them, and in response, search marketers started being sneakier with their stuffing techniques, including keyword phrases as two to three percent of the total word count of the article.

    Today, any variation of this keyword stuffing strategy is obsolete. Google’s algorithm analyzes user intent with semantic search capabilities, and produces the most relevant results, regardless of keywords involved. Instead of writing content based on keywords or including specific keyword phrases in the body of your content, focus on writing detailed content based on topics you know your audience wants to read.

    Duplicating or Reinventing Old Content

    articleimage728duplicate

    This was a popular strategy for the overworked search marketers looking for a way to include large volumes of new content or new content based on old keywords without much effort. Search marketers would take old posts and either repost them verbatim or reconstitute them with just enough changes to make them distinct.

    Today, the Panda update is advanced enough to detect when a piece of content has been directly taken from a preexisting piece, even if there are enough wording changes to make them legally distinct. That means unless you’re writing new content entirely from scratch, Google will notice your duplication and will penalize you as a result. Keep your content original and fresh.

    Outsourcing Work to Non-Native Speakers

    articleimage728 Outsourcing Work to Non-Native Speake

    Before Panda, search marketers would scramble to get as much content written and posted as possible, usually within a short time frame. At the time, it was possible and easy to outsource this writing to non-native speakers from other countries, who would be able to produce large quantities of content for extremely low wages. It was once a cost-efficient way to stuff your blog full of content that, while poorly written, could easily rocket you to the top of SERPs.

    Today’s Panda update is sophisticated enough to detect the non-native use of language. So even if your articles are passable, Google will notice that your work is not fluently written, and will take action as a result. While it’s still a cheap way to get lots of content, it’s only going to work against you.

    Pushing Out Press Releases for the Sake of Pushing Out Press Releases

    First, let me say that press releases are still a highly valuable SEO strategy. Getting your name out there with a solid link from a highly reputable external source does wonders for your domain authority, and producing high-quality content can increase your authority and relevance even further. But in order to get picked up and see those SEO benefits, you need to produce the highest quality work, and only submit press releases when you have something truly newsworthy to report.

    Before Panda, it was possible to push out as many press releases as you wanted, for questionable topics or reiterations of accomplishments you’ve already reported. Today, it will hinder your efforts more than it will help them.

    Focusing on Quantity Instead of Quality

    Since each article you post will have some factor in how Google sees your authority level and relevance, many webmasters make a false assumption that more content is always better. They’ll set an arbitrary minimum, such as two posts every day, and do everything they can to make sure that minimum amount of content is produced.

    However, post-Panda, this strategy is ineffective. You can post as much content as you like—if all of it reads as low-quality, it’s going to hurt you instead of help you. Quality needs to be your priority. Only focus on quantity once your quality is locked down.

    Stuffing Keywords Into Meta Tags and Descriptions

    Just like with your general content strategy, it’s a bad idea to stuff keywords into the meta tags and descriptions of your site. Before Panda, it was a good idea to take one or two core keyword phrases and use them throughout the meta fields of your site. However, today it’s better to write naturally—the semantic search capabilities of Google’s algorithm make it more advantageous to simply describe your business and the pages of your site rather than try to strike a set of specific keywords.

    Guest Posting Everywhere You Can

    Just like with press releases, guest posting used to be a strategy that people would follow blindly. They would post content on every possible external source they could find, earning links and gaining relevance on irrelevant platforms just as frequently as relevant ones. Today, this strategy is ineffective for two reasons. First, Google takes the subject matter of your content and the placement of your content into consideration, so you can be docked for posting irrelevant content on a niche blog. Second, if you post too often or post the same content over and over, you could be considered a spammer.

    Writing General Content

    The subject matter of your content matters to mark the relevance and niche of your site. The quality matters because Google favors sites with easy-to-read content. But the specificity of your content is also taken into consideration. Post-Panda, Google favors sites that post highly detailed content, with how-to guides or illustrations that substantiate its material. Writing general content, like overall descriptions of a topic instead of an in-depth examination, is no longer a viable content strategy.

    While there are likely more iterations of Panda on the way, both as major updates and as minor data refreshes, the general course of the Panda update will remain fixated on rewarding sites with great content and punishing those without. The key to staying on Panda’s good side is relatively simple: write consistent, new content that your users would want to read. If you make your users happy, you’ll make Google happy, and you’ll climb to the top of the ranks as a result.

  4. What’s Next After Panda, Penguin, and Pigeon?

    Leave a Comment

    Google likes to keep search marketers on their toes. Its search engine algorithm, kept top secret, has evolved gradually over the course of more than 15 years, but its biggest changes have come in the form of incidental spikes. Google releases major updates to its algorithm in big packages, which roll out over the course of a few days, and have traditionally caused great volatility in the search rankings of countless businesses. Google also releases tiny updates, fixes, and data refreshes as follow-ups to these massive updates, but they don’t make nearly as many waves.

    The big players of the past decade have been the Panda update of 2011, the Penguin update of 2012, and the Pigeon update from earlier this year. These updates all fundamentally disrupted certain ranking principles we had all taken for granted, and their impact has dictated the shape of search marketing today.

    Today, it’s easy to understand why Google released each of these updates, but when they first rolled out, they were surprising to everyone. While there is a certain predictable calm in the current search marketing world, it’s only a matter of time before Google changes the game again with another revolutionary new update.

    So what will the nature of the next update be? And what can we do to prepare for it?

    Panda and Penguin: Two Sides of the Same Coin

    articleimage637Panda and Penguin

    In order to understand the possibilities for the future, we have to understand the context of the past. The Panda and Penguin updates served as complementary rollouts, targeting the negative practices of onsite SEO and offsite SEO, respectively.

    The Panda update came first in 2011, shaking up the results of almost 12 percent of all search queries. The update came as a surprise, but it was only a natural response to some of the practices that were rampant at the time. The update’s primary target was onsite content, and culprits who used low-quality content as a mechanism solely to drive rank. Accordingly, it penalized those sites and rewarded sites that maintained a focus in providing valuable, enjoyable content.Low-quality spam-like practices, such as stuffing content with keywords and copying content from other sites, were virtually eradicated.

    The Penguin update came out as a counterpoint to Panda in 2012, doing for offsite link building what Panda did for onsite copywriting. Penguin 1.0 affected just over three percent of search queries, giving it a narrower range than Panda, but the sites it did affect were affected enormously. Penguin targeted sites that paid for external links, built external links on irrelevant sites, or spammed links in irrelevant conversations. Conversely, it rewarded sites that built more natural links in a diversified strategy.

    Enter the Pigeon Update

    articleimage637pigeionupdate

    The Pigeon update was slightly different from its cousins. Like them it was a major update that fundamentally changed an element of SEO, but it was never officially named by Google. It was released in the early summer of 2014.

    The Pigeon update was designed to change results for local searches. Rather than attempting a global change, like with Panda and Penguin, Pigeon is focused only on redefining searches for local businesses. Through Pigeon, local directory sites like Yelp and UrbanSpoon got a significant boost in authority, and businesses with significant high ratings on those sites also received a boost. Now, local businesses can get as much visibility by increasing the number of positive reviews posted about them than they can by pursuing traditional content marketing strategies.

    The Bottom Line

    While these updates all surprised people when they came out, and their specific changes are still being analyzed and debated, they all share one fundamental quality: they were rolled out to improve user experience.

    Panda was rolled out because too many webmasters were posting spammy, low-quality, and keyword stuffed content. The update sought to improve user experience by promoting sites with more relevant, valuable content.

    Penguin was rolled out because the web was filling up with keyword stuffed, random backlinks. The update sought to improve user experience by penalizing the culprits behind such spammy practices.

    Pigeon was rolled out because the scope of local businesses online was getting more diverse, and users needed a more intuitive way to find the ones that best met their needs. Pigeon sough to improve user experience by adding sophistication to its local business ranking process.

    User experience is the name of the game, and it’s the sole motivation behind each of Google’s landmark updates.

    Building Off of Old Structures

    Since their release, Panda and Penguin have been subject to countless new iterations. Data refreshes and updates tend to occur on an almost monthly basis, while major updates have been rolled out annually—Panda 4.0 and Penguin 3.0 both rolled out in the past few months. Pigeon is still relatively new, but chances are it will see some expansion as well.

    For now, it seems that Google is trying to build off of the structures that already exist within the confines of its greater algorithm. Rather than trying to introduce new categories of search ranking factors, Google is refining the categories it’s already introduced: onsite, offsite, and now local. It’s likely that Google will continue this trend for as long as it continues to improve user experience, gently refining their quality criteria and targeting emerging black hat tactics as they arise.

    However, it’s only a matter of time before Google discovers a new category of refinement. When it does, the update will likely be just as surprising as the big three, and will warrant its own series of updates and refinements.

    What the Next Overhaul Could Bring

    articleimage637 What the Next Overhaul Could Bring

    If we’re going to predict the nature of the next update, we need to understand two things: the emergence of new technology and the fundamental focus Google maintains on improving user experience. The next major Google update will probably have something to do with significantly improving the way users interact with one or more rising technologies.

    The Knowledge Graph

    One option is a radical expansion of the Google Knowledge Graph. The Knowledge Graph, that box of helpful information that appears to the side when you search for a specific person, place or thing, is changing the way that people search—instead of clicking on one of the highest ranking links, they’re consulting the information displayed in the box. The next Google update could change how significant this box appears, and how it draws and presents information from other sites.

    Third Party Apps

    Google has already shown its commitment to improving user experience through the integration of third party apps—it’s favoring third party sites like Yelp and UrbanSpoon in search results, and is integrating services like OpenTable and Uber in its Maps application. The next search algorithm update could start drawing more information in from these independent applications, rather than web pages themselves, or it could use app integrations as a new basis for establishing authority.

    The Rise of Mobile

    Smart phones are ubiquitous at this point, but wearable technology is still on the rise. The swell of user acceptance for smart watches could trigger some new update based around proximity searches, voice searches, or some other facet of smart watch technology. Since smart watches are in their infancy, it’s difficult to tell exactly what impacts on search they will have.

    No matter what kind of update Google has in store for us next, it’s bound to take us by surprise at least slightly. We can study its past updates and the new technologies on the horizon all we want, but Google will always be a step ahead of us because it’s the one in control of the search results. The only things we know for sure at this juncture arethat Google will eventually release another new massive update at some point, and its goal will be improving user experience.

  5. 8 Changes You Need to Make After Panda 4.1

    Leave a Comment

    After four months of silence on the Google Panda front after May’s Panda 4.0 update, the next iteration of Panda is here. Referred to as Panda 4.1, the update isn’t big enough to warrant the title of “5.0,” but is significant enough to have search marketers scrambling.

    Building on the intentions of its predecessors, the Panda 4.1 continues Google’s tradition of gradually weeding out low-quality content in favor of well-written, informative, engaging content. Sites with aggregated or copied content, such as lyric databases and medical content hubs, seem to have been hit the hardest by this iteration of Panda, suggesting that Google’s duplicate content detection is becoming more sophisticated. On the flip side, small- to medium-sized businesses with diverse original content are seeing a boost.

    The update started rolling out officially on September 25, 2014, and became active in gradual updates that spanned through the first week of October. Most companies have already seen the gains or losses from this update, so if you haven’t noticed your rankings change in the past few weeks, don’t worry—Panda 4.1 probably didn’t affect you.

    Still, Panda 4.1 has changed the world of search yet again, and if you want to take advantage of it and prepare for the next phases of Google’s evolution, there are several strategic changes you’ll need to make:

    1. Scour your site for duplicate content—and get rid of it.

    articleimage509Scour your site for duplicate content

    Sites with volumes of duplicated content are the ones who have been hit hardest by Panda 4.1. Now is your chance to get rid of the dead weight. Look throughout your site and your blog to find any articles that might be partly replicated from an outside source. Just because you don’t plagiarize work doesn’t mean you’re not at risk—extended quotes, attributed work from outside authors, and paraphrased sections could all register as duplicated material, and could hurt your overall ranks. If you find any content that could be seen as a duplicate from another source, get rid of it.

    2. Do a content audit and remove or improve “thin” content on your site.

    articleimage509Do a content audit

    “Thin” content is a vague term, referring to content that is densely packed with keywords, light on value or specificity, or shoddily written. We’ve all seen content like this, so it should stick out like a sore thumb—especially in comparison to a longer, more detailed piece. Go through your previously published material and review the pieces of content that look like they’ve been scrapped together. You have two options for these pieces: either delete them, or take the time to revise them and turn them into a similar, but more valuable piece.

    3. Adjust your content strategy to include only the highest quality material.

    articleimage509Adjust your content strategy to include only the hi

    Depending on the current level of your content marketing strategy, this change could be enormous or barely noticeable. Moving forward, all your content needs to be of the highest quality—that means based on an original idea, written by an expert, and highly detailed. Don’t worry as much about the frequency of your posts; if a piece of content isn’t as high quality as you’d like it to be, do not publish it. It’s better to have a smaller number of better-quality posts than a greater number of lesser entries. You may be doing this already, but it’s still a good idea to revisit your strategy and see what positive changes you can make.

    4. Add more outbound authoritative links to your content.

    Google wants to see high-quality, authoritative content. If you want to be seen as authoritative, you need to back up your facts and provide references to support your claims. The best way to do that is to provide in-text links pointing to outside, authoritative sites. It’s a way of leveraging the current status of well-established sites to bolster your own authority. As you continue writing new content, experiment with posting more outbound links to build your own credibility. Make sure to use a diverse range of sources to avoid spamming any one source with an excessive number of backlinks.

    5. Include more images in your posts.

    Embedded images in your blog posts do two things: first, they look more enticing to your readership, giving you greater reader retention and more reader satisfaction. Second, they give your content the appearance of detail, and make your content seem more valuable according to Google. Include infographics in the body of your blog posts to illustrate a point with information; if they are original, they’ll naturally attract backlinks and assist your strategy in multiple ways. Otherwise, include any relevant images you can find (as long as they’re legal to use) to complement the text on your page.

    6. Publish author credentials to establish author expertise.

    According to the recent leak of Google’s Quality Rater Guidelines, author expertise is an important factor in evaluating the authoritativeness of a piece of content. Instead of trying to make your content seem like it was written by an expert, have your content actually written by an expert. Include author credentials at the bottom of each published article, identifying the author’s name, title, and area of expertise. If you do this consistently, and offsite content also features this author’s name, you’ll effectively build that author’s authority, and your content will be seen as higher quality. It’s a small change that could add up to major results.

    7. Build more high-quality links to your content.

    Despite all the changes that the Penguin updates have made to the world of backlink building, backlinks are still tremendously important for building a site’s authority. This change is essentially the strategy I covered in point 4, but in reverse. If a high-quality site, such as an information database or a .edu site, links to one of your articles, that article will be seen as much more credible, giving you a Panda-proof boost in authority. If you can incorporate more of these high-authority backlinks into your link building campaign, your domain’s overall authority will significantly increase.

    8. Perform content audits regularly.

    The best ongoing new strategy you can adopt in response to Panda 4.1 is a regular content audit. On a monthly or bi-monthly basis, take an hour to review all the new onsite content that’s been published since your last audit. Carefully review each piece to determine its quality; check for originality, authoritativeness, and level of detail. If any of these pieces does not meet your quality standards, either get rid of it or revise it to make it comply. Doing this regularly keeps you vigilant, and keeps your content quality from ever declining or putting you at risk for another Panda-related drop in rank.

    Google is notorious for keeping online marketers on their toes, and it has continued that reputation with this latest update. With Panda 4.0 coming in May and 4.1 rolling out in September, Google could be back on a quarterly (or near-quarterly) updating pattern, like it was for previous iterations of Panda. If that’s the case, it could mean another major update is on the horizon for December or January.

    Stay sharp and keep your strategy up-to-date, and you’ll likely ride past the next Panda update with no mysterious drops in rank. You might even get a boost!

  6. Here’s Why Retailmenot Got Hit by Google Panda

    Leave a Comment

    Retailmenot.com, a growing coupon-based website backed by Google Ventures, experienced a stunning drop in both search engine rankings and organic traffic back in May of this year. Google Panda 4.0, the name given to May’s major Panda update, supposedly affected only about 7.5 percent of all search queries, but Retailmenot.com took a much bigger hit than expected.

    The incident has raised a lot of questions in the search engine marketing community, particularly focused on how Google Panda 4.0 works, and why Retailmenot.com was hit so hard despite being backed by Google’s own venture capital investment. The problem is multifaceted, but by understanding exactly what happened, you can protect your own web properties against a similar potential drop in the future.

    What Happened?

    articleimage408What Happened

    First, let’s take a look at exactly what happened. According to SearchMetrics, by May 21, Retailmenot.com experienced an approximate drop in organic search visibility of 33 percent. This drop does not measure the amount of organic traffic a website receives, but there is a correlation between organic visibility and organic traffic. Essentially, this metric illustrates a cumulative drop in rankings over several keywords that adds up to a third less visibility.

    In a broader context, 33 percent doesn’t seem so bad. After all, sites like spoonful.com and songkick.com experienced an approximate drop of 75 percent. But Retailmenot.com’s popularity and support from Google Ventures make it an interesting representative of the Panda update. Other sites, such as medterms.com, experienced as much as a 500 percent increase in visibility—so we know the update wasn’t only giving out penalties. So why, exactly, was Retailmenot.com penalized?

    The Mechanics Behind the Drop

    articleimage408 The Mechanics Behind the Drop

    The drop was first acknowledged around May 21, just one day after the rollout of Panda 4.0. There is no question that this update is primarily responsible for the significant drop in Retailmenot.com’s rankings. The Panda Update, which started back in 2011, has a clearly defined purpose: to improve user online experience by eliminating spam and pages with low-quality or minimal content from search results. Since 2011, several new iterations of the Panda update, along with occasional “data refreshes” have been applied to Google’s search algorithms in an ongoing attempt to improve search results.

    Panda 4.0, in May, was the latest “major” update. While the update surely introduced some new ranking signals and algorithm mechanics, the baseline goal of the update is in line with its Panda predecessors: to reward sites with high-quality content and punish those without it. Google is notorious for keeping its algorithms private, so it’s impossible to say exactly which factors were responsible for shaking Retailmenot.com’s visibility, but it seems like an inferior content program was at the heart of it.

    Why It Matters

    First, let’s take a look at why this hit was such a big deal for Retailmenot.com. A drop of 33 percent in search visibility doesn’t seem like that much on the surface; it could be the result of a handful of dropped ranks. The world of search is volatile at best, so occasional drops aren’t that big of a deal for most companies (especially big ones like Retailmenot.com). But this particular drop did have a significant impact on Retailmenot.com’s bottom line.

    CEO Cotter Cunningham reported in a conference call for Retailmenot.com’s Q2 earnings that organic search traffic represented approximately 64 percent of their total visitors—which is a big deal. Cunningham did report that Retailmenot.com has steadily recovered from the initial drop in rankings, but when 64 percent of your customers are affected by a change in Google’s algorithms, you take notice. Their stock price (SALE) closed at $31.04 at the end of trading on May 21, but by May 23, it had dropped to a low of $23.87. Their stock still has not returned to its original levels, but of course this is likely due to several factors.

    Why does this matter to you? Retailmenot.com is just one example of how significant an algorithm change can be. Preparing yourself for possible future changes, and responding immediately to any ranking drops, can help prevent or mitigate the effects of lost organic search visibility. Retailmenot.com wasn’t necessarily engaging in black hat practices; if they were spamming backlinks and posting total junk content, they would have experienced a much larger drop than they did. Instead, it appears as though the volume and quality of their content was just outside of Google’s latest standards, and as we know, those standards are constantly increasing.

    So you know it’s important to protect yourself against major search engine drops like these by committing yourself to providing your users with the best possible online experience. But the Retailmenot.com drop is also significant because it shows us that recovery is possible. CEO Cotter Cunningham also reported in their Q2 earnings conference call that some organic search visibility had been restored, and their revenue was still close to their original target.

    It’s also interesting to consider why Retailmenot.com was hit by Panda despite being backed by Google Ventures. While Google does seem biased in many of its actions (such as adjusting their search engine algorithms to favor content on their own social media platform, Google+), the fact that a GV-backed site was hit by a major update is evidence that Google has unflinching, equal standards for web quality. Let’s hope this unprejudiced stance remains as they continue to roll out more and more changes.

    How to Safeguard Your Site

    articleimage408 How to Safeguard Your Site

    Google Panda 4.0 is over. If you were going to get hit by it, you’ve already seen the damage, so if you haven’t noticed any significant outliers in your search ranking data, you’ve successfully avoided getting hit by Panda 4.0. If you have experienced a penalty from any stage of the Panda update thus far, it’s time to remove those penalties and start making a recovery.

    However, as evidenced by Retailmenot.com’s recent debacle, just because you escaped from a few penalties unscathed doesn’t mean you’ll avoid getting hit by future updates. If you want to make sure your organic search visibility stays where it is and continues to grow, you need to double check your strategy to make sure you’re complying with Google’s standards for user experience:

    • Write high-quality, original content on a regular basis and make it easy for your users to find, read, and share. Explore a diverse range of topics, avoid keyword stuffing, and make sure your subjects are valuable to your readership. Multiple types of content, including writing, images, and videos, are encouraged.
    • Encourage organic backlinking, but don’t spam links to your site. Keep any links you post as natural and beneficial as possible, and if you guest post often—consider using nofollow links to keep Google at bay.
    • Promote your content with social media, and encourage your followers to share and comment.
    • Keep your site as user-friendly as possible with an easy-to-follow navigation, ample opportunities to contact you, fast loading times, and minimal interference.

    Google is somewhat unpredictable, and because updates always come without warning, it’s impossible to completely prevent any possible drop in organic search visibility. Still, if you adhere to best practices consistently and do everything you can to give your users a great experience, you should be able to avoid a hit like the one experienced by Retailmenot.com.

  7. How to Prepare for the Imminent Google Panda Refresh

    Leave a Comment

    Google recently announced on Twitter that they would be refreshing their Panda update days beforehand which gave SEO professionals and companies considerable time to prepare for the actual launching. It’s hard to imagine how anyone could NOT be prepared for Google Panda updates by now, since this is the 24th revision or refresh to occur after its initial launching in February 2011.

    The majority of users know by now why Panda was created and what role it plays in conjunction with Google search engines. If sites have continued to produce poor quality content and have managed to get away with it up until now, it is highly unlikely that they will continue with their getting-by-on-a-wing-and-a-prayer streak for very much longer with each update that is released in 2014.

    In the same vein, if a site has already been penalized for producing low quality content and has worked hard to make the necessary corrections, they will need to
    be even more conscientious about creating high quality that stays within Google’s guidelines to perform well in SERPs. The following tips can help companies and SEO professionals carefully strategize how to strengthen their online presence using a dynamic plan of white hat tactics.

    Maintain Routine and Take it To The Next Level

    Maintain Routine and Take it To The Next Level

    Companies and SEO professionals can use Panda updates released in 2013 as a rule of thumb for what to expect in 2014. For example, websites that used black hat techniques to boost their rankings at the top of search engine page results, but provided no valuable content for their visitors were a main target for penalization. Since online presence has become even more competitive than it was only a year ago, it makes sense that micro updates (from Panda and Penguin) would continue to target the quality of a site’s links and content in 2014 as well.

    Therefore, savvy marketers need to monitor and perform periodic audits with their link profiles (i.e., look at anchor phrases, monitor the pace of link acquisition over the previous six months or a high percentage of .COM Top Level Domains (TLD) linking) to identify and remove inbound links that have been manipulated. This action alone is crucial to also prevent dozens of unnatural links from being embedded onto sites for content to remain at the same high quality level.

    SEO Has Changed, So Change With It

    Vigilant content marketing efforts made by businesses is one of the primary things that Google looks for to assess whether or not they want to support them. The best way to produce good content strategy is to consistently provide fresh, relevant and informative content to the targeted audience. They also need to measure social signals to guide future marketing campaigns both on and off social media sites, and include social buttons such as “Like,” “Share” and “Send” to encourage interaction with and sharing of products that can in turn increase authority to their site.

    Properly Posting Guest Blogs

    Properly Posting Guest Blogs

    Because guest blogging appears to be one of the most effective means of building traffic, quality inbound links and branding exposure in 2014, any posting should include a caveat be done using high ethical standards when selecting target sites to maximize any SEO efforts. Companies need to make sure that blog campaigns are released in moderation and are on topic (both in content and on the website the blog is posted on), and include a brief explanation of who a guest blogger is and why they were invited. These measures make a guest blog appear more organic instead of looking like a large paid link scheme in the eyes of Google.

    Social Media Plays an Increasingly Visible Role

    Diversification used within content marketing means that companies need to gain exposure using several social networks (at least seven) to stay competitive. Aside from Facebook and Twitter, companies should consider attaching themselves to networks like Instagram, Pinterest and various micro-video services which Google’s updates are likely to increasingly rely on to validate high quality content produced by a site.

    Invest in Google+

    In addition to strengthening and investing in an overall social media marketing position, businesses need to establish their presence on Google+. The most recent study of ranking factors confirms that Google+ is plays an extremely significant role in maintaining a solid SEO ranking. Since a component of Google’s algorithm depends on “social signals,” it is a smart move for businesses to establish Google Authorship (which brings the content’s body together) and link it to their Google+ account. The more +1’s a business receives, the higher the ranking their site will have on SERPs that will also serve as the key factor to strengthen their Author Rank.

    How Hummingbird Will Affect SEO in 2014

    How Hummingbird Will Affect SEO in 2014

    The SEO preparations needed to make an active content marketing strategy a seamless process largely depends on how a company connects with users. This is particularly essential in view of the dominant trend of many competitors who are regularly advertising their products and services via mobile devices. Another factor that will make the difference in how a content’s quality is determined will specifically come from its length. Ordinarily, length of content is an indication of the depth of expertise and credibility delivered to the reader which are both valued by Google. On average, the desirable length for text-based articles is a minimum 550 words, but has increased from this range to 1000 words or more. However, posting lengthy content on a mobile device needs to be seriously taken into consideration, since most readers will probably not want to continue scrolling down on their smartphone or tablet to read a 1,000 plus word article.

    Hummingbird is just a small piece of the graph pie that reflects the rapidly shifting digital environment where at least 33 percent of all Americans own tablets and 50 percent own smartphones which matters immensely to a company’s SEO rankings. A few underlying changes that happened with Hummingbird include the growing importance of both Knowledge Graph (interconnected network of entities and referenced properties, i.e., Facebook) semantic search (used to improve search accuracy by identifying a named entity, i.e., Movie: “Divergent”) that will serve as key components in preparation for the emergence of voice search related with mobile. It is crucial, then, that companies make it a top priority to first invest in and to produce a mobile-optimized website for 2014 to accommodate the vast number of mobile device users and then upgrade their website properties for bigger screens next.

    The Changing Relationship Between SEO and Advertising and PPC

    Although Google has made the decision to encrypt the vast majority of its searches, the ability to access keyword data is still available for advertisers using PPC on Google’s platform. As a result, future SEO budgets should include PPC because access to keyword data may otherwise be prohibited.

    Google has taken humongous steps to prevent SEO professionals and many others from boosting their online presence using “black hat” and “gray hat” tactics. To prevent a long string of penalties from being put on their site, companies and SEO professionals should strive to only use solid “white hat” tactics and focus on rapidly evolving priority areas to rank high on SERPs in preparation of Google’s newest Panda update!

  8. Why Duplicate Content is Bad for SEO and What to Do About It

    1 Comment

    With the rollout of Google Panda, we have heard sad stories of sites that have been either devalued or removed from Google’s index entirely.

    One of the reasons for the huge drop in some sites’ rankings has been duplicate content — one of the problems that Panda was released to control.

    Most of the sites that have experienced a drastic decrease in rankings were content farms and article directories; that is, sites loaded with thousands of duplicate articles.

    While it had been made clear that duplicate content was one of the primary things Panda frowns on, some content authors breathed a sigh of relief after Google appeared to say that “There’s no such thing as a ‘duplicate content penalty’ ” in a blog post several years ago.

    But duplicate content remains a controversial issue. It has kept bloggers and webmasters nervous about publishing content that could hurt their rankings. Like many other things, there are two sides to the issue. There’s duplicate content that Google allows and there’s the type that hurts your website’s rankings.

    Let’s try to clear up the difference.

    What type of duplicate content can hurt your rankings?
    To determine whether a sample of duplicate content is going to pull down your rankings, first you have to determine why you are going to publish such content in the first place.

    It all boils down to your purpose.

    If your goal is to try to punk the system by using a piece of content that has been published elsewhere, you’re bound to get penalized. The purpose is clearly deceptive and intended to manipulate search results.

    This is what Google has to say about this sort of behavior:

    “Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.”

    If Google has clear evidence that you are trying to manipulate your search rankings, or that you are practicing spammy strategies to try to improve rankings and drive traffic to your website, it could result to your site being removed from Google’s index.

    The effects on users
    Publishing duplicate content could also hurt your reputation in the eyes of the users.

    The ultimate goal of the search engines is to provide users with the most valuable, most useful, and most relevant information. If you publish a bit of content that has been previously published elsewhere, your site may not show up for the same search, because search engines tend to show results only from the main content sources.

    This explains why the search engines omit duplicate results to deliver only those that the users need.

    When users read content on your site that they have already seen previously on a site that they trust more, chances are their trust in your site will diminish.

    But is there ever a case where duplicate content is acceptable?

    When duplicate content can be acceptable
    There may be instances when duplicate content is accidental, and therefore should not lead to any penalties.

    One such instance is when the search engines index tree identifies separate URLs within a domain that point to a single content. An example is the following trio of URLs: http://abc.com, http://www.abc.com, and http://www.abc.com/index.htm. There’s clearly no indication of manipulation or intent to spam in this case.

    Another case of legitimate duplication occurs when a content sample is published in several different formats to cater to specific users. With the explosion of mobile web browsing, content is now published to suit desktops, tablets, and mobile phone web users. Publication of a single content in several formats is not subject to any penalties for duplicate content.

    Also, keep in mind that there are instances when publishing a copy of a piece of content, in part or in whole, is needed for reference, such as when citing a news source. If the purpose is to reference or to add value to users, such content duplication is not subject to penalties.

    Avoiding duplicate content that provokes the Panda’s wrath
    Simply put, avoiding duplicate content is your best defense against any duplicate content penalties administered by Google Panda. Remember that Google and other search engines strive to provide search results that are unique and of high quality.

    Your goal must therefore be to publish unique and original content at all times.

    However, if duplication cannot be avoided, below are recommended fixes that you can employ to avert penalties:

    Boilerplates. Long boilerplates or copyright notices should be removed from various pages and placed on a single page instead. In cases where you would have to call your readers’ attention to boilerplate or copyright at the bottom of each of your pages or posts, insert a link to the single special page instead.

    Similar pages. There are cases when similar pages must be published, such as SEO for small and big businesses. Avoid publishing the same or similar information. Instead, expand on both services and make the information very specific to each business segment.

    Noindex. People could be syndicating your content. If there’s no way to avoid this, include a note at the bottom of each page of your content that asks users to include a “noindex” metatag on your syndicated content to prevent the duplicate content from being indexed by the search engines.

    301 redirects. Let the search engine spiders know that a page has permanently moved by using 301 redirects. This also alerts the search engines to remove the old URL from their index and replace it with the new address.

    Choosing only one URL. There might be several URLs you could use to point to your homepage, but you should choose only one. When choosing the best URL for your page, be sure to keep the users in mind. Make the URL user-friendly. This makes it easier not only for your users to find your page, but also for the search engines to index your site.

    Always create unique content. Affiliates almost always fall victim to the convenience of ready-made content provided by merchants. If you are an affiliate, be sure to create unique content for the merchant products you are promoting. Don’t just copy and paste.

    Conclusion
    Whatever your intent is, the best way to avoid getting penalized by Google Panda is to avoid creating duplicate content in the first place. Keep in mind that quality is now at the top of the search engines’ agenda.

    It should be yours too.

  9. SEO Mistakes the Panda and the Penguin Forbid You to Commit

    Leave a Comment

    For Google and other major search engines, quality and reliability of information are key to user satisfaction. These elements also empower the search engines to thrive as they seek to provide better data to users in terms of quality, relevance, and authority.

    And who is king of the SEO hill?

    Google, of course. And Google shows no sign of loosening its stranglehold on the universe of SEO.

    Via one algorithmic update after another, Google is wreaking havoc on people who found loopholes in the system to advance their personal interests. For years, these smooth operators devised tricks to manipulate their way to the top of search engine results pages.

    With the Panda and Penguin updates, Google may have finally patched the holes that allowed spammers to litter search engine results with garbage.

    More recently, Google rolled out newer versions of the Panda and Penguin updates. In the hope of making the Internet a better place to host and find high-quality and extremely useful information, Google supplied webmasters and business owners with guidelines to help them play the SEO game in a fairer manner.

    So let’s talk about some of the mistakes that every webmaster and online business owner should avoid so as not to get slapped by Google. We’ll also discuss some recommendations on how to optimize your site properly for Panda and Penguin.

    But first, a brief review of what the two major Google updates are all about.

    Google Panda
    The Panda was the first of the two major overhauls that Google rolled out in less than two years. It offered an initial glimpse of how the mighty Google intended to provide better search engine results.

    The main goal of Panda was to sniff out sites that carry low-quality content — or in Panda-speak, “thin” content. What Google Panda generally looked for were sites that had obviously spammy elements such as keyword stuffing, duplicate content, and in some cases, high bounce rate.

    Google Penguin
    Although at first it might have sounded cute and cuddly to Internet users, the Penguin quickly showed them otherwise. This update zeroed in on sites that were over-optimized in terms of backlinking.

    One of the most widely practiced link-building tactics prior to Penguin’s appearance was to use exact-match keywords for anchor texts. The problem with this tactic is that Penguin reads it as an unnatural linking practice.

    To promote natural backlinking, Penguin set out to penalize sites that routinely used exact-match keywords for anchor texts, and rewarded those smart enough to employ variations in their keywords.

    The top SEO mistakes you should avoid at all times
    Now that you have been reminded of what Panda and Penguin want and how they’d like us to play the SEO game, keep the following pitfalls in mind to avoid seeing your site take the deep plunge down the search results pages.

    1. Using mostly exact-match keywords for backlinks
    This used to be one of the most effective ways of getting a site to rank higher in search results. These days, this strategy can still be recommended, but with caution. Now that Penguin is policing the info highway, using mostly exact-match keywords is a sure way to get your site devalued.

    To gain or maintain favorable ranking, observe natural link-building best practices. Post-Penguin SEO calls for you to vary your keywords by using related terms. If you are optimizing for “baby clothing,” for example, use keyphrases such as “kids’ clothing,” “clothing for babies,” etc. It’s also a good idea to use your brand’s name as anchor text.

    The primary thing to remember is to link naturally. Don’t be too concerned about failing to corner exact-match keywords that you think could hugely benefit your business. After all, Google is moving toward latent semantic indexing (LSI), which puts related keyphrases into consideration for smarter indexing.

    2. Generating most of your traffic via only one marketing channel
    Many marketers, especially new ones, tend to assume the only way to gain a huge amount of highly targeted traffic is by focusing time and energy on a single marketing channel. Some only use SEO, while others concentrate their efforts on social media marketing.

    Focusing your attention on one channel could bring success in terms of gaining some very targeted traffic, but with regard to ranking, it could actually hurt you, especially since the Panda and Penguin rollouts.

    Again, diversity should be used not only in keywords and keyphrases, but also to drive traffic to your site. Apart from SEO, the smart way to drive traffic to your site will involve use of the following tactics:

    • Article marketing
    • Social media pages for your business
    • Guest posting
    • Social bookmarking
    • Forum posting and blog comments
    • Press release

     

    By diversifying your traffic sources, you will create a natural way for your audience to find your business at different online locations — a signal that will get you favorable rankings in search.

    3. Failing to take advantage of internal linking
    Even worse is not doing any internal linking at all. Internal linking not only improves the users’ experience; it’s also good for onsite SEO.

    With strategic and meaningful internal linking, you will make it easy for your users to find their way around your site and locate the information they want. Your users will also have more good reasons to linger on your site as they consume more information related to what they are looking for.

    Proper internal linking also enables search engine spiders to determine which content is related to other content.

    Proper internal linking can be executed by including the following:

    • Standard navigation above the fold — more specifically, above the content
    • Category section on sidebar
    • Related posts on sidebar or below each post
    • Links within each post that point users/readers to related content
    • Sitemap

     

    4. Publishing content with very little value
    In Google Panda-speak, this is known as “thin” content. Panda rolled out to hammer sites that carry duplicate information and that promote content which offers very little value or information to users. Such sites are often stuffed with keywords and overly promotional.

    To avoid getting smacked by the Panda’s giant paw, think critically about the value your users are likely to get from your content: Is it unique? Will it help them solve their most pressing concerns? Will the content fulfill its promise?

    Conclusion
    Are we seeing the beginning of the end of SEO manipulation? Let’s hope so.

    As Google shoves spammers way down its search results, the hope is that Google’s first few pages will feature nothing but extremely valuable and useful information that really meets users’ expectations. And as a site owner and online entrepreneur, you can depend on Google Panda and Penguin to improve your standards as you strive to deliver what your audience is looking for.

    For more information on properly optimizing your site, contact us and we’ll show you your options to make your site Google compliant.

     

  10. The Google Panda 20 Update: Some Key Information

    Leave a Comment

    Google is making everyone aware of the company’s relentless drive to supply better and more useful information.

    Following a series of Panda and Penguin updates, Google Panda #20 was released on September __, 2012. This was the first major update since July 24, 2012. More Panda updates are expected to be released within the next few months, and future updates will be more consistent.

    Unlike other recent releases, Panda #20 was a fairly major update – one that ran for almost a week. In fact, some 2.4% of English queries were affected and about 0.5% in other languages.

    Another interesting thing about Panda #20 is that it overlapped with another algorithmic update dubbed the EMD update, which Google rolled out to target sites with low-quality, exact-match domain names.

    This made it tricky for affected SEOs and site owners to determine which update had hit them. Was their site hit for failing to comply with Google Panda standards, or for having a poor exact-match domain name?

    Panda was released to devalue sites with “thin content,” or content that offers minimal value. Since its release last year, tons of sites have seen a dramatic drop in rankings. Some, especially notorious content farms, have been removed from Google’s index altogether.

    Panda also targeted sites that contained duplicate content. As a result of Panda’s release, black-hat SEO practices have been significantly thrashed. Sites that churn out hundreds of pages with duplicate content were obliterated.

    The release of Panda, as well as its equally ferocious sibling Penguin, met with a few complaints as well. Years of hard work and substantial amounts of marketing dollars to push a site to the top of Google rankings were effectively tossed out the window. SEOs, publishers, and site owners, believing they had been following recommended SEO best practices, cried foul.

    The hard lesson we can all learn in the aftermath of Google’s algorithmic changes is that, while it’s true that quality is subjective, standards have been laid out.

    The stress on quality and authority

    Quality and relevance of information is at the heart of every substantial change rolled out by Google. To ensure that every site is on the same page with Google, they have laid out guidelines for site owners to follow.

    As far as Panda is concerned, as long as your site’s content is original and offers quality and useful information, you should be fine.

    As long as the content strongly relates to the topic and offers great value to your audience, there wouldn’t be any reason for Google to slap you.

    Do your link-building activities follow the prescribed or accepted methods? Have you been linking to authority sites, and are authority sites linking back to yours? Do you make a point of regularly checking your link profiles for any potentially damaging links?

    There’s no way of telling how many more of Google’s Panda updates are coming in the future, but Matt Cutts has made it clear that Google Panda will be updated and refreshed on a continual basis. This shows how committed Google is to making the Internet a better and more reliable avenue for gleaning valuable information.

    Conclusion

    It’s crucial to keep abreast of the periodic algorithmic changes that Google rolls out. Keeping yourself on the same page with the search engines is vital to the success of your online business.

    If you need help keeping your site compliant with current SEO best practices, contact us. You can also subscribe to our feed to keep yourself in the SEO loop.

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!

Love,

-The AudienceBloom Team