AudienceBloom

CALL US:  1-877-545-GROW

Tag Archive: google panda

  1. Are We on the Verge of the Next Great Search Disruption?

    Leave a Comment

    Okay, so we all know that the search world is constantly evolving. It’s changed, radically, in many different ways since its general inception in the mid-1990s. Most of these changes, however, have been slow and gradual improvements to the core, original search engine algorithm. Search experts and marketers were quick to note when these things happened; for example, when Panda was released, 11 percent of queries were affected, and marketers couldn’t help noticing this extreme volatility because they were watching their ranks closely.

    Panda Effect

    (Image Source: Search Engine Land)

    But users didn’t really notice this volatility—to the average user, the changes and improvements in search are so gradual they’re barely noticeable, the same way it’s hard to tell when a child is growing when you see him/her every day.

    What Constitutes a Disruption?

    Because of this incremental phenomenon, it’s tough to categorize what might count as a search engine “disruption.” Usually, a tech disruption happens all at once—when a new product is released, a new trend takes off, or a new company emerges to challenge the norm. Now that all the norms of search are pretty much in place, the minor “disruptions” we’ve had so far (usually in the form of Google updates) can’t really claim to have that much impact. User search behavior has changed much in the past 20 years, but again, it’s done so incrementally.

    Still, knowing that, the search world may be on the verge of a major disruption in the truest sense—a new set of phenomena that may turn the nature of online search on its head. And it’s already starting to take place.

    Artificial Intelligence on Two Fronts

    Disruption is coming in the form of artificial intelligence (AI), and in two distinct modes of operation, it’s already here:

    • AI is powering diverse new types of virtual assistants. These include programs like Siri, Cortana, and Google Now, and are becoming more popular modes of search at an astounding rate.
    • AI is beginning to handle more search engine updates. Machine learning algorithms like RankBrain are finally starting to emerge as the future of search engine updating.

    So on one hand, you have AI interfering with the way users are searching, and on the other, you have AI taking over the updating process for search engines.

    Let’s take a look at each of these in turn, and how they could be considered disruptive.

    Virtual Assistants

    Chances are, you’ve used a virtual assistant at least once in your life, and in the near future, you’ll find yourself using them even more. Consider how these programs could cause the next major search disruption:

    • Voice search popularity. First, it’s important to address the rising popularity of voice search in general. By some estimates, voice-based searches have gone from zero to over 50 billion searches per month. That’s a huge jump, and it’s only going to get bigger. That means more people are using colloquial phrases and forgoing traditional search engines entirely.

    LSA Insider

    (Image Source: LSA Insider)

    • Cross-realm search. It’s also important to realize that most virtual assistants aren’t limited to one realm of search. For example, Cortana and Siri will search the Internet, your local device, your online accounts, and even files within your local device for your search queries. Search is no longer exclusively online, and the lines between online and offline are starting to blur.
    • User intent and semantic capabilities. Virtual assistants are also becoming more adept at recognizing natural language and user intent, which means it’s going to be harder than ever to “optimize” anything in specific ways, and users will have hyper-focused intentions when looking for solutions or content.
    • On-the-go searching. Virtual assistants are also driving more mobile and on-the-go searches, which is changing the way people form queries. They need more immediate, location-based answers, rather than the products of premeditated keyword-based research queries of old.

    Machine Learning in Search

    On the other front of AI development, you have new machine learning algorithms working to replace the previously manual job of improving search engines. This has started out small, with a modification to Hummingbird known as RankBrain, but we can expect to see bigger, better versions of these machine learning algorithms in place in the near future. There are three key ways it could be a disruptor:

    • Micro-updates. RankBrain doesn’t come up with major changes and then push them to a live environment. It runs through tons of micro-updates on a constant basis, meaning that incremental improvement is going to happen on an even more transformative level.
    • Unpredictable paths of development. Since human beings won’t be in control of algorithm updates forever, machine learning algorithms could take searches down new, unfamiliar paths of development—some of which may look very different to today’s average user. Entire constructs and norms may be fundamentally overwritten.
    • Rate of change. Perhaps what’s most scary about the idea of machine learning is the sheer pace at which it can develop. With algorithms perfecting themselves and perfecting how to perfect themselves, the pace of development may skyrocket, leaving us marketers in the dust.

    Key Takeaways

    Since these technologies are still being developed, it’s hard to estimate to what degree they’ll be able to redefine the norms of user searches. However, early indications show these two forms of AI to be powerful, popular, and for lack of a less clichéd phrase, game-changing. As a marketer, you can’t prepare for the future in any concrete way, since even the technology developers aren’t sure where it’s going to go from here, but you can prepare yourself by remaining flexible. Hedge your bets with lots of long-term strategies, try to jump on new trends before your competitors can, and always be willing to adapt.

    What can we help you with?

  2. Why Duplicate Content is Bad for SEO and What to Do About It

    Leave a Comment

    With the rollout of Google Panda, we have heard sad stories of sites that have been either devalued or removed from Google’s index entirely.

    One of the reasons for the huge drop in some sites’ rankings has been duplicate content — one of the problems that Panda was released to control.

    Most of the sites that have experienced a drastic decrease in rankings were content farms and article directories; that is, sites loaded with thousands of duplicate articles.

    While it had been made clear that duplicate content was one of the primary things Panda frowns on, some content authors breathed a sigh of relief after Google appeared to say that “There’s no such thing as a ‘duplicate content penalty’ ” in a blog post several years ago.

    But duplicate content remains a controversial issue. It has kept bloggers and webmasters nervous about publishing content that could hurt their rankings. Like many other things, there are two sides to the issue. There’s duplicate content that Google allows and there’s the type that hurts your website’s rankings.

    Let’s try to clear up the difference.

    What type of duplicate content can hurt your rankings?
    To determine whether a sample of duplicate content is going to pull down your rankings, first you have to determine why you are going to publish such content in the first place.

    It all boils down to your purpose.

    If your goal is to try to punk the system by using a piece of content that has been published elsewhere, you’re bound to get penalized. The purpose is clearly deceptive and intended to manipulate search results.

    This is what Google has to say about this sort of behavior:

    “Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.”

    If Google has clear evidence that you are trying to manipulate your search rankings, or that you are practicing spammy strategies to try to improve rankings and drive traffic to your website, it could result to your site being removed from Google’s index.

    The effects on users
    Publishing duplicate content could also hurt your reputation in the eyes of the users.

    The ultimate goal of the search engines is to provide users with the most valuable, most useful, and most relevant information. If you publish a bit of content that has been previously published elsewhere, your site may not show up for the same search, because search engines tend to show results only from the main content sources.

    This explains why the search engines omit duplicate results to deliver only those that the users need.

    When users read content on your site that they have already seen previously on a site that they trust more, chances are their trust in your site will diminish.

    But is there ever a case where duplicate content is acceptable?

    When duplicate content can be acceptable
    There may be instances when duplicate content is accidental, and therefore should not lead to any penalties.

    One such instance is when the search engines index tree identifies separate URLs within a domain that point to a single content. An example is the following trio of URLs: http://abc.com, http://www.abc.com, and http://www.abc.com/index.htm. There’s clearly no indication of manipulation or intent to spam in this case.

    Another case of legitimate duplication occurs when a content sample is published in several different formats to cater to specific users. With the explosion of mobile web browsing, content is now published to suit desktops, tablets, and mobile phone web users. Publication of a single content in several formats is not subject to any penalties for duplicate content.

    Also, keep in mind that there are instances when publishing a copy of a piece of content, in part or in whole, is needed for reference, such as when citing a news source. If the purpose is to reference or to add value to users, such content duplication is not subject to penalties.

    Avoiding duplicate content that provokes the Panda’s wrath
    Simply put, avoiding duplicate content is your best defense against any duplicate content penalties administered by Google Panda. Remember that Google and other search engines strive to provide search results that are unique and of high quality.

    Your goal must therefore be to publish unique and original content at all times.

    However, if duplication cannot be avoided, below are recommended fixes that you can employ to avert penalties:

    Boilerplates. Long boilerplates or copyright notices should be removed from various pages and placed on a single page instead. In cases where you would have to call your readers’ attention to boilerplate or copyright at the bottom of each of your pages or posts, insert a link to the single special page instead.

    Similar pages. There are cases when similar pages must be published, such as SEO for small and big businesses. Avoid publishing the same or similar information. Instead, expand on both services and make the information very specific to each business segment.

    Noindex. People could be syndicating your content. If there’s no way to avoid this, include a note at the bottom of each page of your content that asks users to include a “noindex” metatag on your syndicated content to prevent the duplicate content from being indexed by the search engines.

    301 redirects. Let the search engine spiders know that a page has permanently moved by using 301 redirects. This also alerts the search engines to remove the old URL from their index and replace it with the new address.

    Choosing only one URL. There might be several URLs you could use to point to your homepage, but you should choose only one. When choosing the best URL for your page, be sure to keep the users in mind. Make the URL user-friendly. This makes it easier not only for your users to find your page, but also for the search engines to index your site.

    Always create unique content. Affiliates almost always fall victim to the convenience of ready-made content provided by merchants. If you are an affiliate, be sure to create unique content for the merchant products you are promoting. Don’t just copy and paste.

    Conclusion
    Whatever your intent is, the best way to avoid getting penalized by Google Panda is to avoid creating duplicate content in the first place. Keep in mind that quality is now at the top of the search engines’ agenda.

    It should be yours too.

    What can we help you with?

  3. SEO Mistakes the Panda and the Penguin Forbid You to Commit

    Leave a Comment

    For Google and other major search engines, quality and reliability of information are key to user satisfaction. These elements also empower the search engines to thrive as they seek to provide better data to users in terms of quality, relevance, and authority.

    And who is king of the SEO hill?

    Google, of course. And Google shows no sign of loosening its stranglehold on the universe of SEO.

    Via one algorithmic update after another, Google is wreaking havoc on people who found loopholes in the system to advance their personal interests. For years, these smooth operators devised tricks to manipulate their way to the top of search engine results pages.

    With the Panda and Penguin updates, Google may have finally patched the holes that allowed spammers to litter search engine results with garbage.

    More recently, Google rolled out newer versions of the Panda and Penguin updates. In the hope of making the Internet a better place to host and find high-quality and extremely useful information, Google supplied webmasters and business owners with guidelines to help them play the SEO game in a fairer manner.

    So let’s talk about some of the mistakes that every webmaster and online business owner should avoid so as not to get slapped by Google. We’ll also discuss some recommendations on how to optimize your site properly for Panda and Penguin.

    But first, a brief review of what the two major Google updates are all about.

    Google Panda
    The Panda was the first of the two major overhauls that Google rolled out in less than two years. It offered an initial glimpse of how the mighty Google intended to provide better search engine results.

    The main goal of Panda was to sniff out sites that carry low-quality content — or in Panda-speak, “thin” content. What Google Panda generally looked for were sites that had obviously spammy elements such as keyword stuffing, duplicate content, and in some cases, high bounce rate.

    Google Penguin
    Although at first it might have sounded cute and cuddly to Internet users, the Penguin quickly showed them otherwise. This update zeroed in on sites that were over-optimized in terms of backlinking.

    One of the most widely practiced link-building tactics prior to Penguin’s appearance was to use exact-match keywords for anchor texts. The problem with this tactic is that Penguin reads it as an unnatural linking practice.

    To promote natural backlinking, Penguin set out to penalize sites that routinely used exact-match keywords for anchor texts, and rewarded those smart enough to employ variations in their keywords.

    The top SEO mistakes you should avoid at all times
    Now that you have been reminded of what Panda and Penguin want and how they’d like us to play the SEO game, keep the following pitfalls in mind to avoid seeing your site take the deep plunge down the search results pages.

    1. Using mostly exact-match keywords for backlinks
    This used to be one of the most effective ways of getting a site to rank higher in search results. These days, this strategy can still be recommended, but with caution. Now that Penguin is policing the info highway, using mostly exact-match keywords is a sure way to get your site devalued.

    To gain or maintain favorable ranking, observe natural link-building best practices. Post-Penguin SEO calls for you to vary your keywords by using related terms. If you are optimizing for “baby clothing,” for example, use keyphrases such as “kids’ clothing,” “clothing for babies,” etc. It’s also a good idea to use your brand’s name as anchor text.

    The primary thing to remember is to link naturally. Don’t be too concerned about failing to corner exact-match keywords that you think could hugely benefit your business. After all, Google is moving toward latent semantic indexing (LSI), which puts related keyphrases into consideration for smarter indexing.

    2. Generating most of your traffic via only one marketing channel
    Many marketers, especially new ones, tend to assume the only way to gain a huge amount of highly targeted traffic is by focusing time and energy on a single marketing channel. Some only use SEO, while others concentrate their efforts on social media marketing.

    Focusing your attention on one channel could bring success in terms of gaining some very targeted traffic, but with regard to ranking, it could actually hurt you, especially since the Panda and Penguin rollouts.

    Again, diversity should be used not only in keywords and keyphrases, but also to drive traffic to your site. Apart from SEO, the smart way to drive traffic to your site will involve use of the following tactics:

    • Article marketing
    • Social media pages for your business
    • Guest posting
    • Social bookmarking
    • Forum posting and blog comments
    • Press release

     

    By diversifying your traffic sources, you will create a natural way for your audience to find your business at different online locations — a signal that will get you favorable rankings in search.

    3. Failing to take advantage of internal linking
    Even worse is not doing any internal linking at all. Internal linking not only improves the users’ experience; it’s also good for onsite SEO.

    With strategic and meaningful internal linking, you will make it easy for your users to find their way around your site and locate the information they want. Your users will also have more good reasons to linger on your site as they consume more information related to what they are looking for.

    Proper internal linking also enables search engine spiders to determine which content is related to other content.

    Proper internal linking can be executed by including the following:

    • Standard navigation above the fold — more specifically, above the content
    • Category section on sidebar
    • Related posts on sidebar or below each post
    • Links within each post that point users/readers to related content
    • Sitemap

     

    4. Publishing content with very little value
    In Google Panda-speak, this is known as “thin” content. Panda rolled out to hammer sites that carry duplicate information and that promote content which offers very little value or information to users. Such sites are often stuffed with keywords and overly promotional.

    To avoid getting smacked by the Panda’s giant paw, think critically about the value your users are likely to get from your content: Is it unique? Will it help them solve their most pressing concerns? Will the content fulfill its promise?

    Conclusion
    Are we seeing the beginning of the end of SEO manipulation? Let’s hope so.

    As Google shoves spammers way down its search results, the hope is that Google’s first few pages will feature nothing but extremely valuable and useful information that really meets users’ expectations. And as a site owner and online entrepreneur, you can depend on Google Panda and Penguin to improve your standards as you strive to deliver what your audience is looking for.

    For more information on properly optimizing your site, contact us and we’ll show you your options to make your site Google compliant.

     

    What can we help you with?

  4. Panda 3.5 (*Update* – “Penguin”) – What Changed from 3.4, and How to Recover

    12 Comments

    4/26 Edit – Google has confirmed that this update is going to be called “Penguin.” Matt Cutts even tweeted this photo:

    Brace for impact. As I write this, Google is rolling out the latest rendition of its Panda algorithm — Panda 3.5 (Editor’s note: Now being called “Penguin”). Here’s Google’s official announcement:

    In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content.

    It seems to be up for debate right now as to whether this is an update of Google’s Panda algorithm (ie, Panda 3.5) or whether it’s a standalone update. Either way, In this post my goal is to provide an in-depth analysis on the following:

    • What Google’s saying in its announcement
    • How Panda 3.5 is affecting search results
    • How to recover from Panda 3.5
    • Other possible repercussions of Google’s latest algorithm change
    • My analysis on Google’s real purpose for rolling out this algorithm change

    What Google’s Saying

    Let’s break down Google’s announcement and dive into the details of what they’re trying to accomplish with Panda 3.5.
    Google:
    The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.
    Analysis: 
    Google is aware that it’s easy to increase rankings by amassing lots of inbound links and loading up your website with keyword terms and LSI (related) terms for desired keyword rankings.
    Google:
    The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded.
    Analysis:
    Google wants high-quality, information-rich, user-friendly websites to appear in its search results. It doesn’t consider keyword-stuffed websites to be a quality source of information for its users. Additionally, Google wants to stop rewarding sites with high rankings that got there by manipulating its algorithm with crappy inbound links.
    Google gives the following screenshot as an example of keyword stuffing:
    Keyword stuffing

    Google's example of keyword stuffing

    I’ve seen this type of thing many times, and I’m disgusted whenever I see it. People use software to spit out this garbage and then either publish it on another website with a link back to their money site, or they put it below the fold of the page they want to get ranked in the search engines. The goal is to get as many keywords and related keywords (ie, LSI keywords) on the page as possible in order to prove to Google that the page is relevant and should rank well for the target keyword.
    Google follows up with this screenshot:
    Link Spam

    Google's example of link spam

    This example is clearly a page from a blog network. Blog networks are popular and effective link building tactics, but Google doesn’t like them. In this example, the content isn’t even well-written — it’s clearly spun by computer software.
    Google:
    The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice.
    Analysis:
    Google said that Panda 3.4 would affect 1.2% of queries, but it clearly affected way more than that; it rocked the SEO industry. If they say Panda 3.5 affects 3.1% of queries, then this update could have a much bigger impact than Panda 3.4 did. This would be the biggest update since Panda 1.0 itself.

    How Panda 3.5 is Affecting Search Results

    Search results for various queries appear to have changed, but they don’t appear to be better. In fact, they appear to be much worse. Let’s take a look at a few examples:

    Search term: “new shoes”

    • Rank 1: A Youtube video for the song “new shoes” by Paolo Nutini. – who? I don’t know who Paolo Nutini is, and props if you do. Furthermore, I’m looking for new shoes and Google gave me a video as the top search result? C’mon, man.
    • Rank 2: An intro to marketing class. – WTF? What the hell does this have to do with new shoes?
    Search term: “make money online”
    • Rank 1: makemoneyforbeginners.blogspot.com – Seriously? This site is blank. As in, zero (0) posts. It’s ranking #1 for a search term with 110,000 global exact searches/month, and it’s blank.
    • Rank 2-5: Nothing useful, littered with Adsense ads.
    • Rank 6: zzzprofits.com – What? This is a forum directory with barely any posts. Nothing related to making money online or even remotely useful here.
    • Rank 7: gurucreation.com – This is a list builder site. The owner is just trying to get folks to give up their email address so he can build his email list.
    Search term: “raw dog food”
    • Rank 2: A book on Amazon. – A book about raw dog food? Is Google just getting into bed with Amazon here or does Google really think I’m looking for a book?
    • Rank 5: mudbay.us – I can’t find anything about raw dog food on this site. It’s not even mentioned on the homepage. I’m clueless as to why Google is ranking it #5 for this term.

    How to Recover from Google Panda 3.5

    As I discussed in a previous blog post, Google is targeting inbound link profiles with all their recent Panda updates (3.3, 3.4 and 3.5). If you’ve been victimized by this latest algorithm change, it’s due to one of the following factors:

    1. Too many inbound links with exact-match anchor text.
    2. Too many inbound links from “webspam” content.
    3. Not enough “trust” links, such as links from Facebook, Twitter, and social bookmarking sites. These are also known as social signals.
    You have two recovery options: Delete or dilute.
    1. Delete most or all of your inbound links with exact match anchor text.
    2. Dilute your existing link profile with a new link building campaign aimed at building plenty of LSI keywords, naked URLs, brand anchors and junk/universal anchors. (For more information on what each of these are, please read my previous post, in which I go into detail about each one.
    ***Shameless plug alert*** If you’d like, we offer link building packages aimed at diluting your existing inbound link profile in order to help you recover from Panda 3.3, 3.4 or 3.5. Whether you’re a small business or an agency with clients of your own, we can help you out.

    Other Possible Repercussions of Panda 3.5

    Google has made it clear that it doesn’t like “webspam” and it doesn’t like the sites that host it (the publishers) or the ones that use it to their benefit (the advertisers). Does this mean that it’s now possible to “tank” your competitors by throwing lots of crappy, spun content up at various blog networks that link to your competitor’s website? Is Google making Negative SEO a reality?

    I’ve already seen various reports that Negative SEO is working. I really hope Google hasn’t made it possible to tank competitors with nasty links. If so, I expect SEO companies to morph into SEO mercenaries, torpedoing their clients’ competitors down, one by one.

    What’s the Real Purpose Behind Panda 3.3, 3.4 and 3.5?

    Google’s real purpose behind Panda 3.4 through 3.5 is simple: to make money. Small businesses and webmasters that have long held solid, page 1 rankings for their money keywords are suddenly and abruptly seeing their rankings decline, which is leading to decreased sales and hard-hit bottom lines. Many of these are businesses that enjoyed high-quality, high-converting organic search traffic that they were able to procure by paying a small fee to an SEO company to keep them ranked highly.

    Google realized an opportunity: If they could make it more difficult for small businesses to rank well, while at the same time smacking down hundreds of thousands of businesses in the rankings, they could incite a panic-induced stampede to Google’s Adwords pay-per-click auction in an attempt to compensate for lost organic search traffic. This is exactly the effect that Google has had on the industry. Small SEO companies are closing up shop. Small businesses are panicking and fleeing to Google Adwords. At the same time, the influx of new bidders in Adwords is increasing the average cost per click for keywords across every niche, putting more money in Google’s pockets and stripping away profit margins from bidders (small companies).

    This is a smart business move by Google, but it’s a far cry from making the search world a better place, as they claim to be doing. Search results are worse, or just plain different; not better. Small businesses that long enjoyed prosperity are begging to give Google money to get their brand back at the top of search results (albeit, in the “Sponsored” section). Google is flexing its control over the search industry in a way that’s going to suck more money out of small, private businesses and put more money in its own coffers.

    Conclusion

    I hope this guide has been helpful for you, whether you’re just trying to learn more about Google Panda 3.5 or whether you’ve been negatively affected by it. Feel free to reach out or leave a comment!

    What can we help you with?

  5. Google Panda 3.3: Why Your Rankings Dropped and How to Recover

    7 Comments

    Google Panda 3.3 rolled out between February 18, 2012 and February 27, 2012, and the SEO world, along with millions of website owners, small business owners and webmasters have been scrambling to figure out the answers to these two questions:

    • Why did my site lose rankings?
    • What do I need to do to recover?

    Now that the dust has settled, we have some answers as to what happened with your rankings, as well as how to recover.

    What happened:

    Your site was hit by Panda 3.3, the latest iteration of Google’s Panda algorithm. Panda 3.3 was rolled out between 2/18 and 2/27.

    Who was affected by Panda 3.3?

    • Larger sites that had long held rankings in their niche for years
    • Strong authority sites with active or historic link building

    What is Panda 3.3?

    Panda 3.3 is the latest iteration of Google’s Panda algorithm. It specifically targeted unnatural link profiles. It’s not a ranking penalty; rather it’s a loss of rankings due to decreased value of the inbound links pointing to your site. The inbound links were devalued due to a change in the way Google assesses inbound links. Here’s Google’s statement on what changed:

    Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.”

    Here’s the link:
    http://insidesearch.blogspot.com/2012/02/search-quality-highlights-40-changes.html

    Additionally, Panda 3.3:

    • Placed more emphasis on social media, and inbound links from social media sites like Facebook, Twitter, and Google+.
    • Placed more emphasis on onsite content matching up with the source page content of the link
    • Placed more emphasis on the content on source page in order to determine relevance of the outbound link

    Why your site was hit:

    Google looks for patterns that can be programmatically identified and enforced. You were hit because your site fit that pattern; your site wasn’t hand selected or “penalized.” Too many of your inbound links had “exact match” anchor text, meaning that the anchor text of the links was exactly the keywords you were trying to rank for. SEOs and link builders have historically built lots of exact-match anchor text links because exact-match anchor text has always carried a heavy weight in the ranking algorithm. Now, after Panda 3.3, that weight has been significantly reduced. As such, the value of these links was reduced.

    Additionally, Google appears to have implemented a threshold for “too many” exact match anchor text links. When Google deems there to be too many exact-match anchor texts for a particular keyword, it will significantly reduce the value of all of those links. This is worrisome because it opens up the doors to “Negative SEO” or attack-tactics. It’s now presumably possible to “tank” your competitors by building links to their websites with over-optimized exact match anchor text. I hope Google will realize this and try to prevent it from happening, because as of right now, it appears to be possible. If Google doesn’t correct this, I expect to see mercenary “Negative SEO” companies start to sprout up and offer services of tanking competitors out of the rankings.

    Here are two things to specifically look for when reviewing your inbound link profile:

    1) Your link profile might not have enough brand-name anchors. Anchor text that is your brand name or variations of it are one signal to Google of a natural link profile.

    2) Another element of a natural-looking link profile is what’s called “junk” anchor text, LSI anchor text, and naked URLs. These are anchors that say “click here”, “here”, “Website”, “yoururl.com”, “www.yoururl.com”, etc. LSI anchor text are related terms to your target keywords. Google looks for these anchors in backlink profiles as a signal of a natural link profile since most folks link to other sites that way.

    What you need to do to recover from Panda 3.3:

    • To recover from Panda 3.3, you need a link building campaign that includes anchor text with lots of variation, including lots of brand-name anchors, LSIs, junk anchors, and naked URL anchors. You need to continue this until the scales are “tipped back” in your favor and the ratios of exact-match anchors are brought down. When the ratios drop below the threshold (whatever that threshold may be), you will regain some of the value of your links.
    • You need to revisit the textual content on your website and ensure that it uses your keywords for which you’d like to rank well, in addition to variations of those keywords.
    • Make the following changes to your link building campaign to counter and adapt to this algorithm change:
      • Vastly increase the amount and variation of anchor text that you use to build inbound links to your site, and increase the usage of brand name anchors, naked URL anchors, junk anchors and LSI anchors.
      • Add in a social component so that your site will receive inbound links from social sites such as Facebook, Twitter, Google+, Delicious, and more.

    Have I permanently lost all value from the links that were built before Panda 3.3 hit?

    Probably not. Once the scales are tipped back in your favor with a strategic link building campaign that consists of plenty of the anchor types I outlined above, the value from your previously-built links should be restored.

    How long will it take?

    If the scales were only slightly tipped, it could take as few as 2-3 weeks. I would expect it to take 2-4 months on average, with extreme cases requiring 6-8 months of link building work.

    I hope this guide helps you conquer and reverse the hit from Google Panda 3.3! Was your site hit by Panda? Let us know in the comments!

    What can we help you with?

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!

Love,

-The AudienceBloom Team