AudienceBloom

CALL US:  1-877-545-GROW

Category Archive: Panda

  1. 8 Changes You Need to Make After Panda 4.1

    Leave a Comment

    After four months of silence on the Google Panda front after May’s Panda 4.0 update, the next iteration of Panda is here. Referred to as Panda 4.1, the update isn’t big enough to warrant the title of “5.0,” but is significant enough to have search marketers scrambling.

    Building on the intentions of its predecessors, the Panda 4.1 continues Google’s tradition of gradually weeding out low-quality content in favor of well-written, informative, engaging content. Sites with aggregated or copied content, such as lyric databases and medical content hubs, seem to have been hit the hardest by this iteration of Panda, suggesting that Google’s duplicate content detection is becoming more sophisticated. On the flip side, small- to medium-sized businesses with diverse original content are seeing a boost.

    The update started rolling out officially on September 25, 2014, and became active in gradual updates that spanned through the first week of October. Most companies have already seen the gains or losses from this update, so if you haven’t noticed your rankings change in the past few weeks, don’t worry—Panda 4.1 probably didn’t affect you.

    Still, Panda 4.1 has changed the world of search yet again, and if you want to take advantage of it and prepare for the next phases of Google’s evolution, there are several strategic changes you’ll need to make:

    1. Scour your site for duplicate content—and get rid of it.

    articleimage509Scour your site for duplicate content

    Sites with volumes of duplicated content are the ones who have been hit hardest by Panda 4.1. Now is your chance to get rid of the dead weight. Look throughout your site and your blog to find any articles that might be partly replicated from an outside source. Just because you don’t plagiarize work doesn’t mean you’re not at risk—extended quotes, attributed work from outside authors, and paraphrased sections could all register as duplicated material, and could hurt your overall ranks. If you find any content that could be seen as a duplicate from another source, get rid of it.

    2. Do a content audit and remove or improve “thin” content on your site.

    articleimage509Do a content audit

    “Thin” content is a vague term, referring to content that is densely packed with keywords, light on value or specificity, or shoddily written. We’ve all seen content like this, so it should stick out like a sore thumb—especially in comparison to a longer, more detailed piece. Go through your previously published material and review the pieces of content that look like they’ve been scrapped together. You have two options for these pieces: either delete them, or take the time to revise them and turn them into a similar, but more valuable piece.

    3. Adjust your content strategy to include only the highest quality material.

    articleimage509Adjust your content strategy to include only the hi

    Depending on the current level of your content marketing strategy, this change could be enormous or barely noticeable. Moving forward, all your content needs to be of the highest quality—that means based on an original idea, written by an expert, and highly detailed. Don’t worry as much about the frequency of your posts; if a piece of content isn’t as high quality as you’d like it to be, do not publish it. It’s better to have a smaller number of better-quality posts than a greater number of lesser entries. You may be doing this already, but it’s still a good idea to revisit your strategy and see what positive changes you can make.

    4. Add more outbound authoritative links to your content.

    Google wants to see high-quality, authoritative content. If you want to be seen as authoritative, you need to back up your facts and provide references to support your claims. The best way to do that is to provide in-text links pointing to outside, authoritative sites. It’s a way of leveraging the current status of well-established sites to bolster your own authority. As you continue writing new content, experiment with posting more outbound links to build your own credibility. Make sure to use a diverse range of sources to avoid spamming any one source with an excessive number of backlinks.

    5. Include more images in your posts.

    Embedded images in your blog posts do two things: first, they look more enticing to your readership, giving you greater reader retention and more reader satisfaction. Second, they give your content the appearance of detail, and make your content seem more valuable according to Google. Include infographics in the body of your blog posts to illustrate a point with information; if they are original, they’ll naturally attract backlinks and assist your strategy in multiple ways. Otherwise, include any relevant images you can find (as long as they’re legal to use) to complement the text on your page.

    6. Publish author credentials to establish author expertise.

    According to the recent leak of Google’s Quality Rater Guidelines, author expertise is an important factor in evaluating the authoritativeness of a piece of content. Instead of trying to make your content seem like it was written by an expert, have your content actually written by an expert. Include author credentials at the bottom of each published article, identifying the author’s name, title, and area of expertise. If you do this consistently, and offsite content also features this author’s name, you’ll effectively build that author’s authority, and your content will be seen as higher quality. It’s a small change that could add up to major results.

    7. Build more high-quality links to your content.

    Despite all the changes that the Penguin updates have made to the world of backlink building, backlinks are still tremendously important for building a site’s authority. This change is essentially the strategy I covered in point 4, but in reverse. If a high-quality site, such as an information database or a .edu site, links to one of your articles, that article will be seen as much more credible, giving you a Panda-proof boost in authority. If you can incorporate more of these high-authority backlinks into your link building campaign, your domain’s overall authority will significantly increase.

    8. Perform content audits regularly.

    The best ongoing new strategy you can adopt in response to Panda 4.1 is a regular content audit. On a monthly or bi-monthly basis, take an hour to review all the new onsite content that’s been published since your last audit. Carefully review each piece to determine its quality; check for originality, authoritativeness, and level of detail. If any of these pieces does not meet your quality standards, either get rid of it or revise it to make it comply. Doing this regularly keeps you vigilant, and keeps your content quality from ever declining or putting you at risk for another Panda-related drop in rank.

    Google is notorious for keeping online marketers on their toes, and it has continued that reputation with this latest update. With Panda 4.0 coming in May and 4.1 rolling out in September, Google could be back on a quarterly (or near-quarterly) updating pattern, like it was for previous iterations of Panda. If that’s the case, it could mean another major update is on the horizon for December or January.

    Stay sharp and keep your strategy up-to-date, and you’ll likely ride past the next Panda update with no mysterious drops in rank. You might even get a boost!

  2. Here’s Why Retailmenot Got Hit by Google Panda

    Leave a Comment

    Retailmenot.com, a growing coupon-based website backed by Google Ventures, experienced a stunning drop in both search engine rankings and organic traffic back in May of this year. Google Panda 4.0, the name given to May’s major Panda update, supposedly affected only about 7.5 percent of all search queries, but Retailmenot.com took a much bigger hit than expected.

    The incident has raised a lot of questions in the search engine marketing community, particularly focused on how Google Panda 4.0 works, and why Retailmenot.com was hit so hard despite being backed by Google’s own venture capital investment. The problem is multifaceted, but by understanding exactly what happened, you can protect your own web properties against a similar potential drop in the future.

    What Happened?

    articleimage408What Happened

    First, let’s take a look at exactly what happened. According to SearchMetrics, by May 21, Retailmenot.com experienced an approximate drop in organic search visibility of 33 percent. This drop does not measure the amount of organic traffic a website receives, but there is a correlation between organic visibility and organic traffic. Essentially, this metric illustrates a cumulative drop in rankings over several keywords that adds up to a third less visibility.

    In a broader context, 33 percent doesn’t seem so bad. After all, sites like spoonful.com and songkick.com experienced an approximate drop of 75 percent. But Retailmenot.com’s popularity and support from Google Ventures make it an interesting representative of the Panda update. Other sites, such as medterms.com, experienced as much as a 500 percent increase in visibility—so we know the update wasn’t only giving out penalties. So why, exactly, was Retailmenot.com penalized?

    The Mechanics Behind the Drop

    articleimage408 The Mechanics Behind the Drop

    The drop was first acknowledged around May 21, just one day after the rollout of Panda 4.0. There is no question that this update is primarily responsible for the significant drop in Retailmenot.com’s rankings. The Panda Update, which started back in 2011, has a clearly defined purpose: to improve user online experience by eliminating spam and pages with low-quality or minimal content from search results. Since 2011, several new iterations of the Panda update, along with occasional “data refreshes” have been applied to Google’s search algorithms in an ongoing attempt to improve search results.

    Panda 4.0, in May, was the latest “major” update. While the update surely introduced some new ranking signals and algorithm mechanics, the baseline goal of the update is in line with its Panda predecessors: to reward sites with high-quality content and punish those without it. Google is notorious for keeping its algorithms private, so it’s impossible to say exactly which factors were responsible for shaking Retailmenot.com’s visibility, but it seems like an inferior content program was at the heart of it.

    Why It Matters

    First, let’s take a look at why this hit was such a big deal for Retailmenot.com. A drop of 33 percent in search visibility doesn’t seem like that much on the surface; it could be the result of a handful of dropped ranks. The world of search is volatile at best, so occasional drops aren’t that big of a deal for most companies (especially big ones like Retailmenot.com). But this particular drop did have a significant impact on Retailmenot.com’s bottom line.

    CEO Cotter Cunningham reported in a conference call for Retailmenot.com’s Q2 earnings that organic search traffic represented approximately 64 percent of their total visitors—which is a big deal. Cunningham did report that Retailmenot.com has steadily recovered from the initial drop in rankings, but when 64 percent of your customers are affected by a change in Google’s algorithms, you take notice. Their stock price (SALE) closed at $31.04 at the end of trading on May 21, but by May 23, it had dropped to a low of $23.87. Their stock still has not returned to its original levels, but of course this is likely due to several factors.

    Why does this matter to you? Retailmenot.com is just one example of how significant an algorithm change can be. Preparing yourself for possible future changes, and responding immediately to any ranking drops, can help prevent or mitigate the effects of lost organic search visibility. Retailmenot.com wasn’t necessarily engaging in black hat practices; if they were spamming backlinks and posting total junk content, they would have experienced a much larger drop than they did. Instead, it appears as though the volume and quality of their content was just outside of Google’s latest standards, and as we know, those standards are constantly increasing.

    So you know it’s important to protect yourself against major search engine drops like these by committing yourself to providing your users with the best possible online experience. But the Retailmenot.com drop is also significant because it shows us that recovery is possible. CEO Cotter Cunningham also reported in their Q2 earnings conference call that some organic search visibility had been restored, and their revenue was still close to their original target.

    It’s also interesting to consider why Retailmenot.com was hit by Panda despite being backed by Google Ventures. While Google does seem biased in many of its actions (such as adjusting their search engine algorithms to favor content on their own social media platform, Google+), the fact that a GV-backed site was hit by a major update is evidence that Google has unflinching, equal standards for web quality. Let’s hope this unprejudiced stance remains as they continue to roll out more and more changes.

    How to Safeguard Your Site

    articleimage408 How to Safeguard Your Site

    Google Panda 4.0 is over. If you were going to get hit by it, you’ve already seen the damage, so if you haven’t noticed any significant outliers in your search ranking data, you’ve successfully avoided getting hit by Panda 4.0. If you have experienced a penalty from any stage of the Panda update thus far, it’s time to remove those penalties and start making a recovery.

    However, as evidenced by Retailmenot.com’s recent debacle, just because you escaped from a few penalties unscathed doesn’t mean you’ll avoid getting hit by future updates. If you want to make sure your organic search visibility stays where it is and continues to grow, you need to double check your strategy to make sure you’re complying with Google’s standards for user experience:

    • Write high-quality, original content on a regular basis and make it easy for your users to find, read, and share. Explore a diverse range of topics, avoid keyword stuffing, and make sure your subjects are valuable to your readership. Multiple types of content, including writing, images, and videos, are encouraged.
    • Encourage organic backlinking, but don’t spam links to your site. Keep any links you post as natural and beneficial as possible, and if you guest post often—consider using nofollow links to keep Google at bay.
    • Promote your content with social media, and encourage your followers to share and comment.
    • Keep your site as user-friendly as possible with an easy-to-follow navigation, ample opportunities to contact you, fast loading times, and minimal interference.

    Google is somewhat unpredictable, and because updates always come without warning, it’s impossible to completely prevent any possible drop in organic search visibility. Still, if you adhere to best practices consistently and do everything you can to give your users a great experience, you should be able to avoid a hit like the one experienced by Retailmenot.com.

  3. How to Prepare for Penguin 2.0: Take Off that Black Hat!

    3 Comments

    Google Penguin 2.0What do Penguins, Pandas, and black hats have in common? Lots! Penguin is the most recent set of guidelines published by Google designed to clean up abuses in the field of SEO, and a new version is due out soon, according to Google’s Web Spam Czar, Matt Cutts. The impending event has marketers, reputation managers, and webmasters scurrying for cover.

    SEO – A Concept Recap

    SEO (search engine optimization) is the relatively newborn public relations field that tries to increase the visibility of websites by the strategic placement of keywords, content, and social media interaction, and the industry has grown rapidly in a little over a decade.

    Carried to extremes, as such things always are, black-hat SEO is a subdivision within the field that tries to achieve money-making results in an unsustainable way (ie, against Google’s webmaster guidelines). It frustrates the very purpose of a search engine, which is to help users find the information they need. Instead, rampant SEO gone amok serves only the needs of online marketers wishing to increase sales for themselves or their clients.

    To readjust the proper balance, Mr. Cutts and his team of “penguin” police have attempted to establish guidelines that will rule out the most abusive practices of black hat SEO.

    BlackHat SEO – Are You Doing It?

    The predecessor to Penguin was Panda, with much the same purpose. Panda included a series of algorithm updates, begun in early 2011. These were aimed at downgrading websites that did not provide positive user experiences.

    Panda updates of the algorithm were largely directed at website quality. The term “above the fold” is sometimes used to refer to the section of a website that a user sees before one begins to scroll down. The term comes from newspapers, which are delivered folded in two. The section that is “above the fold” is the section one sees before opening the paper, or unfolding it.

    Typically, marketers wish to cram as much eye-catching, commercial material as possible into this section, while responsible journalists wish to pack it with the most relevant and useful information.

    Penguin, on the other hand, is targeted more specifically at keyword stuffing and manipulative link building techniques.

    One targeted abuse, keyword stuffing, is not a tasty Thanksgiving delicacy, but the practice of loading the meta tag section of a site, and the site itself, with useless repetition of certain words. Sites can lose their ranking altogether as a result of such stuffing.

    Abusive practitioners of keyword stuffing are not above using keywords that are rendered invisible because their font color is identical with the background color. The user doesn’t see them, but the search engine spider does. This practice was soon discovered, however, and dealt with by the search engines.

    Meta tags are sometimes placed behind images, or in “alternative text” fields, so that the spiders pick them up while they remain invisible to users. Popular or profitable search keywords are sometimes included invisible to humans, but visible to the search crawlers. Very clever, but also soon discovered and dealt with. With Penguin, Google now analyzes the relevance and subject matter of a page much more effectively, without being tricked by keyword-stuffing schemes.

    “Cloaking” is another tactic that was used for a while to present a different version of a site to the search engine’s crawler than to the user. While a legitimate tactic when it tells the crawler about content embedded in a video or Flash component, it became abused as a Black Hat SEO technique, and is now rendered obsolete by the technique of “progressive enhancement,” which tailors a site’s visibility to the capabilities of the user or crawler. Pornographic sites have often been “cloaked” in non-pornographic form as a way of avoiding being labeled as such.

    The first set of Penguin guidelines and algorithms went live in April 2012, and the second main wave is due out any day now (though Penguin has gone through several periodic updates since its initial release). It’s designed to combat an excessive use of exact-match anchor text. It will also be directed against links from sources of dubious quality and links that are seen as unnatural or manipulative.

    The trading or buying of links will be targeted as well. The value of links from directories and bookmarking sites will be further downgraded, as will links from content that’s thin or poor-quality. Basically, the revision in the algorithms will be designed to rule out content that serves the marketer’s ends rather than the users’.

    Advice For SEO Marketers To Stay Clean

    If you are a professional SEO, the questions to ask yourself are:

    • Is this keyword being added in order to serve the customer’s potential needs, or is it designed merely to increase the number of hits? If the latter, then the additional users that would be brought to the site by the keyword are probably not high-quality conversion potential.
    • Is the added SEO material being hidden from the user or the search engine crawler? If so, with what purpose? If that purpose amounts to dishonest marketing practices, the material runs the risk of getting you in trouble with Penguin.
    • What’s the overall purpose of your SEO strategy? If it’s anything other than increasing sales by enhancing user experience, then you may expect an unwelcome visit from Penguin.

    If you’re a user, you’ll very likely not be as conscious of these changes, except inasmuch as they will alter the look of your search results page when you perform a search in Google. Will the new Penguin algorithms cut down on those ubiquitous “sponsored links” or “featured links”? Probably not. But savvy users know how to ignore those links by now, except of course when they turn out to be useful.

    Will the new algorithms enhance the overall usefulness of the search engine experience? Probably, at least marginally, and perhaps even in a major way. The whole field of internet marketing and e-Commerce is changing so rapidly and radically that it’s hard to keep track of the terminology, especially the proliferation of acronyms. But the ultimate goal will be an enhanced user experience.

  4. Why Duplicate Content is Bad for SEO and What to Do About It

    1 Comment

    With the rollout of Google Panda, we have heard sad stories of sites that have been either devalued or removed from Google’s index entirely.

    One of the reasons for the huge drop in some sites’ rankings has been duplicate content — one of the problems that Panda was released to control.

    Most of the sites that have experienced a drastic decrease in rankings were content farms and article directories; that is, sites loaded with thousands of duplicate articles.

    While it had been made clear that duplicate content was one of the primary things Panda frowns on, some content authors breathed a sigh of relief after Google appeared to say that “There’s no such thing as a ‘duplicate content penalty’ ” in a blog post several years ago.

    But duplicate content remains a controversial issue. It has kept bloggers and webmasters nervous about publishing content that could hurt their rankings. Like many other things, there are two sides to the issue. There’s duplicate content that Google allows and there’s the type that hurts your website’s rankings.

    Let’s try to clear up the difference.

    What type of duplicate content can hurt your rankings?
    To determine whether a sample of duplicate content is going to pull down your rankings, first you have to determine why you are going to publish such content in the first place.

    It all boils down to your purpose.

    If your goal is to try to punk the system by using a piece of content that has been published elsewhere, you’re bound to get penalized. The purpose is clearly deceptive and intended to manipulate search results.

    This is what Google has to say about this sort of behavior:

    “Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.”

    If Google has clear evidence that you are trying to manipulate your search rankings, or that you are practicing spammy strategies to try to improve rankings and drive traffic to your website, it could result to your site being removed from Google’s index.

    The effects on users
    Publishing duplicate content could also hurt your reputation in the eyes of the users.

    The ultimate goal of the search engines is to provide users with the most valuable, most useful, and most relevant information. If you publish a bit of content that has been previously published elsewhere, your site may not show up for the same search, because search engines tend to show results only from the main content sources.

    This explains why the search engines omit duplicate results to deliver only those that the users need.

    When users read content on your site that they have already seen previously on a site that they trust more, chances are their trust in your site will diminish.

    But is there ever a case where duplicate content is acceptable?

    When duplicate content can be acceptable
    There may be instances when duplicate content is accidental, and therefore should not lead to any penalties.

    One such instance is when the search engines index tree identifies separate URLs within a domain that point to a single content. An example is the following trio of URLs: http://abc.com, http://www.abc.com, and http://www.abc.com/index.htm. There’s clearly no indication of manipulation or intent to spam in this case.

    Another case of legitimate duplication occurs when a content sample is published in several different formats to cater to specific users. With the explosion of mobile web browsing, content is now published to suit desktops, tablets, and mobile phone web users. Publication of a single content in several formats is not subject to any penalties for duplicate content.

    Also, keep in mind that there are instances when publishing a copy of a piece of content, in part or in whole, is needed for reference, such as when citing a news source. If the purpose is to reference or to add value to users, such content duplication is not subject to penalties.

    Avoiding duplicate content that provokes the Panda’s wrath
    Simply put, avoiding duplicate content is your best defense against any duplicate content penalties administered by Google Panda. Remember that Google and other search engines strive to provide search results that are unique and of high quality.

    Your goal must therefore be to publish unique and original content at all times.

    However, if duplication cannot be avoided, below are recommended fixes that you can employ to avert penalties:

    Boilerplates. Long boilerplates or copyright notices should be removed from various pages and placed on a single page instead. In cases where you would have to call your readers’ attention to boilerplate or copyright at the bottom of each of your pages or posts, insert a link to the single special page instead.

    Similar pages. There are cases when similar pages must be published, such as SEO for small and big businesses. Avoid publishing the same or similar information. Instead, expand on both services and make the information very specific to each business segment.

    Noindex. People could be syndicating your content. If there’s no way to avoid this, include a note at the bottom of each page of your content that asks users to include a “noindex” metatag on your syndicated content to prevent the duplicate content from being indexed by the search engines.

    301 redirects. Let the search engine spiders know that a page has permanently moved by using 301 redirects. This also alerts the search engines to remove the old URL from their index and replace it with the new address.

    Choosing only one URL. There might be several URLs you could use to point to your homepage, but you should choose only one. When choosing the best URL for your page, be sure to keep the users in mind. Make the URL user-friendly. This makes it easier not only for your users to find your page, but also for the search engines to index your site.

    Always create unique content. Affiliates almost always fall victim to the convenience of ready-made content provided by merchants. If you are an affiliate, be sure to create unique content for the merchant products you are promoting. Don’t just copy and paste.

    Conclusion
    Whatever your intent is, the best way to avoid getting penalized by Google Panda is to avoid creating duplicate content in the first place. Keep in mind that quality is now at the top of the search engines’ agenda.

    It should be yours too.

  5. SEO Mistakes the Panda and the Penguin Forbid You to Commit

    Leave a Comment

    For Google and other major search engines, quality and reliability of information are key to user satisfaction. These elements also empower the search engines to thrive as they seek to provide better data to users in terms of quality, relevance, and authority.

    And who is king of the SEO hill?

    Google, of course. And Google shows no sign of loosening its stranglehold on the universe of SEO.

    Via one algorithmic update after another, Google is wreaking havoc on people who found loopholes in the system to advance their personal interests. For years, these smooth operators devised tricks to manipulate their way to the top of search engine results pages.

    With the Panda and Penguin updates, Google may have finally patched the holes that allowed spammers to litter search engine results with garbage.

    More recently, Google rolled out newer versions of the Panda and Penguin updates. In the hope of making the Internet a better place to host and find high-quality and extremely useful information, Google supplied webmasters and business owners with guidelines to help them play the SEO game in a fairer manner.

    So let’s talk about some of the mistakes that every webmaster and online business owner should avoid so as not to get slapped by Google. We’ll also discuss some recommendations on how to optimize your site properly for Panda and Penguin.

    But first, a brief review of what the two major Google updates are all about.

    Google Panda
    The Panda was the first of the two major overhauls that Google rolled out in less than two years. It offered an initial glimpse of how the mighty Google intended to provide better search engine results.

    The main goal of Panda was to sniff out sites that carry low-quality content — or in Panda-speak, “thin” content. What Google Panda generally looked for were sites that had obviously spammy elements such as keyword stuffing, duplicate content, and in some cases, high bounce rate.

    Google Penguin
    Although at first it might have sounded cute and cuddly to Internet users, the Penguin quickly showed them otherwise. This update zeroed in on sites that were over-optimized in terms of backlinking.

    One of the most widely practiced link-building tactics prior to Penguin’s appearance was to use exact-match keywords for anchor texts. The problem with this tactic is that Penguin reads it as an unnatural linking practice.

    To promote natural backlinking, Penguin set out to penalize sites that routinely used exact-match keywords for anchor texts, and rewarded those smart enough to employ variations in their keywords.

    The top SEO mistakes you should avoid at all times
    Now that you have been reminded of what Panda and Penguin want and how they’d like us to play the SEO game, keep the following pitfalls in mind to avoid seeing your site take the deep plunge down the search results pages.

    1. Using mostly exact-match keywords for backlinks
    This used to be one of the most effective ways of getting a site to rank higher in search results. These days, this strategy can still be recommended, but with caution. Now that Penguin is policing the info highway, using mostly exact-match keywords is a sure way to get your site devalued.

    To gain or maintain favorable ranking, observe natural link-building best practices. Post-Penguin SEO calls for you to vary your keywords by using related terms. If you are optimizing for “baby clothing,” for example, use keyphrases such as “kids’ clothing,” “clothing for babies,” etc. It’s also a good idea to use your brand’s name as anchor text.

    The primary thing to remember is to link naturally. Don’t be too concerned about failing to corner exact-match keywords that you think could hugely benefit your business. After all, Google is moving toward latent semantic indexing (LSI), which puts related keyphrases into consideration for smarter indexing.

    2. Generating most of your traffic via only one marketing channel
    Many marketers, especially new ones, tend to assume the only way to gain a huge amount of highly targeted traffic is by focusing time and energy on a single marketing channel. Some only use SEO, while others concentrate their efforts on social media marketing.

    Focusing your attention on one channel could bring success in terms of gaining some very targeted traffic, but with regard to ranking, it could actually hurt you, especially since the Panda and Penguin rollouts.

    Again, diversity should be used not only in keywords and keyphrases, but also to drive traffic to your site. Apart from SEO, the smart way to drive traffic to your site will involve use of the following tactics:

    • Article marketing
    • Social media pages for your business
    • Guest posting
    • Social bookmarking
    • Forum posting and blog comments
    • Press release

     

    By diversifying your traffic sources, you will create a natural way for your audience to find your business at different online locations — a signal that will get you favorable rankings in search.

    3. Failing to take advantage of internal linking
    Even worse is not doing any internal linking at all. Internal linking not only improves the users’ experience; it’s also good for onsite SEO.

    With strategic and meaningful internal linking, you will make it easy for your users to find their way around your site and locate the information they want. Your users will also have more good reasons to linger on your site as they consume more information related to what they are looking for.

    Proper internal linking also enables search engine spiders to determine which content is related to other content.

    Proper internal linking can be executed by including the following:

    • Standard navigation above the fold — more specifically, above the content
    • Category section on sidebar
    • Related posts on sidebar or below each post
    • Links within each post that point users/readers to related content
    • Sitemap

     

    4. Publishing content with very little value
    In Google Panda-speak, this is known as “thin” content. Panda rolled out to hammer sites that carry duplicate information and that promote content which offers very little value or information to users. Such sites are often stuffed with keywords and overly promotional.

    To avoid getting smacked by the Panda’s giant paw, think critically about the value your users are likely to get from your content: Is it unique? Will it help them solve their most pressing concerns? Will the content fulfill its promise?

    Conclusion
    Are we seeing the beginning of the end of SEO manipulation? Let’s hope so.

    As Google shoves spammers way down its search results, the hope is that Google’s first few pages will feature nothing but extremely valuable and useful information that really meets users’ expectations. And as a site owner and online entrepreneur, you can depend on Google Panda and Penguin to improve your standards as you strive to deliver what your audience is looking for.

    For more information on properly optimizing your site, contact us and we’ll show you your options to make your site Google compliant.

     

  6. The Google Panda 20 Update: Some Key Information

    Leave a Comment

    Google is making everyone aware of the company’s relentless drive to supply better and more useful information.

    Following a series of Panda and Penguin updates, Google Panda #20 was released on September __, 2012. This was the first major update since July 24, 2012. More Panda updates are expected to be released within the next few months, and future updates will be more consistent.

    Unlike other recent releases, Panda #20 was a fairly major update – one that ran for almost a week. In fact, some 2.4% of English queries were affected and about 0.5% in other languages.

    Another interesting thing about Panda #20 is that it overlapped with another algorithmic update dubbed the EMD update, which Google rolled out to target sites with low-quality, exact-match domain names.

    This made it tricky for affected SEOs and site owners to determine which update had hit them. Was their site hit for failing to comply with Google Panda standards, or for having a poor exact-match domain name?

    Panda was released to devalue sites with “thin content,” or content that offers minimal value. Since its release last year, tons of sites have seen a dramatic drop in rankings. Some, especially notorious content farms, have been removed from Google’s index altogether.

    Panda also targeted sites that contained duplicate content. As a result of Panda’s release, black-hat SEO practices have been significantly thrashed. Sites that churn out hundreds of pages with duplicate content were obliterated.

    The release of Panda, as well as its equally ferocious sibling Penguin, met with a few complaints as well. Years of hard work and substantial amounts of marketing dollars to push a site to the top of Google rankings were effectively tossed out the window. SEOs, publishers, and site owners, believing they had been following recommended SEO best practices, cried foul.

    The hard lesson we can all learn in the aftermath of Google’s algorithmic changes is that, while it’s true that quality is subjective, standards have been laid out.

    The stress on quality and authority

    Quality and relevance of information is at the heart of every substantial change rolled out by Google. To ensure that every site is on the same page with Google, they have laid out guidelines for site owners to follow.

    As far as Panda is concerned, as long as your site’s content is original and offers quality and useful information, you should be fine.

    As long as the content strongly relates to the topic and offers great value to your audience, there wouldn’t be any reason for Google to slap you.

    Do your link-building activities follow the prescribed or accepted methods? Have you been linking to authority sites, and are authority sites linking back to yours? Do you make a point of regularly checking your link profiles for any potentially damaging links?

    There’s no way of telling how many more of Google’s Panda updates are coming in the future, but Matt Cutts has made it clear that Google Panda will be updated and refreshed on a continual basis. This shows how committed Google is to making the Internet a better and more reliable avenue for gleaning valuable information.

    Conclusion

    It’s crucial to keep abreast of the periodic algorithmic changes that Google rolls out. Keeping yourself on the same page with the search engines is vital to the success of your online business.

    If you need help keeping your site compliant with current SEO best practices, contact us. You can also subscribe to our feed to keep yourself in the SEO loop.

  7. Google’s Panda Hidden by EMD Update

    Leave a Comment

    Panda hiding just like Google's Panda update didSix months ago the SEO world was turned upside down when Google released the Panda update. Since then, it’s basically been all about upping your game with the content you provide on your site. Just last week, September 27th, a new Panda update took place. Didn’t notice? Don’t feel bad.

    Google wasn’t exactly transparent about it happening. In fact, it was pretty much shrouded by the EMD update. Yep, the EMD update and the new Panda update seemingly overlapped, leaving a lot of people confused. People who didn’t have an EMD site watched their sites tank in search results and were scratching their heads.

    So if you have a site that has dropped in rank (or possibly jumped up) then it very well could have been the new Panda. Obviously that’s probably the case if it isn’t an EMD. Were you hit? If so, there’s at least a silver lining you can reach for…

    Recovering from Panda

    Luckily, recovering from Panda doesn’t seem to be as difficult as some of the other things you could get hit with. With Panda, Google’s sole is to give users better quality sites in the search results. If you know that it’s Panda that you got hit with, then focus your efforts on increasing the quality of your site.

    What’s kind of funny though, is that if you were hit with the EMD update then a lot of your focus should also be on quality. Because if your quality doesn’t live up to the standards and your domain name is an EMD then you’re playing with fire. But theoretically, if you have an EMD and amazing content that gives the user a great experience then you likely won’t fall into the EMD hole.

    Content

    Provide good quality content that is helpful for your users. What this content is specifically is likely to vary according to your industry or niche. Any textual content should be written so that it’s easily understood by your readers.

    • Using shorter sentences helps.
    • Using words that you don’t need a degree to understand helps.
    • Using subheadings to break up the content into section helps.
    • Using bullet lists (like this one) helps.

    The quality of your content is vital. Even if just a page or two on your site is thin (not very much content), poorly written content or content that doesn’t provide any real value then your whole site can suffer. Panda won’t just penalize that page and make sure it doesn’t rank highly, it can take it out on your entire site.

    Get your tooth and comb out. Go through every single page on your site and make sure that it has plenty of helpful content. If it doesn’t, then revise, add to it… if need be, you can merge some different pages so that it’s one page with lots of awesome content your readers get help from.

    I’ve even seen some speculate that Google is using the Flesch Reading Score, which is one of several ways to “score’ the readability of your content. If you’re interested, you can paste your content in and get your Flesch Score (and others) here.

    Navigation

    Your site needs to be easy to use for both the user and for the search engines. This means that users should easily be able to find what they’re looking for and depending on your site, you may need different sitemaps or Schemas.

    If your sitemap becomes a monster page with tons of links then consider breaking it up into several of them. Remember it doesn’t have to link to every single page on your site, though. It should be an easy starting point to get to all the major sections on your site.

    If your business sells photography then your site probably is very image or video rich. If that’s the case, then you definitely need to be using Schema.org and use video sitemaps. On the guidelines (linked below) you will find links to video and image specific information that will help.

    Navigating your site should be super simple for any user. You should link to other relevant and helpful pages on your site on each page of your site, but don’t go crazy with them. You don’t want to have 20 links on one single page. But 2 links that go to further information or related information to help the user is perfectly fine (and encouraged).

    OK finally… whatever you do, don’t simply rely on others to figure out what to do with your site. Even if you hire someone to do it for you, you should at least read the design and content guidelines by Google. They aren’t detailed, but you will give you an idea and at the end of the day – it’s your business. It’s your responsibility, right?

    Conclusion

    Unlike with other updates, there are many stories of sites that have bounced back to where they were after tweaking their site and sometimes came back even stronger. Exactly what needs to be done will depend on your industry and what you have/haven’t done on the site. If you feel you got chewed up and spit out by Panda, contact us and we can help.

  8. Google Penguin: 5 Recovery Facts You Need to Know

    5 Comments

    Google Penguin WarriorNow that Google Penguin has had some time to sink in, we have an opportunity to reflect on what exactly Penguin changed, the aftermath of Penguin, and (most importantly), what steps should be taken to recover from a Penguin penalty.

    As I previously wrote, Penguin targeted inbound link profiles. While Panda 3.3 and 3.4 devalued certain elements of anchor text (most notably, exact-match anchor text), Penguin actually slapped a penalty on it. With access to client data as well as dozens of folks who’ve reached out and asked for link profile audits, I’ve seen some specific trends that are undeniable. In this post, I’ll discuss the trends that I see in every Penguin-affected link profile, while pointing out supporting evidence from other SEO gurus and webmasters that have noted similar trends. My goal is to definitively outline what Google Penguin changed, and what you (SEOs and webmasters) should do about it.

    Fact #1: Reconsideration Requests Won’t Help You

    Penguin is an algorithmic penalty, and Matt Cutts has stated that reconsideration requests won’t help you for algorithmic penalties. Matt Cutts explains the difference between algorithm and manual penalties in the video below. At right around the 2:00 mark, Matt Cutts explains that reconsideration requests won’t do you any good if you have an algorithmic penalty.

    Search Engine Land confirms this as well, here:

    However, Google says this is an algorithmic change — IE, it’s a penalty that’s applied automatically, rather than a human at Google spotting some spam and applying what’s called a manual penality. Because of that, Google said that reconsideration requests won’t help with Penguin.

    Fact #2: Your Inbound Link Profile is Probably What’s Hurting You

    Assuming you’re not engaging in some obviously shady onsite keyword stuffing, your inbound link profile is what caused your rankings to drop if your site took a dive on or around April 24th. Multiple sources have backed this up, including Search Engine Watch in this article:

    The Penguin algo seems to be looking at three major factors:

    • If the majority of a website’s backlinks are low quality or spammy looking (e.g., sponsored links, links in the footers, links from directories, links from link exchange pages, links from low quality blog networks).
    • If majority of a website’s backlinks are from unrelated websites.
    • If too many links are pointing back to a website with exact match keywords in the anchor texts

    Google changed the way its algorithm calculates value based on inbound links. With the rollout of Panda 3.3, anchor text was severely devalued. Penguin actually added a penalty for over-optimized inbound anchor text. Google did this to make it harder for SEOs to get their clients ranked well in the search engines, in the hopes that those clients would turn to Google Adwords instead.

    Fact #3: Removing Bad Links Will Help

    Since Penguin is an algorithmic penalty, and the spam flag in the algorithm is related to an unnatural link profile, then it makes sense to remove the links that could be causing this algorithmic flag. In the video above, at the 0:46 mark, Matt Cutts says:

    “So, if your site is affected by an algorithm, for the most part, if you change your site, whatever the characteristics are that’s flagging, triggering, or causing us to think you might have keyword stuffing, or whatever, if you change your site, then after we’ve re-crawled and re-indexed the page, and some period after that when we’ve re-processed that in our algorithms, for the most part your site should be able to pop back up or increase in its rankings.

    So, I think it’s safe to assume that removing bad links will help.

    Note: At AudienceBloom, we offer inbound link profile audits. We can comb through your inbound link profile, diagnose why your website fell in the rankings, suggest specific links to remove, and outline a plan of action moving forward. Call or Contact Us for a quote!

    Fact #4: Diluting Your Inbound Link Profile with New Links Will Help

    There are multiple case studies that have been published (and I have verified them with my own clients’ data) that conclude that the primary problem plaguing sites affected by Penguin is over-optimized anchor text for their primary target keywords. One of the most popular studies published was the one from MicroSite Masters here, stating:

    What anchor text should you be using? From the data we’ve evaluated, “MySiteDomain.com”, “MySiteDomain”, “http://mysitedomain.com”, “http://www.mysitedomain.com”, “The Title of My Website”, “here”, and “the title of one of my H1′s (that isn’t a keyword I’m trying to rank for)”, were generally used as anchors on sites that were not affected by the most recent Google update and are probably a good starting point to consider using moving forward.

    If you’ve read my other blogs, you’ll see that on March 9th (nearly 2 months before the MicroSite Masters case study) I publushed this article stating:

    Another element of a natural-looking link profile is what’s called “junk” anchor text, LSI anchor text, and naked URLs. These are anchors that say “click here”, “here”, “Website”, “yoururl.com”, “www.yoururl.com”, etc.

    SEONitro published another case study concluding the following:

    [Anchor text] is probably the biggest post Panda/Penguin disqualifier as in most sites we will have researched did not diversify their link anchor density and were hit hard this go around with the “exact match” dial down. As we go up to Case Study #2 we find that our affected sites have VERY LITTLE brand incoming links.

    Jonathan Leger published a case study concluding that exact-match anchor text for sites that are ranking well post-Penguin is about 10% on average:

    It probably won’t come as much of a surprise for me to tell you that the average EMA for a site is pretty low — just 10% across all of the markets.
    Also, if you’re not using EMDs, it’s important to diversify your anchor text a lot. How much is “a lot” really depends on your market. So do the research. Check out the link profiles of other ranking sites in your market and see what their anchor text looks like.

    TopMarketingStrategies.com published a lengthy observation packed with valuable insight as well:

    Another option is that you can simply try to “dilute” the anchor text optimization by adding more links to that page with very diverse anchor text. This is the more likely option for most and it is what most are doing because it’s easier and cheaper (most of the time).

    UniqueArticleWizard.com published an article with their observations and tips for recovering from Penguin in this article:

    One of the most notable updates in Penguin was an “Over Optimization” algorithm. This update specifically targeted Keyword Anchor Text as it relates to the underlying link.

    What are some things that can help you recover from Penguin?

    1) Get new additional links with generic anchor text for your site
    2) Build no more than 50% of your backlinks with targeted anchor text

    Neil Patel wrote a great article explaining his findings over at QuickSprout:

    One of the unnatural link building signals that Panda 3.4 aims at is too many exact anchor text links. Standard practice used to be you’d aim for about 30% to 50% matches…now that numbers dropped drastically. So test the waters out and start with 5% or so and increase slowly.

    On June 8th, Search Engine Land published its latest thoughts on Penguin:

    If you think that you were hit by Penguin, I recommend building a few links to your site with diversified anchor text. Stay away from exact match anchor text.

    On June 21st, ArticleRanks sent out an email to its list with Penguin recovery advice:

    Start diversifying your anchors to the point that you are totally diluting the anchors you already have in place, then on future penguin refreshes the filter will be lifted from your site. We have already recovered a couple of sites like this.

    Time for a shameless plug! At AudienceBloom, we offer SEO link building packages aimed at diluting your inbound link profile. If you’re suffering from a post-Panda 3.3 or Penguin penalty, we can help!

    Fact #5: Take Action, Have Patience, and Everything Will be Just Fine

    SEO is a rapidly changing industry. Since its inception, SEOs have been doing their best to analyze and adapt to search engine algorithms. When a major update like Penguin comes along and tanks your website’s traffic, the urge to give up may be strong; it may feel like you’ve just lost everything you’ve worked for. But in reality, it’s not about how many times you get knocked down. It’s about how many times you get back up.

    But it’s not just about getting back up and holding your ground. Without taking action, your website won’t see any recovery, as we learned from Matt Cutts’ video above.

    Conclusion:

    So, what can you do right now to take action and recover from Google Penguin?

    1. Audit your link profile and diagnose why your website lost its rankings

    2. Remove bad links identified in the audit

    3. Engage in a new SEO link building campaign to dilute your current inbound link profile and get a healthy mix of diverse anchor text links.

    I hope the information I’ve outlined here helps you recover from Google Penguin. If you found it helpful, leave a comment!

  9. How to Recover Your Rankings After Panda 3.3, 3.4, 3.5, or Penguin – Delete or Dilute

    18 Comments

    Since I wrote my last post about Google’s Penguin update (the updated  name for Google’s webspam algorithm change), I’ve been asked how to diagnose the reason for your site losing its rankings (Pandalization, if you will — or is it ‘Penguinization’ now?). So in this post, I’m going to outline a specific, step-by-step method via case study for how to do the following:

    • Diagnose why your site dropped in the rankings, and;
    • Recover your site’s rankings in Google

    Diagnosing why your site’s rankings dropped

    If your site’s rankings dropped around February 19th, March 23rd, or April 24th, then you’re likely a victim of Panda 3.3 (February 19th), Panda 3.4 (March 23rd), or Panda 3.5/Penguin (April 24th). All of these algorithm changes specifically targeted inbound link profiles. Google’s goal with each of these was to make it more difficult to get a website ranked well using inbound links. For years, inbound linking tactics have dominated the SEO industry for one simple reason: They work really, really, well.

    However, after Panda 3.3, the link building game has changed. What used to work is no longer working. Google swiftly smacked hundreds of thousands of sites out of top ranking spots that they had long enjoyed, in the hopes of creating a panic-induced mass migration to Google Adwords, thereby driving up bid prices and putting more money in Google’s pockets.

    Anyway, since we know that the recent Panda/Penguin algorithm updates were related to your backlink profile, we know where to start with our analysis. Let’s go over some of the pre-Panda 3.3 link building best practices:
    Old (Pre-Panda 3.3) Link Building Best Practices:

    • Exact-match anchor text: 30-45% of overall inbound link profile
    • Link Quantity: The more, the merrier
    • Link Quality: Higher PR pages and root domains are better
    • Link Velocity: Steady or increasing, month over month
    • Source anchor text matches destination content: Unnecessary
    • Source URL content matches destination URL content: Unnecessary
    • LSI anchor text: 5-10% of overall inbound link profile
    • Junk/universal anchors, Naked URLs, and Branded Anchors: 5-10% of overall inbound link profile
    • Nofollow anchors: Unnecessary

     

    New (Post-Panda 3.3) Link Building Best Practices:

    • Exact-match anchor text: 1-5% of overall inbound link profile
    • Link Quantity: The more, the merrier
    • Link Quality: Higher PR pages and root domains are better
    • Link Velocity: Steady or increasing, month over month
    • Source anchor text matches destination content: Important
    • Source URL content matches destination URL content: Important
    • LSI anchor text: 20-30% of overall inbound link profile
    • Junk/universal anchors, Naked URLs, and Branded Anchors: 70+% of overall inbound link profile
    • Nofollow anchors: 10-20%

     

    Clearly, Google has turned the link building industry on its head. Hundreds of thousands of webmasters and SEO companies that followed pre-Panda 3.3 link building best practices were hit with an “unnatural links” warning from Google Webmaster Tools and terrorized with abrupt losses of rankings, leading to huge declines in traffic, sales, and bottom lines.

    Of particular importance was Google’s change to the “Exact match” anchor ratio. Whereas more exact-match anchors was previously a golden ticket to the top of Google’s search results, this golden ticket became a warrant for your arrest after Panda 3.3. Websites with backlink profiles that included over-optimized anchor text (as a ratio of the overall inbound link profile) were smacked into oblivion by Google.

    Clearly, after Panda 3.3, Google implemented some sort of threshold for what they feel is an appropriate ratio of anchor text that any website should have. Websites with a higher ratio than this threshold saw the value of all of those links completely discounted — it was as if they didn’t exist anymore.

    This is also what opened the door for Negative SEO — the practice of tanking your competitors out of the rankings. Simply build thousands of crappy links with the same anchor text to your competitor’s site, and BAM; they’re out. Previous to Panda 3.3, this tactic would have either helped your competitor slightly or done nothing to them at all. Google was smarter than that. Unfortunately, that doesn’t appear to be the case anymore.
    Anyway, now that you know what ratios of anchors you should be looking for in your site’s backlink profile, it’s time to get a list of your site’s backlinks. The top two tools right now are the following:

    1. Majestic SEO - (free to use on your own site, or monthly subscriptions available to analyze your competitors’ sites)
    2. Open Site Explorer - (Monthly subscription required)

     

    For the purpose of this tutorial, let’s use Majestic SEO.
    Step 1: Create an account

    Step 2: Follow the instructions to link your website to Majestic SEO (you’ll need FTP access to your site)

    Step 3: Type in your website’s URL & Click “Explore”

    Majestic SEO

    Majestic SEO’s main search screen

    Step 4: Review your site’s foundation stats

    Majestic SEO dashboard

    Majestic SEO’s report dashboard

    What’s important to look for here is a good ratio of referring domains to external backlinks. For example, as of the time of this post, AudienceBloom.com has 1,436 backlinks from 566 referring domains, giving us a unique linking domain ratio of 39.41%. Generally, you want to aim for 30% or higher.

    Step 5: Dig in — click “Create Report”

    Be sure to use the “fresh index” and select a domain-level report. It should complete immediately, after which you can navigate to the “reports” section, click the report, and then click “More detailed anchor text report here.” On the next page, click “Export report CSV” and download the CSV report so we can dig into the data.

    Step 6: Visualize the data

    Open your CSV file and get rid of all the columns except “AnchorText” and “TotalBackLinks”, then sort the columns by Total Backlinks from largest to smallest.

    Note: from here on out, I will be using data from a website that got Pandalized recently — this is not AudienceBloom.com’s link profile.

    Anchor text counts sorted from largest to smallest

    Anchor text counts sorted from largest to smallest

    Next, use the pie graph option (on Excel’s “insert” menu) to create a visual pie graph of your data.

     

    Anchor text visualized on a pie graph

    Anchor text visualized on a pie graph

    Ahem. Well there’s your problem. More than half of this site’s link profile is comprised of three keywords (clearly the keywords they were targeting). It should now be clear why this site fell out of the rankings for these keywords in Google. This is definitive evidence that this site lost its rankings because of an over-optimization flag on its backlink profile.

    Recovering your site’s rankings in Google: Delete or Dilute

    Now that you’ve confirmed that you have an over-optimization flag on your site’s backlink profile, you have two options for fixing it:

    1. Delete or remove all or most of your inbound links containing the over-optimized anchor text, or;
    2. Dilute your existing inbound link profile with a new link building campaign that focuses on building brand anchors, junk/universal anchors, LSI anchors, and naked URLs

     

    It’s often extremely difficult, and sometimes impossible, to delete or remove existing inbound links. Most of the time, you’ll have no control over them. You can send emails to webmasters in a futile attempt to get them to care about you and your site’s rankings, but this often is too time-consuming, too tedious, and yields too few successes. A more feasible alternative is to dilute your existing link profile.

    All other variables equal, Google values more recent links higher. This is because a more recent link is a more timely “vote” of confidence. And that’s exactly the way Google views links — as votes. Because Google has more respect for newer links, it’s possible to quickly dilute your existing link profile with new links. The idea is to increase the size of your overall inbound link profile in order to reduce the ratio of your exact-match anchor text.

    ***Shameless self plug alert*** – We at AudienceBloom offer link building packages designed to dilute your existing inbound link profile, for the purpose of recovering from Panda/Penguin over-optimization penalties. Whether you’re looking for link building for an individual company or you’re an agency with your own clients, we have solutions for you, and we’d love to work with you!

    If you do decide to go the “delete” route, then you’re going to need a list of domains on which your links currently reside, as well as an an anchor text count for each domain (so you know where to target your efforts). Luckily, you can also obtain this information from Majestic SEO.

    Step 1: Log back into your Majestic SEO account

    Step 2: Return to your report and click “More detailed report on referring domains here.”

    Step 3: Dig in

    Go ahead and click “Export Report CSV” and open up the CSV file so we can manipulate the data. Delete every column except “RefDomain”, “TotalBackLinks” and “AnchorText”. Next, sort by “TotalBackLinks” from largest to smallest.

    Domains sorted by anchor text

    Domains sorted by anchor text

    You should now have a clear picture of who you need to reach out to in an attempt to get your links removed.

    Step 4: Start contacting webmasters

    Now that you know what domains are harboring the majority of your offending links, you need to reach out to them, one by one, and politely ask them to remove the links they have to your site. Often, if there are hundreds or thousands of links to your site from a single domain, it means your website is in a footer, sidebar, blogroll, or other site-wide link. This type of link is easily removed by the webmaster, provided they actually heed your request.

    Conclusion

    I hope this guide on recovering your rankings from the recent Panda/Penguin/webspam algorithm updates has been helpful. If you find it useful, leave a message in the comments!

  10. Panda 3.5 (*Update* – “Penguin”) – What Changed from 3.4, and How to Recover

    13 Comments

    4/26 Edit – Google has confirmed that this update is going to be called “Penguin.” Matt Cutts even tweeted this photo:

    Brace for impact. As I write this, Google is rolling out the latest rendition of its Panda algorithm — Panda 3.5 (Editor’s note: Now being called “Penguin”). Here’s Google’s official announcement:

    In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content.

    It seems to be up for debate right now as to whether this is an update of Google’s Panda algorithm (ie, Panda 3.5) or whether it’s a standalone update. Either way, In this post my goal is to provide an in-depth analysis on the following:

    • What Google’s saying in its announcement
    • How Panda 3.5 is affecting search results
    • How to recover from Panda 3.5
    • Other possible repercussions of Google’s latest algorithm change
    • My analysis on Google’s real purpose for rolling out this algorithm change

    What Google’s Saying

    Let’s break down Google’s announcement and dive into the details of what they’re trying to accomplish with Panda 3.5.
    Google:
    The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.
    Analysis: 
    Google is aware that it’s easy to increase rankings by amassing lots of inbound links and loading up your website with keyword terms and LSI (related) terms for desired keyword rankings.
    Google:
    The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded.
    Analysis:
    Google wants high-quality, information-rich, user-friendly websites to appear in its search results. It doesn’t consider keyword-stuffed websites to be a quality source of information for its users. Additionally, Google wants to stop rewarding sites with high rankings that got there by manipulating its algorithm with crappy inbound links.
    Google gives the following screenshot as an example of keyword stuffing:
    Keyword stuffing

    Google's example of keyword stuffing

    I’ve seen this type of thing many times, and I’m disgusted whenever I see it. People use software to spit out this garbage and then either publish it on another website with a link back to their money site, or they put it below the fold of the page they want to get ranked in the search engines. The goal is to get as many keywords and related keywords (ie, LSI keywords) on the page as possible in order to prove to Google that the page is relevant and should rank well for the target keyword.
    Google follows up with this screenshot:
    Link Spam

    Google's example of link spam

    This example is clearly a page from a blog network. Blog networks are popular and effective link building tactics, but Google doesn’t like them. In this example, the content isn’t even well-written — it’s clearly spun by computer software.
    Google:
    The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice.
    Analysis:
    Google said that Panda 3.4 would affect 1.2% of queries, but it clearly affected way more than that; it rocked the SEO industry. If they say Panda 3.5 affects 3.1% of queries, then this update could have a much bigger impact than Panda 3.4 did. This would be the biggest update since Panda 1.0 itself.

    How Panda 3.5 is Affecting Search Results

    Search results for various queries appear to have changed, but they don’t appear to be better. In fact, they appear to be much worse. Let’s take a look at a few examples:

    Search term: “new shoes”

    • Rank 1: A Youtube video for the song “new shoes” by Paolo Nutini. – who? I don’t know who Paolo Nutini is, and props if you do. Furthermore, I’m looking for new shoes and Google gave me a video as the top search result? C’mon, man.
    • Rank 2: An intro to marketing class. – WTF? What the hell does this have to do with new shoes?
    Search term: “make money online”
    • Rank 1: makemoneyforbeginners.blogspot.com – Seriously? This site is blank. As in, zero (0) posts. It’s ranking #1 for a search term with 110,000 global exact searches/month, and it’s blank.
    • Rank 2-5: Nothing useful, littered with Adsense ads.
    • Rank 6: zzzprofits.com – What? This is a forum directory with barely any posts. Nothing related to making money online or even remotely useful here.
    • Rank 7: gurucreation.com – This is a list builder site. The owner is just trying to get folks to give up their email address so he can build his email list.
    Search term: “raw dog food”
    • Rank 2: A book on Amazon. – A book about raw dog food? Is Google just getting into bed with Amazon here or does Google really think I’m looking for a book?
    • Rank 5: mudbay.us – I can’t find anything about raw dog food on this site. It’s not even mentioned on the homepage. I’m clueless as to why Google is ranking it #5 for this term.

    How to Recover from Google Panda 3.5

    As I discussed in a previous blog post, Google is targeting inbound link profiles with all their recent Panda updates (3.3, 3.4 and 3.5). If you’ve been victimized by this latest algorithm change, it’s due to one of the following factors:

    1. Too many inbound links with exact-match anchor text.
    2. Too many inbound links from “webspam” content.
    3. Not enough “trust” links, such as links from Facebook, Twitter, and social bookmarking sites. These are also known as social signals.
    You have two recovery options: Delete or dilute.
    1. Delete most or all of your inbound links with exact match anchor text.
    2. Dilute your existing link profile with a new link building campaign aimed at building plenty of LSI keywords, naked URLs, brand anchors and junk/universal anchors. (For more information on what each of these are, please read my previous post, in which I go into detail about each one.
    ***Shameless plug alert*** If you’d like, we offer link building packages aimed at diluting your existing inbound link profile in order to help you recover from Panda 3.3, 3.4 or 3.5. Whether you’re a small business or an agency with clients of your own, we can help you out.

    Other Possible Repercussions of Panda 3.5

    Google has made it clear that it doesn’t like “webspam” and it doesn’t like the sites that host it (the publishers) or the ones that use it to their benefit (the advertisers). Does this mean that it’s now possible to “tank” your competitors by throwing lots of crappy, spun content up at various blog networks that link to your competitor’s website? Is Google making Negative SEO a reality?

    I’ve already seen various reports that Negative SEO is working. I really hope Google hasn’t made it possible to tank competitors with nasty links. If so, I expect SEO companies to morph into SEO mercenaries, torpedoing their clients’ competitors down, one by one.

    What’s the Real Purpose Behind Panda 3.3, 3.4 and 3.5?

    Google’s real purpose behind Panda 3.4 through 3.5 is simple: to make money. Small businesses and webmasters that have long held solid, page 1 rankings for their money keywords are suddenly and abruptly seeing their rankings decline, which is leading to decreased sales and hard-hit bottom lines. Many of these are businesses that enjoyed high-quality, high-converting organic search traffic that they were able to procure by paying a small fee to an SEO company to keep them ranked highly.

    Google realized an opportunity: If they could make it more difficult for small businesses to rank well, while at the same time smacking down hundreds of thousands of businesses in the rankings, they could incite a panic-induced stampede to Google’s Adwords pay-per-click auction in an attempt to compensate for lost organic search traffic. This is exactly the effect that Google has had on the industry. Small SEO companies are closing up shop. Small businesses are panicking and fleeing to Google Adwords. At the same time, the influx of new bidders in Adwords is increasing the average cost per click for keywords across every niche, putting more money in Google’s pockets and stripping away profit margins from bidders (small companies).

    This is a smart business move by Google, but it’s a far cry from making the search world a better place, as they claim to be doing. Search results are worse, or just plain different; not better. Small businesses that long enjoyed prosperity are begging to give Google money to get their brand back at the top of search results (albeit, in the “Sponsored” section). Google is flexing its control over the search industry in a way that’s going to suck more money out of small, private businesses and put more money in its own coffers.

    Conclusion

    I hope this guide has been helpful for you, whether you’re just trying to learn more about Google Panda 3.5 or whether you’ve been negatively affected by it. Feel free to reach out or leave a comment!

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!

Love,

-The AudienceBloom Team