CALL US:  1-877-545-GROW

Category Archive: Penguin

  1. How Guest Blogging Is Affected after Penguin 2.0 and 2.1

    Leave a Comment


    Guest blogging is popular because it creates a win-win situation for everyone. It works well to drive traffic back to a blog. It works well to give a new blogger exposure on other more established blogs. It works well to provide blog owners fresh content. And it works well to gives readers interesting content from a new perspective.

    There are other benefits, too.

    For one thing, guest blogging is a great vehicle to allow good writers to quickly become popular bloggers. It’s a way to build your reputation even if you have only recently started blogging and are not getting many visitors to your blog. If you write on a specific niche well—in a clear and authoritative way—then you are attracting positive interest with your work. You are also winning the silent endorsement of the blogger who shares your content with his or her audience.

    For another, guest blogging is way of building backlinks back to a site. Consequently, it provides excellent search engine optimization. But that’s not all. Guest blogging, when done right, gives readers high quality, relevant content.

    In fact, guest blogging has become so popular that many top blogs now are inundated with requests for guest posting.

    Why, then, if guest blogging is so beneficial, providing a wonderful service for writers, publishers, readers and even search engines would Google try to discourage guest blogging?

    Actually, Google is not targeting guest blogging. If guest blogging is in some way hampered by Penguin 2.0 or 2.1, then it is just an accidental casualty to a major larger issue Google is trying to tackle with its change in algorithms.

    New Algorithms Will Not Wipe Out Guest Blogging

    After the dust from Penguin 2.0 and 2.1 settles it may put a dent in this unique win-win strategy for driving traffic back to a website, but, all things considered, while it may damage the enthusiasm shown for guest blogging, it will not be devastating enough to stop it.

    It’s unlikely that we are going to see articles and reports on the demise of guest blogging.

    While Google is not exactly targeting guest blogs, some devaluing may occur to backlinks and author ranking as a result of the new algorithms.

    It’s still too early to say what has changed, but four trends are slowly showing up.

    How Life As We Know It May Change


    Here are four probable ways that Penguin 2.0 and 2.1 will affect life as we know it on the Internet.

    1. Google will be more discriminating about rewarding social signals.

    In the past, social signals were a sign that a guest post was popular. After all, if people liked something they might be inclined to share it with friends and followers on Facebook, Twitter, and the other popular social media platforms.

    Why, then, would Google want to discourage this level of sharing and enthusiasm?

    It’s because Google’s unstated prime directive is relevancy and with many people gaming the social signal system by buying hundreds, even thousands, of Tweets or Likes on cheap outsourcing websites like Fiverr and other clones, the relevancy test will fail.

    Google’s battle is not actually against people sharing content that they like. It’s really a battle against bots and spam purveyors who are mimicking real Tweets and real Likes.

    One sign that an author is gaming the system is the author’s reluctance to mention their guest posts on their own social media platforms. Google reasons that an author who wrote a great piece is inclined to share it with his friends and followers on some major social media websites while an author who had already purchased tweets and likes would probably not be bothered to share it on his or her own social profile.

    Still, although things may look grim, one way to resolve Google’s loss of confidence in social signals is to write great posts, share them on your own profiles in social media, and ask influencer’s to do the same. At some point, Google’s algorithms will probably be able to distinguish between real human approval and the footprints left by bots.

    2. Google will be looking at author rank.

    A new benchmark is author rank. This will show Google where you are publishing your content and how well people like it on the Internet. While author rank can work in your favor, you also run the risk of having your rank devalued.

    Here are two ways to resolve this potential risk of getting your author rank devalued:

    • First, write great content that will naturally attract social signals and comments. Avoid short, hackneyed, uninteresting or even spun content. In other words, stuff that people will not want to read if they can find something better.
    • Second, get your backlinks from multiple content creation strategies rather than from guest blogging alone.

    It’s important to note that with the new algorithms, publishing on an authority website alone is no guarantee that your author rank will be safe.

    3. Google will give less weight to links.

    Before the change in algorithms, a link was a link. Now, Google questions links, deciding which to value and which to devalue.

    If you write a long piece, full of valuable information, then the link will probably be valued, but if you post an infographic, the link will probably be devalued.

    What this change basically means is that the rapid link building methods deployed by SEO experts will now be under suspicion.

    4. Lack of link relevance could be penalized.


    The worst penalty Google can mete out is to devalue your site. This may happen if you create irrelevant links.

    What is an irrelevant link? Suppose, you wrote a guest post on the best places to invest in 2014. Then relevant links might be those going out to Huffington Post, Forbes, and the Wall Street Journal. Relevant links might also include reference to pertinant dictionaries and encyclopedias like Investopedia or Wikipedia. An irrelevant link would be linking your article to a car website, a plastic surgeon’s website, or a dog grooming website.

    While you could probably create irrelevant links a few times, if you do it often enough, your site itself might be devalued. Google would rightly argue, based on the latent semantic index of your guest post, about the relationship between the world of investments and the world of car dealerships, cosmetic surgery, and pet care.

    Don’t Panic, Write Well

    Ultimately, what Google is trying to do is in line with what they have always tried to do—encourage quality content and discourage poor content on the Internet. Quality content is one that informs and educates. Great content wins attention and approval in the form of social media sharing, blog comments, and other forms of popularity. Alternatively, if content is just written for the sake of keywords, backlinks, and other SEO favors, then under the new algorithms, it will be more quickly detected and devalued. Guest blogs that show signs of poor content will suffer. But guest blogs that give circumstantial evidence of good content will actually rise to the surface. The best strategy, then, is to continue to write well and avoid appearing to game the system, either accidentally or intentionally.

  2. Penguin 2.0: What Happened, and How to Recover

    1 Comment

    If you’ve spent any time recently in the world of SEO, you’ve probably heard about Penguin 2.0 — Google’s search engine algorithm change that was just launched on May 22nd, 2013. By the way that some SEOs were talking, you’d think it was the Zombie Apocalypse. Whatever it is, you can be sure that it will have a dramatic change on the web landscape. Here are five important questions and answers about Penguin 2.0.

    What is Penguin 2.0?

    To understand the 2.0 of anything, you need to understand the 1.0. The original Penguin is the moniker for Google’s algorithm update of April 24, 2012. When Google tweaked the algorithm in a big way, 3.1% of all English-language queries were affected by the update. Penguin was carefully designed to penalize certain types of webspam. Here are some of the main factors that Penguin targeted:

    1.  Lots of exact-match anchor texts (30% or more of a link profile)

    2.  Low quality site linkbacks, including directories and blogs

    3.  Keyword intensive anchors

    The aftershocks of Penguin continued long after April 24. Several mini Penguins were released since then, which is why some SEOs prefer to call the coming change “Penguin 4.” The new Penguin is predicted to do the following:

    • Penalize paid links, especially those without “nofollow”
    • Penalize spamdexing in a more effective way
    • Penalize advertorial spam.
    • Tightening penalties on link spamming/directory listings
    • Removing hacked sites from search engine results
    • Boost ranks for sites that have a proven authority within a niche

    How much different is it from Penguin 1.0?

    Calling this Penguin 2.0 is slightly misleading. We shouldn’t think of algorithm changes in the same way we think of software updates — better features, faster architecture, whatever. Penguin is not a software update. It’s a change in the way that a search engine delivers results to users.

    Here is a brief explanation of search engines, and how they change. Search engines are designed to give people the most accurate, trustworthy, and relevant results for a specific search query. So, if you type in “how to cook lima beans,” the search engine attempts to find the very best site on the Internet to help you cook your lima beans. Obviously, every lima bean recipe site wants to have the top dog spot on the search engine results page (SERP).

    Some webmasters will cook up clever tricks to do so. Thus, a site with hordes of banner ads, hordes of affiliate links, and barely a word about cooking lima beans could, with a few black hat techniques, climb in the rankings. The search engine doesn’t want that. They want people to have their lima bean recipe — great content — not just a bunch of ads.

    Thus, they change things deep within the algorithm to prevent those unscrupulous tricks from working. But the slithery lima bean site figures out a new way to slip by the algorithm. And the algorithm figures out another way to block them. And so on, and so forth.

    As all of this is happening, several key points emerge:

    1.  Search engine algorithms become more sophisticated and intelligent.

    2.  It becomes less likely for sites to game the system.

    At AudienceBloom, we follow white-hat SEO principles. We understand that there are a few tricks that we could use that might bump your site higher in the short term. However, we don’t engage in those practices. We want our clients to be successful for the long haul, which is why we engage in SEO techniques that are truly legitimate.

    What’s going to happen? 

    Now that Penguin 2.0 is rolling out, one of two things will happen to your site (as Google’s data centers propagate with the algorithm rollout and your rankings are adjusted accordingly):

    1. Nothing.

    2. Your rankings will drop, organic traffic will tank, and your site will begin to flounder.

    If, unfortunately, number 2 strikes, you may not realize it for a few days unless you are a big site with 10k+ visits with 30%+ organic a day.  In order to answer “what’s going to happen” for your site, you need to understand whether or not your site is in violation of any Penguin 2.0 targets. That question is better answered with an entire article of its own, but here are a few warning signs that your site could be targeted by Penguin 2.0.

    • You’ve had unscrupulous link building efforts conducted on your site.
    • You purchased paid links from another site (e.g., advertorials)
    • You rely on spam-like search queries (for example “pay day loans,” “cheap computers,” “free gambling site,” etc.).
    • You have aggressively pursued link networks listings on unreliable directories.

    Each of the above four points are common SEO tactics. Some SEOs have become sneakier than the algorithm, which is why Google is making these important changes.

    What should I do to prepare or recover?

    The most important thing you can do right now is to follow Matt Cutt’s advice in his recent video:

    “If you’re doing high quality content whenever you’re doing SEO, this (the Penguin update) should not be a big surprise. You shouldn’t have to worry about a lot of changes. If you have been hanging out in a lot of blackhat forums, trading different types of spamming package tips and that sort of stuff, then this might be a more eventful summer for you.”

    Content is the most important thing, of course, but that’s more of a proactive preparation than an actual defense. Is there a way to actually defend yourself from the onslaught of Penguin 2.0? What if you’ve already been affected by it?

    One important thing you can do right now is to analyze your site’s link profile to ensure that your site is free of harmful links. Then, you should remove and perform disavow requests on the bad links to keep your site’s inbound link profile clean. This is the equivalent of a major surgery on your site, and it could take a long time to recover. Here’s what Matt Cutts said about it on May 13:

    Cutts on Penguin 2.0

    Here are the steps you need to take to recover from Penguin 2.0:

    Step 1. Identify which inbound links are “unclean” or could be hurting your rankings (ie, causing you to be affected by Penguin 2.0). To do this, you’ll need to perform an inbound link profile audit (or have us do that for you).

    Step 2. Perform major surgery on your site’s link profile in order to make it as clean as possible. This includes removing links identified in the link profile audit, and then disavowing them as well.

    Step 3. Build new inbound links using white-hat tactics like guest blogging, while abiding by proper anchor text rules with your new inbound links.

    Step 4. Establish a content calendar to keep pushing out high-quality content, engage in social media, and avoid spammy techniques of any kind.

    If you’re looking for SEO help, AudienceBloom is prepared to help. One of our major efforts in the wake of Penguin 1.0 was helping sites to recover their rankings and clean up from their past. If you’ve been hit by Penguin 2.0, now is the time to take action to recover your rankings and search traffic. Contact us for a complimentary assessment and action plan.

  3. How to Prepare for Penguin 2.0: Take Off that Black Hat!


    Google Penguin 2.0What do Penguins, Pandas, and black hats have in common? Lots! Penguin is the most recent set of guidelines published by Google designed to clean up abuses in the field of SEO, and a new version is due out soon, according to Google’s Web Spam Czar, Matt Cutts. The impending event has marketers, reputation managers, and webmasters scurrying for cover.

    SEO – A Concept Recap

    SEO (search engine optimization) is the relatively newborn public relations field that tries to increase the visibility of websites by the strategic placement of keywords, content, and social media interaction, and the industry has grown rapidly in a little over a decade.

    Carried to extremes, as such things always are, black-hat SEO is a subdivision within the field that tries to achieve money-making results in an unsustainable way (ie, against Google’s webmaster guidelines). It frustrates the very purpose of a search engine, which is to help users find the information they need. Instead, rampant SEO gone amok serves only the needs of online marketers wishing to increase sales for themselves or their clients.

    To readjust the proper balance, Mr. Cutts and his team of “penguin” police have attempted to establish guidelines that will rule out the most abusive practices of black hat SEO.

    BlackHat SEO – Are You Doing It?

    The predecessor to Penguin was Panda, with much the same purpose. Panda included a series of algorithm updates, begun in early 2011. These were aimed at downgrading websites that did not provide positive user experiences.

    Panda updates of the algorithm were largely directed at website quality. The term “above the fold” is sometimes used to refer to the section of a website that a user sees before one begins to scroll down. The term comes from newspapers, which are delivered folded in two. The section that is “above the fold” is the section one sees before opening the paper, or unfolding it.

    Typically, marketers wish to cram as much eye-catching, commercial material as possible into this section, while responsible journalists wish to pack it with the most relevant and useful information.

    Penguin, on the other hand, is targeted more specifically at keyword stuffing and manipulative link building techniques.

    One targeted abuse, keyword stuffing, is not a tasty Thanksgiving delicacy, but the practice of loading the meta tag section of a site, and the site itself, with useless repetition of certain words. Sites can lose their ranking altogether as a result of such stuffing.

    Abusive practitioners of keyword stuffing are not above using keywords that are rendered invisible because their font color is identical with the background color. The user doesn’t see them, but the search engine spider does. This practice was soon discovered, however, and dealt with by the search engines.

    Meta tags are sometimes placed behind images, or in “alternative text” fields, so that the spiders pick them up while they remain invisible to users. Popular or profitable search keywords are sometimes included invisible to humans, but visible to the search crawlers. Very clever, but also soon discovered and dealt with. With Penguin, Google now analyzes the relevance and subject matter of a page much more effectively, without being tricked by keyword-stuffing schemes.

    “Cloaking” is another tactic that was used for a while to present a different version of a site to the search engine’s crawler than to the user. While a legitimate tactic when it tells the crawler about content embedded in a video or Flash component, it became abused as a Black Hat SEO technique, and is now rendered obsolete by the technique of “progressive enhancement,” which tailors a site’s visibility to the capabilities of the user or crawler. Pornographic sites have often been “cloaked” in non-pornographic form as a way of avoiding being labeled as such.

    The first set of Penguin guidelines and algorithms went live in April 2012, and the second main wave is due out any day now (though Penguin has gone through several periodic updates since its initial release). It’s designed to combat an excessive use of exact-match anchor text. It will also be directed against links from sources of dubious quality and links that are seen as unnatural or manipulative.

    The trading or buying of links will be targeted as well. The value of links from directories and bookmarking sites will be further downgraded, as will links from content that’s thin or poor-quality. Basically, the revision in the algorithms will be designed to rule out content that serves the marketer’s ends rather than the users’.

    Advice For SEO Marketers To Stay Clean

    If you are a professional SEO, the questions to ask yourself are:

    • Is this keyword being added in order to serve the customer’s potential needs, or is it designed merely to increase the number of hits? If the latter, then the additional users that would be brought to the site by the keyword are probably not high-quality conversion potential.
    • Is the added SEO material being hidden from the user or the search engine crawler? If so, with what purpose? If that purpose amounts to dishonest marketing practices, the material runs the risk of getting you in trouble with Penguin.
    • What’s the overall purpose of your SEO strategy? If it’s anything other than increasing sales by enhancing user experience, then you may expect an unwelcome visit from Penguin.

    If you’re a user, you’ll very likely not be as conscious of these changes, except inasmuch as they will alter the look of your search results page when you perform a search in Google. Will the new Penguin algorithms cut down on those ubiquitous “sponsored links” or “featured links”? Probably not. But savvy users know how to ignore those links by now, except of course when they turn out to be useful.

    Will the new algorithms enhance the overall usefulness of the search engine experience? Probably, at least marginally, and perhaps even in a major way. The whole field of internet marketing and e-Commerce is changing so rapidly and radically that it’s hard to keep track of the terminology, especially the proliferation of acronyms. But the ultimate goal will be an enhanced user experience.

  4. Caution: Don’t Overreact to Penguin (or Any Other Google Change)

    Leave a Comment

    If you got Penguin slapped, the last thing you wanna do is rush in to try to fix things without first making sure of what you’re doing. Soon as the big Google updates started to hit, people panicked.

    Even some so-called professional SEO service providers freaked out. Then there were the SEO specialists who decided this would be a great time to take advantage of people: they rushed out and promised near-instant recovery from these updates. They assured people “I can fix it for you!” — when in reality, they were only chasing $$$ signs.

    Don’t jump on board and take everything you read to heart. Check out some of the big mistakes people made (and are still making).

    Calling a Complete Halt to Backlink Building

    No! No! No! Don’t believe it when someone tells you that backlinks don’t matter anymore. Think about this: If Google didn’t place any value on backlinks as a ranking factor, then why did they get so strict about which links they’re willing to count? If anything, that seems to emphasize the importance of backlinks, to my eyes. How about you?

    Google is getting stricter about which links they place value on because links do matter. Google wants everyone to produce over-the-top, priceless content each and every time; content that will bring mass attention and leave everyone wanting to share your material with everyone they know.

    Really? Yeah, like that’s gonna happen. It just isn’t that easy to generate those “natural” backlinks they wanna see.

    So yes, be aware of what Google is cracking down on. Read the Google Webmaster Guidelines updates (as rare as they are) and stay away from those links that hover in the dark, beckoning you to buy them. They may be enticingly cheap, but they’re that cheap for a reason. In sum, don’t-don’t-don’t stop working to gain valuable quality links.

    Racing as Fast as Possible to Remove Backlinks

    There’s an underlying, irrational fear that’s driving people to remove lots of backlinks. True, there are cases when that’s perfectly justifiable. But only if you have shady, cheap backlinks you’ve paid for or clearly identified as part of a negative SEO campaign out to hurt you. (The latter is rare; and even more rare is for one of those campaigns to achieve its intended effect — but it’s been known to happen).

    It is heartwrenching to watch a small business that relies on its website do further damage by using the disavow tool or actually paying people to remove links that weren’t really doing it any harm. Not all backlinks are bad! Just because you didn’t create, initiate, or generate the link doesn’t mean it’s hurting you. Remember, Google wants you to have links that appear naturally without any involvement from your side.

    So before you start erasing parts of your backlink profile or disavowing links, please — please! — make sure you know what you’re doing. Don’t just take one person’s opinion as gospel truth.

    Ditching Any Kind of Anchor Link Usage

    Anchor text. It was kind of a buzzword of 2012, wasn’t it? Everywhere you turned after the Penguin arrived, people were talking about how using keyword anchor text for links got them buried in the search results. Truth is, using a multitude of anchor texts has always been the smart way to do things. If 99% of your backlinks say “Dallas DUI attorney,” it doesn’t look natural at all.

    Instead, a chunk of your backlinks should say “Dallas DUI attorney,” but another chunk could say “Dallas attorney for DUI charges”; another could say “this attorney,” another could say “his website here,” etc.

    If your links are coming from related pages on other sites, Google’s pretty good at picking up on that and knowing what your site is about. Even with anchor text that has nothing to do with your keywords “click here,” if it’s surrounded by text that talks about DUI attorneys or something similar then, yeah, they’ll know.

    Note: One of the newest “hot things” that’s getting a lot of talk is co-occurences — sometimes called co-citations. There’s evidence that you don’t have to get a true HTML’d link. If an article talks about what to do when you get a DUI and they say visit (your site) just like that, with no actual link, that helps your ranking just as much as a real link. Google has a patent pending that deals with identifying phrases and “used to retrieve and rank documents” here. Whether it pans out to be related to this or not, we’ll just have to wait and see. What do you think it really means?


    So these are just a few of the rash moves that site owners have been making. We hope that you’ll take heed. If you aren’t sure about your backlink profile or what you should be doing with it, we can help you. Visit our contact page and get in touch.

  5. SEO Mistakes the Panda and the Penguin Forbid You to Commit

    Leave a Comment

    For Google and other major search engines, quality and reliability of information are key to user satisfaction. These elements also empower the search engines to thrive as they seek to provide better data to users in terms of quality, relevance, and authority.

    And who is king of the SEO hill?

    Google, of course. And Google shows no sign of loosening its stranglehold on the universe of SEO.

    Via one algorithmic update after another, Google is wreaking havoc on people who found loopholes in the system to advance their personal interests. For years, these smooth operators devised tricks to manipulate their way to the top of search engine results pages.

    With the Panda and Penguin updates, Google may have finally patched the holes that allowed spammers to litter search engine results with garbage.

    More recently, Google rolled out newer versions of the Panda and Penguin updates. In the hope of making the Internet a better place to host and find high-quality and extremely useful information, Google supplied webmasters and business owners with guidelines to help them play the SEO game in a fairer manner.

    So let’s talk about some of the mistakes that every webmaster and online business owner should avoid so as not to get slapped by Google. We’ll also discuss some recommendations on how to optimize your site properly for Panda and Penguin.

    But first, a brief review of what the two major Google updates are all about.

    Google Panda
    The Panda was the first of the two major overhauls that Google rolled out in less than two years. It offered an initial glimpse of how the mighty Google intended to provide better search engine results.

    The main goal of Panda was to sniff out sites that carry low-quality content — or in Panda-speak, “thin” content. What Google Panda generally looked for were sites that had obviously spammy elements such as keyword stuffing, duplicate content, and in some cases, high bounce rate.

    Google Penguin
    Although at first it might have sounded cute and cuddly to Internet users, the Penguin quickly showed them otherwise. This update zeroed in on sites that were over-optimized in terms of backlinking.

    One of the most widely practiced link-building tactics prior to Penguin’s appearance was to use exact-match keywords for anchor texts. The problem with this tactic is that Penguin reads it as an unnatural linking practice.

    To promote natural backlinking, Penguin set out to penalize sites that routinely used exact-match keywords for anchor texts, and rewarded those smart enough to employ variations in their keywords.

    The top SEO mistakes you should avoid at all times
    Now that you have been reminded of what Panda and Penguin want and how they’d like us to play the SEO game, keep the following pitfalls in mind to avoid seeing your site take the deep plunge down the search results pages.

    1. Using mostly exact-match keywords for backlinks
    This used to be one of the most effective ways of getting a site to rank higher in search results. These days, this strategy can still be recommended, but with caution. Now that Penguin is policing the info highway, using mostly exact-match keywords is a sure way to get your site devalued.

    To gain or maintain favorable ranking, observe natural link-building best practices. Post-Penguin SEO calls for you to vary your keywords by using related terms. If you are optimizing for “baby clothing,” for example, use keyphrases such as “kids’ clothing,” “clothing for babies,” etc. It’s also a good idea to use your brand’s name as anchor text.

    The primary thing to remember is to link naturally. Don’t be too concerned about failing to corner exact-match keywords that you think could hugely benefit your business. After all, Google is moving toward latent semantic indexing (LSI), which puts related keyphrases into consideration for smarter indexing.

    2. Generating most of your traffic via only one marketing channel
    Many marketers, especially new ones, tend to assume the only way to gain a huge amount of highly targeted traffic is by focusing time and energy on a single marketing channel. Some only use SEO, while others concentrate their efforts on social media marketing.

    Focusing your attention on one channel could bring success in terms of gaining some very targeted traffic, but with regard to ranking, it could actually hurt you, especially since the Panda and Penguin rollouts.

    Again, diversity should be used not only in keywords and keyphrases, but also to drive traffic to your site. Apart from SEO, the smart way to drive traffic to your site will involve use of the following tactics:

    • Article marketing
    • Social media pages for your business
    • Guest posting
    • Social bookmarking
    • Forum posting and blog comments
    • Press release


    By diversifying your traffic sources, you will create a natural way for your audience to find your business at different online locations — a signal that will get you favorable rankings in search.

    3. Failing to take advantage of internal linking
    Even worse is not doing any internal linking at all. Internal linking not only improves the users’ experience; it’s also good for onsite SEO.

    With strategic and meaningful internal linking, you will make it easy for your users to find their way around your site and locate the information they want. Your users will also have more good reasons to linger on your site as they consume more information related to what they are looking for.

    Proper internal linking also enables search engine spiders to determine which content is related to other content.

    Proper internal linking can be executed by including the following:

    • Standard navigation above the fold — more specifically, above the content
    • Category section on sidebar
    • Related posts on sidebar or below each post
    • Links within each post that point users/readers to related content
    • Sitemap


    4. Publishing content with very little value
    In Google Panda-speak, this is known as “thin” content. Panda rolled out to hammer sites that carry duplicate information and that promote content which offers very little value or information to users. Such sites are often stuffed with keywords and overly promotional.

    To avoid getting smacked by the Panda’s giant paw, think critically about the value your users are likely to get from your content: Is it unique? Will it help them solve their most pressing concerns? Will the content fulfill its promise?

    Are we seeing the beginning of the end of SEO manipulation? Let’s hope so.

    As Google shoves spammers way down its search results, the hope is that Google’s first few pages will feature nothing but extremely valuable and useful information that really meets users’ expectations. And as a site owner and online entrepreneur, you can depend on Google Panda and Penguin to improve your standards as you strive to deliver what your audience is looking for.

    For more information on properly optimizing your site, contact us and we’ll show you your options to make your site Google compliant.


  6. Google Penguin Update 3 Unleashed


    We are seeing drastic changes in how search engines deliver information. Google in particular is striving to deliver not just the most relevant information, but the most useful and high-quality content.

    Google has released a series of algorithmic updates aimed at placing sites that promote high-quality content ahead of the pack in search results. The most notable of these updates are Google Panda and Google Penguin.

    On October 5, 2012, Google unleashed the latest version of its anti-webspam algorithm called Penguin. Widely known as Penguin 3 and announced by Matt Cutts via Twitter, it announced a refresh that affects sites in several languages:

    “Weather report: Penguin data refresh coming today. 0.3% of English queries noticeably affected. Details:”

    0.3% may sound like a tiny figure, but it’s a significant signal that should put everyone on alert that the Penguin update is here to stay, and it’s hell-bent on providing search results that are more relevant and more useful.

    A brief timeline of Google Penguin updates

    Among Google’s algorithmic updates, perhaps none was more striking than Google Penguin. Below is a brief timeline of the Penguin update:

    • April 24, 2012 – The first Google Penguin was released. Almost instantly, many sites saw an incredible drop in rankings.
    • May 26, 2012 – Penguin 1.1 was released, affecting less than 0.1% of English sites.
    • October 5, 2012 – Penguin 3 was announced and released, with about 0.3% of English sites affected by the latest refresh.

    How does Penguin 3 fit into the series of Penguin updates?

    Penguin is all about ridding the SERPs of spammy websites. If they were overly optimized with low-quality content and links, sites were heavily penalized by the first two versions of Google Penguin.

    Penguin 3 has had the same effect on search queries as Penguins 1 and 2, but Cutts was explicit this time about the effect of the latest refresh.

    In his tweets, Cutts revealed the size of Penguin 3’s impact on queries in English and several other languages. Below are some of the data on how much Penguin 3 affected searches in various languages:

    • English – 0.3%
    • Spanish – 0.4%
    • Italian – 0.3%
    • French – 0.4%

    What’s noticeably different?

    SEOs and webmasters are probably wondering what’s different in the SERPs as a result of Penguin 3’s release.

    With the release, Matt Cutts gave us a clear idea of where we should look for the results of the latest algorithmic refresh: the changes will be “above the fold.”

    This means that while the previous two Penguin updates affected the entire first page of the SERPs, Penguin 3 made observable changes within the top 5 search results.

    Should you be worried?

    If your site has not been severely affected by the previous two Penguin releases, you shouldn’t have any worries. You have probably been doing SEO right as far as Google’s recent updates are concerned.

    If you’ve been hit by the previous releases but you made changes to your link-building activities in accordance with current SEO best practices, you should be fine.

    Just remember that the key to surviving any update Google throws at you is to make sure your site promotes high-quality, relevant, and extremely useful content; maintains natural link-building practices; and keeps informed of all timely developments in the Search Engine Optimization industry.

    In other words, stay on top of things!


    I hope this post has provided you with useful information on the Google Penguin 3 update. If you need help making sure your site stays compliant with current SEO standards, contact us. We would love to chat with you about how we can help.

  7. The Penguin Rises Again

    Leave a Comment

    From the depths of Google’s secret society, the Penguin has just risen yet again. When the Penguin was originally released, Google told us that it was not a one-time thing. They told us that it would evolve into being part of their ongoing algorithm and would refresh now and then.

    What is the Penguin Update?

    You probably already know all about the zoo Google has been growing… pandas, penguins and more. But just to make sure we don’t leave anyone behind, the Penguin update originally came about back in April. Matt Cutts first presented it as an update that would penalize ‘over optimization’. This was confusing to a lot of people.

    Later he clarified and said that was a bad way to describe it. The Penguin update was meant to penalize sites that use webspam and “blackhat” SEO to manipulate search results. When it hit, the internet buzzed with confused site owners. They were ticked off and didn’t know what to do.

    Since this was an algorithm change and not something where Google was manually reviewing sites, site owners were not given the opportunity to submit reconsideration requests. Instead, if they felt their site had been wrongly penalized they could either talk about it in the Google webmaster forum or use this form.

    What’s worse is that no one really knew for sure what the ‘guidelines’ were. No one knew for sure just what it was about their site they should be concerned about. Then, because this was an algorithm change that would be refreshed (at some point, but no one knew when) you couldn’t make changes on your site and immediately see how they affected your rankings in order to make Google happy.

    Instead, you were forced to pretty much guess; make changes and hope for the best when the refresh came around, eventually, which we see now came about six months after the original Penguin. That’s a long time to wait for recovery for a small business.

    The New Penguin Refresh

    The Penguin refresh has hit and now site owners have a little better idea of what the Penguin wants to see on a site. Apparently, it could be both on-site and off-site factors.

    Some people claim that they’ve done nothing about existing backlinks, but instead focused on the site itself. They’ve had some success.

    On the other hand, Marie Haynes still believes it’s pretty much the backlinks that are solely to blame for a small business website she has been talking to. She talks about it here. This site had tons of bad backlinks that a ‘SEO company’ had convinced him he needed and she believes the only solution for him is to start fresh with a new site (which has been mentioned by many people for the best way to recover from Penguin). That’s very sad and frustrating if it proves to be true.

    Speaking of Bad Backlinks…

    What about negative SEO? What if he hadn’t hired the SEO company to build all those links, but a competitor did it to tank his site? (Referring to Marie’s post)

    This is the main reason that Google is considering disavowing backlinks within Google Webmaster Tools. Bing already offers the ability to do so. However, there are potential downsides and ways even that could be manipulated as discussed here.

    If there was a way to remove the downsides, would it really still be viable? Google already devalues a good chunk of bad backlinks. So if you run out to have these bad backlinks removed, it may not even make any difference if they’ve already been discounted, right? There are a few reports of removing bad backlinks and seeing a recovery from Penguin, but there are far more saying there’s been no change regardless of the links they’ve removed.

    Mass Confusion & Conclusion

    While all the what if’s for the Penguin update and refreshes can be confusing, there’s even more. Many site owners are in a state of absolute hysteria. Within one week, we’ve seen a Penguin refresh, Panda update and the EMD update… so the very first step to recovery is figuring out just what you got hit with. Was it one of these updates or was it a site-specific penalty? If one of the mass updates, which one and why?

    If you’ve suffered from a drop in search rankings, we can help. Contact us and we’ll get you on the path to recovery.

    Photo via Cnystrom @ Flickr

  8. 6 On-Page Optimization Best Practices For the Post-Penguin SEO World


    It’s still all about Penguin, isn’t it?

    Yes, but that’s because I’d like to arm you with as much information as possible, so instead of battling Pandas and Penguins, you can cuddle with these cute animals; after all, I don’t know about you, but I’d rather make love than war.

    As you may already know, Google Penguin is just getting started. Additionally, we’re in the midst of a new era of SEO where traditional SEO is becoming more seamlessly intertwined with social media.

    And who knows how much more of Google Penguin we’ll see in coming days, weeks, or months.

    We’ve covered the basics as far as recovering from Google Penguin is concerned. We know that these days, more than ever, the only legitimate way to attain rankings is to provide quality and relevant content to users, in order to obtain links naturally.

    No more tricks, says the Penguin.

    In this post, I’ll share with you six on-page optimization best practices that conform to Google Penguin’s guidelines. I’ll focus on key optimization considerations that will help you create a more reputable image for your site both in the eyes of your audience and of the search engines. Let’s get started.

    Keyword density (keyword what?)

    Not long ago, SEOs were concerned about keyword density, or the number of keyword occurrences as a ratio of the overall number of words on the page. The acceptable keyword density used to be somewhere between 2% and 4%, which meant that for every 100 words, a specific keyword (note that I use “keyword” interchangeably with “keyphrase”) should occur two to four times.

    Today, however, keyword density is no longer a ranking factor (was it ever?). Search technology has tremendously evolved over the years to recognize the relevance of certain content to a topic.

    SEOs now advocate the use of Latent Semantic Indexing (LSI). Simply put, LSI refers to the use of relevant terms to the content’s target keywords. So if you’re gunning for the keyword “consumer electronics”, you can use that exact keyword within the first few sentences of the content and then use related terms such as “gadgets”, “electronics”, etc. throughout the rest of the content.

    Tip: You can find LSI terms for any given keyword by using Google’s Keyword Tool. Alternatively, you can perform a search for your keyword in Google and then scroll to the bottom of the first page, where you’ll see some other suggested search terms. These suggested search terms are LSI terms for the keyword you queried.

    While I still recommend using your target keywords exactly within the post title, meta description tags, and in the first and last paragraph of your text body copy, I highly recommend using as many varying LSI terms within the content’s body as you can muster.

    However, if your target keyword is tricky to use in a grammatically-correct way, such as “Roofing L.A.”, then don’t force the issue; just settle on using each word within the keyphrase as closely together as possible.

    Don’t forget internal linking

    Internal linking is still an important aspect of on-page optimization. There are several key benefits to internal linking:

    • Reduces bounce rate, as it promotes relevant internal content to your audience
    • Helps search engines determine the importance and relevance of your pages within your domain
    • Helps Google and other search engine spiders crawl and index your pages more easily and effectively
    • Helps users easily find their way around your site, lending to a more positive overall user experience, and time-on-site metrics
    • Allows you to control anchor text to each individual page, helping search engines understand what keywords you believe the destination page is relevant for

    Generally, websites with good internal linking strategies rank better in search results.

    Link to relevant information outside your site

    Whenever possible, link to sites that offer relevant information to your content. You may have already noticed that I’ve done so in this very post.

    This makes your link structure more natural and it provides value to your audience. Don’t worry about linking to your competitors occasionally, either. Linking to related websites helps Google understand what circle of relevance your website falls under. Plus, giving props to a competitor with a link shows a lot of confidence in your product, and can speak volumes about your business.

    Keep it fresh and useful

    Google Panda and Penguin take into account the freshness of content. That’s why setting up a blog for your site is so crucial these days. With a blog, you can post new and useful information as often as you want. This helps in many ways:

    • Supplies new content to your existing audience, keeping your brand top-of-mind (and thus, makes your audience more likely to convert)
    • Helps grow your audience by drawing in new readers
    • Establishes niche authority/credibility
    • Increases traffic via social channels (due to shares, mentions, tweets, etc.)
    • Increases organic search traffic because it adds more content that can be turned up in the search results
    • Gets you more opportunities to receive natural inbound links when other authors reference your existing content

    Ideally, you should update your blog at least once per business day.

    You also want to post information that is extremely useful and relevant to your audience. How-to posts and posts on trending topics are preferred by most readers. If you constantly post useful information you will give your audience plenty of reasons to visit your site regularly. Lame content that nobody cares about won’t help you at all; if it doesn’t provide some sort of value to your readers, don’t even bother posting it.

    Be original

    Remember how sites with duplicate content were killed early in 2011? Google’s stern stance against duplicate content still stands.

    Sites with internal duplicate content are also at risk. If you’re not sure if your site has internal duplicate content, you can use Google’s Sitemaps (Google Webmaster Tool) to check for duplicate content.

    Keep ads to a minimum

    For many users, ads are simply annoying. But from a search engine perspective, peppering a site with ads can actually hurt your rankings.

    But how much is too much?

    Avoid setting up more than two ads, especially above the fold. Ideally, keep ads to a maximum of two per page. And if you are going to serve ads within your pages, serve only those that are extremely relevant and valuable to your users.


    There you have it, quick and easy tips for proper on-page optimization. If you have questions or if you need help with your on-page optimization initiative, contact us and we’ll be happy to offer a free consultation.


  9. When Will I Recover My Rankings after Google Penguin?


    The dust has now settled after Google’s Penguin update, offering us a clearer view of the damage sustained by affected sites. We can now clearly see the multi-faceted effects of–and reasons for–the update.

    If your website was affected, the questions you’re probably asking are, “When will I recover my rankings?” and “What do I need to do to recover my rankings?”

    Algorithmic Penalties vs. Manual Penalties

    The good news is that Google Penguin is algorithmic, and algorithmic penalties are not permanent. Take a look at the video below, in which Matt Cutts discusses algorithmic penalties and how they work.

    At the 0:46 mark, Matt Cutts says:

    So, if your site is affected by an algorithm, for the most part, if you change your site, whatever the characteristics are that’s flagging, triggering, or causing us to think you might have keyword stuffing, or whatever, if you change your site, then after we’ve re-crawled and re-indexed the page, and some period after that when we’ve re-processed that in our algorithms, for the most part your site should be able to pop back up or increase in its rankings.

    Starting at the 1:10 mark, Matt Cutts discusses manual penalties:

    Now, on the manual side, as far as I can think of, the vast majority of the time, what we try to do is we try to have, essentially a time-out. So, if it’s hidden text, you might have a penalty for having hidden text, and then after, say, 30 days, that would expire. And then if you’re doing something more severe, if you’re doing some cloaking or some really malicious stuff, that will last for a longer period of time, but eventually that will also expire. So we try to write things such that if you improve your site, if it’s affected by an algorithm, or even if you’ve done something within your site, eventually that would normally time out.

    If you received an unnatural link warning from Google, you may have a manual penalty. Here’s an excerpt of Matt Cutts’ interview during SMX Advanced on June 5th, 2012, in which he discusses the unnatural link warnings that were sent out to webmasters:

    Danny Sullivan: If you submit a warning for unnatural links do you submit a reconsideration request?

    Matt Cutts: Yes, because it was a manual penalty

    Matt Cutts: we want to see a real effort in that you remove those links. We want to see effort. We look at a random sample to see if those links are removed or not. If you remove 90% or so, you are in better shape. We understand it is difficult and we are talking to the webmaster tools to add a disavow link feature.

    Danny Sullivan: If you were hit by Penguin and Panda, should I just give up?

    Matt Cutts: Sometimes, but both are algorithmic and if you change the site and your signals, then you can come back.

    The Road to Recovery

    If your site was hit, there are a number of activities you can engage in to help unwind the effects of Penguin. But before you can begin doing these, you first need to know the reason why you were hit by Google Penguin. The two core non-Google Penguin compliant activities include:

    • Unnatural linking (onsite and offsite)
    • Keyword stuffing (over-use of exact-match keywords in your onsite copy)

    Affected site owners who were quick to identify these problems and implemented Penguin compliant amendments have seen varying amounts of time it took to recover their sites’ rankings. Some recovered within a month, and others are still on the road to recovery, with no end in sight.

    The answer to our question on how long it will take to recover rankings post-Penguin depends on the following factors:

    • How frequently Google crawls your site
    • The level of access of you have to your site
    • How quickly you can identify and fix the problems

    Webmasters with sites that have years of manipulated linking relationships with other sites may find themselves entangled in an especially tricky mess. The only options are to either clean up or start over from scratch (ouch!). The problem is that on April 24th (the day Penguin was released), Google abruptly “changed its mind” on over a decade of previously established best practices. Literally overnight, anchor text became a dangerous weapon rather than a strategic tool for savvy SEOs. This midnight shift in policy left millions of webmasters in the dust; websites that had long held top rankings for competitive keywords saw their rankings fall into oblivion, wiping out website traffic and sales. The longer webmasters had been engaging in manipulated linking practices, the more severely their sites were hit; and the more difficult it is to undo the penalty.

    But if your site is relatively new and you got hit, you may be fortunate, depending on how far you are into your link building strategy.

    Let’s take a look at the factors I mentioned above that could determine how long before you see your site or pages return to their previous rankings.

    How frequently Google crawls your site

    Google’s crawl rate is an algorithmic process, meaning it’s not determined by any individual at Google. A lot of factors are at play to alert crawlers on how often they should visit a site. These factors include, but are not limited to:

    • Number of parameters in a URL
    • Number, source, and recency of links to a page
    • Site’s PageRank
    • Frequency of content updates
    • Date of last page update

    However, you can “train” crawlers on how often your site should be visited. Google crawlers frequent sites that are updated with fresh content. News sites, for example, get visited more often than other sites, while some sites that offer real-time information such as live-score sites (for sports broadcasts) are visited every second.

    If your site is set up for Google Webmaster tools, take a look at the Sitemaps section to give you a good picture of how often Google visits your site. If you’ve kept your site updated with fresh content daily for at least several months, there’s a good chance that Google visits your site on a daily basis.

    The level of access you have to your site

    Your role on your site plays a vital part in its road to recovery after Google Penguin. If your site is a blog and you personally take care of on-site optimization, you can easily clean things up. But then again, that depends on the number of pages and posts your blog has.

    For off-site linking activities, you may need to check your Google Analytics (if set up for your site), Google Webmaster Tools inbound links, or a 3rd party link data provider such as Majestic SEO or Open Site Explorer to see which sites link to you.

    Identify websites linking to you with exact-match anchor text and reach out to the webmaster, asking to remove or change the link. Here’s the exact email template I have developed, which works well for this purpose:

    Subject line: Link Removal Request



    My name is Jayson, and I represent [your website URL]. I wanted to thank you for linking to our site from [linking page URL] However, it has come to our attention that this link may have been acquired against Google’s Webmaster Guidelines. It is important for us to bring our site into compliance. Additionally, the link points to a website which Google has penalized, which could cause harm to your website’s rankings. Could you please remove our link from this page and any other page on your site?

    Thank You,


    If you’re operating a large website with many pages, hire an SEO professional to do a total site audit for you. Also conduct a backlink profile audit to identify any external links that may be bringing you down.

    How quickly you can identify and fix the problems

    The sooner you can identify and fix problems that may be affecting your rankings, the sooner your recovery will be. It’s currently unknown whether websites affected by Google Penguin will need to wait until the next Penguin refresh in order to recover their rankings. Unfortunately, if that’s the case, then it may be a while until you recover, because Google has only pushed out two known Penguin updates: The original (on April 24th) and Penguin 1.1 (on May 25th).

    As soon as it becomes more clear whether Penguin recovery can happen between Penguin refreshes, I’ll update this blog post. My intuition says it can, but the jury’s still out for now.

    Obviously, the time it takes to recover from Google Penguin is not set in stone. Your rankings could be back in a matter of weeks or months, depending on the level of commitment and work you put in.


    I hope you find this post useful in understanding how long it should take for your site’s rankings to recover if you got hit by Penguin, as well as what steps you can do to speed up recovery.

    If you need help in making your site Google Penguin compliant, please leave a comment or contact us to set up a consultation.

  10. Google Penguin: 5 Recovery Facts You Need to Know


    Google Penguin WarriorNow that Google Penguin has had some time to sink in, we have an opportunity to reflect on what exactly Penguin changed, the aftermath of Penguin, and (most importantly), what steps should be taken to recover from a Penguin penalty.

    As I previously wrote, Penguin targeted inbound link profiles. While Panda 3.3 and 3.4 devalued certain elements of anchor text (most notably, exact-match anchor text), Penguin actually slapped a penalty on it. With access to client data as well as dozens of folks who’ve reached out and asked for link profile audits, I’ve seen some specific trends that are undeniable. In this post, I’ll discuss the trends that I see in every Penguin-affected link profile, while pointing out supporting evidence from other SEO gurus and webmasters that have noted similar trends. My goal is to definitively outline what Google Penguin changed, and what you (SEOs and webmasters) should do about it.

    Fact #1: Reconsideration Requests Won’t Help You

    Penguin is an algorithmic penalty, and Matt Cutts has stated that reconsideration requests won’t help you for algorithmic penalties. Matt Cutts explains the difference between algorithm and manual penalties in the video below. At right around the 2:00 mark, Matt Cutts explains that reconsideration requests won’t do you any good if you have an algorithmic penalty.

    Search Engine Land confirms this as well, here:

    However, Google says this is an algorithmic change — IE, it’s a penalty that’s applied automatically, rather than a human at Google spotting some spam and applying what’s called a manual penality. Because of that, Google said that reconsideration requests won’t help with Penguin.

    Fact #2: Your Inbound Link Profile is Probably What’s Hurting You

    Assuming you’re not engaging in some obviously shady onsite keyword stuffing, your inbound link profile is what caused your rankings to drop if your site took a dive on or around April 24th. Multiple sources have backed this up, including Search Engine Watch in this article:

    The Penguin algo seems to be looking at three major factors:

    • If the majority of a website’s backlinks are low quality or spammy looking (e.g., sponsored links, links in the footers, links from directories, links from link exchange pages, links from low quality blog networks).
    • If majority of a website’s backlinks are from unrelated websites.
    • If too many links are pointing back to a website with exact match keywords in the anchor texts

    Google changed the way its algorithm calculates value based on inbound links. With the rollout of Panda 3.3, anchor text was severely devalued. Penguin actually added a penalty for over-optimized inbound anchor text. Google did this to make it harder for SEOs to get their clients ranked well in the search engines, in the hopes that those clients would turn to Google Adwords instead.

    Fact #3: Removing Bad Links Will Help

    Since Penguin is an algorithmic penalty, and the spam flag in the algorithm is related to an unnatural link profile, then it makes sense to remove the links that could be causing this algorithmic flag. In the video above, at the 0:46 mark, Matt Cutts says:

    “So, if your site is affected by an algorithm, for the most part, if you change your site, whatever the characteristics are that’s flagging, triggering, or causing us to think you might have keyword stuffing, or whatever, if you change your site, then after we’ve re-crawled and re-indexed the page, and some period after that when we’ve re-processed that in our algorithms, for the most part your site should be able to pop back up or increase in its rankings.

    So, I think it’s safe to assume that removing bad links will help.

    Note: At AudienceBloom, we offer inbound link profile audits. We can comb through your inbound link profile, diagnose why your website fell in the rankings, suggest specific links to remove, and outline a plan of action moving forward. Call or Contact Us for a quote!

    Fact #4: Diluting Your Inbound Link Profile with New Links Will Help

    There are multiple case studies that have been published (and I have verified them with my own clients’ data) that conclude that the primary problem plaguing sites affected by Penguin is over-optimized anchor text for their primary target keywords. One of the most popular studies published was the one from MicroSite Masters here, stating:

    What anchor text should you be using? From the data we’ve evaluated, “”, “MySiteDomain”, “”, “”, “The Title of My Website”, “here”, and “the title of one of my H1′s (that isn’t a keyword I’m trying to rank for)”, were generally used as anchors on sites that were not affected by the most recent Google update and are probably a good starting point to consider using moving forward.

    If you’ve read my other blogs, you’ll see that on March 9th (nearly 2 months before the MicroSite Masters case study) I publushed this article stating:

    Another element of a natural-looking link profile is what’s called “junk” anchor text, LSI anchor text, and naked URLs. These are anchors that say “click here”, “here”, “Website”, “”, “”, etc.

    SEONitro published another case study concluding the following:

    [Anchor text] is probably the biggest post Panda/Penguin disqualifier as in most sites we will have researched did not diversify their link anchor density and were hit hard this go around with the “exact match” dial down. As we go up to Case Study #2 we find that our affected sites have VERY LITTLE brand incoming links.

    Jonathan Leger published a case study concluding that exact-match anchor text for sites that are ranking well post-Penguin is about 10% on average:

    It probably won’t come as much of a surprise for me to tell you that the average EMA for a site is pretty low — just 10% across all of the markets.
    Also, if you’re not using EMDs, it’s important to diversify your anchor text a lot. How much is “a lot” really depends on your market. So do the research. Check out the link profiles of other ranking sites in your market and see what their anchor text looks like. published a lengthy observation packed with valuable insight as well:

    Another option is that you can simply try to “dilute” the anchor text optimization by adding more links to that page with very diverse anchor text. This is the more likely option for most and it is what most are doing because it’s easier and cheaper (most of the time). published an article with their observations and tips for recovering from Penguin in this article:

    One of the most notable updates in Penguin was an “Over Optimization” algorithm. This update specifically targeted Keyword Anchor Text as it relates to the underlying link.

    What are some things that can help you recover from Penguin?

    1) Get new additional links with generic anchor text for your site
    2) Build no more than 50% of your backlinks with targeted anchor text

    Neil Patel wrote a great article explaining his findings over at QuickSprout:

    One of the unnatural link building signals that Panda 3.4 aims at is too many exact anchor text links. Standard practice used to be you’d aim for about 30% to 50% matches…now that numbers dropped drastically. So test the waters out and start with 5% or so and increase slowly.

    On June 8th, Search Engine Land published its latest thoughts on Penguin:

    If you think that you were hit by Penguin, I recommend building a few links to your site with diversified anchor text. Stay away from exact match anchor text.

    On June 21st, ArticleRanks sent out an email to its list with Penguin recovery advice:

    Start diversifying your anchors to the point that you are totally diluting the anchors you already have in place, then on future penguin refreshes the filter will be lifted from your site. We have already recovered a couple of sites like this.

    Time for a shameless plug! At AudienceBloom, we offer SEO link building packages aimed at diluting your inbound link profile. If you’re suffering from a post-Panda 3.3 or Penguin penalty, we can help!

    Fact #5: Take Action, Have Patience, and Everything Will be Just Fine

    SEO is a rapidly changing industry. Since its inception, SEOs have been doing their best to analyze and adapt to search engine algorithms. When a major update like Penguin comes along and tanks your website’s traffic, the urge to give up may be strong; it may feel like you’ve just lost everything you’ve worked for. But in reality, it’s not about how many times you get knocked down. It’s about how many times you get back up.

    But it’s not just about getting back up and holding your ground. Without taking action, your website won’t see any recovery, as we learned from Matt Cutts’ video above.


    So, what can you do right now to take action and recover from Google Penguin?

    1. Audit your link profile and diagnose why your website lost its rankings

    2. Remove bad links identified in the audit

    3. Engage in a new SEO link building campaign to dilute your current inbound link profile and get a healthy mix of diverse anchor text links.

    I hope the information I’ve outlined here helps you recover from Google Penguin. If you found it helpful, leave a comment!

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!


-The AudienceBloom Team