AudienceBloom

CALL US:  1-877-545-GROW

Tag Archive: google

  1. Why Did Google Update Search Quality Raters Guidelines?

    Leave a Comment

    Google’s “Search Quality Raters” guidelines (henceforth shortened to SQR guidelines) are something of a holy document in the SEO community. We live in a world where Google is pretty much the dominant force in search, dictating the trends and tropes we optimize our sites for (and influencing any other competitors who peek their heads out for search space), and it’s hard because they don’t tell us specifics about how their search algorithm works. Instead, we get hints and suggestions that indirectly tell us how to rank higher but mostly just prevent us from relying on black hat tactics to rank.

    The Original SQR Guidelines

    There have been a handful of “leaked” versions of the SQR document, and one official abridged version released by Google, but it wasn’t until last year that Google released the full SQR guidelines, in all their 160-page glory. Search marketers, once they got over being intimidated at the length, delved into the document to see what new insights they could uncover about how Google interprets the authoritative strength and relevance of websites.

    Google Search Quality Rating Program

    (Image Source: Google)

    The original document didn’t exactly revolutionize the search world, but it did bring up some important considerations and strategic takeaways we wouldn’t have gotten otherwise.  Much of the document did, admittedly, tread old ground by covering things like the importance of high-quality content and how Google views relevance to search queries from a semantic angle. However, there were some noticeable new insights:

    • Google views pages that deal with your money or your life “YMYL” pages more significantly than other pages.
    • Expertise, authoritativeness, and trustworthiness (EAT) are the three factors that Google uses to determine a site’s domain strength.
    • Content positioning and design come into play. Google evaluates content differently based on how it’s placed.
    • Know queries and know simple queries. These are designations for different types of queries based on how they can be answered; namely, succinctly or with more elaboration necessary.

    Now, it appears Google has made some major modifications to the SQR document.

    The Changes

    Only four months after the document was originally published, Google has made significant changes. You might think Google added even more content to the 160-page behemoth, but actually, the document shrank, specifically to 146 pages.

    Among the most important changes include:

    • A decreased emphasis on supplementary content. Supplementary content refers to any on-page content other than the “main” source of information. For example, if your Contact page has a few paragraphs of text explaining who you are and what you do, you might have supplementary content in the form of notes in the footer, or testimonials. Supplementary content can help or harm you, and it was a major point of emphasis in the previous version. Now that Google has downplayed it, it might be a sign that it’s not as important to your rank as it used to be.
    • An increased attention to Local search, now called “Visit-in-Person.” Google spends more time talking about the importance of local ranks and how to achieve those ranks. It has also adopted new terminology, “visit-in-person,” which may explain how they perceive these types of user queries. Rather than simply relegating these types of entries, which function on an algorithm separate from the national results, to a geographic sub-category, Google is now boasting these entries as means for foot traffic. It makes sense, as most local searches happen on mobile and are related to some semi-immediate need.
    • Increased descriptions of YMYL and EAT concepts. I described both the YMYL and EAT concepts in the section above. The concepts themselves haven’t changed, but Google has increased its emphasis on them. This means the concepts may be becoming more important to your overall rank, or it may mean that there was some initial confusion surrounding them, and Google has worked to clarify those points.
    • More examples of mobile marketing in effect. It’s no surprise that Google is doing more to play up mobile, especially with another Mobilegeddon-style update in the works. Mobile is a topic that still confuses a lot of webmasters, but it’s still becoming increasingly important as a way to reach modern audiences. Mobile isn’t going away anytime soon, so this is a vital area (and Google recognizes that).

    If you’re interested in a much, much more thorough analysis of the changes, there’s a great post about it here.

    Google’s Main Priorities

    By examining Google’s motivations, we can better understand where the search platform hopes to be in the next few years, and get a jumpstart on preparing our SEO strategies for the future. For starters, Google is extremely fixated on the mobile user experience. With an expanded section on mobile compliance and a new frame of reference for local searches, it’s clear that Google wants mobile searchers to have an integrated, interactive, and seamless experience finding websites. The YMYL and EAT systems of rating content quality and significance are standbys, but the fact that Google is actively expanding these concepts is evidence that they’ll be around for the long haul.

    It’s uncertain exactly how often Google will update their SQR guidelines document, or what other changes might be in store for our future as search marketers. Certainly, there may be major new additions for new technologies like apps and interactive content, but in the meantime, keep your focus on producing expert, authoritative, trustworthy content, and optimize your site for mobile user experiences.

    What can we do for you?

  2. How Far Will Google’s “Right to Be Forgotten” Plan Go?

    Leave a Comment

    Google has been under a ton of pressure, both in the European Union and in the United States, to do more for user privacy. One of the biggest changes pushed by the EU back in 2014, the “Right to Be Forgotten” act, basically mandated that Google remove any old or irrelevant links that individual users request for removal (pending approval). Now, it appears that the Right to Be Forgotten rules are growing larger and more complex, and Google is taking an active role in their development.

    What does this mean for Google? What effects will it have for the future of search?

    The Right to Be Forgotten Act

    The specifics of the Right to Be Forgotten Act are more complicated than my initial, brief explanation, and it’s important to understand them before I dig any deeper into more recent developments. The original guidelines in the EU give all EU citizens the right to request a link to be removed from Google’s massive web index. This functions similarly to Google’s DMCA requests, which are used for copyright holders to request the removal of infringing material.

    right to be forgotten

    (Image Source: SearchEngineLand)

    However, there is no guarantee that Google will honor this request if the link is deemed relevant to search user interests. The only links required to be removed are ones that are out of date or no longer relevant to the public interest, which is a painfully ambiguous term. As a quick example, let’s say when you were a kid, you started up a personal blog in which you complained about school and expressed your angst with reckless abandon. Somehow, you found yourself unable to take this site down, and now, 20 years later, that blog is still showing for anybody who searches your name. Since this information is old, and isn’t inherently tied to the public interest, it would seem fair for Google to remove the link from its index to reduce its visibility.

    If Google rejects the request, it can be appealed. If the link is removed, the content is still available online—it’s just significantly more difficult to find. It may also appear on another site, complicating the process. Right to Be Forgotten isn’t a perfect system, but it’s what we have.

    Various alternative forms of “right to be forgotten” are starting to emerge in the United States, as well. For example, California passed a recent law known as the “eraser button,” which demands similar functionality (going a step further to demand the full removal of certain content from the web), and Illinois and New Jersey are working on similar laws. A federal version of the legislation is also underway.

    This falls in line with the general attitude of the times: a demand for greater responsibility from tech giants. Google is also under heavy scrutiny for alleged antitrust violations, originally in the EU exclusively, and now in the United States as well.

    The Latest Developments

    Back in 2014, Google was resistant to Right to Be Forgotten legislation, claiming it to be a form of censoring the Internet and a violation of user rights. Now, Google is inching closer to a more comprehensive application of those laws.

    Under the old policy, “forgotten” links would only be removed from versions of the search engine for other countries—Google.com.uk, for example. A user with the right incentive could simply access the United States’ version of Google to find the links that have been removed. Now, Google has implemented functionality to prevent this breach; all link removal requests are based on the origin of search, so any user in the UK will not be able to find forgotten links, no matter which version of the search engine they use. Speculation suggests that Google made this change under pressure from European authorities.

    Where Does It Go From Here?

    Frankly, the speculation rings of truth to me. I suspect that Google won’t take any moves to remove content from users’ reach until it is forced to (or pressured to) by international government bodies. This move, while small, is a concession the search giant is willing to give in order to remain in good standing on the international scene.

    The company isn’t known for buckling in response to requests; for example, when Spain passed a law that would require the company to pay a kind of tax to writers of articles that showed up in Google News, Google responded by pulling Google News from Spain entirely.

    Google News

    (Image Source: Arts Technica)

    With Google more or less tacitly accepting these new demands for indexation rules, does this mean that Google is liable to respond passively to such requests in the future? This remains to be seen. It depends on how much pressure is put on the company by international organizations, and how important the issue is to Google. For example, removing a link to a half-assed personal blog from 20 years ago doesn’t carry the same consequences as censoring information available to an entire country about that country’s government.

    My guess is, lawmakers in the United States and overseas will gradually work to introduce new, better regulations to encourage user privacy and Google, as long as these demands are reasonable, will comply. Overall, this will have a minimal effect on the way we use search engines, but it shows that we’re entering an era of greater responsibility and accountability for tech giants.

    What can we do for you?

  3. How Google’s Candidate Cards Turned Into a Travesty

    Leave a Comment

    Google is never short on ideas how to improve its search system and break ground in new areas of user satisfaction. Sometimes, these ideas are large and revolutionary, like its embedment of maps into local searches. Its Knowledge Graph, a system of direct information provision (forgoing the need to peruse traditional search entries) is one of the most robust and useful additions in recent years, and it keeps evolving in new, unique ways.

    Rather than solely providing word definitions, or numerical unit conversions, or even filmography information, the Knowledge Graph can provide unique insights on news and upcoming events. Take its entry on the Super Bowl, for example (keep in mind this was written just before the actual Super Bowl):

    Super Bowl Keyword Search Results

    Presumably, this entry will self-update as teams score throughout the evening, and in the next week, will instead offer retrospective analysis of what is currently a forthcoming event. As a user, this doesn’t leave much to be desired; I can even scroll down to find additional results.

    But a recent feature of the Google Knowledge Graph has made a much bigger impact, and reveals one of the biggest current flaws of the direct-information model. It’s all about Google’s portrayal of candidates in what has undoubtedly been one of the most eventful, peculiar election seasons of the past few decades.

    Candidate Cards

    Google’s politics-centric new feature, candidate cards, has begun the same way all its features begin: as an experiment. Accordingly, let’s refrain from judging this system too harshly.

    The idea was to give the American public more transparency on their leading presidential candidates—which sounds great in theory. Google’s idea was to give each significant candidate a designated position in their search results for certain queries. These “candidate” cards would appear in a carousel to potential voters, giving them a direct line of insight into the candidates’ actions and positions. This feature was rolled out as a trial for the recent “undercard” Republican debate, along with YouTube integration and live tracking via Google Trends.

    Google Candidate Cards Mobile View

    (Image Source: Google blog)

    Here’s the issue: if you followed along with this experiment during the actual debate, you wouldn’t see multiple candidates’ positions. You only would have seen one, at least for the bulk of the time and for most queries.

    As SearchEngineLand’s Danny Sullivan noted in a blog post on the issue, the carousel of cards that appeared, for practically any search, only showed posts and positions by one candidate: Carly Fiorina.

    gop debate serp

    A handful of general searches like “gop debate” or even just “debate” returned the same carousel. Likewise for any undercard candidate-specific searches, such as “Huckabee” or “Santorum.” At first glance, you would assume that this is some type of error with Google’s system, that somehow these posts were “stuck” as the top results for any query that tapped into the feature. Could this mean that Google was unfairly favoring one candidate over the others?

    Google would later confirm that nothing was wrong with the feature. Each candidate had the same ability to upload information to this system; Fiorina was the only candidate who made use of the system, and therefore had substantial ground to gain.

    Main Candidate Cards

    Candidate cards for the main GOP candidates appeared not long after the undercard debate ended, including Donald Trump, who was absent from the “main” debate. Take a quick look at these and take note of anything peculiar that stands out:

    GOP Debate

    (Image Source: SearchEngineLand)

    Look at the center post, which features a link to donate $5 to Mark Rubio’s campaign, and consider the nature of the query: 2016 Republican debate. If you’re like me, this raises some questions about the card system and whether it goes “too far” for search results.

    Three Major Concerns

    I don’t care who you support, which party you belong to, or what you think about this election. For the purpose of this article, I’m assuming every candidate on both ends of the political spectrum is equally unqualified to lead the country, and so should you. Remove your biases and consider the following dilemmas this system presents, for any candidate in the running:

    1. Free Advertising. There are strict rules about political advertising, which go into exhaustive detail that I won’t attempt to reproduce here. It seems that Google’s card system can be taken advantage of as a free, unrestricted place to advertise, whether it’s through the request for campaign donations or an attack on another candidate.
    2. SEO as a Political Skill. Take Fiorina’s case; should she be rewarded with extra publicity because of what basically comes down to SEO skills? This seems strange at first, until you realize this is mostly the case anyway—you can bet each candidate has a dedicated contact responsible for making sure they rank highly for public searches (not to mention the presence and effects of social media marketing in political elections).
    3. Biased Media Control. Last, and perhaps most importantly, should Google be allowed to control the parameters for which we view candidate information? Contemplating the possibility of filtering out one candidate’s cards, this is concerning, yet again, it’s nothing entirely new—Google’s stranglehold on search results is currently being investigated as a violation of antitrust laws in Europe.

    What does the candidate card system say about Google? What does it mean for the political system? Is it a useful tool that needs refinement or a total travesty that should be scrapped? I’m not quite sure myself, but you can be sure this experiment didn’t quite go the way Google originally intended. Keep your eyes peeled for how this feature develops—it could have a massive impact on how this and even future elections pan out. In the meantime, you better hope your favorite candidate is as skilled at SEO as you are.

    What can we do for you?

  4. Your Guide to Google’s New Stance on Unnatural Links

    2 Comments

    Recently, Google quietly released an update to its link schemes/unnatural links document in Webmaster Tools. For something that happened so quietly, it generated significant noise across industry media outlets. So, what changes were made and what do SEO professionals, business owners and webmasters need to do differently as a result?

    Building Links to Manipulate PageRank

    articleimage575 Building Links to Manipulate PageRank

    Here’s what Google’s document now says about manipulating PageRank:

    “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.”

    What does Google mean when they say, “any links intended to manipulate PageRank”? According to Google, any links you (or someone on your behalf) create with the sole intention of improving your PageRank or Google rankings is considered unnatural.

    The quantity and quality of inbound links have always been a crucial part of how Google’s algorithm determines PageRank. However, this fact manifested manipulative link building schemes that created nothing other than spam across the Web, which is something Google has been working feverishly to eliminate since it launched it original Penguin algorithm in April 2012.

    Now, Google is much better at differentiating true editorial links (ie, natural) links from manipulative (unnatural) ones. In fact, Google now penalizes Websites in the search rankings that display an exceptionally manipulative link profile or history of links.

    What about Buying or Selling Links?

    articleimage575 What about Buying or Selling Links

    Google says, “Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link.”

    If people found out that their favorite politician had in some way purchased a majority of his or her votes, how would they feel about it? When we purchase (or sell) links for a website, we are essentially doing the same thing.

    Google has made it clear that purchasing links violates their quality guidelines. However, many companies continue to do so, and some companies have severely lost search rankings and visibility as a result.

    Google is getting better at understanding which links are purchased in a wide variety of ways. They also have a team devoted to investigating web spam, including purchased links.

    Excessive Link Exchanges

    articleimage575 Excessive Link Exchanges

    Google says, “Excessive link exchanges (“Link to me and I’ll link to you”) or partner pages exclusively for the sake of cross-linking.”

    A few years ago, it was a common for webmasters to exchange links. This method worked; as a result it started to become abused at large scale. As a result, Google started discounting such links. Now, Google has officially added this to its examples of unnatural link building tactics.

    Large-scale Article Marketing or Guest Posting Campaigns

    Google says, “Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links.”

    This, in particular, has a lot of people wondering, “can you still engage in guest posting as a way to get inbound links?” The answer depends on how you’re doing it.

    A few years ago, it was a common for SEOs to engage in large-scale article marketing in an attempt to quickly get tons of inbound links. Many were using low-quality, often “spun” content (mixed and mashed, sometimes computer-generated nonsense) to reduce time and content production costs. The result was a surge in nonsensical articles being published around the Web for the sole purpose of creating inbound links. It was a true “throw spaghetti at the wall and see what sticks” approach to online marketing; some publications rejected these submissions, and others approved them without any editorial review. It was all a numbers game with the hopes that some of the content would get indexed, and thus, count for the link.

    Google responded by launching its Penguin and Panda algorithms to penalize businesses that were creating this mess; Penguin targeted websites with many inbound links that were obviously unnatural, while Panda targeted the publishers that published the content without any editorial review. As a result, most of the large-scale article marketing links became worthless.

    After people started to realize that large-scale article marketing campaigns were no longer working, they turned to guest posting as an alternative. Unfortunately, what many considered “guest posting” was simply an ugly reincarnation of article marketing; the only difference was the publishers and the extra steps of finding websites open to publishing guest contributions. Many continue to use low-quality content in mass quantities, and wonder why they still get penalized by Penguin.

    Does guest posting still work for building inbound links?  Yes, but only if you publish high quality content on relevant, authoritative sites. High-quality guest posts are a popular and tremendously effective way to acquire editorial links for your site, and they have many other benefits as well. For more information on how to use guest posting as a safe, effective link building tactic, see my article “The Ultimate, Step-by-Step Guide to Building Your Business by Guest Blogging.

    Automated Link Building Programs

    Google says, “Using automated programs or services to create links to your site.”

    A few years ago, during the same time period that article marketing and spinning was all the rage, a market developed for programs and services that would automate the steps involved in these processes. These tools and services became popular because they were an easy way to get huge numbers of links to your site quickly. Most importantly, they worked. Unfortunately, they only accelerated the permeation of low-quality nonsense that pervaded the industry at that time.

    Google now hunts down sites that have these sorts of inbound links, denying them any benefit.

    Conclusion

    So, why is Google waging a war on unnatural links? For years, many SEOs effectively manipulated their rankings using the methods described above, along with others. However, the types of links and content that people created as a result provided no value to people; only clutter. They cause search results to display nonsensical, confusing content, which makes Google look bad to its users. Furthermore, they cost Google money as its bots spend time scraping and indexing nonsense rather than good, quality content.

    As of April 2012, with the release of the Penguin algorithm, Google has been trying to keep low quality content out of its index as well as the search results. Now, they’re becoming more transparent with their goals as they refine and clarify their webmaster guidelines.

    Although these changes created quite a stir across the industry, it’s really just the same message that Google has been trying to convey for years. Create quality content that people want to read and share; the inbound links will come as a result, and you won’t need to worry about unnatural ones bringing down your website in the rankings.

    Want more information on link building? Head over to our comprehensive guide on link building here: SEO Link Building: The Ultimate Step-by-Step Guide

    What can we do for you?

  5. How to Find LSI (Long-Tail) Keywords Once You’ve Identified Your Primary Keywords

    6 Comments

    LSI keywordsFor many SEOs, keyword research is all about finding keywords with a high number of monthly searches and low competition. Some of the more advanced will move on to long tail keywords, or keyword phrases, or look to local keywords to help lower the competition and leap to the top of the search engine results page.

    These are all great strategies, but to truly show your skills as a keyword ninja, and find those untapped gold nuggets, you have to know how to identify long-tail, LSI keywords.

    What are LSI Keywords?

    If you were to search for a definition to LSI, or latent semantic indexing, keywords you would find answers all over the map. Most people will tell you that LSI keywords are simply synonyms for your keywords. The belief is that by finding similar terms for your primary keywords you can make your content look a bit more natural while adding more possible search terms into the mix.

    However, this rudimentary explanation of the term doesn’t do enough to serve our purposes. If you want to master the LSI keyword we have to get elbows deep in what it means. I wrote an article specifically for that purpose: “Latent Semantic Indexing (LSI): What is it, and Why Should I Care?

    Wikipedia describes LSI as having the, “ability to correlate semantically related terms that are latent in a collection of text;” a practice first put into use by Bell Labs in the latter part of the 1980s. So it looks for words (keywords or key phrases, in our case) that have similar meanings and words that have more than one meaning.

    Take the term “professional trainer,” for example. This could mean a professional fitness trainer, a professional dog trainer, or even a professional corporate trainer. Thanks to LSI, the search engines can actually use the rest of the text in the surrounding content to make an educated guess as to which type of professional trainer is actually being discussed.

    If the rest of your content discusses Labrador Retrievers, collars, and treats, then the search engine will understand that the “professional trainer” being referenced is likely a dog trainer. As a result, the content will be more likely to appear in search results for dog trainers.

    Another case would be where multiple synonymous terms exist in the same piece of content. Take the word “text” for example. If this were a keyword for which you were trying to optimize your page, words like “content,” “wording,” and “vocabulary,” would all likely appear within the content because they are synonyms and/or closely related terms.

    The benefits of LSI keywords

    The most obvious benefit to LSI keywords is that your keyword reach becomes broader by using synonyms. As I wrote in my article “The Rise of the Longtail Keyword for SEO,” “they are playing an increasingly essential role in SEO.”

    In addition to the broader reach, your content will rank higher in search engines because of the supporting effect of the LSI keywords. Repeating your keywords over and over throughout the text in an attempt to achieve the perfect keyword density (which, by the way, is a completely outdated SEO term and tactic) makes the content read awfully funny; and the search engines are smart enough to detect this sort of manipulation, too. Using synonymous keywords helps make your content a richer experience for the reader, and more legitimate (and thus, higher ranking) to search engines.

    Finally, LSI keywords help keep you competitive for your primary keywords in the right context. If you are optimizing for the term “professional dog trainer,” you’re less likely to be competing against the other types of professional trainers in search results.

    Great, how do I find LSI keywords?

    The search for LSI keywords starts with your primary keywords. They are the foundation of your SEO efforts, so if you don’t have these identified yet, then go back and find these first. Once you have them you can get started with LSI keywords. How do you find primary keywords? See my article, “The Definitive Guide to Using Google’s Keyword Planner Tool for Keyword Research.”

    Contrary to what you learned in high school, the thesaurus is not your first stop to find synonyms for your LSI keywords.

    The easiest way to find out what the search engines think are terms related to your keyword are to use the search engines themselves.  Go over to Google, and start typing your primary keyword into the search box. Note all of suggestions that are provided and you will not only have a list of related keywords, but a list of keywords that Google knows are related.

    Once you’ve made your list, hit enter to perform a search for your keyword. Scroll to the bottom of the results page, and look at the Searches related to <your keyword>. This will also give you some insight as to good ideas for your LSI search terms.

    Google’s Keyword Planner

    There have been a few changes, other than the name, when it comes to Google’s new Keyword Planner, but anyone familiar with the old Keyword Tool should be able to navigate through it with no problems.

    You can use it to find LSI keywords, and the process is simple. First, click on Search for keyword and ad group ideas and enter your primary keyword in the Enter one or more of the following box and click on the Get Ideas button at the bottom. On the following page, click on the Keyword ideas tab to get a look at not just a list of recommended LSI keywords, but their monthly searches, competition and other metrics that can help you decide which ones to target.

    Paid keyword tools

    Like anything else in SEO, there are plenty of software packages and services you can buy that will help you in your search for LSI keywords. The downside to these is that you are paying for something that you can get for free. The upside is that the training and support that comes along with most of these purchases will help you learn how to find these keywords more easily.

    The secret operator

    Actually, this is no real secret, but if you place a tilde (the squiggly line ~) before your primary keyword in the Google search engine, it will provide you with the results for synonyms to your search term; for example, ~professional dog trainer.

    Reading over the titles and descriptions of the results, you’ll be able to find some good LSI keywords. If you want to leave a term out of the results, add that phrase to the query with a minus sign in front of it. For example: ~professional dog training –dog grooming.

    Like your primary keywords, you need to make sure that you don’t over-do it when it comes to LSI keywords. A few closely-related terms will be sufficient to help your SEO efforts. And like your primary keywords, don’t try to insert LSI keywords into the text where they don’t fit.

    Remember, latent semantic indexing will only help you if you are writing good content for your readers. LSI keywords will give the search engines the information and evidence they need to understand what your content is saying and reward you accordingly.

    Want more information on content marketing? Head over to our comprehensive guide on content marketing here: The All-in-One Guide to Planning and Launching a Content Marketing Strategy.

    What can we do for you?

  6. Penguin 2.0: What Happened, and How to Recover

    1 Comment

    If you’ve spent any time recently in the world of SEO, you’ve probably heard about Penguin 2.0 — Google’s search engine algorithm change that was just launched on May 22nd, 2013. By the way that some SEOs were talking, you’d think it was the Zombie Apocalypse. Whatever it is, you can be sure that it will have a dramatic change on the web landscape. Here are five important questions and answers about Penguin 2.0.

    What is Penguin 2.0?

    To understand the 2.0 of anything, you need to understand the 1.0. The original Penguin is the moniker for Google’s algorithm update of April 24, 2012. When Google tweaked the algorithm in a big way, 3.1% of all English-language queries were affected by the update. Penguin was carefully designed to penalize certain types of webspam. Here are some of the main factors that Penguin targeted:

    1.  Lots of exact-match anchor texts (30% or more of a link profile)

    2.  Low quality site linkbacks, including directories and blogs

    3.  Keyword intensive anchors

    The aftershocks of Penguin continued long after April 24. Several mini Penguins were released since then, which is why some SEOs prefer to call the coming change “Penguin 4.” The new Penguin is predicted to do the following:

    • Penalize paid links, especially those without “nofollow”
    • Penalize spamdexing in a more effective way
    • Penalize advertorial spam.
    • Tightening penalties on link spamming/directory listings
    • Removing hacked sites from search engine results
    • Boost ranks for sites that have a proven authority within a niche

    How much different is it from Penguin 1.0?

    Calling this Penguin 2.0 is slightly misleading. We shouldn’t think of algorithm changes in the same way we think of software updates — better features, faster architecture, whatever. Penguin is not a software update. It’s a change in the way that a search engine delivers results to users.

    Here is a brief explanation of search engines, and how they change. Search engines are designed to give people the most accurate, trustworthy, and relevant results for a specific search query. So, if you type in “how to cook lima beans,” the search engine attempts to find the very best site on the Internet to help you cook your lima beans. Obviously, every lima bean recipe site wants to have the top dog spot on the search engine results page (SERP).

    Some webmasters will cook up clever tricks to do so. Thus, a site with hordes of banner ads, hordes of affiliate links, and barely a word about cooking lima beans could, with a few black hat techniques, climb in the rankings. The search engine doesn’t want that. They want people to have their lima bean recipe — great content — not just a bunch of ads.

    Thus, they change things deep within the algorithm to prevent those unscrupulous tricks from working. But the slithery lima bean site figures out a new way to slip by the algorithm. And the algorithm figures out another way to block them. And so on, and so forth.

    As all of this is happening, several key points emerge:

    1.  Search engine algorithms become more sophisticated and intelligent.

    2.  It becomes less likely for sites to game the system.

    At AudienceBloom, we follow white-hat SEO principles. We understand that there are a few tricks that we could use that might bump your site higher in the short term. However, we don’t engage in those practices. We want our clients to be successful for the long haul, which is why we engage in SEO techniques that are truly legitimate.

    What’s going to happen? 

    Now that Penguin 2.0 is rolling out, one of two things will happen to your site (as Google’s data centers propagate with the algorithm rollout and your rankings are adjusted accordingly):

    1. Nothing.

    2. Your rankings will drop, organic traffic will tank, and your site will begin to flounder.

    If, unfortunately, number 2 strikes, you may not realize it for a few days unless you are a big site with 10k+ visits with 30%+ organic a day.  In order to answer “what’s going to happen” for your site, you need to understand whether or not your site is in violation of any Penguin 2.0 targets. That question is better answered with an entire article of its own, but here are a few warning signs that your site could be targeted by Penguin 2.0.

    • You’ve had unscrupulous link building efforts conducted on your site.
    • You purchased paid links from another site (e.g., advertorials)
    • You rely on spam-like search queries (for example “pay day loans,” “cheap computers,” “free gambling site,” etc.).
    • You have aggressively pursued link networks listings on unreliable directories.

    Each of the above four points are common SEO tactics. Some SEOs have become sneakier than the algorithm, which is why Google is making these important changes.

    What should I do to prepare or recover?

    The most important thing you can do right now is to follow Matt Cutt’s advice in his recent video:

    “If you’re doing high quality content whenever you’re doing SEO, this (the Penguin update) should not be a big surprise. You shouldn’t have to worry about a lot of changes. If you have been hanging out in a lot of blackhat forums, trading different types of spamming package tips and that sort of stuff, then this might be a more eventful summer for you.”

    Content is the most important thing, of course, but that’s more of a proactive preparation than an actual defense. Is there a way to actually defend yourself from the onslaught of Penguin 2.0? What if you’ve already been affected by it?

    One important thing you can do right now is to analyze your site’s link profile to ensure that your site is free of harmful links. Then, you should remove and perform disavow requests on the bad links to keep your site’s inbound link profile clean. This is the equivalent of a major surgery on your site, and it could take a long time to recover. Here’s what Matt Cutts said about it on May 13:

    Cutts on Penguin 2.0

    Here are the steps you need to take to recover from Penguin 2.0:

    Step 1. Identify which inbound links are “unclean” or could be hurting your rankings (ie, causing you to be affected by Penguin 2.0). To do this, you’ll need to perform an inbound link profile audit (or have us do that for you).

    Step 2. Perform major surgery on your site’s link profile in order to make it as clean as possible. This includes removing links identified in the link profile audit, and then disavowing them as well.

    Step 3. Build new inbound links using white-hat tactics like guest blogging, while abiding by proper anchor text rules with your new inbound links.

    Step 4. Establish a content calendar to keep pushing out high-quality content, engage in social media, and avoid spammy techniques of any kind.

    If you’re looking for SEO help, AudienceBloom is prepared to help. One of our major efforts in the wake of Penguin 1.0 was helping sites to recover their rankings and clean up from their past. If you’ve been hit by Penguin 2.0, now is the time to take action to recover your rankings and search traffic. Contact us for a complimentary assessment and action plan.

    What can we do for you?

  7. How to Prepare for Penguin 2.0: Take Off that Black Hat!

    3 Comments

    Google Penguin 2.0What do Penguins, Pandas, and black hats have in common? Lots! Penguin is the most recent set of guidelines published by Google designed to clean up abuses in the field of SEO, and a new version is due out soon, according to Google’s Web Spam Czar, Matt Cutts. The impending event has marketers, reputation managers, and webmasters scurrying for cover.

    SEO – A Concept Recap

    SEO (search engine optimization) is the relatively newborn public relations field that tries to increase the visibility of websites by the strategic placement of keywords, content, and social media interaction, and the industry has grown rapidly in a little over a decade.

    Carried to extremes, as such things always are, black-hat SEO is a subdivision within the field that tries to achieve money-making results in an unsustainable way (ie, against Google’s webmaster guidelines). It frustrates the very purpose of a search engine, which is to help users find the information they need. Instead, rampant SEO gone amok serves only the needs of online marketers wishing to increase sales for themselves or their clients.

    To readjust the proper balance, Mr. Cutts and his team of “penguin” police have attempted to establish guidelines that will rule out the most abusive practices of black hat SEO.

    BlackHat SEO – Are You Doing It?

    The predecessor to Penguin was Panda, with much the same purpose. Panda included a series of algorithm updates, begun in early 2011. These were aimed at downgrading websites that did not provide positive user experiences.

    Panda updates of the algorithm were largely directed at website quality. The term “above the fold” is sometimes used to refer to the section of a website that a user sees before one begins to scroll down. The term comes from newspapers, which are delivered folded in two. The section that is “above the fold” is the section one sees before opening the paper, or unfolding it.

    Typically, marketers wish to cram as much eye-catching, commercial material as possible into this section, while responsible journalists wish to pack it with the most relevant and useful information.

    Penguin, on the other hand, is targeted more specifically at keyword stuffing and manipulative link building techniques.

    One targeted abuse, keyword stuffing, is not a tasty Thanksgiving delicacy, but the practice of loading the meta tag section of a site, and the site itself, with useless repetition of certain words. Sites can lose their ranking altogether as a result of such stuffing.

    Abusive practitioners of keyword stuffing are not above using keywords that are rendered invisible because their font color is identical with the background color. The user doesn’t see them, but the search engine spider does. This practice was soon discovered, however, and dealt with by the search engines.

    Meta tags are sometimes placed behind images, or in “alternative text” fields, so that the spiders pick them up while they remain invisible to users. Popular or profitable search keywords are sometimes included invisible to humans, but visible to the search crawlers. Very clever, but also soon discovered and dealt with. With Penguin, Google now analyzes the relevance and subject matter of a page much more effectively, without being tricked by keyword-stuffing schemes.

    “Cloaking” is another tactic that was used for a while to present a different version of a site to the search engine’s crawler than to the user. While a legitimate tactic when it tells the crawler about content embedded in a video or Flash component, it became abused as a Black Hat SEO technique, and is now rendered obsolete by the technique of “progressive enhancement,” which tailors a site’s visibility to the capabilities of the user or crawler. Pornographic sites have often been “cloaked” in non-pornographic form as a way of avoiding being labeled as such.

    The first set of Penguin guidelines and algorithms went live in April 2012, and the second main wave is due out any day now (though Penguin has gone through several periodic updates since its initial release). It’s designed to combat an excessive use of exact-match anchor text. It will also be directed against links from sources of dubious quality and links that are seen as unnatural or manipulative.

    The trading or buying of links will be targeted as well. The value of links from directories and bookmarking sites will be further downgraded, as will links from content that’s thin or poor-quality. Basically, the revision in the algorithms will be designed to rule out content that serves the marketer’s ends rather than the users’.

    Advice For SEO Marketers To Stay Clean

    If you are a professional SEO, the questions to ask yourself are:

    • Is this keyword being added in order to serve the customer’s potential needs, or is it designed merely to increase the number of hits? If the latter, then the additional users that would be brought to the site by the keyword are probably not high-quality conversion potential.
    • Is the added SEO material being hidden from the user or the search engine crawler? If so, with what purpose? If that purpose amounts to dishonest marketing practices, the material runs the risk of getting you in trouble with Penguin.
    • What’s the overall purpose of your SEO strategy? If it’s anything other than increasing sales by enhancing user experience, then you may expect an unwelcome visit from Penguin.

    If you’re a user, you’ll very likely not be as conscious of these changes, except inasmuch as they will alter the look of your search results page when you perform a search in Google. Will the new Penguin algorithms cut down on those ubiquitous “sponsored links” or “featured links”? Probably not. But savvy users know how to ignore those links by now, except of course when they turn out to be useful.

    Will the new algorithms enhance the overall usefulness of the search engine experience? Probably, at least marginally, and perhaps even in a major way. The whole field of internet marketing and e-Commerce is changing so rapidly and radically that it’s hard to keep track of the terminology, especially the proliferation of acronyms. But the ultimate goal will be an enhanced user experience.

    What can we do for you?

  8. 5 Ways to Outrank the Big Brands

    Leave a Comment

    One persistent rumor about Google is that the search giant favors big brands—“authority websites” and reputed sites of long standing—when it comes to search engine rankings. Prior to Google’s Penguin algorithm update, it was easy for small websites to rank ahead of big brands by using spammy SEO tactics that were specifically targeted by Penguin. This not only caused the drop of countless small websites battling for top rankings, but also fueled the flames of suspicion that Google favors big brands.

    Google Loves Big BrandsMany webmasters now suspect—and probably with some justification—that authority is everything in the world of search. Opinions are floating around that Google has bestowed its favor solely upon big brands and websites that have been around a long time. I see that in my line of work, too. Brand websites (or authority websites) often get away with posting what almost anyone would agree is “thin” content.

    Does this mean Google’s algorithm is skewed to favor big brands? While it isn’t possible to prove this point of view, several new websites have arisen and ranked well for highly competitive keywords where brand/authority websites already were operating. So, it can be done. But what’s the secret?

    The solution is to move beyond the general perception of unfairness and work toward becoming a brand yourself. And I would argue that whether Google favors the big guys or not really doesn’t matter. Here’s why.

    1. Identify Opportunities with Long-tail Keywords

    One of the easiest things to observe is that big brands—the seemingly invincible market leaders—rank for only a particular set of keywords. Not every keyword that relates to a niche can be targeted by the top brands, so there’s a considerable array of keywords—mostly long-tail—that are lying about. While ranking for core keywords is probably not going to be possible, that’s ok; long-tail keywords are your opportunity to shine, and they often yield better conversion rates, bounce rates, and time-on-site metrics than core terms.

    Regardless of how big a website is (in terms of brand value) and how long it’s been established, it can’t possibly account for every possible keyword combination in its content strategy. The authority websites are strong because they built a solid content strategy around a few keywords and stuck to it, with plenty of dollars and hours invested in that strategy.

    Key Takeaway: Figure out the keywords for which your big-brand website doesn’t rank well. Then take on the competition with those keywords. There are many ways to find long-tail keywords, but the most basic is to use Google’s Adwords Keyword Tool to perform keyword research.

    2. Authority Isn’t Everything; Content is King

    A key feature of Google’s algorithm is the concept of authority; Author Rank seems to be the mantra of every SEO effort. But, in truth, how many websites with correct author information have you observed still failing to secure the top spot? Some of these, you’ll notice, are from websites that are in the big-brand league.

    For instance, you won’t see Mashable or BuzzFeed ranking right at the top for every “tech-related” keyword. But honestly, they’ve got some really awesome content.

    Key Takeaway: The ranking algorithm is a summation of factors that consist largely of:

    • Inbound links at the page level
    • Social shares at the page level
    • Comments at the page level

    So, while big brands easily get inbound links, social shares, and comments at the page level, this isn’t an algorithmic favor from Google; it’s simply the result of big brands investing time and money into developing and nurturing their reader base.

    Counter this advantage by creating better content around the keywords for which you want to rank, properly optimizing that content from an on-site perspective, and strategically marketing it. Big brands may get links and social shares more easily, but great content will always win out over time.

    3. Social Signals Don’t Play Favorites; Use Them to Your Advantage

    This is the age of social signals. Google and Bing are actively seeking out social signals to value the websites they rank, and this offers a huge benefit for new websites looking to compete against authority websites. What is interesting is that the notion of authority itself is often deciphered through the kind of social shares and signals your website/page sends (besides the other, usual factors of Author Rank and links, of course).

    When it comes to social signals, brand/authority doesn’t really matter. If you provide value, you win. If you provide the most useful and unique content, you win.

    Key Takeaway: Encourage social shares and maintain your social media presence. It’s the perfect intersection between SEO and user engagement that offers you an opening to beat the big guys.

    4. Focus on People, Not Search Engines

    Pause a moment and think about this: What actually comprises an authority website? Most of the big brands have taken years to build a strong and credible following, a fan base, or active user-engagement. That boils down to just two things: 1) value and 2) people. When you provide value through your content strategies and your products, you attract people. As such, your focus should be on people.

    Key Takeaway: Treat SEO as a tool and not as the means to achieving the goals of your website/brand. The real means involves pushing value outward and enabling it to be shared across a wide platform—ideally, multiple platforms. This will draw in customers over the long run, and establish you as a brand and authority in your own right.

    5. If You Can’t Beat ‘em, Join ‘em

    If you’re already doing the things that are recommended by content strategists, user-engagement experts, and community managers, you’re already on your way to building your own brand. Why, then, should you worry about whatever bias Google might harbor with regard to the big brands?

    If the bias isn’t really as strong as we think, then there’s nothing to worry about. If it is a fact of life, then it may favor you over time as you build your own brand. Building a business requires long-term investments of time and money, and that’s often what it requires to outrank the big brands. But as you grow your business strategically and have patience, eventually you’ll be playing in that same ballpark with the big guys.

    In the age of “authority,” the challenge for new websites is that, by the time you get halfway toward becoming a recognized authority on Google’s radar, many of your competitors may have established a firmer brand, better positioning, and a stronger level of authority.

    That’s why you can’t waste time by competing with them on every keyword they rank for. Instead, build your brand, gradually, by targeting the areas the bigger brands and authority websites haven’t included in their net. From then on, it’s merely a matter of repeating the efforts, using social channels and strong content strategies, while expanding your territory.

    What can we do for you?

  9. Google vs. Facebook: Who’s Got the Upper Hand?

    Leave a Comment

    We’re lining up Google and Facebook against each other. It looks like the match of our lifetime.

    Facebook recently joined the ranks of high-profile tech companies that have gone public. While Facebook’s IPO failed to impress, the company still is a huge force to be reckoned with. Facebook is valued at nearly $105 billion, and it has a solid 900 million-plus members.

    With its membership nearing the 1 billion mark, Facebook appears poised to take on the whole world. Or is it? Is Facebook really invincible? Can it topple the current king of the tech industry — Google?

    Not likely . . . at least not in the foreseeable future.

    Some web experts believe that Facebook lacks the appeal of Twitter, which has been instrumental in breaking some of the biggest news stories in recent history.

    Some even argue that Facebook is going to end up like MySpace, which has entirely lost its appeal. You may recall that MySpace was once the king of the social media hill, after it stole the thunder from rival Friendster. Now MySpace has been reduced to nothing more than a virtual ghost town.

    Lately, we have seen a growing number of people who complain about Facebook selling its members out. Some have sought refuge in Google+, now viewed by many as a worthy successor to Facebook.

    Google is hell-bent on making social interactivity work on Google+. Although at one point the company denied that Google+ is a social network, for users it works, looks, and feels like one.

    Regardless, Google’s command of the online advertising space is staggering, and Facebook wants a piece of that. Who will be the last man standing in this battle?

    The cutthroat rivalry of social networks
    Facebook has maintained a solid reputation with its status as the largest social network in the world. On the other hand, Google has amassed a huge follower base that is increasingly drawn to Google+ for its new approach to social networking. Quite simply, Google+ offers great features that Google+ users aren’t willing to give up.

    Google+ is now home to nearly half a billion members, of whom 100 million were active users as of September 2012. It took Google+ just over a year to grow its user base to that substantial number. By comparison, it took Facebook many years to grow its user base to 900 million.

    At this rate, Google+ could snatch the helm from Facebook in a matter of months.

    While users can do a lot of things on Facebook, such as share and recommend content, upload and watch videos, post and share photos, etc., Google+ has some huge advantages over Facebook.

    One advantage Google+ has over Facebook is its seamless integration of products and services that many of us had already been using fairly heavily, such as YouTube, Google Calendar, and Google Drive (formerly Google Docs).

    Since Google+ integrates many of the tools that members already use, it’s much easier to aggregate content and collaborate with other people.

    What I especially like about Google+ is Hangout, a cool video-conferencing feature that allows as many as 10 people to chat via video all at once. Hangout also allows users to watch a YouTube video simultaneously. How cool is that? Very cool indeed.

    Right now, for those who desire some rest from the noise that’s plaguing Facebook, Google+ is an excellent alternative. Google+ is full of interesting images, videos, and content you won’t find on Facebook or other social networking sites.

    The advantages of Google’s product lineup
    Google did run into some trouble when they declared that user data will be shared across its products, but the company did a great job of providing a seamless user experience.

    Whether you are using an Android tablet or an Android phone, you can tap into Google’s wide range of products seamlessly from YouTube, GTalk, Calendar, Google Play, Maps, to a host of other product lines. You can also sync all your favorite Google apps to a single device.

    Personally, I don’t think I’d be able to do the things I do on a regular basis as productively without the apps from Google to which I’ve grown so accustomed.

    But that doesn’t mean I’m not active on Facebook. I still visit the competition to this day, but only when I want to get in touch with friends and keep myself in the loop about what’s going on with them.

    The competitive world of online advertising
    Facebook is still enjoying a steady growth in number of users, so the odds are good it will surpass the one billion point fairly soon. And after its recent IPO, the company has a pile of cash at its disposal to fund more innovation.

    Facebook’s got a great deal of resources that may enable it to beat Google in the mobile advertising arena, an area where Google is already playing well.

    With regard to advertising, Facebook could one-up Google in several key areas, such as display ads. However, Google has already gotten a good head start in mobile ads, forcing Facebook to struggle to make a decent foray into this area.

    Google is also keen on flexing its muscles to extend its might into social, an area where it has repeatedly failed in the past — with Google Wave and Google Buzz, for example.

    But with Google+, Google seems to be doing things right this time around.

    So Facebook is still the undisputed king of the social media hill, but the Google+ threat is real and looming. With Google’s vast resources in terms of cash gained from its extremely profitable advertising model, it could be only a matter of time before we see Google+ take the lead from Facebook.

    Conclusion
    What do you think? Who will be left standing, and receive the crown as the true social media giant in 2013 and beyond? Will we still see Facebook dominating the social space next year, or will Google+ finally stake its claim to unchallenged dominance of the Internet?

    We would love to know what you have to say about this. Post your comments below!

    What can we do for you?

  10. Google Penguin Update 3 Unleashed

    2 Comments

    We are seeing drastic changes in how search engines deliver information. Google in particular is striving to deliver not just the most relevant information, but the most useful and high-quality content.

    Google has released a series of algorithmic updates aimed at placing sites that promote high-quality content ahead of the pack in search results. The most notable of these updates are Google Panda and Google Penguin.

    On October 5, 2012, Google unleashed the latest version of its anti-webspam algorithm called Penguin. Widely known as Penguin 3 and announced by Matt Cutts via Twitter, it announced a refresh that affects sites in several languages:

    “Weather report: Penguin data refresh coming today. 0.3% of English queries noticeably affected. Details: http://t.co/Esbi2ilX.”

    0.3% may sound like a tiny figure, but it’s a significant signal that should put everyone on alert that the Penguin update is here to stay, and it’s hell-bent on providing search results that are more relevant and more useful.

    A brief timeline of Google Penguin updates

    Among Google’s algorithmic updates, perhaps none was more striking than Google Penguin. Below is a brief timeline of the Penguin update:

    • April 24, 2012 – The first Google Penguin was released. Almost instantly, many sites saw an incredible drop in rankings.
    • May 26, 2012 – Penguin 1.1 was released, affecting less than 0.1% of English sites.
    • October 5, 2012 – Penguin 3 was announced and released, with about 0.3% of English sites affected by the latest refresh.

     
    How does Penguin 3 fit into the series of Penguin updates?

    Penguin is all about ridding the SERPs of spammy websites. If they were overly optimized with low-quality content and links, sites were heavily penalized by the first two versions of Google Penguin.

    Penguin 3 has had the same effect on search queries as Penguins 1 and 2, but Cutts was explicit this time about the effect of the latest refresh.

    In his tweets, Cutts revealed the size of Penguin 3’s impact on queries in English and several other languages. Below are some of the data on how much Penguin 3 affected searches in various languages:

    • English – 0.3%
    • Spanish – 0.4%
    • Italian – 0.3%
    • French – 0.4%

     
    What’s noticeably different?

    SEOs and webmasters are probably wondering what’s different in the SERPs as a result of Penguin 3’s release.

    With the release, Matt Cutts gave us a clear idea of where we should look for the results of the latest algorithmic refresh: the changes will be “above the fold.”

    This means that while the previous two Penguin updates affected the entire first page of the SERPs, Penguin 3 made observable changes within the top 5 search results.

    Should you be worried?

    If your site has not been severely affected by the previous two Penguin releases, you shouldn’t have any worries. You have probably been doing SEO right as far as Google’s recent updates are concerned.

    If you’ve been hit by the previous releases but you made changes to your link-building activities in accordance with current SEO best practices, you should be fine.

    Just remember that the key to surviving any update Google throws at you is to make sure your site promotes high-quality, relevant, and extremely useful content; maintains natural link-building practices; and keeps informed of all timely developments in the Search Engine Optimization industry.

    In other words, stay on top of things!

    Conclusion

    I hope this post has provided you with useful information on the Google Penguin 3 update. If you need help making sure your site stays compliant with current SEO standards, contact us. We would love to chat with you about how we can help.

    What can we do for you?

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!

Love,

-The AudienceBloom Team