AudienceBloom

CALL US:  1-877-545-GROW

Tag Archive: google

  1. Your Guide to Google’s New Stance on Unnatural Links

    2 Comments

    unnatural linkRecently, Google quietly released an update to its link schemes/unnatural links document in Webmaster Tools. For something that happened so quietly, it generated significant noise across industry media outlets. So, what changes were made and what do SEO professionals, business owners and webmasters need to do differently as a result?

    Building Links to Manipulate PageRank

    Here’s what Google’s document now says about manipulating PageRank:

    “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.”

    What does Google mean when they say, “any links intended to manipulate PageRank”? According to Google, any links you (or someone on your behalf) create with the sole intention of improving your PageRank or Google rankings is considered unnatural.

    The quantity and quality of inbound links have always been a crucial part of how Google’s algorithm determines PageRank. However, this fact manifested manipulative link building schemes that created nothing other than spam across the Web, which is something Google has been working feverishly to eliminate since it launched it original Penguin algorithm in April 2012.

    Now, Google is much better at differentiating true editorial links (ie, natural) links from manipulative (unnatural) ones. In fact, Google now penalizes Websites in the search rankings that display an exceptionally manipulative link profile or history of links.

    What about Buying or Selling Links?

    Google says, “Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link.”

    If people found out that their favorite politician had in some way purchased a majority of his or her votes, how would they feel about it? When we purchase (or sell) links for a website, we are essentially doing the same thing.

    Google has made it clear that purchasing links violates their quality guidelines. However, many companies continue to do so, and some companies have severely lost search rankings and visibility as a result.

    Google is getting better at understanding which links are purchased in a wide variety of ways. They also have a team devoted to investigating web spam, including purchased links.

    Excessive Link Exchanges

    Google says, “Excessive link exchanges (“Link to me and I’ll link to you”) or partner pages exclusively for the sake of cross-linking.”

    A few years ago, it was a common for webmasters to exchange links. This method worked; as a result it started to become abused at large scale. As a result, Google started discounting such links. Now, Google has officially added this to its examples of unnatural link building tactics.

    Large-scale Article Marketing or Guest Posting Campaigns

    Google says, “Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links.”

    This, in particular, has a lot of people wondering, “can you still engage in guest posting as a way to get inbound links?” The answer depends on how you’re doing it.

    A few years ago, it was a common for SEOs to engage in large-scale article marketing in an attempt to quickly get tons of inbound links. Many were using low-quality, often “spun” content (mixed and mashed, sometimes computer-generated nonsense) to reduce time and content production costs. The result was a surge in nonsensical articles being published around the Web for the sole purpose of creating inbound links. It was a true “throw spaghetti at the wall and see what sticks” approach to online marketing; some publications rejected these submissions, and others approved them without any editorial review. It was all a numbers game with the hopes that some of the content would get indexed, and thus, count for the link.

    Google responded by launching its Penguin and Panda algorithms to penalize businesses that were creating this mess; Penguin targeted websites with many inbound links that were obviously unnatural, while Panda targeted the publishers that published the content without any editorial review. As a result, most of the large-scale article marketing links became worthless.

    After people started to realize that large-scale article marketing campaigns were no longer working, they turned to guest posting as an alternative. Unfortunately, what many considered “guest posting” was simply an ugly reincarnation of article marketing; the only difference was the publishers and the extra steps of finding websites open to publishing guest contributions. Many continue to use low-quality content in mass quantities, and wonder why they still get penalized by Penguin.

    Does guest posting still work for building inbound links?  Yes, but only if you publish high quality content on relevant, authoritative sites. High-quality guest posts are a popular and tremendously effective way to acquire editorial links for your site, and they have many other benefits as well. For more information on how to use guest posting as a safe, effective link building tactic, see my article “The Ultimate, Step-by-Step Guide to Building Your Business by Guest Blogging.

    Automated Link Building Programs

    Google says, “Using automated programs or services to create links to your site.”

    A few years ago, during the same time period that article marketing and spinning was all the rage, a market developed for programs and services that would automate the steps involved in these processes. These tools and services became popular because they were an easy way to get huge numbers of links to your site quickly. Most importantly, they worked. Unfortunately, they only accelerated the permeation of low-quality nonsense that pervaded the industry at that time.

    Google now hunts down sites that have these sorts of inbound links, denying them any benefit.

    Conclusion

    So, why is Google waging a war on unnatural links? For years, many SEOs effectively manipulated their rankings using the methods described above, along with others. However, the types of links and content that people created as a result provided no value to people; only clutter. They cause search results to display nonsensical, confusing content, which makes Google look bad to its users. Furthermore, they cost Google money as its bots spend time scraping and indexing nonsense rather than good, quality content.

    As of April 2012, with the release of the Penguin algorithm, Google has been trying to keep low quality content out of its index as well as the search results. Now, they’re becoming more transparent with their goals as they refine and clarify their webmaster guidelines.

    Although these changes created quite a stir across the industry, it’s really just the same message that Google has been trying to convey for years. Create quality content that people want to read and share; the inbound links will come as a result, and you won’t need to worry about unnatural ones bringing down your website in the rankings.

  2. How to Find LSI (Long-Tail) Keywords Once You’ve Identified Your Primary Keywords

    Leave a Comment

    LSI keywordsFor many SEOs, keyword research is all about finding keywords with a high number of monthly searches and low competition. Some of the more advanced will move on to long tail keywords, or keyword phrases, or look to local keywords to help lower the competition and leap to the top of the search engine results page.

    These are all great strategies, but to truly show your skills as a keyword ninja, and find those untapped gold nuggets, you have to know how to identify long-tail, LSI keywords.

    What are LSI Keywords?

    If you were to search for a definition to LSI, or latent semantic indexing, keywords you would find answers all over the map. Most people will tell you that LSI keywords are simply synonyms for your keywords. The belief is that by finding similar terms for your primary keywords you can make your content look a bit more natural while adding more possible search terms into the mix.

    However, this rudimentary explanation of the term doesn’t do enough to serve our purposes. If you want to master the LSI keyword we have to get elbows deep in what it means. I wrote an article specifically for that purpose: “Latent Semantic Indexing (LSI): What is it, and Why Should I Care?

    Wikipedia describes LSI as having the, “ability to correlate semantically related terms that are latent in a collection of text;” a practice first put into use by Bell Labs in the latter part of the 1980s. So it looks for words (keywords or key phrases, in our case) that have similar meanings and words that have more than one meaning.

    Take the term “professional trainer,” for example. This could mean a professional fitness trainer, a professional dog trainer, or even a professional corporate trainer. Thanks to LSI, the search engines can actually use the rest of the text in the surrounding content to make an educated guess as to which type of professional trainer is actually being discussed.

    If the rest of your content discusses Labrador Retrievers, collars, and treats, then the search engine will understand that the “professional trainer” being referenced is likely a dog trainer. As a result, the content will be more likely to appear in search results for dog trainers.

    Another case would be where multiple synonymous terms exist in the same piece of content. Take the word “text” for example. If this were a keyword for which you were trying to optimize your page, words like “content,” “wording,” and “vocabulary,” would all likely appear within the content because they are synonyms and/or closely related terms.

    The benefits of LSI keywords

    The most obvious benefit to LSI keywords is that your keyword reach becomes broader by using synonyms. As I wrote in my article “The Rise of the Longtail Keyword for SEO,” “they are playing an increasingly essential role in SEO.”

    In addition to the broader reach, your content will rank higher in search engines because of the supporting effect of the LSI keywords. Repeating your keywords over and over throughout the text in an attempt to achieve the perfect keyword density (which, by the way, is a completely outdated SEO term and tactic) makes the content read awfully funny; and the search engines are smart enough to detect this sort of manipulation, too. Using synonymous keywords helps make your content a richer experience for the reader, and more legitimate (and thus, higher ranking) to search engines.

    Finally, LSI keywords help keep you competitive for your primary keywords in the right context. If you are optimizing for the term “professional dog trainer,” you’re less likely to be competing against the other types of professional trainers in search results.

    Great, how do I find LSI keywords?

    The search for LSI keywords starts with your primary keywords. They are the foundation of your SEO efforts, so if you don’t have these identified yet, then go back and find these first. Once you have them you can get started with LSI keywords. How do you find primary keywords? See my article, “The Definitive Guide to Using Google’s Keyword Planner Tool for Keyword Research.”

    Contrary to what you learned in high school, the thesaurus is not your first stop to find synonyms for your LSI keywords.

    The easiest way to find out what the search engines think are terms related to your keyword are to use the search engines themselves.  Go over to Google, and start typing your primary keyword into the search box. Note all of suggestions that are provided and you will not only have a list of related keywords, but a list of keywords that Google knows are related.

    Once you’ve made your list, hit enter to perform a search for your keyword. Scroll to the bottom of the results page, and look at the Searches related to <your keyword>. This will also give you some insight as to good ideas for your LSI search terms.

    Google’s Keyword Planner

    There have been a few changes, other than the name, when it comes to Google’s new Keyword Planner, but anyone familiar with the old Keyword Tool should be able to navigate through it with no problems.

    You can use it to find LSI keywords, and the process is simple. First, click on Search for keyword and ad group ideas and enter your primary keyword in the Enter one or more of the following box and click on the Get Ideas button at the bottom. On the following page, click on the Keyword ideas tab to get a look at not just a list of recommended LSI keywords, but their monthly searches, competition and other metrics that can help you decide which ones to target.

    Paid keyword tools

    Like anything else in SEO, there are plenty of software packages and services you can buy that will help you in your search for LSI keywords. The downside to these is that you are paying for something that you can get for free. The upside is that the training and support that comes along with most of these purchases will help you learn how to find these keywords more easily.

    The secret operator

    Actually, this is no real secret, but if you place a tilde (the squiggly line ~) before your primary keyword in the Google search engine, it will provide you with the results for synonyms to your search term; for example, ~professional dog trainer.

    Reading over the titles and descriptions of the results, you’ll be able to find some good LSI keywords. If you want to leave a term out of the results, add that phrase to the query with a minus sign in front of it. For example: ~professional dog training –dog grooming.

    Like your primary keywords, you need to make sure that you don’t over-do it when it comes to LSI keywords. A few closely-related terms will be sufficient to help your SEO efforts. And like your primary keywords, don’t try to insert LSI keywords into the text where they don’t fit.

    Remember, latent semantic indexing will only help you if you are writing good content for your readers. LSI keywords will give the search engines the information and evidence they need to understand what your content is saying and reward you accordingly.

  3. Penguin 2.0: What Happened, and How to Recover

    1 Comment

    If you’ve spent any time recently in the world of SEO, you’ve probably heard about Penguin 2.0 — Google’s search engine algorithm change that was just launched on May 22nd, 2013. By the way that some SEOs were talking, you’d think it was the Zombie Apocalypse. Whatever it is, you can be sure that it will have a dramatic change on the web landscape. Here are five important questions and answers about Penguin 2.0.

    What is Penguin 2.0?

    To understand the 2.0 of anything, you need to understand the 1.0. The original Penguin is the moniker for Google’s algorithm update of April 24, 2012. When Google tweaked the algorithm in a big way, 3.1% of all English-language queries were affected by the update. Penguin was carefully designed to penalize certain types of webspam. Here are some of the main factors that Penguin targeted:

    1.  Lots of exact-match anchor texts (30% or more of a link profile)

    2.  Low quality site linkbacks, including directories and blogs

    3.  Keyword intensive anchors

    The aftershocks of Penguin continued long after April 24. Several mini Penguins were released since then, which is why some SEOs prefer to call the coming change “Penguin 4.” The new Penguin is predicted to do the following:

    • Penalize paid links, especially those without “nofollow”
    • Penalize spamdexing in a more effective way
    • Penalize advertorial spam.
    • Tightening penalties on link spamming/directory listings
    • Removing hacked sites from search engine results
    • Boost ranks for sites that have a proven authority within a niche

    How much different is it from Penguin 1.0?

    Calling this Penguin 2.0 is slightly misleading. We shouldn’t think of algorithm changes in the same way we think of software updates — better features, faster architecture, whatever. Penguin is not a software update. It’s a change in the way that a search engine delivers results to users.

    Here is a brief explanation of search engines, and how they change. Search engines are designed to give people the most accurate, trustworthy, and relevant results for a specific search query. So, if you type in “how to cook lima beans,” the search engine attempts to find the very best site on the Internet to help you cook your lima beans. Obviously, every lima bean recipe site wants to have the top dog spot on the search engine results page (SERP).

    Some webmasters will cook up clever tricks to do so. Thus, a site with hordes of banner ads, hordes of affiliate links, and barely a word about cooking lima beans could, with a few black hat techniques, climb in the rankings. The search engine doesn’t want that. They want people to have their lima bean recipe — great content — not just a bunch of ads.

    Thus, they change things deep within the algorithm to prevent those unscrupulous tricks from working. But the slithery lima bean site figures out a new way to slip by the algorithm. And the algorithm figures out another way to block them. And so on, and so forth.

    As all of this is happening, several key points emerge:

    1.  Search engine algorithms become more sophisticated and intelligent.

    2.  It becomes less likely for sites to game the system.

    At AudienceBloom, we follow white-hat SEO principles. We understand that there are a few tricks that we could use that might bump your site higher in the short term. However, we don’t engage in those practices. We want our clients to be successful for the long haul, which is why we engage in SEO techniques that are truly legitimate.

    What’s going to happen? 

    Now that Penguin 2.0 is rolling out, one of two things will happen to your site (as Google’s data centers propagate with the algorithm rollout and your rankings are adjusted accordingly):

    1. Nothing.

    2. Your rankings will drop, organic traffic will tank, and your site will begin to flounder.

    If, unfortunately, number 2 strikes, you may not realize it for a few days unless you are a big site with 10k+ visits with 30%+ organic a day.  In order to answer “what’s going to happen” for your site, you need to understand whether or not your site is in violation of any Penguin 2.0 targets. That question is better answered with an entire article of its own, but here are a few warning signs that your site could be targeted by Penguin 2.0.

    • You’ve had unscrupulous link building efforts conducted on your site.
    • You purchased paid links from another site (e.g., advertorials)
    • You rely on spam-like search queries (for example “pay day loans,” “cheap computers,” “free gambling site,” etc.).
    • You have aggressively pursued link networks listings on unreliable directories.

    Each of the above four points are common SEO tactics. Some SEOs have become sneakier than the algorithm, which is why Google is making these important changes.

    What should I do to prepare or recover?

    The most important thing you can do right now is to follow Matt Cutt’s advice in his recent video:

    “If you’re doing high quality content whenever you’re doing SEO, this (the Penguin update) should not be a big surprise. You shouldn’t have to worry about a lot of changes. If you have been hanging out in a lot of blackhat forums, trading different types of spamming package tips and that sort of stuff, then this might be a more eventful summer for you.”

    Content is the most important thing, of course, but that’s more of a proactive preparation than an actual defense. Is there a way to actually defend yourself from the onslaught of Penguin 2.0? What if you’ve already been affected by it?

    One important thing you can do right now is to analyze your site’s link profile to ensure that your site is free of harmful links. Then, you should remove and perform disavow requests on the bad links to keep your site’s inbound link profile clean. This is the equivalent of a major surgery on your site, and it could take a long time to recover. Here’s what Matt Cutts said about it on May 13:

    Cutts on Penguin 2.0

    Here are the steps you need to take to recover from Penguin 2.0:

    Step 1. Identify which inbound links are “unclean” or could be hurting your rankings (ie, causing you to be affected by Penguin 2.0). To do this, you’ll need to perform an inbound link profile audit (or have us do that for you).

    Step 2. Perform major surgery on your site’s link profile in order to make it as clean as possible. This includes removing links identified in the link profile audit, and then disavowing them as well.

    Step 3. Build new inbound links using white-hat tactics like guest blogging, while abiding by proper anchor text rules with your new inbound links.

    Step 4. Establish a content calendar to keep pushing out high-quality content, engage in social media, and avoid spammy techniques of any kind.

    If you’re looking for SEO help, AudienceBloom is prepared to help. One of our major efforts in the wake of Penguin 1.0 was helping sites to recover their rankings and clean up from their past. If you’ve been hit by Penguin 2.0, now is the time to take action to recover your rankings and search traffic. Contact us for a complimentary assessment and action plan.

  4. How to Prepare for Penguin 2.0: Take Off that Black Hat!

    3 Comments

    Google Penguin 2.0What do Penguins, Pandas, and black hats have in common? Lots! Penguin is the most recent set of guidelines published by Google designed to clean up abuses in the field of SEO, and a new version is due out soon, according to Google’s Web Spam Czar, Matt Cutts. The impending event has marketers, reputation managers, and webmasters scurrying for cover.

    SEO – A Concept Recap

    SEO (search engine optimization) is the relatively newborn public relations field that tries to increase the visibility of websites by the strategic placement of keywords, content, and social media interaction, and the industry has grown rapidly in a little over a decade.

    Carried to extremes, as such things always are, black-hat SEO is a subdivision within the field that tries to achieve money-making results in an unsustainable way (ie, against Google’s webmaster guidelines). It frustrates the very purpose of a search engine, which is to help users find the information they need. Instead, rampant SEO gone amok serves only the needs of online marketers wishing to increase sales for themselves or their clients.

    To readjust the proper balance, Mr. Cutts and his team of “penguin” police have attempted to establish guidelines that will rule out the most abusive practices of black hat SEO.

    BlackHat SEO – Are You Doing It?

    The predecessor to Penguin was Panda, with much the same purpose. Panda included a series of algorithm updates, begun in early 2011. These were aimed at downgrading websites that did not provide positive user experiences.

    Panda updates of the algorithm were largely directed at website quality. The term “above the fold” is sometimes used to refer to the section of a website that a user sees before one begins to scroll down. The term comes from newspapers, which are delivered folded in two. The section that is “above the fold” is the section one sees before opening the paper, or unfolding it.

    Typically, marketers wish to cram as much eye-catching, commercial material as possible into this section, while responsible journalists wish to pack it with the most relevant and useful information.

    Penguin, on the other hand, is targeted more specifically at keyword stuffing and manipulative link building techniques.

    One targeted abuse, keyword stuffing, is not a tasty Thanksgiving delicacy, but the practice of loading the meta tag section of a site, and the site itself, with useless repetition of certain words. Sites can lose their ranking altogether as a result of such stuffing.

    Abusive practitioners of keyword stuffing are not above using keywords that are rendered invisible because their font color is identical with the background color. The user doesn’t see them, but the search engine spider does. This practice was soon discovered, however, and dealt with by the search engines.

    Meta tags are sometimes placed behind images, or in “alternative text” fields, so that the spiders pick them up while they remain invisible to users. Popular or profitable search keywords are sometimes included invisible to humans, but visible to the search crawlers. Very clever, but also soon discovered and dealt with. With Penguin, Google now analyzes the relevance and subject matter of a page much more effectively, without being tricked by keyword-stuffing schemes.

    “Cloaking” is another tactic that was used for a while to present a different version of a site to the search engine’s crawler than to the user. While a legitimate tactic when it tells the crawler about content embedded in a video or Flash component, it became abused as a Black Hat SEO technique, and is now rendered obsolete by the technique of “progressive enhancement,” which tailors a site’s visibility to the capabilities of the user or crawler. Pornographic sites have often been “cloaked” in non-pornographic form as a way of avoiding being labeled as such.

    The first set of Penguin guidelines and algorithms went live in April 2012, and the second main wave is due out any day now (though Penguin has gone through several periodic updates since its initial release). It’s designed to combat an excessive use of exact-match anchor text. It will also be directed against links from sources of dubious quality and links that are seen as unnatural or manipulative.

    The trading or buying of links will be targeted as well. The value of links from directories and bookmarking sites will be further downgraded, as will links from content that’s thin or poor-quality. Basically, the revision in the algorithms will be designed to rule out content that serves the marketer’s ends rather than the users’.

    Advice For SEO Marketers To Stay Clean

    If you are a professional SEO, the questions to ask yourself are:

    • Is this keyword being added in order to serve the customer’s potential needs, or is it designed merely to increase the number of hits? If the latter, then the additional users that would be brought to the site by the keyword are probably not high-quality conversion potential.
    • Is the added SEO material being hidden from the user or the search engine crawler? If so, with what purpose? If that purpose amounts to dishonest marketing practices, the material runs the risk of getting you in trouble with Penguin.
    • What’s the overall purpose of your SEO strategy? If it’s anything other than increasing sales by enhancing user experience, then you may expect an unwelcome visit from Penguin.

    If you’re a user, you’ll very likely not be as conscious of these changes, except inasmuch as they will alter the look of your search results page when you perform a search in Google. Will the new Penguin algorithms cut down on those ubiquitous “sponsored links” or “featured links”? Probably not. But savvy users know how to ignore those links by now, except of course when they turn out to be useful.

    Will the new algorithms enhance the overall usefulness of the search engine experience? Probably, at least marginally, and perhaps even in a major way. The whole field of internet marketing and e-Commerce is changing so rapidly and radically that it’s hard to keep track of the terminology, especially the proliferation of acronyms. But the ultimate goal will be an enhanced user experience.

  5. 5 Ways to Outrank the Big Brands

    Leave a Comment

    One persistent rumor about Google is that the search giant favors big brands—“authority websites” and reputed sites of long standing—when it comes to search engine rankings. Prior to Google’s Penguin algorithm update, it was easy for small websites to rank ahead of big brands by using spammy SEO tactics that were specifically targeted by Penguin. This not only caused the drop of countless small websites battling for top rankings, but also fueled the flames of suspicion that Google favors big brands.

    Google Loves Big BrandsMany webmasters now suspect—and probably with some justification—that authority is everything in the world of search. Opinions are floating around that Google has bestowed its favor solely upon big brands and websites that have been around a long time. I see that in my line of work, too. Brand websites (or authority websites) often get away with posting what almost anyone would agree is “thin” content.

    Does this mean Google’s algorithm is skewed to favor big brands? While it isn’t possible to prove this point of view, several new websites have arisen and ranked well for highly competitive keywords where brand/authority websites already were operating. So, it can be done. But what’s the secret?

    The solution is to move beyond the general perception of unfairness and work toward becoming a brand yourself. And I would argue that whether Google favors the big guys or not really doesn’t matter. Here’s why.

    1. Identify Opportunities with Long-tail Keywords

    One of the easiest things to observe is that big brands—the seemingly invincible market leaders—rank for only a particular set of keywords. Not every keyword that relates to a niche can be targeted by the top brands, so there’s a considerable array of keywords—mostly long-tail—that are lying about. While ranking for core keywords is probably not going to be possible, that’s ok; long-tail keywords are your opportunity to shine, and they often yield better conversion rates, bounce rates, and time-on-site metrics than core terms.

    Regardless of how big a website is (in terms of brand value) and how long it’s been established, it can’t possibly account for every possible keyword combination in its content strategy. The authority websites are strong because they built a solid content strategy around a few keywords and stuck to it, with plenty of dollars and hours invested in that strategy.

    Key Takeaway: Figure out the keywords for which your big-brand website doesn’t rank well. Then take on the competition with those keywords. There are many ways to find long-tail keywords, but the most basic is to use Google’s Adwords Keyword Tool to perform keyword research.

    2. Authority Isn’t Everything; Content is King

    A key feature of Google’s algorithm is the concept of authority; Author Rank seems to be the mantra of every SEO effort. But, in truth, how many websites with correct author information have you observed still failing to secure the top spot? Some of these, you’ll notice, are from websites that are in the big-brand league.

    For instance, you won’t see Mashable or BuzzFeed ranking right at the top for every “tech-related” keyword. But honestly, they’ve got some really awesome content.

    Key Takeaway: The ranking algorithm is a summation of factors that consist largely of:

    • Inbound links at the page level
    • Social shares at the page level
    • Comments at the page level

    So, while big brands easily get inbound links, social shares, and comments at the page level, this isn’t an algorithmic favor from Google; it’s simply the result of big brands investing time and money into developing and nurturing their reader base.

    Counter this advantage by creating better content around the keywords for which you want to rank, properly optimizing that content from an on-site perspective, and strategically marketing it. Big brands may get links and social shares more easily, but great content will always win out over time.

    3. Social Signals Don’t Play Favorites; Use Them to Your Advantage

    This is the age of social signals. Google and Bing are actively seeking out social signals to value the websites they rank, and this offers a huge benefit for new websites looking to compete against authority websites. What is interesting is that the notion of authority itself is often deciphered through the kind of social shares and signals your website/page sends (besides the other, usual factors of Author Rank and links, of course).

    When it comes to social signals, brand/authority doesn’t really matter. If you provide value, you win. If you provide the most useful and unique content, you win.

    Key Takeaway: Encourage social shares and maintain your social media presence. It’s the perfect intersection between SEO and user engagement that offers you an opening to beat the big guys.

    4. Focus on People, Not Search Engines

    Pause a moment and think about this: What actually comprises an authority website? Most of the big brands have taken years to build a strong and credible following, a fan base, or active user-engagement. That boils down to just two things: 1) value and 2) people. When you provide value through your content strategies and your products, you attract people. As such, your focus should be on people.

    Key Takeaway: Treat SEO as a tool and not as the means to achieving the goals of your website/brand. The real means involves pushing value outward and enabling it to be shared across a wide platform—ideally, multiple platforms. This will draw in customers over the long run, and establish you as a brand and authority in your own right.

    5. If You Can’t Beat ‘em, Join ‘em

    If you’re already doing the things that are recommended by content strategists, user-engagement experts, and community managers, you’re already on your way to building your own brand. Why, then, should you worry about whatever bias Google might harbor with regard to the big brands?

    If the bias isn’t really as strong as we think, then there’s nothing to worry about. If it is a fact of life, then it may favor you over time as you build your own brand. Building a business requires long-term investments of time and money, and that’s often what it requires to outrank the big brands. But as you grow your business strategically and have patience, eventually you’ll be playing in that same ballpark with the big guys.

    In the age of “authority,” the challenge for new websites is that, by the time you get halfway toward becoming a recognized authority on Google’s radar, many of your competitors may have established a firmer brand, better positioning, and a stronger level of authority.

    That’s why you can’t waste time by competing with them on every keyword they rank for. Instead, build your brand, gradually, by targeting the areas the bigger brands and authority websites haven’t included in their net. From then on, it’s merely a matter of repeating the efforts, using social channels and strong content strategies, while expanding your territory.

  6. Google vs. Facebook: Who’s Got the Upper Hand?

    Leave a Comment

    We’re lining up Google and Facebook against each other. It looks like the match of our lifetime.

    Facebook recently joined the ranks of high-profile tech companies that have gone public. While Facebook’s IPO failed to impress, the company still is a huge force to be reckoned with. Facebook is valued at nearly $105 billion, and it has a solid 900 million-plus members.

    With its membership nearing the 1 billion mark, Facebook appears poised to take on the whole world. Or is it? Is Facebook really invincible? Can it topple the current king of the tech industry — Google?

    Not likely . . . at least not in the foreseeable future.

    Some web experts believe that Facebook lacks the appeal of Twitter, which has been instrumental in breaking some of the biggest news stories in recent history.

    Some even argue that Facebook is going to end up like MySpace, which has entirely lost its appeal. You may recall that MySpace was once the king of the social media hill, after it stole the thunder from rival Friendster. Now MySpace has been reduced to nothing more than a virtual ghost town.

    Lately, we have seen a growing number of people who complain about Facebook selling its members out. Some have sought refuge in Google+, now viewed by many as a worthy successor to Facebook.

    Google is hell-bent on making social interactivity work on Google+. Although at one point the company denied that Google+ is a social network, for users it works, looks, and feels like one.

    Regardless, Google’s command of the online advertising space is staggering, and Facebook wants a piece of that. Who will be the last man standing in this battle?

    The cutthroat rivalry of social networks
    Facebook has maintained a solid reputation with its status as the largest social network in the world. On the other hand, Google has amassed a huge follower base that is increasingly drawn to Google+ for its new approach to social networking. Quite simply, Google+ offers great features that Google+ users aren’t willing to give up.

    Google+ is now home to nearly half a billion members, of whom 100 million were active users as of September 2012. It took Google+ just over a year to grow its user base to that substantial number. By comparison, it took Facebook many years to grow its user base to 900 million.

    At this rate, Google+ could snatch the helm from Facebook in a matter of months.

    While users can do a lot of things on Facebook, such as share and recommend content, upload and watch videos, post and share photos, etc., Google+ has some huge advantages over Facebook.

    One advantage Google+ has over Facebook is its seamless integration of products and services that many of us had already been using fairly heavily, such as YouTube, Google Calendar, and Google Drive (formerly Google Docs).

    Since Google+ integrates many of the tools that members already use, it’s much easier to aggregate content and collaborate with other people.

    What I especially like about Google+ is Hangout, a cool video-conferencing feature that allows as many as 10 people to chat via video all at once. Hangout also allows users to watch a YouTube video simultaneously. How cool is that? Very cool indeed.

    Right now, for those who desire some rest from the noise that’s plaguing Facebook, Google+ is an excellent alternative. Google+ is full of interesting images, videos, and content you won’t find on Facebook or other social networking sites.

    The advantages of Google’s product lineup
    Google did run into some trouble when they declared that user data will be shared across its products, but the company did a great job of providing a seamless user experience.

    Whether you are using an Android tablet or an Android phone, you can tap into Google’s wide range of products seamlessly from YouTube, GTalk, Calendar, Google Play, Maps, to a host of other product lines. You can also sync all your favorite Google apps to a single device.

    Personally, I don’t think I’d be able to do the things I do on a regular basis as productively without the apps from Google to which I’ve grown so accustomed.

    But that doesn’t mean I’m not active on Facebook. I still visit the competition to this day, but only when I want to get in touch with friends and keep myself in the loop about what’s going on with them.

    The competitive world of online advertising
    Facebook is still enjoying a steady growth in number of users, so the odds are good it will surpass the one billion point fairly soon. And after its recent IPO, the company has a pile of cash at its disposal to fund more innovation.

    Facebook’s got a great deal of resources that may enable it to beat Google in the mobile advertising arena, an area where Google is already playing well.

    With regard to advertising, Facebook could one-up Google in several key areas, such as display ads. However, Google has already gotten a good head start in mobile ads, forcing Facebook to struggle to make a decent foray into this area.

    Google is also keen on flexing its muscles to extend its might into social, an area where it has repeatedly failed in the past — with Google Wave and Google Buzz, for example.

    But with Google+, Google seems to be doing things right this time around.

    So Facebook is still the undisputed king of the social media hill, but the Google+ threat is real and looming. With Google’s vast resources in terms of cash gained from its extremely profitable advertising model, it could be only a matter of time before we see Google+ take the lead from Facebook.

    Conclusion
    What do you think? Who will be left standing, and receive the crown as the true social media giant in 2013 and beyond? Will we still see Facebook dominating the social space next year, or will Google+ finally stake its claim to unchallenged dominance of the Internet?

    We would love to know what you have to say about this. Post your comments below!

  7. Google Penguin Update 3 Unleashed

    2 Comments

    We are seeing drastic changes in how search engines deliver information. Google in particular is striving to deliver not just the most relevant information, but the most useful and high-quality content.

    Google has released a series of algorithmic updates aimed at placing sites that promote high-quality content ahead of the pack in search results. The most notable of these updates are Google Panda and Google Penguin.

    On October 5, 2012, Google unleashed the latest version of its anti-webspam algorithm called Penguin. Widely known as Penguin 3 and announced by Matt Cutts via Twitter, it announced a refresh that affects sites in several languages:

    “Weather report: Penguin data refresh coming today. 0.3% of English queries noticeably affected. Details: http://t.co/Esbi2ilX.”

    0.3% may sound like a tiny figure, but it’s a significant signal that should put everyone on alert that the Penguin update is here to stay, and it’s hell-bent on providing search results that are more relevant and more useful.

    A brief timeline of Google Penguin updates

    Among Google’s algorithmic updates, perhaps none was more striking than Google Penguin. Below is a brief timeline of the Penguin update:

    • April 24, 2012 – The first Google Penguin was released. Almost instantly, many sites saw an incredible drop in rankings.
    • May 26, 2012 – Penguin 1.1 was released, affecting less than 0.1% of English sites.
    • October 5, 2012 – Penguin 3 was announced and released, with about 0.3% of English sites affected by the latest refresh.

     
    How does Penguin 3 fit into the series of Penguin updates?

    Penguin is all about ridding the SERPs of spammy websites. If they were overly optimized with low-quality content and links, sites were heavily penalized by the first two versions of Google Penguin.

    Penguin 3 has had the same effect on search queries as Penguins 1 and 2, but Cutts was explicit this time about the effect of the latest refresh.

    In his tweets, Cutts revealed the size of Penguin 3’s impact on queries in English and several other languages. Below are some of the data on how much Penguin 3 affected searches in various languages:

    • English – 0.3%
    • Spanish – 0.4%
    • Italian – 0.3%
    • French – 0.4%

     
    What’s noticeably different?

    SEOs and webmasters are probably wondering what’s different in the SERPs as a result of Penguin 3’s release.

    With the release, Matt Cutts gave us a clear idea of where we should look for the results of the latest algorithmic refresh: the changes will be “above the fold.”

    This means that while the previous two Penguin updates affected the entire first page of the SERPs, Penguin 3 made observable changes within the top 5 search results.

    Should you be worried?

    If your site has not been severely affected by the previous two Penguin releases, you shouldn’t have any worries. You have probably been doing SEO right as far as Google’s recent updates are concerned.

    If you’ve been hit by the previous releases but you made changes to your link-building activities in accordance with current SEO best practices, you should be fine.

    Just remember that the key to surviving any update Google throws at you is to make sure your site promotes high-quality, relevant, and extremely useful content; maintains natural link-building practices; and keeps informed of all timely developments in the Search Engine Optimization industry.

    In other words, stay on top of things!

    Conclusion

    I hope this post has provided you with useful information on the Google Penguin 3 update. If you need help making sure your site stays compliant with current SEO standards, contact us. We would love to chat with you about how we can help.

  8. The Google Panda 20 Update: Some Key Information

    Leave a Comment

    Google is making everyone aware of the company’s relentless drive to supply better and more useful information.

    Following a series of Panda and Penguin updates, Google Panda #20 was released on September __, 2012. This was the first major update since July 24, 2012. More Panda updates are expected to be released within the next few months, and future updates will be more consistent.

    Unlike other recent releases, Panda #20 was a fairly major update – one that ran for almost a week. In fact, some 2.4% of English queries were affected and about 0.5% in other languages.

    Another interesting thing about Panda #20 is that it overlapped with another algorithmic update dubbed the EMD update, which Google rolled out to target sites with low-quality, exact-match domain names.

    This made it tricky for affected SEOs and site owners to determine which update had hit them. Was their site hit for failing to comply with Google Panda standards, or for having a poor exact-match domain name?

    Panda was released to devalue sites with “thin content,” or content that offers minimal value. Since its release last year, tons of sites have seen a dramatic drop in rankings. Some, especially notorious content farms, have been removed from Google’s index altogether.

    Panda also targeted sites that contained duplicate content. As a result of Panda’s release, black-hat SEO practices have been significantly thrashed. Sites that churn out hundreds of pages with duplicate content were obliterated.

    The release of Panda, as well as its equally ferocious sibling Penguin, met with a few complaints as well. Years of hard work and substantial amounts of marketing dollars to push a site to the top of Google rankings were effectively tossed out the window. SEOs, publishers, and site owners, believing they had been following recommended SEO best practices, cried foul.

    The hard lesson we can all learn in the aftermath of Google’s algorithmic changes is that, while it’s true that quality is subjective, standards have been laid out.

    The stress on quality and authority

    Quality and relevance of information is at the heart of every substantial change rolled out by Google. To ensure that every site is on the same page with Google, they have laid out guidelines for site owners to follow.

    As far as Panda is concerned, as long as your site’s content is original and offers quality and useful information, you should be fine.

    As long as the content strongly relates to the topic and offers great value to your audience, there wouldn’t be any reason for Google to slap you.

    Do your link-building activities follow the prescribed or accepted methods? Have you been linking to authority sites, and are authority sites linking back to yours? Do you make a point of regularly checking your link profiles for any potentially damaging links?

    There’s no way of telling how many more of Google’s Panda updates are coming in the future, but Matt Cutts has made it clear that Google Panda will be updated and refreshed on a continual basis. This shows how committed Google is to making the Internet a better and more reliable avenue for gleaning valuable information.

    Conclusion

    It’s crucial to keep abreast of the periodic algorithmic changes that Google rolls out. Keeping yourself on the same page with the search engines is vital to the success of your online business.

    If you need help keeping your site compliant with current SEO best practices, contact us. You can also subscribe to our feed to keep yourself in the SEO loop.

  9. Google’s EMD Update: What Really Happened?

    Leave a Comment

    As with most things in life, the only thing certain about Google and SEO is that there will always be changes. If you’re going to try to keep your site optimized for good rankings, you either have to learn to go with the flow and adjust, or bring someone in to do it for you.

    What’s so frustrating for many people though, is that no matter how good your intentions and no matter how much you strive to play by the rules, you still end up getting penalized. Your site drops or disappears from the search results, leading to a loss of income.

    The EMD Update

    One of the newest changes that Google has implemented is the EMD algorithm update. Originally rolled out towards the end of September, Google continues to reevaluate sites for it in what they call a data refresh.

    EMD stands for “exact match domain” and this update was supposedly going to target only EMD sites.

    What exactly is an EMD? An EMD has the keyword or keyword phrase you’re targeting in the domain name itself. For example, if your company sells laptop cases then an EMD might be something like laptopcasesstore.com.

    Why Target EMD Sites?

    Many niche site builders and marketers have been using EMD sites for years. This single element, having their main keyword in the domain, provided a huge ranking boost, often enough to help them easily rank their sites.

    What’s wrong with that?

    Many of these sites have very thin content, built solely for the purpose of promoting affiliate advertisements, Adsense ads, or other methods of making money. Google had been hinting about this change for quite a while, but it mostly fell on deaf ears. Nonetheless, the majority of these online marketers continued to pump out thousands of these low quality sites and take their chances.

    Sure, most websites are built for the purpose of making money, including legit company websites. However, the EMD sites we’re talking about here rarely have any helpful content and provide no value for the user. That is why Google finally did this, but the problem is it just isn’t that easy.

    The Problems with the EMD Update

    There are probably hundreds of thousands of sites that would technically fall under the EMD category that are amazing sites with incredible value. These websites are often authority sites that specialize in a particular micro-niche. So do they really deserve to be punished? No.

    Some of them haven’t been. For example, consider CheapTickets.com; It doesn’t seem to have been affected.

    Google says that supposedly sites will be evaluated by the Panda and Penguin guidelines, as well as the EMD update. This basically means that if your site has low-quality content or not much content at all, is hammered to death with advertisements and other no-no’s AND you have a domain name that could be considered an EMD, then you better watch out. You’ve probably already lost rankings or likely soon will.

    Sounds great in theory, doesn’t it? Unfortunately, things don’t always work out quite so easily. Let’s take a look here at something I just found literally a few minutes ago. If all the things above were true and carried out to perfection, then someone please tell me how this site is holding on to amazing search engine rankings?

    *Note: I have no idea who owns the following site and am no way affiliated with it. It’s just a random search term I plugged into Google to see what I found and right before me appeared a perfect example of what’s not so perfect about these updates…

    I decided to search for green tea reviews. I don’t even drink green tea, but it’s a hot topic in the health niche. Why reviews? Many affiliate marketers and online marketers target keyword phrases like this – with reviews or something in the term. Here’s a screenshot of the search results…

     EMD search results

     Do you see what I see? This EMD site, so obviously an EMD that Google was supposed to be targeting, ranks #3. Not only that, but it ranks higher than the official government website sitting at #5. Just to be sure (though I pretty much already knew), I clicked through to see if it was really a site that deserved to outrank the .gov official site…

    EMD update

     

    Bam. Right smack at the top. It’s a site built to promote products from Amazon. This is strictly an affiliate site with very little content. You can tell this is not a go-to source, an authority site or anything even close. So how is it sitting at #3?

    On the flip side, I was reading a forum where the owner of a furniture store is baffled (and should be). His site, which did contain the words ‘his area’ and ‘furniture’ in the domain, was hit hard. It’s gone from the search results. This was actually his company’s site, a real site made for customers who couldn’t make it to the store or wanted to browse before walking into the store.

    So tell me, how does this make the Google search experience better for the user? Of course, many of the thin EMD sites that should have been hit were hit. Of course, many of the real sites providing value have sustained their rankings. I’m just saying that no update is going to be perfect and you have to be prepared to fight for your position if it comes down to it.

    If your domain could be considered an EMD, then make sure that:

    • You are not going crazy with the keyword. That means putting the exact main keyword in every single image tag, the title, the keyword tag, the site description, every single header tag, etc… Don’t over use that term within the content of your site either.
    • Link to authority sites (For example, if you have a health site, link to WebMD here and there).
    • Have plenty of pages and helpful content (hopefully more than 10 pages).
    • Use well-planned internal linking
    • Don’t go over 2% keyword density
    • Make it easy for people to share your content socially
    • Set up Facebook or Youtube channels and use them… interact and link to your site.
    • Any backlinks you build use a wide variety of anchor text, even ones not including your keywords. Build ones that use your plain URL as the link, use ones that say “click here”
    • Get active with social media so your site gets lots of mentions, likes and links from social sites (social signals), but don’t “buy” them. They should happen naturally.

     
    Conclusion

    Clearly, Google’s EMD update has affected many, but not all. I expect there’ll be more iterations of the algorithm before they get it completely right, as can be expected for all algorithm updates. If you’ve been hit by the EMD update, we can help you recover! Contact us for more information.

  10. Disavowing Links in Google – Pros and Cons

    Leave a Comment

    Bing scored a head start by being the first search engine to provide webmasters with a tool to disavow links. It’s welcome news for SEOs and webmasters, but players in the SEO arena continue to look for the same tool from Google.

    SEOs and site owners are becoming concerned about negative SEO, because manually removing spammy links (sometimes thousands of them) is time-consuming and exhausting. And because Google is still the reigning king of the SEO world, the community is clamoring for solutions.

    Matt Cutts acknowledges that “negative SEO is not impossible, but it is difficult.” He further assures us that Google is working on enabling webmasters to disavow links, and that it’s going to be rolled out possibly within a few months.

    But while many of us in the SEO community are drooling for what might be Google’s tool to disavow links, there are many who are not so sure about it.

    At first glance, disavowing links in Google is tempting. It’s easy to come up with a lot of benefits that would result.

    However, there are possible negative consequences, as well.

    That’s why it’s worth investigating all sides of this development to grasp how disavowing links in Google might affect your site’s rankings.

    Arming yourself against negative SEO

    After Google unleashed Penguin and wreaked havoc on sites that give and receive spammy and unnatural links, webmasters and SEOs became concerned about negative SEO.

    Penguin frowns upon sites that are riddled with poor quality and spammy links. This is where scheming individuals found an opportunity to take any site down by bombarding it with malicious links.

    Imagine your horror if one day you woke up to discover thousands of malicious links pointing to your site. Just the thought of having to clean your site of all those links manually is frustrating.

    This is perhaps the biggest benefit that disavowing links has to offer. It effectively helps webmasters and SEOs battle negative SEO by informing Google which sites they feel shouldn’t be associated with their own and which sites are probably linking to them for malicious purposes.

    Cleaning up your act

    And then there’s the fact that you might have inflicted harm upon yourself by implementing back-linking tactics that now violate Google webmaster guidelines.

    Over-optimization by using mostly exact-match keywords as anchor text spread across hundreds of pages, directories, and social sites used to be extremely beneficial for your rankings (prior to Panda 3.3). However, post Google Penguin, such practices have no effect or harm your site’s rankings.

    As a responsible webmaster or SEO, the most sensible thing to do is either to cut your ties with sites to which you’ve created massive links or change the anchor text used to link to your site from each offending link.

    But with thousands of links pointing to your site, you are again faced with a grueling task that could take hundreds of hours and a substantial amount of other resources, just to fix things up.

    While it’s an enormous task to wean your link building off the tactics that used to work so well, it’s most important to first determine which inbound links are harming your rankings. Happily, that’s the easy part. If your site is set up in Google Webmaster Tools, Open Site Explorer, or Majestic SEO, you can easily see who links to you.

    When Google finally rolls out a feature that enables disavowing links, you won’t have to worry about cutting the cord that links you to offending sites one after another.

    The Case for Why Google Shouldn’t Implement Disavowing Links

    Ryan Jones suggests that disavowing links might make spamming easier. He contends it could open an opportunity for others to manipulate this feature and better understand the algorithm.

    Among other things, he also points out that some SEOs could be doing more harm than good to their sites by disavowing links that aren’t actually hurting them.

    Also, there’s the danger of accidentally outing other webmasters (a frowned-upon practice) with the disavowing links feature, which, according to Ryan, is essentially an automated outing form.

    The points he raises are valid. In my opinion, SEOs should applaud Ryan for raising these points even before Google rolls out its own disavowing links feature.

    But it’s probably safe to assume that, because Google is thinking about implementing this much-anticipated feature in the coming months, the search giant is studying both the negative and positive effects it may have on the future fortunes of SEOs. After all, SEO is an ever-evolving process. It’s a never-ending effort of trying to figure out what works and what doesn’t.

    I’m not the type who sits and waits. I’d rather raise points that people or organizations should take a look at. If Google does plan on implementing disavowing links in the future, it needs to take the time to hear out all concerns raised by SEOs and webmasters.

    Conclusion

    I hope this post has helped raise your awareness on the debate for a Google “disavow links” tool. What do you think about disavowing links? Should Google implement it or not?

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!

Love,

-The AudienceBloom Team