CALL US:  1-877-545-GROW

Category Archive: Google

  1. Why Did Google Update Search Quality Raters Guidelines?

    Leave a Comment

    Google’s “Search Quality Raters” guidelines (henceforth shortened to SQR guidelines) are something of a holy document in the SEO community. We live in a world where Google is pretty much the dominant force in search, dictating the trends and tropes we optimize our sites for (and influencing any other competitors who peek their heads out for search space), and it’s hard because they don’t tell us specifics about how their search algorithm works. Instead, we get hints and suggestions that indirectly tell us how to rank higher but mostly just prevent us from relying on black hat tactics to rank.

    The Original SQR Guidelines

    There have been a handful of “leaked” versions of the SQR document, and one official abridged version released by Google, but it wasn’t until last year that Google released the full SQR guidelines, in all their 160-page glory. Search marketers, once they got over being intimidated at the length, delved into the document to see what new insights they could uncover about how Google interprets the authoritative strength and relevance of websites.

    Google Search Quality Rating Program

    (Image Source: Google)

    The original document didn’t exactly revolutionize the search world, but it did bring up some important considerations and strategic takeaways we wouldn’t have gotten otherwise.  Much of the document did, admittedly, tread old ground by covering things like the importance of high-quality content and how Google views relevance to search queries from a semantic angle. However, there were some noticeable new insights:

    • Google views pages that deal with your money or your life “YMYL” pages more significantly than other pages.
    • Expertise, authoritativeness, and trustworthiness (EAT) are the three factors that Google uses to determine a site’s domain strength.
    • Content positioning and design come into play. Google evaluates content differently based on how it’s placed.
    • Know queries and know simple queries. These are designations for different types of queries based on how they can be answered; namely, succinctly or with more elaboration necessary.

    Now, it appears Google has made some major modifications to the SQR document.

    The Changes

    Only four months after the document was originally published, Google has made significant changes. You might think Google added even more content to the 160-page behemoth, but actually, the document shrank, specifically to 146 pages.

    Among the most important changes include:

    • A decreased emphasis on supplementary content. Supplementary content refers to any on-page content other than the “main” source of information. For example, if your Contact page has a few paragraphs of text explaining who you are and what you do, you might have supplementary content in the form of notes in the footer, or testimonials. Supplementary content can help or harm you, and it was a major point of emphasis in the previous version. Now that Google has downplayed it, it might be a sign that it’s not as important to your rank as it used to be.
    • An increased attention to Local search, now called “Visit-in-Person.” Google spends more time talking about the importance of local ranks and how to achieve those ranks. It has also adopted new terminology, “visit-in-person,” which may explain how they perceive these types of user queries. Rather than simply relegating these types of entries, which function on an algorithm separate from the national results, to a geographic sub-category, Google is now boasting these entries as means for foot traffic. It makes sense, as most local searches happen on mobile and are related to some semi-immediate need.
    • Increased descriptions of YMYL and EAT concepts. I described both the YMYL and EAT concepts in the section above. The concepts themselves haven’t changed, but Google has increased its emphasis on them. This means the concepts may be becoming more important to your overall rank, or it may mean that there was some initial confusion surrounding them, and Google has worked to clarify those points.
    • More examples of mobile marketing in effect. It’s no surprise that Google is doing more to play up mobile, especially with another Mobilegeddon-style update in the works. Mobile is a topic that still confuses a lot of webmasters, but it’s still becoming increasingly important as a way to reach modern audiences. Mobile isn’t going away anytime soon, so this is a vital area (and Google recognizes that).

    If you’re interested in a much, much more thorough analysis of the changes, there’s a great post about it here.

    Google’s Main Priorities

    By examining Google’s motivations, we can better understand where the search platform hopes to be in the next few years, and get a jumpstart on preparing our SEO strategies for the future. For starters, Google is extremely fixated on the mobile user experience. With an expanded section on mobile compliance and a new frame of reference for local searches, it’s clear that Google wants mobile searchers to have an integrated, interactive, and seamless experience finding websites. The YMYL and EAT systems of rating content quality and significance are standbys, but the fact that Google is actively expanding these concepts is evidence that they’ll be around for the long haul.

    It’s uncertain exactly how often Google will update their SQR guidelines document, or what other changes might be in store for our future as search marketers. Certainly, there may be major new additions for new technologies like apps and interactive content, but in the meantime, keep your focus on producing expert, authoritative, trustworthy content, and optimize your site for mobile user experiences.

  2. What’s Google’s Plan for the Future of Online Reviews?

    Leave a Comment

    Google is, inarguably, the most powerful influencer in the tech world, being the sole provider of more than 3.5 billion searches per day and offering dozens of products that align themselves with various digital needs of users everywhere. With all this power, Google is attempting to shape digital user experiences that give us the greatest satisfaction, and connect us in simpler ways to the information and services we need.

    One of the most powerful forces in the online world, at least when it comes to purchasing decisions, has to be online reviews.

    Online Review Performance

    (Image Source: Moz)

    customer reviews

    (Image Source: SearchEngineLand)

    Over two-thirds of consumers are influenced by online reviews when making a purchasing decision. Even more impressive, 88 percent of consumers take online reviews as seriously as they would a personal recommendation.

    As you can see, how reviews are handled and displayed could have a massive effect on businesses everywhere. A blip in review displays could skew reviews negatively and cripple your marketing campaign, or skew them positively and give you an influx of new buyers.

    Google holds much power over this process, and it may have big things in store for how it’s handled in the future.

    Types of Online Reviews

    The term “online review” in itself is vague, as there are many types of reviews that could be written. For starters, reviews can be for a product, for a service, or for a company overall. They could also be hosted on branded sites, or on external sources.

    Let’s take a look at some of the most common types of online reviews, and how Google currently takes them into consideration.

    • Branded product reviews. First are branded product reviews. Typically left by users who have purchased the product before, these are featured on individual product pages of a website. eCommerce sites are the most popular for this, but they also may be featured on sites with a single digital good or intangible services. Typically, users will rate the product and leave a descriptive review containing their opinions. This text can be indexed by Google, which may help you rank for more relevant keywords. You can also use microformatting to properly categorize the review and possibly get your product’s average rating (or a review itself) featured as an entry in the SERPs.

    ratings and reviews

    • Branded testimonials and company reviews. These work similarly to product reviews, but are applied to an entire company. These are often featured on dedicated testimonials or “success stories” pages, but are held at lower value than product reviews because companies often have more control over what gets posted here.
    • Company reviews on third-party sources. Third-party sources, such as Yelp and TripAdvisor, are seen as much more authoritative and reliable in Google’s eyes. These sources have verified methods of collecting and processing reviews, and are popular enough to collect many reviews for any listed business. The quantity and quality of reviews comes into heavy play when factoring the local rank for a business.
    • Google reviews. Google reviews are also taken into heavy consideration for local rank, and the weighted average of Google reviews will often appear directly in SERPs for local results:

    ratings in search results

    These reviews are especially important because they could easily form a user’s first impression; they’re usually the first line of reviews a user will get to see. However, Google reviews don’t currently boast the same popularity as platforms like Yelp, which is frustrating the process.

    Recent Changes

    Over the course of the past few years, Google has implemented some massive changes in how it views, weighs, and considers these different types of reviews for business.

    The biggest change came back in 2014 with the so-called Pigeon update, which changed how Google’s local search algorithm functioned. After the search update, features and entries in third-party directories began to factor in more heavily to Google’s evaluative process; now, third-party reviews are extremely important in calculating a business’s local rank, and directory pages (such as a business’s Yelp page) are more likely to show up in search results as standalone entries.

    More recently, Google announced that Google reviews can now be left without being signed into a Google+ account. Reviews can’t be left anonymously, which addresses a major concern for review abusers, but this is a move to make it easier for users to leave reviews. A higher quantity of reviews is beneficial for brands, consumers, and Google itself—it means you’ll get a wider diversity, a “truer” picture of the companies and products in question, and hopefully better differentiation in search results.

    Goals for the Future

    Based on what we currently know about the online review world and some of the recent moves Google has made, it’s reasonable to predict a handful of ways Google may hope to change the interplay of online reviews in the future:

    • More, more, more. More reviews means better results for everyone involved, and Google is working hard to encourage more users to leave reliable reviews.
    • Better integrations of reviews in SERPs. Expect to see reviews integrated in more visible, interactive ways in SERPs.
    • Faster consumer decisions. If Google had its way, consumers would never have to leave SERPs. Online reviews are just one outlet to encourage faster consumer decisions.

    With these future developments in mind, it becomes obvious that online reviews are about to get even more important. Optimize your strategy now to attract the best possible reviews for your business and products, and stay one step ahead of the competition.

  3. Are We on the Verge of the Next Great Search Disruption?

    Leave a Comment

    Okay, so we all know that the search world is constantly evolving. It’s changed, radically, in many different ways since its general inception in the mid-1990s. Most of these changes, however, have been slow and gradual improvements to the core, original search engine algorithm. Search experts and marketers were quick to note when these things happened; for example, when Panda was released, 11 percent of queries were affected, and marketers couldn’t help noticing this extreme volatility because they were watching their ranks closely.

    Panda Effect

    (Image Source: Search Engine Land)

    But users didn’t really notice this volatility—to the average user, the changes and improvements in search are so gradual they’re barely noticeable, the same way it’s hard to tell when a child is growing when you see him/her every day.

    What Constitutes a Disruption?

    Because of this incremental phenomenon, it’s tough to categorize what might count as a search engine “disruption.” Usually, a tech disruption happens all at once—when a new product is released, a new trend takes off, or a new company emerges to challenge the norm. Now that all the norms of search are pretty much in place, the minor “disruptions” we’ve had so far (usually in the form of Google updates) can’t really claim to have that much impact. User search behavior has changed much in the past 20 years, but again, it’s done so incrementally.

    Still, knowing that, the search world may be on the verge of a major disruption in the truest sense—a new set of phenomena that may turn the nature of online search on its head. And it’s already starting to take place.

    Artificial Intelligence on Two Fronts

    Disruption is coming in the form of artificial intelligence (AI), and in two distinct modes of operation, it’s already here:

    • AI is powering diverse new types of virtual assistants. These include programs like Siri, Cortana, and Google Now, and are becoming more popular modes of search at an astounding rate.
    • AI is beginning to handle more search engine updates. Machine learning algorithms like RankBrain are finally starting to emerge as the future of search engine updating.

    So on one hand, you have AI interfering with the way users are searching, and on the other, you have AI taking over the updating process for search engines.

    Let’s take a look at each of these in turn, and how they could be considered disruptive.

    Virtual Assistants

    Chances are, you’ve used a virtual assistant at least once in your life, and in the near future, you’ll find yourself using them even more. Consider how these programs could cause the next major search disruption:

    • Voice search popularity. First, it’s important to address the rising popularity of voice search in general. By some estimates, voice-based searches have gone from zero to over 50 billion searches per month. That’s a huge jump, and it’s only going to get bigger. That means more people are using colloquial phrases and forgoing traditional search engines entirely.

    LSA Insider

    (Image Source: LSA Insider)

    • Cross-realm search. It’s also important to realize that most virtual assistants aren’t limited to one realm of search. For example, Cortana and Siri will search the Internet, your local device, your online accounts, and even files within your local device for your search queries. Search is no longer exclusively online, and the lines between online and offline are starting to blur.
    • User intent and semantic capabilities. Virtual assistants are also becoming more adept at recognizing natural language and user intent, which means it’s going to be harder than ever to “optimize” anything in specific ways, and users will have hyper-focused intentions when looking for solutions or content.
    • On-the-go searching. Virtual assistants are also driving more mobile and on-the-go searches, which is changing the way people form queries. They need more immediate, location-based answers, rather than the products of premeditated keyword-based research queries of old.

    Machine Learning in Search

    On the other front of AI development, you have new machine learning algorithms working to replace the previously manual job of improving search engines. This has started out small, with a modification to Hummingbird known as RankBrain, but we can expect to see bigger, better versions of these machine learning algorithms in place in the near future. There are three key ways it could be a disruptor:

    • Micro-updates. RankBrain doesn’t come up with major changes and then push them to a live environment. It runs through tons of micro-updates on a constant basis, meaning that incremental improvement is going to happen on an even more transformative level.
    • Unpredictable paths of development. Since human beings won’t be in control of algorithm updates forever, machine learning algorithms could take searches down new, unfamiliar paths of development—some of which may look very different to today’s average user. Entire constructs and norms may be fundamentally overwritten.
    • Rate of change. Perhaps what’s most scary about the idea of machine learning is the sheer pace at which it can develop. With algorithms perfecting themselves and perfecting how to perfect themselves, the pace of development may skyrocket, leaving us marketers in the dust.

    Key Takeaways

    Since these technologies are still being developed, it’s hard to estimate to what degree they’ll be able to redefine the norms of user searches. However, early indications show these two forms of AI to be powerful, popular, and for lack of a less clichéd phrase, game-changing. As a marketer, you can’t prepare for the future in any concrete way, since even the technology developers aren’t sure where it’s going to go from here, but you can prepare yourself by remaining flexible. Hedge your bets with lots of long-term strategies, try to jump on new trends before your competitors can, and always be willing to adapt.

  4. Is Google Prepping for Another Mobilegeddon?

    Leave a Comment

    Last year, Google made headlines when it released a so-called “mobilegeddon” update, which sent the procrastinators of the web world scrambling to get their sites updated for mobile friendliness. Now, nearly a year after the original mobile update, Google looks like it’s gearing up for another mobile-oriented refresh. Could this be another mobilegeddon-level release? And if so, what should you be doing about it?

    The Original Mobilegeddon

    Let’s start by clarifying exactly what the original mobile update was and wasn’t. Though the majority of Google’s game-changing updates came in quick and without warning, mobilegeddon stood apart from this norm. More than two months in advance, Google proactively warned webmasters that there would be a massive update on April 21st, refining ranking criteria for sites based on their level of mobile compliance.

    Up to this point, mobile optimization had been a ranking factor for both mobile and desktop search results, with more significance to the former for obvious reasons. If your site wasn’t optimized for mobile (e.g., your content wouldn’t load properly on mobile devices, or users would be forced to scroll and zoom to read your main content), it would suffer a mild ranking penalty. Mobilegeddon (which is not the official name for the update, by the way) was created to increase the severity of the ranking differentials here, making it far less likely that non-mobile-compliant sites would make it to the top of either mobile or desktop search results.

    The search community went into mild panic, amplifying the warning beyond its simplistic message. Many neglected to point out that Google gave clear instructions and ample time for webmasters to get their sites in order—it even launched a free mobile-friendly tool that tells you exactly what’s wrong with your site (and supplementary help guides to show you how to fix it).

    mobile friendly tool

    Nifty, right?

    Still, despite ample warning and accompanying community discussion, it took some webmasters by surprise when the update actually did roll out.

    The Effects of Mobilegeddon

    Mobilegeddon was a bit overblown by the community—the very fact that its name signifies some apocalyptic event should have been a dead giveaway that we were exaggerating Google’s intentions. However, there were some major effects from the update, especially after a few weeks of implementation.

    mobile vs non mobile friendly pages

    (Image Source: Wall Street Journal)

    Cumulatively, non-mobile-compliant sites ended up losing more than 12 percent of their total traffic. The majority of non-mobile-compliant sites dropped significantly in rank, while correspondingly, mobile-friendly sites gained in rank… almost as if Google had planned this out.

    The New Version

    Google recently announced that it would be rolling out an update to the original Mobilegeddon algorithm. Due out in May of this year, the update will serve as an expansion and refinement of the original package, much as its Panda and Penguin updates saw multiple versions released in subsequent months and years.

    Currently, we don’t have many details about this new update, but we do know it’s going to “boost” the effects of the original mobile ranking signal. According to the official announcement, if your site is already optimized for mobile devices, there’s no need to make any further changes. This is only going to affect you negatively if, for some reason, you still haven’t optimized your site for mobile devices.

    It’s hard to say exactly how severe this increase will be. The update probably won’t be on the same scale as the original mobile-friendly update, but the effects from the two updates compounded together will be intense for any site still non-mobile-compliant. Since these sites are being penalized currently anyway, it’s unlikely that mobile-compliant sites will see any meaningful gains in rank.

    Optimizing Your Site for Mobile

    If your site isn’t currently optimized for mobile, you’re a few years behind the times, and you’re currently suffering from an already-stiff mobile ranking “penalty”. However, there’s still time to get your site in tip-top shape before this new update rolls out. In fact, even if your site is theoretically mobile-compliant, it’s not a bad idea to run a quick audit to see if there’s anything you can clean up.

    Google has a great help section for developers concerned about how to implement mobile-friendly designs and functionality.

    mobile website optimization

    (Image Source: Google)

    As a quick reminder checklist, make sure your site is optimized for mobile in the following ways for all pages of your website:

    • Use a responsive design or host a mobile-only version of your site.
    • Ensure users can read all text without having to scroll or zoom.
    • Ensure all content loads properly on mobile devices.
    • Check for any mobile-specific 404 errors and neutralize them.
    • Check and optimize your speed to ensure the best possible mobile experience.

    Beyond that, most of your mobile optimization choices are aesthetic or oriented toward user experience, such as further decreasing page loading times or making the design more appealing for the average mobile user.

    Google’s Ongoing Commitment to Mobile

    This obviously isn’t the first mobile update Google’s released, so don’t expect it to be the last. Currently, Google prioritizes great content over even the mobile-friendliness of a site, but as desktop traffic declines further and further, don’t be surprised if non-mobile-compliant sites eventually sink to the absolute bottom of the ranks. In the meantime, Google is stepping up its efforts in supporting app content in its search results, so consider hedging your search visibility bets by investing in a multifaceted mobile presence.

  5. 7 Strategies to Leverage Hummingbird and Related Topics

    Leave a Comment

    Let’s not kid ourselves; Hummingbird is amazing. It’s an algorithm that took Google’s basic keyword-based structure and turned it into something intuitive and more capable of linguistic understanding than most people you’ll ever meet. Now, Google can, for lack of a better phrase, guess what you’re thinking and give you the content that matches your intentions—even if none of your keywords are an exact match for the most relevant results.

    Similarly, RankBrain and other additions have allowed Google to come up with “related questions” and an advanced network of related topics to discern user intent from ambiguous queries, and provide links to helpful related information that similar searchers have required in the past.

    related questions google results

    (Image Source: Moz)

    So how can you take advantage of Hummingbird and related topics in your own content marketing campaign?

    1. Get specific. General topics aren’t going to cut it anymore. The more specific you get with your material, the more likely you’ll be to show up. If a user is searching for general information on a general subject, with a query like “maple trees,” they’re either going to get an immediate Knowledge Graph entry that gives them a breakdown of the subject, or they’ll get referred to a Wikipedia article. On the other hand, extremely specific queries with specific intents will have almost no competition, giving you the advantage when it comes to ranking. Search for specific topics, and write for specific audiences while you’re at it.
    2. Publish interrelated content features. Don’t post single instances of the topics you’re exploring; instead, develop them into a series of related features. For example, instead of just writing about “How to clean an air conditioner,” write that article and follow it up with, “how to repair an air conditioner that won’t run,” or “how to improve the lifespan of an air conditioner.” All of these questions are related topics, so you’ll stand to gain in two key ways. First, you’ll be seen as a greater authority in this space, and second, you’ll have a higher likelihood of showing up in “related questions” for users interested in these subjects.
    3. Go deeper with your content. This is an easy strategy, but it’s one you should have been doing a long time ago. When taking advantage of Hummingbird, thin content isn’t going to cut it. Hummingbird does a thorough evaluation of the phrases and details within the entire body of your content—the more details you include, and the more subtopics and related ideas you cover, the better the algorithm will be able to “understand” your work. It’s also a best practice for content in general—it makes you stand out from the crowd, gives people more information to peruse, and shows that you’ve done your research thoroughly.
    4. Check out Related Questions. Where better to learn how Google categorizes different topics than on Google itself? Run a sample search for a query related to some of your recent content, and see what pops up in the “related questions” section. Who’s covering those topics now? How are they covering them? Look for any opportunity to cover one of these related topics with your own work in the future, and try to capitalize on any weaknesses you see in the work that currently shows up for these queries.
    5. Forget about keywords (mostly). Keywords aren’t dead—at least not entirely. Even though Google isn’t using keywords on a strict, one-to-one basis, they can be good contextual clues for the subjects of your work. Keep keyword research as an element of your SEO campaign—take a look to see what keywords have the highest volume and the lowest competition rating, and include the most promising candidates throughout your work. However, stay away from picking content topics based solely on your keyword research, and as always, never stuff keywords into your content.
    6. Diversify your vocabulary. With more users relying on casual queries and vocal search, the range of vocabulary in user queries has expanded and become much more conversational. If you want your content to be indexed thoroughly, and for subjects peripherally related to your main targets, you’ll do well to diversify the type of vocabulary you use. Part of that means having a bigger list of potential keywords to target, and part of that means avoiding using the same phrases or terminologies over and over again. Shake things up!
    7. See what your competitors are up to. This is another strategy that’s good to adopt in general, but especially useful in the context of Hummingbird and semantic search. Take a look to see what types of content your competitors are publishing, and which pieces seem to be getting the best results. Are there any related topics that they aren’t taking advantage of, such as follow-up opportunities, alternative positions, or expansions? These could be a good way to get a competitive edge, especially since you already know the root subject has been popular with your shared demographics.

    Google’s search algorithm is now too sophisticated for any kind of measurable, predictable, one-to-one gain. That is to say, you’ll never be able to calculate, on paper, the potential visibility for one of your content ideas. However, by employing these tactics (in addition to standard content and SEO best practices), you’ll stand to benefit more from Google’s semantic understanding and desire to provide users with comprehensive information.

  6. Will Google Start Penalizing Bloggers Who Link to Gifted Products?

    Leave a Comment

    Companies are always looking for legitimate, natural ways to earn more links pointing back to their domains. One backlink from a qualified external source can be a significant boost to your domain authority, helping you rank higher for keywords relevant to your brand, not to mention its potential to send referral traffic your way. As Google cracks down on low-quality and unnatural links, we’ve been left with only a handful of legitimate methods to get the job done.

    One of these reliable methods, sending complimentary or trial products and services, has come under fire recently as Google has made a major change to its stance on the subject.

    The Method

    The method itself is innocent and fairly straightforward. You have a product, or a service, that you want to get more publicity for. You know there are tons of bloggers out there who make a living by reviewing said products and services. As an example, head to any tech site and you’ll see dozens of articles reviewing products:

    review articles on blog

    (Image Source: CNET)

    There’s significant opportunity here. The process goes like this: you make a request to a well-known blogger (the bigger, the better) and offer a complimentary product or a trial of your service in exchange for a write-up on it. Naturally, they’ll post a link pointing back to your domain. The link is important to the review, natural for readers, and valuable for both the blogger and the person receiving the link. Theoretically, it’s a perfect relationship. So what’s the problem?

    Google’s Latest Reaction

    Earlier in March, Google made reference to this practice, identifying it as an opportunity for unnatural links to develop. Google warns that such an exchange is not conducive to a healthy or valuable network of online resources for users, and cautions bloggers to engage in the following best practices:

    • Use nofollow links. Google cautions bloggers to use “nofollow” tags for any link pointing to a company’s website, social media accounts, or apps—pretty much anything that could pass any kind of domain authority. Nofollow tags immediately remove these links from Google’s consideration, rendering them completely ineffective for SEO purposes (despite retaining the value for referral traffic).
    • Disclose the relationship. This one makes more logical sense, and bloggers should have been doing this from the beginning. Whenever a company has given you a product for free or has otherwise compensated you or encouraged you to post a review, it’s a journalistic expectation that you disclose such a relationship. You’ll naturally be more biased in your writing, and users need to know what pre-existing relationships you have before writing.
    • Provide unique, compelling content. This one should be obvious, but Google wants to clarify that any product review should be a piece that’s wholly original (if not exclusive), and actually important to your users. If it’s just a duplication of something 100 other bloggers have published, it won’t be considered a “good” piece of content.

    Are These Links Unnatural?

    Taking a look at the first piece of Google’s advice, we can infer that Google views these product review links as unnatural, much like a stuffed link in a blog comment or forum post. In my opinion, comparing these two links is a little strange. Google’s argument is that the link wouldn’t exist if the company weren’t bribing the product reviewer with a free product; however, this doesn’t seem to hold in cases where reviewers review paid-for products. Imagine a scenario where a tech reviewer was planning on purchasing a new phone to review, but the producer comped the device. Is that link unnatural? Since it would exist in either case, the answer is no.

    Of course, I get what Google is driving at—if a company uses free things as a bribe to get a free link, that link definitely is unnatural. But the line is blurry, and to instantly mandate that all product review links be nofollow links seems a little extreme.

    What Are the Risks?

    As for the second two points of Google’s advice—disclosing your relationship with the company and creating unique, compelling content—you should be doing these, no matter what. They’re easy to accomplish and can only reward you. Don’t worry about the consequences of not doing them, and instead worry about the benefits of actually doing them.

    As for the first point, and my point of contention, it seems unlikely that Google’s algorithm is sophisticated enough to discern when a blogger’s review is the product of a free gift, and then pick out which links are and aren’t tagged with nofollow. Accordingly, I must conclude that it’s highly unlikely that any bloggers will be formally penalized for neglecting these nofollow tags (unless they’re engaging in egregiously spammy behavior). I’m not saying to ignore Google’s advice here, but I don’t think there will be stiff penalties for continuing to pursue and post backlinks in product reviews.

    Even though Google’s warning comes without a significant threat of penalty, it may be wise to heed its advice at this juncture. Remember, even nofollow links are inherently valuable—they’ll earn you referral traffic proportional to your audience size—and brand visibility and reputation are always good areas to improve. In short, even if you’re only getting nofollow links out of the deal, it’s probably still a valuable investment to distribute free samples and trials for the extra visibility—as long as you’re working with the right bloggers.

  7. Why Are Local Business Cards Showing Up in Search Results?

    Leave a Comment

    Google never stops coming up with new improvements, but its latest addition to local search is somewhat surprising. Just last year, Google revolutionized local search by streamlining the “3-pack” format across both desktop and mobile devices, and now it’s tinkering with new functions, such as purchasing plane tickets and discovering tourist attractions in new cities.

    So what’s the latest and greatest that Google has to offer?

    Local Business Cards

    Try not to take the title too literally; Google isn’t pulling in business card information from local realtors and professional networkers. Instead, there’s a new kind of image- and information-based carousel that provides searchers with a more in-depth look at local businesses.

    Local Business Cards

    (Images Source: SearchEngineLand)

    Notice the carousel of small information cards just below the traditional link, and to the left of the Knowledge Graph box. Each of these boxes is filled with company-submitted information, including still images, animations/videos, copy, and links. It appears to be updated based on time, the way Twitter feeds used to, and it’s been confirmed that Google is not drawing this information from MyBusiness, social feeds, or anywhere else. The feature works on both mobile and desktop devices, though it was designed to appeal specifically to mobile users. In the mobile format, these “business cards” appear under the Knowledge Graph box, rather than to the left of it. Users may also “share” the content available in these cards.

    It should be noted that this functionality is currently a test, possibly stemming from a similar feature, Candidate Cards, which rolled out just a few weeks ago. However, if the test goes well, it’s likely that Google will implement this on a full scale.

    Unanswered Questions

    The big unanswered question for this feature is whether or not Google will decide to roll it out to all local businesses. But let’s assume for a minute that it will.

    • Will this be available to national or large enterprises? This will likely apply to the independently owned local coffee shop, but what about the local Starbucks across the street from it? Both companies will have Knowledge Graph boxes and will appear somewhat equally in other areas of search visibility, but it seems unfair and maybe even a little inappropriate to give national franchises or large corporations access to even more visibility.
    • What limitations will there be on provided content? Google’s candidate cards struck up some major controversy because of the way candidates were using them. While some candidates used them as intended, as simple messages or arguments during a debate, others used them as an advertising platform, requesting direct donations from users. This is significantly more complex in the political realm, but what would happen if local businesses were given free search advertising features like this? Would our search results start becoming littered with these ads? This leads me to my next question.
    • Could this be a paid feature? Though candidate cards were rolled out for free and for users’ best interests, it’s not unthinkable to imagine these business cards as a paid advertising feature. They’re displayed prominently, there’s a lot of room for creativity and flexibility, and it gives brands a guaranteed number of impressions. My gut feeling on this is that this won’t turn into a paid feature—somehow I can’t imagine Google striving to give users more information like this, only to squeeze a profit out of local business owners to get the job done.
    • How can this integrate socially? Right now, the share button offers integration with Facebook, Twitter, Google+, and email, but what other forms of social integration might Google be able to offer? Will brands be able to tie their own social media activity into business cards in a reverse of the process?
    • Will this only apply to branded searches? Currently, business cards are only known to appear for branded searches of the companies they’re appearing for. As this is a highly limited test, it’s uncertain whether this limitation would remain in the future. If these business cards can be used for direct advertisements, it seems unlikely that Google would allow them for non-branded searches (at least not for free), but Google has surprised us many times before.

    Clearly, there’s a lot of potential for a feature like this, but a lot of difficult problems to work out. The big test will be to see how users respond to this initial rollout; as long as reactions are favorable, the rest of the problems may work themselves out naturally.

    Predictions and Conclusions

    It’s too early to say definitively whether Google will roll out this feature to all local businesses (or which businesses will “count” as local), but if I were a betting man, I’d say it’s fairly likely. Given that candidate cards have been deemed successful already, and this is basically just an extension of that, combined with the fact that Google has been consistently tinkering with local search results for years, it seems reasonable to suspect that this update will make the final cut.

    When it does, I encourage you to be ready to take full advantage of it. Have images and videos ready for your pop-out business cards, and ideas for both commercial (such as product ads) and non-commercial (such as informative content) applications. Don’t be surprised if this turns into a paid feature, but given Google’s long history of supporting local businesses, don’t expect it.

  8. How Far Will Google’s “Right to Be Forgotten” Plan Go?

    Leave a Comment

    Google has been under a ton of pressure, both in the European Union and in the United States, to do more for user privacy. One of the biggest changes pushed by the EU back in 2014, the “Right to Be Forgotten” act, basically mandated that Google remove any old or irrelevant links that individual users request for removal (pending approval). Now, it appears that the Right to Be Forgotten rules are growing larger and more complex, and Google is taking an active role in their development.

    What does this mean for Google? What effects will it have for the future of search?

    The Right to Be Forgotten Act

    The specifics of the Right to Be Forgotten Act are more complicated than my initial, brief explanation, and it’s important to understand them before I dig any deeper into more recent developments. The original guidelines in the EU give all EU citizens the right to request a link to be removed from Google’s massive web index. This functions similarly to Google’s DMCA requests, which are used for copyright holders to request the removal of infringing material.

    right to be forgotten

    (Image Source: SearchEngineLand)

    However, there is no guarantee that Google will honor this request if the link is deemed relevant to search user interests. The only links required to be removed are ones that are out of date or no longer relevant to the public interest, which is a painfully ambiguous term. As a quick example, let’s say when you were a kid, you started up a personal blog in which you complained about school and expressed your angst with reckless abandon. Somehow, you found yourself unable to take this site down, and now, 20 years later, that blog is still showing for anybody who searches your name. Since this information is old, and isn’t inherently tied to the public interest, it would seem fair for Google to remove the link from its index to reduce its visibility.

    If Google rejects the request, it can be appealed. If the link is removed, the content is still available online—it’s just significantly more difficult to find. It may also appear on another site, complicating the process. Right to Be Forgotten isn’t a perfect system, but it’s what we have.

    Various alternative forms of “right to be forgotten” are starting to emerge in the United States, as well. For example, California passed a recent law known as the “eraser button,” which demands similar functionality (going a step further to demand the full removal of certain content from the web), and Illinois and New Jersey are working on similar laws. A federal version of the legislation is also underway.

    This falls in line with the general attitude of the times: a demand for greater responsibility from tech giants. Google is also under heavy scrutiny for alleged antitrust violations, originally in the EU exclusively, and now in the United States as well.

    The Latest Developments

    Back in 2014, Google was resistant to Right to Be Forgotten legislation, claiming it to be a form of censoring the Internet and a violation of user rights. Now, Google is inching closer to a more comprehensive application of those laws.

    Under the old policy, “forgotten” links would only be removed from versions of the search engine for other countries—, for example. A user with the right incentive could simply access the United States’ version of Google to find the links that have been removed. Now, Google has implemented functionality to prevent this breach; all link removal requests are based on the origin of search, so any user in the UK will not be able to find forgotten links, no matter which version of the search engine they use. Speculation suggests that Google made this change under pressure from European authorities.

    Where Does It Go From Here?

    Frankly, the speculation rings of truth to me. I suspect that Google won’t take any moves to remove content from users’ reach until it is forced to (or pressured to) by international government bodies. This move, while small, is a concession the search giant is willing to give in order to remain in good standing on the international scene.

    The company isn’t known for buckling in response to requests; for example, when Spain passed a law that would require the company to pay a kind of tax to writers of articles that showed up in Google News, Google responded by pulling Google News from Spain entirely.

    Google News

    (Image Source: Arts Technica)

    With Google more or less tacitly accepting these new demands for indexation rules, does this mean that Google is liable to respond passively to such requests in the future? This remains to be seen. It depends on how much pressure is put on the company by international organizations, and how important the issue is to Google. For example, removing a link to a half-assed personal blog from 20 years ago doesn’t carry the same consequences as censoring information available to an entire country about that country’s government.

    My guess is, lawmakers in the United States and overseas will gradually work to introduce new, better regulations to encourage user privacy and Google, as long as these demands are reasonable, will comply. Overall, this will have a minimal effect on the way we use search engines, but it shows that we’re entering an era of greater responsibility and accountability for tech giants.

  9. How Google’s Candidate Cards Turned Into a Travesty

    Leave a Comment

    Google is never short on ideas how to improve its search system and break ground in new areas of user satisfaction. Sometimes, these ideas are large and revolutionary, like its embedment of maps into local searches. Its Knowledge Graph, a system of direct information provision (forgoing the need to peruse traditional search entries) is one of the most robust and useful additions in recent years, and it keeps evolving in new, unique ways.

    Rather than solely providing word definitions, or numerical unit conversions, or even filmography information, the Knowledge Graph can provide unique insights on news and upcoming events. Take its entry on the Super Bowl, for example (keep in mind this was written just before the actual Super Bowl):

    Super Bowl Keyword Search Results

    Presumably, this entry will self-update as teams score throughout the evening, and in the next week, will instead offer retrospective analysis of what is currently a forthcoming event. As a user, this doesn’t leave much to be desired; I can even scroll down to find additional results.

    But a recent feature of the Google Knowledge Graph has made a much bigger impact, and reveals one of the biggest current flaws of the direct-information model. It’s all about Google’s portrayal of candidates in what has undoubtedly been one of the most eventful, peculiar election seasons of the past few decades.

    Candidate Cards

    Google’s politics-centric new feature, candidate cards, has begun the same way all its features begin: as an experiment. Accordingly, let’s refrain from judging this system too harshly.

    The idea was to give the American public more transparency on their leading presidential candidates—which sounds great in theory. Google’s idea was to give each significant candidate a designated position in their search results for certain queries. These “candidate” cards would appear in a carousel to potential voters, giving them a direct line of insight into the candidates’ actions and positions. This feature was rolled out as a trial for the recent “undercard” Republican debate, along with YouTube integration and live tracking via Google Trends.

    Google Candidate Cards Mobile View

    (Image Source: Google blog)

    Here’s the issue: if you followed along with this experiment during the actual debate, you wouldn’t see multiple candidates’ positions. You only would have seen one, at least for the bulk of the time and for most queries.

    As SearchEngineLand’s Danny Sullivan noted in a blog post on the issue, the carousel of cards that appeared, for practically any search, only showed posts and positions by one candidate: Carly Fiorina.

    gop debate serp

    A handful of general searches like “gop debate” or even just “debate” returned the same carousel. Likewise for any undercard candidate-specific searches, such as “Huckabee” or “Santorum.” At first glance, you would assume that this is some type of error with Google’s system, that somehow these posts were “stuck” as the top results for any query that tapped into the feature. Could this mean that Google was unfairly favoring one candidate over the others?

    Google would later confirm that nothing was wrong with the feature. Each candidate had the same ability to upload information to this system; Fiorina was the only candidate who made use of the system, and therefore had substantial ground to gain.

    Main Candidate Cards

    Candidate cards for the main GOP candidates appeared not long after the undercard debate ended, including Donald Trump, who was absent from the “main” debate. Take a quick look at these and take note of anything peculiar that stands out:

    GOP Debate

    (Image Source: SearchEngineLand)

    Look at the center post, which features a link to donate $5 to Mark Rubio’s campaign, and consider the nature of the query: 2016 Republican debate. If you’re like me, this raises some questions about the card system and whether it goes “too far” for search results.

    Three Major Concerns

    I don’t care who you support, which party you belong to, or what you think about this election. For the purpose of this article, I’m assuming every candidate on both ends of the political spectrum is equally unqualified to lead the country, and so should you. Remove your biases and consider the following dilemmas this system presents, for any candidate in the running:

    1. Free Advertising. There are strict rules about political advertising, which go into exhaustive detail that I won’t attempt to reproduce here. It seems that Google’s card system can be taken advantage of as a free, unrestricted place to advertise, whether it’s through the request for campaign donations or an attack on another candidate.
    2. SEO as a Political Skill. Take Fiorina’s case; should she be rewarded with extra publicity because of what basically comes down to SEO skills? This seems strange at first, until you realize this is mostly the case anyway—you can bet each candidate has a dedicated contact responsible for making sure they rank highly for public searches (not to mention the presence and effects of social media marketing in political elections).
    3. Biased Media Control. Last, and perhaps most importantly, should Google be allowed to control the parameters for which we view candidate information? Contemplating the possibility of filtering out one candidate’s cards, this is concerning, yet again, it’s nothing entirely new—Google’s stranglehold on search results is currently being investigated as a violation of antitrust laws in Europe.

    What does the candidate card system say about Google? What does it mean for the political system? Is it a useful tool that needs refinement or a total travesty that should be scrapped? I’m not quite sure myself, but you can be sure this experiment didn’t quite go the way Google originally intended. Keep your eyes peeled for how this feature develops—it could have a massive impact on how this and even future elections pan out. In the meantime, you better hope your favorite candidate is as skilled at SEO as you are.

  10. Is Google Using Templates to Rank Sites?

    Leave a Comment

    Google is a machine. It’s an incredibly sophisticated machine, with some of the most advanced artificial intelligence and learning algorithms used by a public audience available, but it’s a machine nonetheless. When we talk about the advanced semantic understanding of Hummingbird, Google’s goal of understanding user intent, and the iterative learning and improving processes of an update like RankBrain, it’s easy to think of Google as almost human, judging sites qualitatively the way a college professor might grade a paper, but at the end of the day, it’s still using analytical structures to rank sites for various queries.

    According to recent evidence reported by Aaron Friedman on the Moz blog, Google may actually be using pre-constructed templates, or models for various industries, to determine where and how your company ranks for branded searches.

    The Problem With Templates

    articleimage1804 The Problem With Templates

    Over the years, Google’s algorithm has evolved from being very mechanical (ranking sites so predictably that spammers could easily manipulate the system) to being more qualitative (subverting the attempts of rank manipulators). For the noble, modern-day optimizer, SEO is a kind of hiring opportunity or presidential election. If you put the work in, improve yourself to the fullest, and adhere to all the basic expectations, you should be considered the best candidate for the job. It would seem unfair that other candidates would be selected over you because they had a certain height, or stature, or socioeconomic background.

    Google using templates seems like an unfair breach of the search world we’ve been conditioned to anticipate (and let’s face it, we’re spoiled with this system anyway). Rather than purely judging sites based on quality or relevance, Google occasionally calls in pre-programmed patterns to map out the results. The questions you must be asking at this point are why? And how can we avoid it?

    The Problem With User Intent

    articleimage1804 The Problem With User Intent

    Let’s start with the fundamental problem that leads to templates seeming “wrong” or “unfair.” It all comes down to user intent. If a user searches for something like “emergency pet care near me,” it’s easy for Google to figure out that this user needs an animal hospital nearby, presumably for an injured or sick pet. Google doesn’t need a template to fall back on here, since the intent is decipherable and the relevance of possible pages can be easily evaluated.

    Instead, let’s look at a more ambiguous, and much more common, type of query, like “Starbucks.” What is the user intent here? Does this user want to find a nearby Starbucks? Invest in Starbucks? Uncover Starbucks’s corporate history? Read news about the Starbucks CEO? There are too many potential meanings here to guess, so Google must have a safety net to deal with such queries—templates.

    Industry-Focused Templates

    articleimage1804 Industry-Focused Templates

    According to Friedman’s data, there are some clear patterns established for ambiguous branded searches like these—and they seem to be segregated along industry lines. For example, among hedge funds, the vast majority—74 percent—have their company homepage listed as a top result, with 72 percent getting Knowledge Graph entries, and most with Wikipedia entries seeing their wiki page ranked around the 4.5 mark.

    On the other hand, among pharmaceutical companies, ambiguous branded searches return page one results that are only 20 percent corporate, compared to 37 percent for telecommunications companies. Engineering companies have far fewer media results than comparable industries, and food/drug stores rarely return stock quotes.

    All of this seems like random, peculiar bits of information, but it’s important to note the underlying commonality here: Google uses specific ranking templates to help guide its allocation of results for intent-ambiguous queries.  

    The Balance of Authority and Relevance

    You may be asking yourself, does this mean I don’t have to worry about optimizing my home page, since it’s going to rank no matter what? The short answer is no. Remember, these results are aggregated from 100 or more different companies in each industry; they aren’t a guarantee for every company involved.

    Google must always strike a balance between the relevance of a page (how appropriate it is for a user’s intent) and its authority (how valuable a source it is, in general). These templates are a way of discovering relevance when no other contextual clues are given. The authority of a page must still be taken into consideration, which means if your homepage isn’t optimized, or if you don’t have a homepage at all, you could end up with no corporate page one rankings at all.

    What Does This Mean?

    When it comes to optimizing pages for specific keyword phrases, long-tail phrases, and specific user intents, nothing should change. This is only about ambiguous branded queries and how Google tends to map them differently for different industries. Knowing what you now know, you can spin this to your advantage by optimizing the company presence you know to be most valuable.

    For example, if your industry tends to have Wikipedia pages for companies ranking higher than social media content (like telecommunications companies or food/drug stores), make sure your Wikipedia page stays accurate and up-to-date. If media tends to be a popular result (as with pharmaceutical companies), work on publishing more images and videos. Think of it as a kind of reverse-optimization: instead of changing something so that it ranks higher, you’re changing something that already ranks high to give your brand a better reputation. Learn your industry well, and give your users the best information you can using the templates Google has already created.

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!


-The AudienceBloom Team