AudienceBloom

CALL US:  1-877-545-GROW

Category Archive: Google

  1. Is Google Using Information Accuracy to Rank Sites?

    Leave a Comment

    articleimage664Google’s Slow Decline

    It’s well known that Google puts its users first when it comes to calculating search ranks. Domain authority, a measure of a site’s merit or trustworthiness, is one of the most important factors for determining where a specific page within a domain will rank. For example, if a user searches for “shaving cream,” a site with a “shaving cream” page and a high domain authority will rank higher than a similar site with a low domain authority.

    The factors responsible for forming a site’s domain authority are somewhat mysterious. Search marketers have uncovered many of these factors, either through official Google releases or through trial-and-error based experiments with Google’s search directly. For example, we know that the number and quality of backlinks pointing to a domain factor into how authoritative that domain is perceived to be. But new information suggests that Google is attempting to find new, better ways to calculate a site’s authority, including using the accuracy of information found on the domain.

    What the Research Shows

    articleimage619 Analyze and Nurture Your Reviews

    New Scientist recently revealed that the search engine giant was starting to push the idea of Knowledge-Based Trust (KBT), which is a proprietary method of calculating a page’s authority based on how accurate the information found on it is. Rather than exploring the backlink profile of a site, this algorithm would focus on “endogenous signals,” which would determine the correctness of various facts listed on the page.

    To determine the correctness of this material, Google would compare snippets of information found on the page to similar snippets of information it already has compiled on the Knowledge Graph. In case you weren’t aware, the Google Knowledge Graph is a compendium of verified information pulled from various authoritative sources on the web and reviewed manually for accuracy. You can see information from the Knowledge Graph in its current state by searching for movies, actors, politicians, or other famous subjects—it’s presented in an organized box on the right-hand side.

    According to Google’s recent report on the matter, the KBT algorithm has already been used on 2.8 billion facts and snippets taken from the web. In what was considered to be a successful test, Google verified the accuracy of 119 million web pages and 5.6 million websites. While it’s difficult to say how accurate the algorithm was in determining that accuracy, it seems to be a good start for the technology.

    How Will This Affect Existing Websites?

    Assuming the KBT algorithm one day goes live, Google researchers have suggested that it will only serve as a complement to existing authority-determining factors like backlink profile analysis, rather than a replacement of them. Should the algorithm take effect, there will likely be significant volatility in search ranks. Depending on how accurate your site’s information is compared to Google’s Knowledge Graph, you could move up or down in rankings in a sudden motion.

    However, because Google’s algorithm already detects the quality of your writing and the strength of your content, authority is already based on accuracy by proxy. If you’ve remained committed to posting only the best, most helpful information you can to your users, chances are the release of this new algorithm will not significantly lower your rank.

    Also consider the fact that KBT is mostly based on information housed in the Google Knowledge Graph. Currently, the Knowledge Graph only contains very specific types of information; for example, novels, celebrities, cities, and historical events are all categorized and indexed based on a certain format, but more complex information like “how to change a tire” are more difficult to categorize, and likely will not be indexed by the early stages of Google’s more advanced information-processing products.

    When the Change Could Take Effect

    Google is nothing if not meticulous. Before the company integrates its KBT algorithm into its existing search algorithm, it’s going to want to be sure of the technology’s effectiveness, and that means months of rigorous testing. Early signs appear to validate the effectiveness of the algorithm, but Google’s development team will likely want to refine their approach before debuting it to the general public.

    That being said, Google is constantly pushing for new updates and the best possible search functionality for their billions of global users. Since the KBT algorithm is largely based on the quality of the Knowledge Graph and the Knowledge Graph has been in constantly refinement since 2012, Google may be more willing to make an early push. Google’s updates typically come as a surprise even to search marketers in the know, so KBT will probably be rolled out when we least expect it.

    How to Prepare

    articleimage612 Is your content great enough

    For now, don’t worry too much about KBT. It’s stuck in a testing phase right now, and when it rolls out it will probably be refined to the point where it only minimally affects the landscape of search. What you can and should focus on in the meantime is the quality of your content. Start double checking the facts and figures of all your posts in syndication, and implement a review process that formalizes a fact-checking procedure for all new works that end up on your site. This procedure will help your site become more KBT-friendly, but that’s a far-off concern. Your immediate priority should be taking more steps to ensure that your users get the most accurate, most valuable information possible. Like Google, you must always put your users before anything else.

  2. How to Get Around the Google Knowledge Graph

    Leave a Comment

    articleimage800 What Is the Google Knowledge Graph

    The Google Knowledge Graph is an impressive and relatively new feature, but it has many search marketers fearing for the long-term relevance of their jobs. In case you weren’t aware, the Knowledge Graph refers to a collection of information that Google uses to display concise answers to users with specific queries. For example, if a user searches for a specific movie, like the Wizard of Oz, Google will display a prominent box of information off to the right of its typical link-based search results. This box will display significant information about the user’s query, in this case including the year of initial release, the director, and main actors associated with the movie.

    The Knowledge Graph isn’t limited to just movies, however, and it’s gradually expanding to consume more and more types of information. While this growth is both useful from a user perspective and fascinating from a human perspective, the ramifications it has for SEO are somewhat troubling. Fortunately, there are a handful of strategies you can start implementing to avoid losing out to Knowledge Graph traffic in the long run.

    How the Knowledge Graph Is Changing Search

    The traditional method of search is what drives the value of an SEO campaign. Search results merely listed a series of links to relevant pages, and almost inevitably, a user would click on at least one of those links. If you could get your link to the top, you would receive the greatest number of those clicks.

    The Knowledge Graph is changing search because it’s reducing on critical variable in that equation: the number of people clicking on search links. Let’s say a user searched for the Wizard of Oz in the old format of searching, looking for basic information on the movie. That user would be forced to click on a link to find that information. Today, with the Knowledge Graph, that information is immediately available, eliminating the need to do any clicking.

    As a result, the amount of web traffic you can theoretically get from queries that populate a Knowledge Graph entry are significantly reduced. For example, if you rank at the number one position for “Wizard of Oz,” you could see your traffic reduced by half or more because your potential visitors would no longer have a reason to click into your site.

    There are three main strategies you can use to avoid letting the Knowledge Graph throttle your traffic.

    Option 1: Write More Niche Topics

    shutterstock_178904063

    At least for the time being, the Knowledge Graph only collects information on broad, general topics. It can’t give you detailed steps on how to install a ceiling fan, but it can tell you when President Obama was born. Theoretically, if you don’t waste any time ranking for Knowledge Graph topics, you won’t lose any value.

    Instead, focus your content and SEO strategy on more niche topics, and the more specific you can get the better. How-to and tutorial articles are some of the best options you have, so take advantage of them. Long-tail search queries looking for this type of information don’t see as much search volume as simpler, broader queries, but because the Knowledge Graph will be encroaching on that territory, they might end up seeing just as much traffic. Plus, you’ll enjoy the benefits of lower competition levels, allowing you to rank faster for relevant queries.

    Option 2: Hedge Your Bets With Other Strategies

    shutterstock_16999837

    SEO isn’t the only inbound marketing channel around. Capitalize on some of the other communication and discovery channels that lead people to information on the web. For example, to compensate for lower levels of search traffic, you could bolster your social media strategy and increase your following.

    You could also step up your offsite presence in the form of guest posting or social bookmarking. By leaving traces of your brand or your site behind on pieces of valuable content on external sources, you can capitalize on a significant new stream of traffic.

    By using these strategies, you don’t have to abandon SEO altogether. In fact, stepping up your social and offsite posting strategies can improve your SEO position. Instead, treat them as a way of hedging your bets just in case your search traffic takes a hit.

    Option 3: Play Nice With the Knowledge Graph

    As the saying goes, if you can’t beat them, join them. Another way to beat the Knowledge Graph is to get your content featured in it. To populate its Knowledge Graph entries, Google scours the web for information, looking for microformatting on sites and pages with extremely high authority, like Wikipedia articles. If you want your content to be seen and found by the Knowledge Graph, mark up your content using Schema microformatting and consider creating Wikipedia and similar entries on topics important to your brand.

    While the Knowledge Graph is certainly changing how the world views and uses search, companies generally don’t have to be overly worried about losing significant traffic—at least not yet. In the future, Google could theoretically work to consolidate all the web’s information, completely eliminating the need for individual sites and the possibility for onsite conversions. That’s an extremist view, but it is likely that the Knowledge Graph will continue to rise in prominence. In the meantime, find some alternative strategies to prevent yourself from losing traffic to the information repository, and remember that your users should be your main priority.

  3. Is Google Easing up on the War Against Link Building?

    Leave a Comment

    articleimage883s Google Easing up on the War Against Link Buildin

    The war against link building has been going on for years now. Starting in the mid-to-late 2000s, Google began an initiative, cracking down on shady link building schemes around the web in any way that it could. Back then, backlink building required no tact—it was just a matter of quantity, and the more links you had pointing back to you the better. People would readily buy up backlinks or build them using questionable practices like article directories or link farms, and user experience suffered.

    Google started clearing up the spam by de-indexing or manually penalizing sites that existed solely to help people build backlinks. A few years later, in 2012, it released the Penguin update, a massive algorithm change that rewarded high-quality links and penalized any that appeared to be built solely for the purpose of passing page rank. Since then, Google has continued to make it abundantly clear that anyone caught buying or selling links, along with anyone who built links using questionable practices, would face the search engine giant’s wrath.

    Now, that storm appears to be calming, and it could mean that Google’s war against link building is starting to subside.

    The Slowing Momentum of Penguin

    First, we have to take a look at the Penguin update as it exists today. When it first debuted in 2012, it was a massive game-changer, sending webmasters scrambling to try and pick of the pieces of their lost campaigns. Anyone who was hit by the update had to remove the offending links in short order, and anyone not hit by the update had to update their strategies to ensure they remained in compliance.

    A new version of the update, Penguin 2.0, came a year later, but had a significantly lessened impact. It refined a few processes and added some more criteria to how Google evaluated the quality of links, but beyond that it was a straightforward data refresh.

    In 2014, the SEO community expected a similar update, informally known as Penguin 3.0, but the update was delayed until much later in the year. When the update finally did arrive, it appeared as though the changes were even less significant, making little to no waves in the SEO community at large.

    It could be argued that the slowed momentum of Penguin is due to the fact that Penguin is still doing its job; bad link builders are punished and good link builders are rewarded. However, it could also be an indication that Google is starting to lighten up when it comes to penalizing link builders. It recognizes that millions of sites rely on link building to gain authority, and furthering the struggle against them isn’t worth the effort.

    The Emergence of Link Buying Ads

    articleimage883The Emergence of Link Buying Ad

    According to a recent post by Rand Fishkin of Moz, Google AdWords has apparently removed its ban on advertising from link building companies. Previously, any advertisements that explicitly mentioned the buying or selling of backlinks for the purposes of increasing Google rank were explicitly banned in Google AdWords. Now, a quick search for “link building” or “buy links” reveals several top ads for link building companies.

    Keep in mind that Google’s official policy still forbids the buying and selling of links for the purpose of passing page rank. Allowing advertisements for companies that shamelessly violate that policy is a seeming contradictory decision. It could be a further indication that Google is starting to realize that no matter how hard they try, they’ll never be able to win the war against link builders. If they’re going to exist anyway, Google might as well stand to make a little money off of them through advertising in the process. But does this mean that Google is implicitly agreeing that link building is a necessary strategy to increase rank, or that it accepts the process?

    “I’d Avoid Link Building in General”

    articleimage883I’d Avoid Link Building in General

    Google’s own John Mueller recently gave his opinion on the matter. As part of the Live Hangout, a user asked the question “is link building in any way good?”, directly calling the matter to Mueller’s attention. Mueller responded, “in general, I’d try to avoid that,” then elaborating that pursuing a link building strategy would ultimately cause more problems for your site than it would solve.

    Mueller reinforced the accepted truth that links are still an important part of the Google ranking algorithm, but there are so many other factors that link building should never be your top priority. If you try too hard to build links, you’ll ultimately end up hurting your domain authority, instantly ruining any of the benefits you may have picked up along the way.

    This little discussion makes it clear that even though Penguin is losing momentum and Google AdWords now allows link buying companies to place advertisements, Google is firm on its position that buying or artificially building links is a bad idea.

    The Bottom Line

    Google isn’t fighting the war against link building as hard as it used to. However, that doesn’t mean that excessive link building is suddenly okay. The search engine’s policy on buying or improperly building links is still intact, and some of its highest ranking officials are explicitly warning against it. If you want to build links to increase the authority of your site, the best way to do it is by writing or posting great content, and making it easy for your users to share that content and link back to you. This strategy generates hundreds to thousands of links, but doesn’t carry the risk of a penalty since it constitutes a form of natural link building. Otherwise, build your links on relevant sources in context with the conversation, and never resort to buying links directly or spamming users.

  4. Why Very Long Articles Can Hurt Rankings and Engagement

    2 Comments

    You’ve heard it before; content is king, and the more content you have, the better.

    Be careful how you interpret this. More content is almost certainly better—having more content means you’ll have more indexed pages in Google, more eyes on your work, and more opportunities to convert your readers into customers. But that “more” word is ambiguous and tricky. More content can only be better for your business if the level of quality remains the same, and if the quality drops, more content can actually be a bad thing.

    Take for instance, the very long article—referring to articles of several thousand words or more—it’s true that they have ample body, giving Google lots of text to scan through and giving your users lots of information, but they can actually hurt your search rankings and user engagement strategy.

    The Page Problem

    articleimage866The Page Problem

    The first problem with long articles is what they take away from a short- to medium-size article strategy; page space. Google, when it scours the web for information on websites, looks at site maps and page structures for the bulk of its information. It scans your entire blog, looking for clues to who you are, what you do, and what you like to write about, and prioritizes the titles of those blogs when drawing conclusions. Because titles are critically important, very long articles can cause your relevance to suffer; if you only have one title for every 5,000 words, you’re artificially throttling what Google takes into consideration. If you take your articles down to 1,000 words, you’ll instantly quintuple the number of titles it scans.

    Furthermore, Google loves to see new content. It rewards sites that offer brand-new articles on a regular basis, and tends to decrease the rank of those with fewer updates—even if the total volume of content is high. Publishing one very long article a week instead of five shorter articles is a bad move that can make your site look inferior in Google’s eyes.

    Tired Users

    articleimage866Tired Users

    Google’s eyes aren’t the only ones that matter—you also have to keep your users in mind. The average user has a low attention span, and a high demand for fast, immediate information. Web users are accustomed to 140-character tweets and short, punchy news articles. They don’t have the time, patience, or desire to trudge through a massive article.

    Writing a very long article might seem necessary for longer, more demanding topics, but if nobody wants to read the full material, you’ll be doing yourself a disservice. If you’re having trouble finding a way to shorten your articles, consider breaking them up; instead of writing one massive article, split it into a five-part mini-series that your users can more easily digest. Or, simply hit the high points and make your title more general in turn. Users do want to see detail, but that doesn’t mean you have to go over the top with your explanations.

    Fewer Conversion Opportunities

    Longer articles also take away some of the conversion opportunities you’ll find in other, more concise articles. Generally, if you’re writing for conversions, the rule of thumb is to end with a lead-in to a conversion opportunity, or else have one major leverage point for conversion. If you have one possible conversion per article, you’ll have more conversion chances with five smaller articles than you will with one extremely long one.

    Additionally, because long articles tend to alienate readers, you’ll find that your conversion attempts will often go neglected in your longer features. That means your total number of conversions will suffer if you continuously churn out very long posts.

    Higher Cost to Engagement Ratio

    articleimage866Higher Cost to Engagement Ratio

    It’s also worth mentioning that the cost of generating long articles, provided they have the same level of detail as your shorter articles, is much higher than that of their shorter counterparts. While you might spend the same level of effort on your 5,000 word post as you do on five 1,000 word posts, the cost to engagement ratio is much higher.

    Each post you publish is an opportunity. It’s a new link that you can syndicate to your audience through social media, a new opportunity to attract some referral traffic, and a new chance to get featured on an RSS feed or similar blog aggregators. Writing longer posts means writing fewer posts, so the total number of those engagement opportunities goes down. Since you’re spending the same amount of money on a much lower ring of visibility, you’ll be getting less value for your investment.

    A Note on Minimum Length

    Many of these explanations indicate that multiple short articles are superior to fewer long articles. However, this should not imply that shorter is always better. Your articles need to be of sufficient length to interest and inform your readers, usually 500 words at least.

    It’s true that extremely long articles can be damaging to your customer engagement and SEO strategies, but don’t ever let word count become your top priority. Your first goal should be providing the type of content your users want to see and read, and that means keeping things as concise and detailed as possible, regardless of length. It’s also important to diversify the types of posts you publish—not just long articles and not just short articles, but articles of varying lengths and formats. As long as you’re working with your readers in mind, you won’t have to worry too much about how long your articles get.

  5. Best Practices for Goals in Google Analytics

    Leave a Comment

    Google Analytics is the Swiss Army Knife of the online entrepreneur. It’s full of detailed insights and information you can use to analyze your online traffic and perfect your approach to earn the most new customers and the greatest amount of recurring revenue. But many business owners fail to use Google Analytics to its full potential, relying solely on inbound traffic figures and never venturing further into the platform.

    The “Goals” section of Analytics is one of the most useful tools you’ll find. If set up properly, you’ll be able to track conversions throughout your site, and run an analysis to determine the overall value of your campaign, giving you a perfect gateway to uncover the ROI of your inbound efforts.

    Initial Setup

    articleimage862 Initial Setup

    Setting up a goal is relatively easy. All you have to do is find the Admin section for your target site, click on “Goals,” and then “Create a Goal.” Google Analytics offers a step-by-step process that allows you to set up any goal you’d like.

    For most users, you’ll be setting up a template goal. Some of the common goals you can choose from include “destination” goals, which are completed when a user reaches a specific page, or “event” goals, which are completed when a user takes a specific action, like playing a video. Once you’ve selected a type, you’ll be able to customize your goals and fill in the necessary information—like the URL for your destination goal.

    Once you’ve got your initial goals set up, make sure to run a handful of tests to make sure they are functioning properly.

    Designating a Value

    articleimage862 Designating a Value

    You’ll also have the opportunity to designate a value to the completion of each of your goals. Take advantage of this; it’s going to provide you with a major opportunity to objectively analyze your online marketing results later on.

    For some goals, coming up with this value is easy. For example, if you’re selling an ebook for $5 and you set up a goal for the completion of a single order, the value of the goal would be $5. However, if you’re selling multiple items in varying groups, you’ll have to come up with the average value of a customer order and use that as the assigned value of a goal. The process is further complicated by non-monetary goals, such as those assigned to the completion of a contact form. Here, you’ll have to determine the ratio of inquiries to sales, and then the average sale to determine the average goal completion value.

    This may take a few extra steps, but coming up with an accurate value is essential to determining the objective results of your campaign later down the road.

    Determining Which Goals to Set Up

    If a specific action on your website corresponds to revenue or the strong possibility for revenue, you should set it up as a goal. Only then can you be able to concisely and accurately project how much revenue your inbound marketing strategies are earning. Goals don’t take much time to set up, and once they’re set up correctly, you can run with them for as long as you need. Nobody has ever complained about having too much data available.

    Still, if you have multiple transaction points and multiple points of contact, it may be overwhelming to try and set up a goal for each one of them. Start out with the goals that are the most critical for your business goals, and once those are complete, gradually flesh out the others.

    The Funnel

    Setting up a funnel is an optional part of the goal setup process, but I’ve found it extremely valuable for determining where your customers are coming from and why. With the funnel option, you’ll be able to outline the typical process your visitor goes through before completing a goal; for example, a customer may arrive at your homepage, travel to the blog section, and eventually land on the contact page, where they complete your “contact” goal.

    Setting up a funnel is advantageous because once you have some data flowing, you can easily visualize your customer’s path. Analytics will map out the ideal customer flow you outlined, and give you data for each step of the process. You’ll be able to see what percentage of your customers move on to each step, which will allow you to pinpoint any holdups to your ultimate goal.

    Measuring Your ROI

    articleimage862Measuring Your ROI

    The most important function of goals is getting the opportunity to objectively measure your return on investment (ROI). Will goals in place for all your major transaction and conversion points, you can estimate exactly how much revenue your site has brought in over a given period of time. Determining how much you spent to get that level of traffic is usually the tricky part, since you’ll have incoming traffic from searches, referrals, direct entries, and social media. Still, if you can estimate how much you spend on marketing and compare it to how much you’re making through your goals, you’ll be able to determine the effectiveness of your current strategy.

    If you aren’t already using goals in Analytics, it’s a good idea to get started. Even if you don’t plan on running an analysis in the near future, you may find yourself wishing you had the data. The sooner you set up goals, the more information you’ll have access to, and the better you’ll be able to project the real results of your online marketing strategy.

  6. Are Google Updates a Thing of the Past?

    Leave a Comment

    For more than a decade now, Google updates have been keeping search marketers on their toes. Every time you got used to one common SEO custom—such as the most effective way to build backlinks—Google seemed to respond by pushing a major algorithm change that altered how it took those factors into consideration. In the SEO community, industry professionals were constantly either looking for ways to take advantage of the most recent update or trying to anticipate what changes were coming with the next one.

    Now, as we enter a new era of search, Google’s update patterns appear to have shifted. For the past several years, rather than introducing new algorithm changes, the search giant is only making tweaks to previously existing ones and making minor changes to account for new technologies. Rank disruption is still occurring, but on a much smaller scale, leaving search marketers to wonder—are Google updates a thing of the past?

    The Major Overhauls

    articleimage861The Major Overhauls

    Google updates have earned a reputation for being large, disruptive, and sometimes annoying pushes that can raise your site to the top of the SERPs or drop you off into online oblivion. That’s because most of Google’s major updates so far have been massive game changers, either completely overhauling Google’s search engine algorithm or adding some new set of qualifications that turned the ranking system on its head.

    Take, for instance, the Panda update of 2011 affected nearly 12 percent of all queries, massively disrupting the search game by introducing a new concept of content-based evaluation. Sites with high-quality content were rewarded while sites with spammy content were penalized.

    It was a fair system, and searchers of the world were happy to start seeing more relevant results and fewer obvious attempts to climb ranks by whatever means necessary. But it was still a major headache for search marketers who had invested serious time and money into the previous iteration of Google’s search algorithm. For a time, updates like these were common, and search marketers were constantly on the run, waiting for more changes like the Penguin update, or Panda 2.0, which carried another massive update to Google’s content evaluation system.

    Modern Panda and Penguin

    articleimage861Modern Panda and Penguin

    Panda and Penguin, two of Google’s biggest landmark algorithm updates, have seen multiple iterations over the past five years. Panda 2.0 was followed by small iterations leading to 3.0, and Penguin 2.0 came out only a year after the initial round of Penguin. These algorithm changes were substantial, and search marketers attempted to predict the cycle based on existing patterns, projecting when the next major Panda- and Penguin-based algorithm changes would come.

    But something changed in 2014. Rather than unloading the Panda update in a major package, Google started rolling out data refreshes and minor tweaks to the algorithm on a steady basis. Instead of hitting the search world with a massive burst, it introduced a regular, unobtrusive pulse. Similarly, with the Penguin update, major iterations were virtually done away with. Marketers named an algorithm update “Penguin 3.0” in late 2014, but search volatility was limited compared to Penguin updates in the past.

    This, combined with the fact that Google hasn’t released a major overhaul to its search function since the Hummingbird update of 2013, seems to indicate that instead of rolling out massive, disruptive updates, Google is more interested in rolling out very small, gradual changes.

    Niche Algorithm Updates

    articleimage861nichealgorithmupdates

    Other than extensions for its previous updates, Google has also released a handful of other changes. However, most of these are focused on niche functions—for example, the unofficially nicknamed “Pigeon update” of 2014 overhauled the way Google processes and displays local search results, taking local reviews from directory sites into account. Similarly, Google has been making changes to its Knowledge Graph and how it displays on SERPs.

    These niche updates don’t radically change Google’s core algorithm, nor do they interfere with any major updates of the past. They do have an impact on how search works and what strategies are the most rewarding, but they haven’t done anything to change the fundamental elements of a great SEO strategy.

    The Case for Micro-Updates

    There are a lot of reasons why Google would want to abandon large-scale updates in favor of smaller, less noticeable ones, and the evidence supports that transition:

    • Major updates have slowed to a stop. Instead of large batches of changes, Google is rolling out Penguin and Panda changes gradually and almost imperceptibly.
    • Google is no longer officially naming its updates. Penguin 3.0, Panda 4.1, and the Pigeon update are all unofficial nicknames—Google has abandoned the process of naming its updates, indicating it’s moving away from the process.
    • Search volatility is decreasing. Since Panda’s 12 percent disruption, nothing has come close to that level of volatility.
    • Google is finally at a stable point. The search algorithm is now complex enough to evaluate the quality of sites and the intention behind user queries, leaving little reason to rapidly accelerate through growth spurts.

    Of course, it’s possible that Google has a few more aces up its sleeves, but for now it looks as though major updates are dead, in favor of smaller, less momentous rollouts.

    What Search Marketers Can Learn

    There’s no reason to fear anymore. It’s likely that Google will no longer be pushing out the updates that have disrupted so many business rankings for so long. Instead, search marketers should understand that the fundamental best practices for SEO—improving user experience and building your authorityaren’t going to change anytime soon. The tweaks here and there might fine-tune some small details, but for the most part, the sites and brands that offer the best overall experience are going to be rewarded.

  7. What’s the Best Way to Optimize a Site for Mobile?

    Leave a Comment

    Mobile is taking over the search world. More people are using their mobile devices to perform searches, mobile searches are gaining popularity over desktop searches, and search engines like Google are stepping up their efforts to provide the best possible mobile experience to the greatest number of users. Mobile devices, like smartphones, tablets, and the upcoming plethora of smart watches, are starting to take over the realm of online experience, and if you want to survive in the business world, you’ll need to stay ahead of the curve.

    Why It’s Important to Have a Mobile Optimized Site

    articleimage805 Why It’s Important to Have a Mobile Optimized Site

    In the early days of smartphones, there were critics who claimed that smartphones were just a fad, or that people wouldn’t rely on them to perform searches due to small screen sizes and difficult interfaces. But the trends of the past several years have proven the naysayers wrong: smartphones are here to stay, and creating a mobile experience that suits those mobile devices is imperative:

    • First and most importantly, your site exists to give your users a high quality experience. You want your users to find what they’re looking for on your site easily, and with a design that’s easy on the eyes. If your site isn’t optimized for mobile, you’ll be compromising that experience for a significant portion of your user base, leading to poorer consumer-brand relations and fewer opportunities to attract or convert new leads.
    • Second, your site’s compatibility with mobile devices is measured and interpreted by search engines. Google can tell which sites are optimized for mobile and which are not, and for mobile searches, it’s highly unlikely to rank a non-optimized site in the first page. This is because Google wants mobile users to have the best possible experience, so if you aren’t optimized for mobile, you’ll be missing out on all that search traffic. Plus, mobile-optimized sites get a ranking boost even for desktop searches, so you really don’t want to miss the opportunity.
    • Finally, the competitive factor is critical. Mobile optimization is a new standard in web practices, and countless businesses have already taken steps to ensure their sites comply. If you have a direct competitor whose site is mobile optimized while yours remains non-optimized, you could immediately lose a ton of recurring customers who prefer to browse the web on mobile devices.

    Three Options to Optimize for Mobile

    articleimage805three Options to Optimize for Mobile

    Optimizing for mobile isn’t complicated, but it isn’t as simple as flipping a switch. There’s no single patch of code or button you can push to magically alter your site to be compatible with mobile devices. However, you do have several options.

    Responsive Websites

    Responsive websites are optimized for mobile at a design level. They are created in such a way that allows the components of the page—such as the banners, blocks of text, headlines, and so on—to organize themselves on the page based on the size of the screen that’s accessing the webpage. These components may flex or stack to accommodate a smaller screen size, so a desktop user and a mobile user would both be able to easily navigate the site (even though the layout might be different).

    There are a number of advantages to responsive websites. Since the design is flexible enough to adjust to any screen, every type of mobile device will have a customized experience. However, the “responsive” element only needs to be built once. There is only one URL for your website, which makes it easy to develop and easier to manage over time, and it’s relatively simple to implement. The loading times for responsive sites tends to be slightly slower than the other options, but that’s generally a small price to pay for a universally adaptable website.

    Mobile URLs

    Mobile URLs are exactly what they sound like—they’re separate, customized URLs that exist for the mobile version of a webpage. For example, if your traditional website was www.example.com, your new website could be www.mobile.example.com. Whenever a user accesses your site using a mobile device, you can automatically re-point them to the mobile version of your site (and provide a link to toggle between these versions, just in case a user wants to switch).

    Mobile URLs are starting to become antiquated, but they’re still useful for some businesses. They take more time to create than a responsive design, since they require an independent creation, and require more extensive ongoing upkeep. They’re also vulnerable to fault points in the redirect system—if you accidentally direct a mobile user to the desktop version, they may have a poor experience.

    Dynamic Content

    The third option for mobile optimization is closer in theory to responsive design. Like with a responsive design, dynamic content structures require a single URL to house both a mobile version and a desktop version. The difference is, in a dynamic content setting, you’ll have twin versions of your site—the desktop and mobile versions—ready to display based on the type of device and screen size trying to access them.

    This is an improvement over mobile URLs, since you’ll only need to manage one URL, and you won’t have to worry about creating and sustaining a redirect. However, there are some flaws that may prevent you from achieving the best results. Creating one mobile version can be problematic, since there are hundreds of different mobile devices that could theoretically access your site.

    The Best Option

    articleimage805The Best Option

    Google doesn’t care how you optimize your site for mobile, as long as it is optimized in some way. Whether you choose responsive, mobile URLs, or dynamic content, Google will consider your site optimized for mobile, and you’ll rank accordingly. Your users likely won’t care what type of mobile-optimization strategy you use either, as long as you’re giving them the best possible experience.

    That being said, your decision should come down to your own personal preferences. From a technical standpoint, responsive designs are generally the cleanest; they only require one redesign to be complete, and the ongoing maintenance is pretty much nonexistent, at least compared to dynamic content or mobile URL strategies. Plus, you’ll eliminate the vulnerability of failing to accurately judge the type of device being used to access it.

    Improving Your Rank in Mobile Search Results

    After optimizing your design and structure for mobile, there are a handful of ongoing strategies you can use to boost your rank in mobile searches, even beyond the strategies of a traditional SEO campaign:

    • Decrease your page loading times. Mobile devices load pages slower than desktop versions. Make sure your mobile design is optimized for lightning-fast download times.
    • Keep plenty of content on your pages. Since mobile users need things quick, it may be tempting to reduce your on-page content, but keep as many words on the page as appropriate to maximize the amount of content Google can crawl.
    • Avoid the temptation to use pop-ups. Pop-up ads are seeing a resurgence, especially for companies trying to push their mobile application specifically to mobile users. Doing so can devalue user experience, increase page loading times, and decrease your domain authority in Google’s eyes.

    Aside from ongoing SEO updates and minor tweaks to the design and functionality of your site, mobile optimization really is a one-time process. With your one-time investment, you’ll instantly gain more favor with your user base, gain more visibility in search engines, and get an edge over your competition. If you haven’t yet optimized your site for mobile, now’s the time to get it done.

  8. How the Google Knowledge Graph Is Growing to Change Search As We Know It

    Leave a Comment

    The search world is always on the move, but it wasn’t until the introduction of the Knowledge Graph a couple of years ago that search marketers really started questioning whether conventional SEO practices were feasible in the long-term.

    The Knowledge Graph has been the subject of a slow evolutionary process, gradually incorporating more elements and covering more ground, and now it has reached a threshold that’s made it the center of attention for many search experts. It’s changing the way people use search, slowly but surely, and if your search campaign is going to survive its rise to prominence, you’ll have to be ready for it.

    What Is the Google Knowledge Graph?

    articleimage800 What Is the Google Knowledge Graph

    The Knowledge Graph is a small box of information you see off to the right-hand side of your traditional search results. Depending on what you’ve searched for, the Knowledge Graph will provide a summary of your subject, including bits of information you may be looking for. For example, searching for a movie will often prompt the Knowledge Graph to show up with a list of actors, the year of release, and any awards associated with the picture. Searching for a politician will prompt the Graph to display his/her picture and a brief rundown of his/her political history. It’s essentially Google’s way of providing immediate information to a user, rather than forcing them to wade through sites to find what they’re looking for.

    The Changing Function of Online Search

    articleimage800The Changing Function of Online Search

    This move shows Google’s dedication to providing the best possible experience for online users, which is a good thing. However, since many online users won’t venture to the top search results after finding their desired information within the Knowledge Graph, previous top-position holders may not see the same amount of traffic they did before the Knowledge Graph existed.

    The entire motivation behind the Knowledge Graph’s release is an indication of the future role of online search. Rather than being a tool to find online sites, it’s becoming a tool to find direct information, and as a result, the scope of SEO and online business marketing is bound to change.

    The Latest Developments

    articleimage800 The Latest Developments

    As you might imagine, since it is a Google product, the Knowledge Graph is not some stagnant, one-time development. It is a living, growing mechanic that continues to become more advanced on an almost daily basis. Even in the short history of 2015, the Knowledge Graph has been subject to updates and advancements.

    2015 Oscar Nominations

    In an unpredicted move, Google began showing information on the 2015 Oscar Nominations in the Knowledge Graph, shortly after they were announced. Any search for “Oscar nominee” or “Oscar nominations” will lead to a list of the eight films nominated for the award for Best Picture. In addition, Google is offering detailed information about the Academy Awards in general, as well as the ceremony date for 2015. It’s a sign of Google’s commitment to providing quick-reference information accurately, but also in a timely manner.

    Social Profiles

    Back in November of 2014, Google stepped out of its Google+ shell and started openly providing links to other social media profiles in its Knowledge Graph box. However, these links were restricted for use by major personalities, such as politicians, actors, and musicians.

    Starting in January of 2015, Google is providing links to social profiles of major brands. There’s even a specific markup Google released so you can accurately provide the details to your corporate social profiles to the search engine.

    Increasing Frequency

    Partly due to an increasing breadth of topics covered by the Knowledge Graph and partly due to an increasing number of companies using proper markup formats on their sites, the Knowledge Graph is showing up for an ever-increasing number of queries. According to a recent post by Steven Levy, the Google team estimates that current total number of queries to be 25 percent. One out of every four queries now leads to a Knowledge Graph box, and that number is likely to grow.

    How to Prepare for the Knowledge Graph’s Expansion

    Already, the Knowledge Graph is making waves in the search world. But as most search marketers have learned the painful way, the best way to respond to a new search function is to proactively prepare for it, rather than reacting to it after the fact. As the Knowledge Graph begins to grow in influence, take measures to protect your SEO strategy.

    Avoid Writing General Information

    This is good advice for any content marketing campaign, regardless of the encroaching Knowledge Graph. Rather than writing general information articles about topics related to your industry, focus on writing in a very specific niche—the more specific the better.

    This is going to hold several benefits for your campaign. First, and most relevantly, writing niche topics will prevent the Knowledge Graph from stepping into your territory. For now, the Knowledge Graph only projects common information about the most general subjects, so the more specific the topics you cover, the less likely it is that the Knowledge Graph will show up for your target queries. Second, the more specific you get with your topics, the less competition you’re going to face. That means you’re going to rank much higher for slightly lower-traffic keywords. It’s a shortcut to greater search traffic.

    Use Schema.org Markups

    Google is open about the fact that the Knowledge Graph relies on microformatting to draw in information, so if you want to make sure the Knowledge Graph has the most accurate and most complete information about your company and everything you offer, use every markup you can. Schema.org is a great (and free) resource you can use to mark up the information on your website, and it also provides detailed information on how to incorporate them onto your site.

    As Google starts rolling out expanded coverage of the Knowledge Graph, like it recently did with social profiles, be ready to grab new microformatting requirements and implement them as needed.

    Find Alternative Means of Improving Online Visibility

    It’s unlikely that the Knowledge Graph is going to end SEO as we know it—even in years to come, when the function has expanded in accuracy and coverage, a number of people will still rely on Google to find actual sites with the information they seek. Even so, it’s important to hedge your bets.

    Gain online visibility through non-search related channels, such as RSS feeds and social media. Get involved with other sites, exchanging guest posts and starting threads and discussions leading back to your site. You’ll also want to get involved with as many third-party apps and services as possible, such as local review sites and new applications for your industry (such as Open Table reservations for restaurants). As smartphones and smart watches become more mainstream, app-based discovery will come to rival traditional searches, and getting involved with those apps early on will keep you ahead of the trend.

    The Knowledge Graph is a game-changer, and while it’s improving the search experience for millions of users, it’s also complicating the lives of search marketers. Still, suggesting that the Knowledge Graph is going to “kill” SEO or otherwise destroy the foundation of your inbound strategy is an overstatement. Like with any development, you’ll need to work to understand it, cover your bases wherever you can, and proactively prepare for its next iterations.

  9. How Local Search and Mobile Search Are Becoming One

    Leave a Comment

    There was once a time when searches were all practically the same. They were performed in the same way, on a desktop, and two identical queries from two very different people would still yield identical results. Obviously, this isn’t the case anymore, but the degrees of personalization added to search have been relatively well-defined. Mobile searches generate slightly different responses than desktop searches, when a device can detect the location of a user, local-specific searches become possible even without local keywords, and when a user is logged into Google, his/her search history can have an influence on results.

    Now, as we move into 2015 with new technologies, more advanced search algorithms, and a more demanding audience, these lines are beginning to blur, and customized searches are starting to become normalized and ubiquitous.

    The Tenets of Mobile Search

    articleimage785Tenets of Mobile Search

    Mobile search wasn’t always taken seriously, but the rising trend of searches being performed on mobile devices have prompted Google and popular webmasters everywhere to take action. Google offers a more concise search results page, lending an easier search experience for its mobile user base, but more importantly, it has implemented a reward system for sites optimized for a mobile experience. Essentially, when a search is performed on a mobile device, Google increases the rank of sites that are optimized for mobile—either in a responsive format or in its mobile-specific form.

    The idea makes sense. Mobile users demand a mobile experience—otherwise, it’s hard to read, see the full content of the page, and even click buttons. Rewarding the sites with mobile features means that more mobile users will get an ideal experience.

    How Local Search Is Growing

    HowLocalSearchisGrowing

    Local searches once relied on local-specific keywords. For example, in order to find a veterinarian in Raleigh, you would need to search specifically for “veterinarian in Raleigh” or something similar. Today, as long as your device can detect your location, the local portion of the search is applied automatically, and a search of “veterinarian” is sufficient to generate local-specific veterinary results. It has made things dramatically easier for local searchers looking for restaurants, hotels, or other services.

    Last year, Google also unveiled the unofficially nicknamed “Pigeon” Update, a major algorithm overhaul made to improve local search results. Now, the presence and reputation of businesses in local directories and local rating sites like Yelp and TripAdvisor are factored into search rankings. Essentially, this means that better-reviewed local businesses are more likely to rank high for a local search. More people are making local searches due to the expanding presence of local businesses online, and users are far more satisfied with the results.

    The Overlap Between Local and Mobile

    articleimage785 How Local Search Is Growing

    Two major factors are responsible for the merge of local and mobile searches.First, mobile devices are popularly enabled with location services, which inform the device of the user’s location. This information enables local searches to happen automatically, whereas browsers rarely come with this option as a default and not everybody logs into an account before they search. The bottom line is that searches performed on mobile devices are more likely to have local results as a default.

    Second, the number of local searches performed on mobile is staggering, partially due to a rising trend of mobile searches and the nature of mobile devices. According to a recent survey, 56 percent of all mobile searches have local intent behind them. People are more willing to use their mobile devices when searching for something local, in large part because many local searches are needed when en route or otherwise away from home.

    As a result, the majority of local searches are now being performed on mobile devices, and the majority of searches performed on mobile devices have some kind of local intent. Those two pieces of data, both trends still growing, are reason enough to declare that the worlds of mobile and local search are starting to merge into one.

    What This Means for Search Marketers

    If you’re currently wondering how to juggle all the facets of a successful optimization strategy, take a deep breath. Despite the increasing complexity and sophistication of Google’s ranking algorithm and emerging technologies, SEO has actually gotten simpler over the years. The combination of local and mobile search is just another step in that simplification process, and as a search marketer, you stand to benefit.

    First, you’re going to need to make sure your site is optimized for mobile. By now, that should be a given, and regardless of whether or not mobile and local searches are interrelated, it’s going to continue being critically important. If your site isn’t optimized for mobile, you aren’t going to show up in mobile searches, so optimize your site if you haven’t already—fortunately, this is a one-time process.

    Next, your local SEO efforts need to be increased. Even if you have multiple locations, or if you don’t consider yourself tied to your specific location, it’s important to engage in a local SEO strategy. The potential rewards are enormous and increasing, with less competition and higher search volume than ever before. Claim your profiles on any relevant local directories that you can, publish local-specific content whenever you can, and actively cultivate positive reviews online when possible. Increasing your local relevance will greatly increase your local web traffic, and will set you up nicely for search developments down the road.

    That’s it. For the time being, the phenomenon of increased mobile and local searches isn’t going to have much of an impact on your strategy other than an increased need for ongoing, local-specific updates. However, as we look to the future, this trend could evolve, and new means of searching could completely revolutionize the industry.

    The Next Age: Wearable Technology

    Wearable technology is on the horizon. Augmented reality with Google Glass and seamless interfaces with the Apple Watch are going to spark a new trend in compact, flexible, hardly noticeable technology. That’s going to make waves in the world of mobile search, as hands-free, on-the-go searches become a critical space for online visibility and integration from the real world to the search world is going to define your success or failure.

    It’s hard to say what specific changes are in the pipeline, especially since Google keeps its algorithm secrets under lock and key, but it’s reasonable to speculate a handful of potential developments. Local searches will start to become hyper-local, focused on detail down to a city block rather than a broader city or region. Mobile searchers will need results instantly, with less of a need for content and more of a need for immediate answers like directions, hours, and ratings. And local businesses who reward mobile searches, perhaps with integrated onsite functionality, will earn the greater share of mobile search traffic—and potentially foot traffic as a result.

    The separate evolutions of mobile and local searches are starting to align, especially as the next wave of mobile technology begins to enter the scene. Stay ahead of the curve by maximizing your local search visibility, and taking advantage of the latest mobile devices when they start to emerge in the marketplace. As always, the bottom line is user experience, so give your users what they want and what they expect, and you’re going to be rewarded as a result.

  10. How an Outdated Sitemap Can Seriously Throttle Your Rankings

    1 Comment

    Sitemaps are a critical and often overlooked element of your site structure, and they play a crucial role for search engines looking to index information about your site. If your sitemap falls out of date and you don’t take measures to correct it, you could pay the price with reduced search visibility and therefore, less traffic to your site.

    Why Sitemaps Are Beneficial

    articleimage780Why Sitemaps Are Beneficial

    Onsite sitemaps are a link on your main site that contain a structured layout of all your existing pages. They’re crawled by Google and inform the search engine about the remainder of your onsite pages.

    XML Sitemaps are files that you can build and submit to Google directly. Essentially, these files are a condensed map that lays out the structure and hierarchy of your site. By submitting a sitemap to Google, you’re telling the search engine to review and index the pages of your site—and this is a critical first step when you’re launching something new.

    Sitemaps are basically instructions that allow search engines to find your pages faster and more accurately. Keeping them updated ensures that Google has the best understanding of your overall website, and the greatest number of your pages are showing up for the appropriate searches.

    The Dangers of an Outdated Sitemap

    articleimage780The Dangers of an Outdated Sitemap

    If your sitemap isn’t up-to-date, you could be providing inaccurate data to search engines. Depending on the severity of your inaccuracies, this could have major consequences or minimal impact. For example, if one of your hundred product pages drops off, you won’t see much of an impact. However, if you’ve restructured your entire navigation, search engines could be confused when they attempt to crawl your site, and you may lose indexed pages as a result. In addition to having a smaller number of indexed, searchable pages, your domain authority could even take a hit.

    The bottom line here is that an outdated sitemap will send outdated information to Google—and while Google, in some cases, is smart enough to make sense of these discrepancies on its own, the safer play is to ensure your sitemaps are always up-to-date.

    How Your Sitemap Can Become Outdated

    articleimage780How Your Sitemap Can Become Outdated

    Sitemaps don’t become obsolete on their own. Only through a deliberate change in your site, usually an increase or decrease in the number of pages, can make your previously submitted sitemap outdated. Keep a close eye on the changes you make to your site, and if you do make a significant change, take efforts to keep your sitemap updated accordingly.

    Adding and Removing Pages

    By far the most common reason for a sitemap becoming outdated is the addition or removal of a core page. Even traditional, static websites experience the need for change from time to time—whether that’s the addition of a new service page or the removal of a special offers page that’s no longer relevant. While some regularly updated sections of your website (such as a blog or press page) will be routinely scanned by Google, any major page changes will need to be reflected in an updated sitemap.

    Redesigning the Site or Navigation

    Restructuring your site will also require an update to your sitemap. In addition to simply listing out the pages of your site, the sitemap is responsible for showcasing the hierarchy of your web presence, outlining the most important pages in a very specific order. If you make major changes to your navigation or restructure your page-based priorities, you’ll need to update your sitemap.

    Adding or Removing Products or Listings

    E-commerce sites and sites with classified-style postings (like job opportunities) tend to be the most vulnerable to sitemaps falling into obsolescence. Since most of these sites have large volumes of products and listings, sometimes numbering in the thousands, it’s common for new postings to be added and old postings to be taken down. Fortunately, a dynamic sitemap can spare you the pain of manually updating a sitemap every time you make a minor change, but you will have to routinely check to ensure your sitemap is accurate and up-to-date.

    Determining Whether Your Sitemap Is Outdated

    The easiest way to test whether your sitemap is current or outdated is to check it using Google Webmaster Tools. If you haven’t yet uploaded a sitemap here, you can start from scratch. If you need help creating your sitemap from scratch, be sure to read up on Google’s guidelines for building a sitemap.

    Once submitted, you might encounter errors during the upload process:

    • If you see a Compression Error, Empty Sitemap, HTTP Error (specific code), Incorrect Namespace, or Incorrect Sitemap Index Format, there is likely a problem with the format of the sitemap you submitted. These problems are generally easily fixed, and do not necessarily indicate a problem with the links and structure included in your map.
    • If you see Invalid or Missing errors, a Parsing Error, or a Path mismatch, it generally means there is a formatting error in the body of your sitemap that needs to be corrected.

    And once the sitemap is accepted, you may find errors with your sitemap. Perform a test by clicking on your intended sitemap, and clicking Test Sitemap in the top right corner. From there, you’ll be able to Open Test Results and view the results of the test.

    The test will tell you what type of content was submitted, in a quantifiable data table, including the number of web pages and videos that were submitted in the test. Any errors that Google encountered, which prevented it from indexing a page that was submitted, will be displayed. Some errors arise when there isn’t a page present where one should be according to the sitemap. Others are based on outside factors, such as server-related problems, or the presence of a robots.txt file blocking Google crawlers from discovering it.

    If you notice any of these errors preventing your sitemap from being accurate or up-to-date, take a closer look at the breakdown in Google Webmaster Tools, make a list of any corrections you need to make, and start making them.

    Submitting a New Sitemap

    Once ready with your new sitemap, head to the Webmaster Tools homepage and enter through the site you wish to submit the sitemap for. Under the Crawl header, click on Sitemaps, select the sitemap you wish to resubmit, and click the Resubmit Sitemap button. Once resubmitted successfully, you’ll be able to re-run the test you used to find the errors in the first place. Hopefully, all of these errors have been corrected in your revision. If not, you’ll have another opportunity to make corrections and resubmit a new sitemap.

    If you’re using static XML sitemaps and you run an e-commerce site or another type of site where pages come and go regularly, you’re in for a lot of work. With a static XML sitemap, you’ll have to manually change and resubmit your work with every change. Instead, you can build a dynamic sitemap and setup automated “pings” to notify the search engine whenever there is a major change.

    No matter how you look at it, sitemaps are an important piece of the puzzle when it comes to making sure Google has the right information about your website. Take measures to ensure your sitemap is up-to-date at all times, and you’ll be rewarded with more indexed pages, and more search traffic as a result.

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!

Love,

-The AudienceBloom Team