CALL US:  1-877-545-GROW

Category Archive: Google

  1. How an Outdated Sitemap Can Seriously Throttle Your Rankings

    1 Comment

    Sitemaps are a critical and often overlooked element of your site structure, and they play a crucial role for search engines looking to index information about your site. If your sitemap falls out of date and you don’t take measures to correct it, you could pay the price with reduced search visibility and therefore, less traffic to your site.

    Why Sitemaps Are Beneficial

    articleimage780Why Sitemaps Are Beneficial

    Onsite sitemaps are a link on your main site that contain a structured layout of all your existing pages. They’re crawled by Google and inform the search engine about the remainder of your onsite pages.

    XML Sitemaps are files that you can build and submit to Google directly. Essentially, these files are a condensed map that lays out the structure and hierarchy of your site. By submitting a sitemap to Google, you’re telling the search engine to review and index the pages of your site—and this is a critical first step when you’re launching something new.

    Sitemaps are basically instructions that allow search engines to find your pages faster and more accurately. Keeping them updated ensures that Google has the best understanding of your overall website, and the greatest number of your pages are showing up for the appropriate searches.

    The Dangers of an Outdated Sitemap

    articleimage780The Dangers of an Outdated Sitemap

    If your sitemap isn’t up-to-date, you could be providing inaccurate data to search engines. Depending on the severity of your inaccuracies, this could have major consequences or minimal impact. For example, if one of your hundred product pages drops off, you won’t see much of an impact. However, if you’ve restructured your entire navigation, search engines could be confused when they attempt to crawl your site, and you may lose indexed pages as a result. In addition to having a smaller number of indexed, searchable pages, your domain authority could even take a hit.

    The bottom line here is that an outdated sitemap will send outdated information to Google—and while Google, in some cases, is smart enough to make sense of these discrepancies on its own, the safer play is to ensure your sitemaps are always up-to-date.

    How Your Sitemap Can Become Outdated

    articleimage780How Your Sitemap Can Become Outdated

    Sitemaps don’t become obsolete on their own. Only through a deliberate change in your site, usually an increase or decrease in the number of pages, can make your previously submitted sitemap outdated. Keep a close eye on the changes you make to your site, and if you do make a significant change, take efforts to keep your sitemap updated accordingly.

    Adding and Removing Pages

    By far the most common reason for a sitemap becoming outdated is the addition or removal of a core page. Even traditional, static websites experience the need for change from time to time—whether that’s the addition of a new service page or the removal of a special offers page that’s no longer relevant. While some regularly updated sections of your website (such as a blog or press page) will be routinely scanned by Google, any major page changes will need to be reflected in an updated sitemap.

    Redesigning the Site or Navigation

    Restructuring your site will also require an update to your sitemap. In addition to simply listing out the pages of your site, the sitemap is responsible for showcasing the hierarchy of your web presence, outlining the most important pages in a very specific order. If you make major changes to your navigation or restructure your page-based priorities, you’ll need to update your sitemap.

    Adding or Removing Products or Listings

    E-commerce sites and sites with classified-style postings (like job opportunities) tend to be the most vulnerable to sitemaps falling into obsolescence. Since most of these sites have large volumes of products and listings, sometimes numbering in the thousands, it’s common for new postings to be added and old postings to be taken down. Fortunately, a dynamic sitemap can spare you the pain of manually updating a sitemap every time you make a minor change, but you will have to routinely check to ensure your sitemap is accurate and up-to-date.

    Determining Whether Your Sitemap Is Outdated

    The easiest way to test whether your sitemap is current or outdated is to check it using Google Webmaster Tools. If you haven’t yet uploaded a sitemap here, you can start from scratch. If you need help creating your sitemap from scratch, be sure to read up on Google’s guidelines for building a sitemap.

    Once submitted, you might encounter errors during the upload process:

    • If you see a Compression Error, Empty Sitemap, HTTP Error (specific code), Incorrect Namespace, or Incorrect Sitemap Index Format, there is likely a problem with the format of the sitemap you submitted. These problems are generally easily fixed, and do not necessarily indicate a problem with the links and structure included in your map.
    • If you see Invalid or Missing errors, a Parsing Error, or a Path mismatch, it generally means there is a formatting error in the body of your sitemap that needs to be corrected.

    And once the sitemap is accepted, you may find errors with your sitemap. Perform a test by clicking on your intended sitemap, and clicking Test Sitemap in the top right corner. From there, you’ll be able to Open Test Results and view the results of the test.

    The test will tell you what type of content was submitted, in a quantifiable data table, including the number of web pages and videos that were submitted in the test. Any errors that Google encountered, which prevented it from indexing a page that was submitted, will be displayed. Some errors arise when there isn’t a page present where one should be according to the sitemap. Others are based on outside factors, such as server-related problems, or the presence of a robots.txt file blocking Google crawlers from discovering it.

    If you notice any of these errors preventing your sitemap from being accurate or up-to-date, take a closer look at the breakdown in Google Webmaster Tools, make a list of any corrections you need to make, and start making them.

    Submitting a New Sitemap

    Once ready with your new sitemap, head to the Webmaster Tools homepage and enter through the site you wish to submit the sitemap for. Under the Crawl header, click on Sitemaps, select the sitemap you wish to resubmit, and click the Resubmit Sitemap button. Once resubmitted successfully, you’ll be able to re-run the test you used to find the errors in the first place. Hopefully, all of these errors have been corrected in your revision. If not, you’ll have another opportunity to make corrections and resubmit a new sitemap.

    If you’re using static XML sitemaps and you run an e-commerce site or another type of site where pages come and go regularly, you’re in for a lot of work. With a static XML sitemap, you’ll have to manually change and resubmit your work with every change. Instead, you can build a dynamic sitemap and setup automated “pings” to notify the search engine whenever there is a major change.

    No matter how you look at it, sitemaps are an important piece of the puzzle when it comes to making sure Google has the right information about your website. Take measures to ensure your sitemap is up-to-date at all times, and you’ll be rewarded with more indexed pages, and more search traffic as a result.

  2. When and How to Use Google’s Disavowal Tool

    Leave a Comment

    While Google’s penalty-based algorithm updates sometimes make it seem like the search engine giant is actively trying to disrupt the ranking efforts of webmasters, the truth is Google cares about the web, and wants to do everything it can to help webmasters succeed in their own endeavors. Google Webmaster Tools, if you couldn’t tell by the name, was created for that very purpose. It contains a number of research- and analysis-based tools to improve your visibility to Google robots, improve the functionality of your site, and ultimately provide a better experience to your web visitors.

    One of these tools is known as the Disavow Links tool, which exists to help webmasters remove harmful or unwanted backlinks that may be interfering with your campaign. Understanding why this tool exists, as well as when and how to use it, can be highly beneficial for your SEO campaign.

    Why Would I Need to Disavow Links?

    Quality external links are a good thing. The more high-quality links you have on external sites pointing back to your domain, the more authoritative your site will be according to Google, and the higher you’ll rank for relevant keywords. However, if you have low-quality links, Google will work against you by making it more difficult to rank for relevant keywords. Just one bad backlink could easily set you back weeks or months in your ranking efforts.

    Types of Links That Can Harm Your Site

    Google wants to clean the web of spam and irrelevant information, so the best links you can build are the ones that fit naturally into a forum, site, or conversation. If your link appears unnatural, or if it appears like it was built for the sole purpose of trying to boost your rank, it may be considered a bad backlink, and can actively work against you.

    Some of the most common types of bad backlinks include:

    • Links on low-quality sources, such as article directories, link scheming sites, or other hosts for spam and throwaway content.
    • Repetitive backlinks, such as multiple instances of the same URL being used across the web with no variety.
    • Backlinks anchored heavily in keywords, or other attempts to optimize links for keywords unnecessarily.
    • An excessive number of links found on one specific source.
    • Links on irrelevant sites, such as an industry directory for an industry that’s irrelevant to your business.

    If you avoid building links on these types of sources, you should be in the clear, at least for the most part. There’s still a small chance that you could be the victim of negative SEO, which occurs when a competitor or other third party builds a negative backlink without your permission. These instances are rare, but they do occur, so it pays to stay apprised of your backlink profile.

    When to Use the Disavow Tool

    The Disavow Tool wasn’t created to be an easy out for these instances. Google takes negative backlinks very seriously, and allowing webmasters to instantly and permanently remove a ranking penalty would defeat the purpose of ranking penalties altogether.

    Instead, it exists as a last-ditch effort for webmasters to remove a particularly harmful link, or set of links. After all other options have been exhausted, it can be used as a request for Google to consider overlooking those specific links when it scans the web. Requests are manually reviewed, and are typically reviewed judiciously—the majority of requests are ultimately ignored.

    Keep this in mind as you audit your backlink profile or attempt to improve your positions in Google. If you notice your rankings or your organic traffic taking a sharp drop, use a link monitoring tool like Open Site Explorer to determine whether a bad backlink could be the root of your problem. If it is, and you cannot remove the link in any other way, you’ll need to refer to the help of the Disavow Tool.

    Step One: Remove the Links You Can


    Before you log into Google Webmaster Tools, take efforts to remove whatever links you can on your own. The first step is the most logical—try removing them yourself. For example, if you’ve posted a comment on a forum with a backlink pointing back to your site, try logging into that site and deleting your comment.

    Of course, this strategy isn’t always going to work. In many cases, you’ll need the assistance of the presiding webmaster if you want to get your link taken down. Most sites list the webmaster contact on the contact page of their site, so send them a polite request email to have your link formally removed. If the contact isn’t listed, you can use Google itself—type in “” where is the URL of the site whose webmaster you’re trying to find. In the vast majority of cases, webmasters will be more than happy to help you out. However, if the webmaster in question is unresponsive after multiple follow-ups, or if they flat-out refuse to help you, you may need to move on to the next step.

    Step Two: Verify Webmaster Tools (if You Haven’t Already)


    Before you gain access to the Disavow Tool, you’ll first need to verify your Webmaster Tools account with your domain. The easiest way to do this is to upload an HTML file (which Webmaster Tools will provide to you) to your site, but you can also verify your account with Google Analytics or your domain registrar. Once your domain is verified within Webmaster Tools, you’ll be able to login and complete the next steps of the disavowal process.

    Step Three: Download Your Links


    First, you’ll need to download a full list of backlinks pointing to your site. Log into Webmaster Tools, select your domain, and once in your dashboard, click on “Search Traffic.” From there, click on “Links to Your Site,” and under “Who links the most,” click “More.” Once there, click on “Download more sample links”—as an alternative, you can click “Download latest links” and get the dates associated with your links as well. This will allow you to download a file that contains all pages and links pointing back to your domain.

    Step Four: Upload Links to Disavow

    Create a text file (.txt) that contains all the links you wish to disavow, using the link list from the file you downloaded. Be sure to only include the problematic links that you were not able to manually remove. Once you’ve got that all set, you can head to the Disavow Tool itself, click “Disavow links,” follow the prompts, and upload your text file. Uploading a text file will automatically replace any previously uploaded text files, so make sure your file is up-to-date. It will take at least a few days, probably a few weeks, before Google reviews your requests.

    Remember, the Disavow Tool isn’t a magic solution to get rid of all your linking problems. You’ll need to carefully restructure your backlink building strategy, and actively work to remove any negative links long before you even consider using the Disavow Tool. Even when you do make a submission, it’s not uncommon for Google to reject your request. Your best strategy moving forward is to make sure you only build the highest quality links for your backlink profile, and catch problematic links proactively.

  3. Why Is Google Pulling Out of So Many Countries?

    Leave a Comment

    Google has ceased operations (or a portion of their operations) in a number of external countries over the course of the past month, including China, Spain, and Russia. The pullouts, though individual and motivated by separate pressures, form an intriguing pattern of international relations for the company.

    Pulling Out of China


    Google’s China domain, once, now officially redirects to, calling itself the “new home” of Google China. For a number of years, Google complied with Chinese requests to censor and alter its search results for the Chinese population (such as restricting information about the 1989 Tiananmen Square protests), but this recent switch means that Google is no longer willing to comply with those stipulations. Uncensored results for searches can now be found on, though whether or not the Chinese population can access that domain is still controlled by the Chinese government.

    Renegotiations stretch as far back as January of 2014, when Google and the Chinese government began discussing the state of Chinese censorship and Google’s future in the country. Google’s official statement on the matter revealed that a cyber-attack from China was the first domino in a chain of incidents that led the search engine giant to discontinue services such as Google Search, News, and Images in China. Services are still being provided on Hong Kong servers.

    Google is refusing to further compromise their search results due to foreign governmental requests. It was not motivated by external pressure or as a punitive move—instead, it was a logical decision that Google believed would work out the best for the greatest number of people.

    Pulling Out of Spain


    In a somewhat differently motivated move, Google is officially putting a halt to its News service in Spain. Earlier in 2014, Spain implemented a new amendment to its copyright law to take place in 2015—though frequently defended to be a move against piracy, the amendment earned the colloquial name “the Google tax.” Under this amendment, the writers of journalistic articles would be entitled to monetary compensation for segment-based views of their work on news aggregation sites, such as Google News. Further complicating the amendment, this right is inalienable, meaning that writers and publishers cannot waive their rights under any circumstances.

    In response to this upcoming legislation, Google formally announced that it is ceasing Google News provision in Spain entirely. Pulling an Internet-based service out of one specific country is, of course, challenging—while users will no longer be able to access Google News at “” as they have in the past, they’ll easily be able to access the Google News pages of other countries. Since Spanish users will still be able to access this “snippet” content, Google may still be held liable under the new legislation—so 2015 will be an interesting year for the future of Google in Spain (and Europe as a whole, considering other European countries may soon pass similar legislation).

    Rather than making an independent decision in this case, Google is simply responding to an external threat. While Spain’s amendment is not specifically targeted toward Google, Google stands to lose much from the new proposal, and felt the need to mitigate or prevent those losses.

    Pulling Out of Russia


    Following its decision to pull Google News out of Spain, Google reported that as of January 1, it will be closing its engineering offices in Russia. This comes as a result of the Russian Parliament passing a new bill that mandates all foreign Internet companies store the data of Russian citizens within Russian borders—beginning in 2016. While Parliament claims the bill’s intent is to protect Russian users’ information, and prevent foreign espionage from gaining them, most critics believe it is simply a move designed to restrict the availability and flow of information. Though other Google services will still be made available to Russian citizens, Google will not comply with the Russian law.

    The motivation behind this decision is similar to the ones driving its decision in both Spain and China. The Spain pullout was motivated by new legislation, the China pullout was motivated by a desire to protect the availability of information, and the Russia pullout was motivated by new legislation designed to restrict information.

    The “Right to Be Forgotten” Controversy

    While these pullouts have all occurred in the month of December, Google’s international woes stretch back to May 2014, when the “right to be forgotten” privacy ruling by the European Union first started making headlines. Under the new EU decision, private citizens are able to make requests to major Internet-based providers like Google that certain links to information about themselves be removed from databases. Unless a compelling case can be made that the information is necessary for public interest, the links must be removed.

    This new decision put much pressure on Google, with over 169,000 links being removed. Though some have argued the decision is a positive move forward for privacy rights, it also forces some factual, information-based links to be removed from public availability. Since Google is a major proponent for absolute information availability, the decision came as a harsh blow.

    Google to Be Broken Up?

    Furthering the tensions between Google and the EU, European Parliament has also been implementing an antitrust investigation to determine whether Google, as a company, constitutes a monopoly on the Internet search industry. While there are clearly other search competitors around such as Bing and Yahoo!, Google is the dominant competitor. Should the EU get its way, it will demand that Google be broken up into separate, distinct companies.

    Such a decision would put enormous pressure on Google, though Google could likely find a way around such an eventuality.

    Fines From Spain, France, and the Netherlands

    Around mid-December, the Netherlands sent a warning to Google that the company needs to update its privacy policy to provide “clear and consistent” information about how private users’ personal information is used. The Netherlands cite a number of ambiguities and inconsistencies in Google’s privacy policy, and after such major attention and pressure from the European Union, they are cracking down harder than ever. Should Google fail to update its privacy policy by February of 2015, the Netherlands is threatening to issue an $18.7 million fine.

    This fine would be an additional blow to Google, after similar (though less expensive) fines handed down from Spain and France for committing the same privacy policy offenses.

    The Future of Google’s International Operations

    It’s unclear what the future holds in store for Google, but there are two major threats working against Google’s traditional international operations. First, secretive countries like Russia and China are making harsh demands about information availability to their respective populations, and Google wants to maintain open channels. Second, European countries are on a crusade to protect private user information and weaken the perceived monopolistic Google, and Google wants to maintain its independence and continue making information indiscriminately publicly available.

    Aside from Google’s desire to remain a singular, powerful business entity, the core root of all these pullouts and tensions is Google’s wish to keep information publicly and widely available. The search engine giant will undoubtedly face increased pressure in 2015 on all fronts, and it will be interesting to see how they respond.

  4. How Facebook Could Be Stifling Google’s Growth

    Leave a Comment

    For years, we’ve collectively recognized Google as the biggest name in the Internet. They’re responsible for refining—and some would say perfecting—the course of online search. Every year, they come out with new products and features that enhance and streamline online user experiences. Even their culture is impressive, rewarding their employees as well as their users with fun, seamless experiences.

    Over the course of the past 15 years, Google has exponentially grown, but that momentum is now starting to wane. There are several factors responsible for the decline, as with any change in business, but one of the biggest factors might be coming from an unlikely contender—an indirect, but massive competitor in the social media world.

    Google’s Slow Decline

    articleimage664Google’s Slow Decline

    Google is still the powerhouse to beat in the search world, but as an entire company, it’s starting to lose some momentum. They recently released a quarterly financial report, and despite still showing growth, there were significant points of concern for investors. They had expected to grow at a rate of 20 percent, but only grew at a rate of 17 percent.

    Certainly, Google’s explosive growth couldn’t last forever. There is a finite number of potential users of the search engine, and Google has become so ubiquitous that there are simply fewer potential new users to go after.

    While the financial report did not give any explicit reasoning behind the missed projects, it’s reasonable to assume that increased competition could have been a factor. After all, Bing and Yahoo! are starting to see increased popularity as online search options. But it’s Facebook, the social network powerhouse, that might be giving Google a run for its money.

    Users and Content


    Already, people are starting to turn to sources other than search engines to get their content. Most users scroll down their Facebook news feed in order to catch up on the stories of the day, and most users log into Facebook on a daily basis. With 1.5 billion active profiles, Facebook is a force to be reckoned with, and Google will likely continue to see declining figures as more users rely on alternative sources to consume content and get information.

    Facebook Advertising


    Facebook advertising may be the single biggest throttle to Google’s stream of revenue and growth potential. Since Google searches and most Google products are completely free to use, Google makes the majority of its substantial income through paid advertising, like the PPC ads you see at the top and side of your search bar. For years, businesses have paid top dollar to be featured in one of these slots and gain traction by attracting new clicks.

    Today, those ads are harder to get. Visibility has increased, but so has the demand of companies trying to earn those coveted spaces. As a result, the cost of paid advertising with Google has skyrocketed, forcing small businesses to go running to a competitor.

    While Bing has been trying its best to launch a cohesive and efficient paid ad engine, Facebook has won out as the most valuable alternatives for businesses. Through Facebook, businesses can spend as little as $5 a day for as long as they want, and narrow down the scope of their target audience based on age, sex, geographic location, and even personal interests—that’s far more specific functionality than Google offers with their keyword-based platform, and since many businesses are marketing through Facebook anyway, the integrated ad platform is capturing their revenue.

    Facebook Atlas and User Knowledge

    The trend of usage and practicality for Facebook advertising is only going to grow. For the past several months, Facebook has been teasing the announcement of Atlas, and the platform is now live. Atlas is described by Facebook as “people-based marketing,” a marketing and advertising solution for businesses who need something a little deeper and more effective than the Facebook ads of yesteryear.

    Atlas is built on an entirely new base of code, and offers more integrated campaign management features. The usual audience targeting and cost management features are present, but with a more detailed metric reporting feature and tie-ins to offsite sales so businesses can accurately project the ROI in their campaigns.

    But the greatest feature of Atlas, and the biggest threat to Google, is the inherent understanding of individual users that Facebook has accumulated over the years. Facebook has slowly and diligently been collecting incredibly detailed pieces of data on all its users, just waiting for the perfect opportunity to monetize it. Facebook knows much more about its users than Google does, and with Atlas, it is prepared to turn that advantage into cash.

    Furthermore, Atlas is looking to become an ad-serving platform that extends all across the web, not operating the confines of any one platform, making it a direct competitor to Google and challenging Google’s own product, DoubleClick. Atlas is also ignoring cookies entirely, entering the fray with a new form of user tracking based on the unique URL of Facebook users’ profiles.

    The bottom line here is that Facebook has a greater pool of knowledge than Google, and with Atlas, it might have greater functionality. It’s certainly a major player in the world of digital paid advertising now, and Google has a right to be concerned. In the coming years, don’t be surprised if Google PPC campaigns start to wane in price and significance as Atlas makes improvements and starts to leech more of Google’s revenue.

    Looking to the Future

    Google is still one of the biggest, most profitable companies in the world, and it isn’t going away anytime soon. Even if Facebook is radically successful with its latest product, and even if it continues to refine its strategies and grow, Google will still have the power and the authority to continue its pattern of growth. That growth might be limited, but it will still exist.

    Over the course of the next several years, you can expect that Facebook’s Atlas will be followed up by a series of related products and services, taking advantage of Facebook’s ever-growing database of user information to bring in more revenue and attract more potential customers. Right now, Facebook is barely causing a dent in Google’s overall strategy, but after five or ten years of aggressive pushing, they could force Google into a direct competitive situation. What would happen as a result of such a showdown remains to be seen, but it’s likely that the two entities will learn from each other and continually try to one-up each other in an effort to achieve dominance.

    What It Means for Marketers

    Since Google isn’t dying, you don’t need to worry about the future health of your SEO or PPC campaign. However, it might be worth your time to invest some money in a new Facebook Atlas campaign and see how well the platform works for your business. As the platform grows, you’ll likely tap into some impressive new functionality, and your ad campaign will likely benefit.

    The emergence of a new competitor to Google can only mean good things for the individual marketer. Increased competition means you’ll see better functionality rolled out at a faster pace, you’ll see prices drop in an effort to generate more revenue, and you’ll be in the middle, taking advantage of each platform to maximize your reach and impact.

  5. What’s Next After Panda, Penguin, and Pigeon?

    Leave a Comment

    Google likes to keep search marketers on their toes. Its search engine algorithm, kept top secret, has evolved gradually over the course of more than 15 years, but its biggest changes have come in the form of incidental spikes. Google releases major updates to its algorithm in big packages, which roll out over the course of a few days, and have traditionally caused great volatility in the search rankings of countless businesses. Google also releases tiny updates, fixes, and data refreshes as follow-ups to these massive updates, but they don’t make nearly as many waves.

    The big players of the past decade have been the Panda update of 2011, the Penguin update of 2012, and the Pigeon update from earlier this year. These updates all fundamentally disrupted certain ranking principles we had all taken for granted, and their impact has dictated the shape of search marketing today.

    Today, it’s easy to understand why Google released each of these updates, but when they first rolled out, they were surprising to everyone. While there is a certain predictable calm in the current search marketing world, it’s only a matter of time before Google changes the game again with another revolutionary new update.

    So what will the nature of the next update be? And what can we do to prepare for it?

    Panda and Penguin: Two Sides of the Same Coin

    articleimage637Panda and Penguin

    In order to understand the possibilities for the future, we have to understand the context of the past. The Panda and Penguin updates served as complementary rollouts, targeting the negative practices of onsite SEO and offsite SEO, respectively.

    The Panda update came first in 2011, shaking up the results of almost 12 percent of all search queries. The update came as a surprise, but it was only a natural response to some of the practices that were rampant at the time. The update’s primary target was onsite content, and culprits who used low-quality content as a mechanism solely to drive rank. Accordingly, it penalized those sites and rewarded sites that maintained a focus in providing valuable, enjoyable content.Low-quality spam-like practices, such as stuffing content with keywords and copying content from other sites, were virtually eradicated.

    The Penguin update came out as a counterpoint to Panda in 2012, doing for offsite link building what Panda did for onsite copywriting. Penguin 1.0 affected just over three percent of search queries, giving it a narrower range than Panda, but the sites it did affect were affected enormously. Penguin targeted sites that paid for external links, built external links on irrelevant sites, or spammed links in irrelevant conversations. Conversely, it rewarded sites that built more natural links in a diversified strategy.

    Enter the Pigeon Update


    The Pigeon update was slightly different from its cousins. Like them it was a major update that fundamentally changed an element of SEO, but it was never officially named by Google. It was released in the early summer of 2014.

    The Pigeon update was designed to change results for local searches. Rather than attempting a global change, like with Panda and Penguin, Pigeon is focused only on redefining searches for local businesses. Through Pigeon, local directory sites like Yelp and UrbanSpoon got a significant boost in authority, and businesses with significant high ratings on those sites also received a boost. Now, local businesses can get as much visibility by increasing the number of positive reviews posted about them than they can by pursuing traditional content marketing strategies.

    The Bottom Line

    While these updates all surprised people when they came out, and their specific changes are still being analyzed and debated, they all share one fundamental quality: they were rolled out to improve user experience.

    Panda was rolled out because too many webmasters were posting spammy, low-quality, and keyword stuffed content. The update sought to improve user experience by promoting sites with more relevant, valuable content.

    Penguin was rolled out because the web was filling up with keyword stuffed, random backlinks. The update sought to improve user experience by penalizing the culprits behind such spammy practices.

    Pigeon was rolled out because the scope of local businesses online was getting more diverse, and users needed a more intuitive way to find the ones that best met their needs. Pigeon sough to improve user experience by adding sophistication to its local business ranking process.

    User experience is the name of the game, and it’s the sole motivation behind each of Google’s landmark updates.

    Building Off of Old Structures

    Since their release, Panda and Penguin have been subject to countless new iterations. Data refreshes and updates tend to occur on an almost monthly basis, while major updates have been rolled out annually—Panda 4.0 and Penguin 3.0 both rolled out in the past few months. Pigeon is still relatively new, but chances are it will see some expansion as well.

    For now, it seems that Google is trying to build off of the structures that already exist within the confines of its greater algorithm. Rather than trying to introduce new categories of search ranking factors, Google is refining the categories it’s already introduced: onsite, offsite, and now local. It’s likely that Google will continue this trend for as long as it continues to improve user experience, gently refining their quality criteria and targeting emerging black hat tactics as they arise.

    However, it’s only a matter of time before Google discovers a new category of refinement. When it does, the update will likely be just as surprising as the big three, and will warrant its own series of updates and refinements.

    What the Next Overhaul Could Bring

    articleimage637 What the Next Overhaul Could Bring

    If we’re going to predict the nature of the next update, we need to understand two things: the emergence of new technology and the fundamental focus Google maintains on improving user experience. The next major Google update will probably have something to do with significantly improving the way users interact with one or more rising technologies.

    The Knowledge Graph

    One option is a radical expansion of the Google Knowledge Graph. The Knowledge Graph, that box of helpful information that appears to the side when you search for a specific person, place or thing, is changing the way that people search—instead of clicking on one of the highest ranking links, they’re consulting the information displayed in the box. The next Google update could change how significant this box appears, and how it draws and presents information from other sites.

    Third Party Apps

    Google has already shown its commitment to improving user experience through the integration of third party apps—it’s favoring third party sites like Yelp and UrbanSpoon in search results, and is integrating services like OpenTable and Uber in its Maps application. The next search algorithm update could start drawing more information in from these independent applications, rather than web pages themselves, or it could use app integrations as a new basis for establishing authority.

    The Rise of Mobile

    Smart phones are ubiquitous at this point, but wearable technology is still on the rise. The swell of user acceptance for smart watches could trigger some new update based around proximity searches, voice searches, or some other facet of smart watch technology. Since smart watches are in their infancy, it’s difficult to tell exactly what impacts on search they will have.

    No matter what kind of update Google has in store for us next, it’s bound to take us by surprise at least slightly. We can study its past updates and the new technologies on the horizon all we want, but Google will always be a step ahead of us because it’s the one in control of the search results. The only things we know for sure at this juncture arethat Google will eventually release another new massive update at some point, and its goal will be improving user experience.

  6. 5 Ways Smart Watches Could Impact Local Search

    Leave a Comment

    Smart watches have been on the horizon for years, sometimes seeming like a joke and other times seeming like the next big thing. Now that Apple is on board with the Apple Watch and tech companies at every level are looking to get on board with the technology, it appears inevitable that the age of smart watches will soon be upon us.

    At first glance, the change may not seem significant; early prototypes of smart watches appear to function just like smart phones, except attached at the wrist and with a smaller screen. But the age of technology that smart watches are influencing will soon grow to disrupt traditional search marketing strategies, and if you want to avoid getting left behind, you’ll have to start adjusting your campaigns accordingly.

    Local search appears to be the area of search most susceptible to changes from the smart watch trend. Since users will start wearing technology on the go, users will demand more efficient, more relevant, and easier-to-interpret results for their local queries.

    As you start to refine your strategy, consider these five potential ways that smart watches could revolutionize local search:

    1. Proximity Will Become a Factor.

    articleimage632Proximity Will Become a Facto

    Proximity already matters. When a user logs onto a laptop and starts a search, Google will recognize the general location of the user and generate results accordingly. For example, the search engine may detect that a user searching for “great burgers” is in Dallas, and populate some of the most well-reviewed burger restaurants in the city.

    Smart watch users will demand more specific results, and search engines will be happy to give them. By tracking a user’s exact location (and storing the exact locations of known local establishments), smart watches would conceivably give more accurate proximity-based results, giving users the closest burger restaurants to them with up-to-the-minute adjustments for moving targets.

    Proximity would also be a factor for local businesses looking to take advantage of smart watch technology. For example, a local coffee house could feasibly send out a discount coupon to smart watch users who enter the perimeter of the restaurant at a specific time, essentially producing a form of proximity-based promotion.

    Companies that take advantage of these proximity-based features will likely be rewarded in two ways: first, they’ll be more likely to show up in relevant searches because they’re optimized for location, and second, they’ll generate more foot traffic from early adopters looking forward to cash in their location-based coupons.

    2. Users Will Rely on Voice Search.

    articleimage632Users Will Rely on Voice Search

    Voice search is a technology already in use, but for a number of reasons, it has yet to catch on. Users are still accustomed to typing in their search queries, and many users don’t even know voice search exists on Google. Siri, Apple’s virtual assistant, has helped to popularize the possibility of implementing computer functions with vocal prompts, but the inefficiency of the system has led to many people avoiding it entirely.

    However, virtual assistants and semantic voice recognition have evolved over the course of several years. The technology is capable of giving users much more relevant results, dissecting the intent behind the spoken message and fetching results accordingly.

    Smart phone screens are already small and difficult for some users to type on, and smart watches will only make those screens smaller and more difficult. Users will be almost forced to rely on voice search to execute their queries.

    This shift in user adoption will force a change for search marketers in two ways. First, search marketers will need to include more phrase-based messaging on their web pages, including more colloquial and conversational language. People speak differently than they type, and search marketers will need to adapt to a new common input. Second, search marketers will have to contend with multiple search engines—the voice search functions of major search engines like Google as well as personal assistants like Siri.

    3.Alerts and Shorter Messages Will Become the Norm.


    Smart watch screens will be smaller, and since the technology will be attached to a user’s wrist, it will be more difficult to play with. As a result, the technology will demand shorter, more immediate forms of communication with its accompanying user. Messages will need to be shorter, and concise, immediate alerts will take precedence over any other medium of communication.

    As a result, search engines will begin to show preference toward businesses with short streams of message content instead of long-form, detailed content. Users themselves will also prefer to follow and engage with companies who offer concise, valuable alerts and content instead of longwinded or cumbersome messaging. Tech giants will start to favor apps and integrations that offer convenient user alerts, and businesses that submit to those changes will earn more visibility.

    The proximity-based offers I mentioned above are a part of this potential system; businesses can give special offers to customers who visit locations in-person, or design an alert system to let users know of recent changes.

    4. Wearable-Specific Content Will Become Relevant.

    Some companies might attempt to optimize their content to be visible on any format, including desktop, mobile, and wearable technologies, but the next step of content evolution is personalized content, which seamlessly integrates real-word and digital-world experiences. Wearable technology will start to serve as the gateway that allows such a world bridge to form.

    For example, when users are eating at a restaurant, wearable technology could theoretically alert users to the various stages of preparation that their meals go through, integrating a digital experience into a traditional one. Pizza chains already offer a form of this technology online, and QR codes have already attempted to start a trend of using real-world establishments to spark digital experiences, but wearable devices will serve as the first generation of technology to solidify that world.

    As a result, companies will need to begin offering wearable-specific content and wearable-specific experiences. Search engines will favor establishments who have taken the steps necessary to push that trend, and users will gravitate toward the businesses that offer the best overall experience.

    5. Web Pages Will Wane in Significance.

    Already, users are starting to abandon the old formats of online experience. Instead of relying on a browser window and a URL bar, users are relying on individual apps and integrated experiences to accomplish their goals and work. Google is starting to promote this trend by integrating third party applications into its broader network—for example, it recently integrated OpenTable and Uber functionality into its Maps application, and it increased the page rank for third party local directories like Yelp and TripAdvisor with the Pigeon update earlier this year.

    If you want to stay relevant for search engines, you’ll need to find an alternative way to get your business online. It’s unlikely that traditional web pages will disappear overnight, but gradually, they will decline in significance. As a search marketer, you need to start hedging your bets and increase your visibility in as many ways as possible.

    These paradigm shifts will be gradual, especially considering only a small portion of the population will be early adopters of smart watch technology. However, the local businesses that adapt the fastest will earn the fastest, most significant rewards. Stay ahead of your competition by refining your strategies early, and preparing for the inevitable shakeups that smart watch technology will cause.

  7. 7 Ways the Knowledge Graph Could Change SEO Forever

    Leave a Comment

    The Google Knowledge Graph is a feature that started rolling out back in 2012 in order to improve the amount of information available online and the speed at which users could find it. It sounds like an amazing initiative—after all, the faster users can find relevant information, the better online experience they’ll have. However, the future of the Knowledge Graph could completely disrupt the world of search engine optimization, and decrease the value of the strategy altogether.

    Today, the Knowledge Graph exists in a relatively straightforward form. When a user sends a search query for a specific entity, Google will scour the web to pull and analyze properly formatted information about that entity, and display it in an organized fashion on the right side of the screen. For example, if a user searches for “Barack Obama,” the Knowledge Graph will display important biographical information, such as his birthday, full name, and of course, the fact that he’s the 44th president of the United States.

    Google gathers this information by dissecting and interpreting information found on external authoritative sites. This information is efficiently readable if it is entered in a specific microformatting template, like those found at for various categories. Currently, the Knowledge Graph only covers a handful of categories of information, but as it expands, it could offer more information on more topics.

    The Knowledge Graph doesn’t have much impact on search as it stands today, but as it grows in both sophistication and user acceptance, it could have significant consequences for search marketers:

    1. Fewer Visitors Will Find You When Looking for Information.

    articleimage633Fewer Visitors Will Find You When Looking for Infor

    Google is trying to simplify the process of obtaining information. In the old way of searching, if you wanted information on a subject, you would type the query into Google, then sort through the results until you found what you were looking for. The Knowledge Graph immediately cuts out the last step of that process by providing such information directly to web searchers.

    That’s a good thing for most web users because it ultimately saves time, but many companies have fought hard to earn the top ranks for those search results, and they depend on the information-seeking traffic as a huge component of their overall web traffic. Their content strategies are based on providing information and positioning themselves as an authority, and as a result, they get thousands of visitors seeking information. Theoretically, the Knowledge Graph could dramatically reduce that traffic.

    2. You’ll Have More Targeted Traffic.

    articleimage633 You’ll Have More Targeted Traffic

    There is a positive side to that dramatic traffic reduction. Let’s say a user is intentionally searching for information on a specific movie, and your site provides that information. If the user reaches your site and finds the information he/she is looking for, he/she will likely leave immediately afterward. You may be getting a thousand hits from people looking for information, but those thousand hits are leaving after they get what they wanted out of you.

    The Knowledge Graph will filter out that traffic by providing them with that information right away. You’ll be left with more specific, targeted traffic—the people who want to visit your site for reasons other than basic information. The Knowledge Graph will also force users to type more specific queries, bypassing that initial wave of information in order to dig deeper and get more specific results. That means as long as you provide niche content to meet those queries, your conversions could actually increase.

    3. There Will Be Greater Demand for Contextual Clues.

    In order to attract more targeted traffic, your blogs and web pages will need to become more specific. It’s no longer enough to write posts that cater to specific keywords—like “Barack Obama.” Instead, you need to do more to ensure that the specific topic of your post is easily understandable to Google. For example, if your blog post is specifically about Barack Obama’s greatest accomplishments, you should spend less time covering background information on the president and more time showcasing the specifics.

    Doing so will help you avoid the problem of overcrowding in search results pages and rank for the hyper-specific pages your users will soon demand. It’s a less predictable strategy, but if you’re consistent, you’ll be rewarded with a greater, more relevant flow of users.

    4. Information Will Be in Less Demand.

    articleimage633 Information Will Be in Less Demand

    I covered this partially in point two, but the demand for information will rapidly decrease once people get used to the Knowledge Graph. If information is immediately available after briefly typing the topic into a search bar, why would users need to rely on the authority of a specific blog to get their information?

    All information-based content strategies will require a major overhaul. While it’s fine to provide some baseline information about these topics, your users will demand something more from you, and if you want to stay relevant in search results as well as with your audience, you’ll have to step up your game. More interactive, personalized content with walkthroughs, guides, case studies, examples, and engaging collateral features are all going to become more important, especially as Google starts adding more categories to their already-impressive Knowledge Graph repertoire.

    5. Knowledge Graph Ads Will Become a Viable Strategy.

    The Knowledge Graph will likely attract considerably more attention than the remainder of the SERPs, and Google realizes this. While right now, the Knowledge Graph is dedicated only to providing accurate, relevant information to the searcher, don’t be surprised if Knowledge Graph ads emerge as a Google products offering in the near future.

    It’s not certain how much these will cost in comparison to traditional PPC ads, but their visibility and click through rates will probably be superior. If you want to guarantee yourself some search visibility, consider investing in the strategy when it starts to emerge.

    6. The Gap Between Authoritative and Non-Authoritative Sites Will Widen.

    It’s difficult to make yourself stand out as an authority on anything, but Google has already made its mind up on the authoritative influencers for most Knowledge Graph categories (such as people and places). Most of its entries are heavily based on information found on Wikipedia and Freebase. At this point, it’s highly unlikely that the most authoritative sites on the web will ever be overthrown, meaning it will eventually become nearly impossible to emerge as an informational authority. Experience will matter more than information, but the inability to cultivate authority from information is a serious blow to some strategies.

    7. Users Will Demand More Immediate Experiences.

    The Knowledge Graph will likely tie into wearable technologies like Google Glass and smart watches to give users immediate results and information. As a result, users will start to demand even more immediate experiences, losing patience for any system that requires hunting and analyzing to find something they need. As a result, the winners of the search war will eventually be the ones who can provide that immediate experience, whether that’s in the form of an incredibly specific and helpful website, an integrated app, or an affiliate partnership with Google. Eventually, people may no longer rely on searching for traditional websites.

    It might be a little too soon to start worrying that the Knowledge Graph will destroy SEO as a strategy, but it is important to be wary of its potential impact. For now, implement microformatting throughout your site, work on providing the most relevant, accurate information for your customers, and hedge your bets by investing in inbound strategies other than basic SEO, such as social media marketing.

  8. How to Audit Your Local SEO Strategy

    Leave a Comment

    Local SEO is growing in importance, and too many companies are neglecting it in favor of a national strategy. Local SEO doesn’t take much more effort than a national campaign, and it rewards participants with a greater visibility in an environment with less competition. Ranking on page five for a national term isn’t worth nearly as much as holding a number one position for that same term on a local level, even if the potential audience is somewhat smaller.

    After Google’s recent Pigeon update, the scope and environment for local SEO has changed dramatically. There are now dozens of new ranking factors, stemming from third party sites and user reviews, which can affect your overall ranking for a local search term. The good news is that you don’t have to spend as much effort stuffing keywords into your content, but in exchange, you have to rely more on the actions of your customers and audience to fuel your authority.

    If you want to get a read for the health of your local SEO campaign and find direction for any changes you’ll need to make, it’s a good idea to perform a high-level audit.

    Here’s how:

    Get a Campaign Snapshot

    articleimage619Get a Campaign Snapshot

    Before you start looking at the individual factors that are affecting your authority and rank, you’ll want to get a relative measurement of how your campaign is performing. For this, you’ll want to look at some of the same metrics you’d use in a national campaign, with extra attention to your user demographics.

    Organic Visits

    Log into Google Analytics and check out the Acquisition tab, whose Overview will show you a breakdown of how many site visitors you had, and where those visitors came from. Pay special attention to the Organic Traffic number—this is the number of people who came to your site from searching for a term. Social Traffic is also important, especially if you have an active social media presence as a part of your overall campaign.

    Your Organic Traffic figure should grow from month to month fairly consistently. If you notice the numbers growing stagnant, it could be an indication of something wrong with the campaign.


    While still in Analytics, head over to the Audience section, and take a look at the Overview. Depending on the operating range of your company and which local markets you’re targeting, you can look at the county and territory of your users or the city under the “Demographics” tab on the left. Analytics will break down your user visits as a total number of visitors, and as a percentage of your total traffic. A high percentage of local visitors is generally an indication of a high-quality local optimization campaign.

    Once you know where you stand with organic visits and demographics, you can look at the individual components of your campaign and analyze how they are influencing the broader numbers.

    Check Your Offsite Presence

    articleimage619 Check Your Offsite Presence

    One of the most important elements of a post-Pigeon local optimization campaign is your business’s presence on as many third party and local directory sites as possible. The go-to example is Yelp, an aggregator of local business information and customer reviews, but there are several other sites with a niche focus, such as UrbanSpoon or TripAdvisor.

    Claim your company’s account on as many of these platforms as possible. You’ll want to do this for two reasons: first, you’ll be able to verify your information’s consistency across the web, especially your name, address, phone number, and business hours. Second, you’ll have more opportunities to cultivate reviews, but we’ll get more into that in the next section.

    Claiming your profile and verifying your information on these sites is usually a one-time process, but you’ll want to check back every so often to make sure your information is still up-to-date. You’ll also want to do a quick check to see if there are any new, relevant directories that have emerged and claim your account early.

    Analyze and Nurture Your Reviews

    articleimage619 Analyze and Nurture Your Reviews

    Checking your business information is only the first half of the local directory audit. The second part is more intensive, and arguably more important for your customer relationships. These sites all share one core feature: the ability for customers to post public reviews. The more high-quality reviews you have the better—it looks good to the other customers and even sends an authoritative ranking signal to Google.

    Take a look at the number of new reviews you’ve gotten, and how positive those reviews are. If you’re getting a high number of negative reviews, read them carefully and try to figure out what you can change to encourage more positive reviews. If you aren’t getting many reviews at all, you need to do more to encourage your in-person customers to leave feedback. (Remember, it’s a violation of policy to directly ask for reviews. Instead, simply direct your customers to the review site itself and leave the decision to review up to them).

    You’ll also have the opportunity to reply to reviews. This is a good chance to reinforce positive experiences, and make up for any negative ones.

    Scrutinize Your Link Profile

    Like with any SEO campaign, you’ll want to take a look at your link profile, especially if you notice your organic traffic numbers dropping. You’ll need a lot of links to gain authority, but you also want to make sure those links come from quality sources. Use a tool like Moz’sOpen Site Explorer to search for instances of your links on external sites. If any of them look suspicious or unfamiliar, take a closer look. If you don’t remember building the link, or if you suspect the link may be harming your domain authority in any way, reach out to the webmaster and ask for the link to be taken down.

    Dissect Your Content Strategy

    Finally, you’ll want to take an objective look at your content strategy. Like with a national strategy, you’ll want to ensure your content answers customer questions, covers topics related to your industry, is detailed, and is well written. But local optimization campaigns need to go a step further, with content that frequently mentions your city or region, and occasional pieces that are relevant to the local community.

    For example, you could write about a local event or local news story and feature it on your blog, or you could submit a press release about your company’s attendance at a local celebration. The goal here is to produce enough content to objectively tie your company to the city or region in question. Don’t go overboard—keyword stuffing is a danger here—but if you aren’t producing enough locally optimized content, it could interfere with your demographic makeup and local visibility.

    After performing a local SEO audit, you should have a good idea of where you stand, and what areas you’ll need to improve upon as you move forward. Take some time to outline a plan moving forward, including objective goals related to traffic changes and new initiatives. Set milestones for accomplishing each of these goals, and follow up when appropriate to re-audit your campaign and see whether you hit the mark. Just don’t expect immediate results—auditing your campaign once a month is enough for most businesses.

  9. A New Google Maps: What’s Ahead for Local Business?

    Leave a Comment

    articleimage617 A New Google Maps

    Search marketers usually scramble to try and analyze and predict the course of Google search algorithm updates, but it’s an application update that has us puzzled this week. Google updates its applications regularly, just as it updates is search algorithm, to keep things fresh and add new features—it’s a standard practice for app development and management, but with an air of Google’s classic unpredictability.

    If you’ve driven anywhere unfamiliar in the last ten years, you’ve probably used Google Maps at some point. Almost a billion people use it as their maps application of choice on a monthly basis, making it the most popular app in the world, and Google wants to maintain that popularity the same way they’ve maintained the popularity of their search function—by improving user experience.

    That user experience improvement most recently came in the form of a full-scale redesign of the app, along with the addition of a few new features. For average app users, the update appears to be a simple facelift, making it easier on the eyes and more “modern” looking. But the fundamental experience of the Maps application is beginning to change altogether, and it could forecast new challenges and new opportunities for local businesses in the future.

    An Analysis of the Changes

    articleimage617An Analysis of the Changes

    Let’s take a look at the changes that Google rolled out. The fundamental functionality of the Maps application is the same—users can search for locations, and plan routes from one point to another. The biggest changes are visual, barely affecting user experience, but the more significant changes came in the form of added features.

    First, Google overhauled the graphic design elements of the app, changing the interface users have gotten used to since the last major update. It now sports the features of Material Design, a set of styles that Google has begun to adopt for most of its applications. For example, Google Play, Google Newsstand, and Android operating systems have all gotten facelifts based on this design. Google Maps is just the latest to fall in line. Some users have found the new design less easy to use when navigating, but this is subjective; for the most part, navigation is identical to what it used to be.

    The bigger changes are the functional ones—Google has now added both OpenTable reservation options and Uber estimates into its interface.

    If you aren’t familiar with OpenTable, it’s an independent platform that works with restaurants to allow users to schedule reservations quickly and easily. Google now allows users to take advantage of the scheduling service without ever leaving the core Google Maps app—a strange move for the company. When a Maps user finds a restaurant that lists OpenTable as a reservation scheduling option, Google Maps shows a “Find a Table” icon under the typical information section. From there, you’ll be able to select your information (such as the number of people in your party, time, and date), and schedule a table without leaving Maps.

    Similarly, Uber has gotten the in-app treatment from Google with this latest update. You’ve probably heard of Uber by now, but if you haven’t, it’s a popular ride sharing service almost like a taxi in several major cities across the country. Now, whenever you chart a trip using the Google Maps navigation function, as long as you already have the Uber app installed on your phone, you’ll see an “Uber” section, which will estimate the length of your trip and the associated fare should you choose to go with the service. Then, users can click a button to open the Uber application.

    A Deeper Look Into Google’s Motivation

    articleimage617A Deeper Look Into Google’s Motivation

    The superficial design changes of Google’s latest Maps update require little explanation. Google is playing the role it always has, trying to give its users the latest, sleekest, most visually pleasing experience possible. It’s the new functionality that has search marketers and tech enthusiasts guessing the search giant’s ulterior motivation.

    By now, it’s pretty clear that Google wants to take over the world—or at least the online world. And so far, it’s done a pretty good job. Google immediately rose to the top of the food chain after it released, becoming the go-to search engine for billions of users, and it’s remained untouchable as the king of search despite increasing attempts from competitors like Bing to make up ground.

    Google isn’t losing ground in search, but it is starting to lose ground in some specialized, niche markets. For example, Yelp has become a force to be reckoned with online, aggregating information on local businesses and collecting reviews from consumers in order to provide a straightforward and informative database—one that gives users more information than a simple Google search. Similar local directories, like TripAdvisor and UrbanSpoon, have also popped up to serve special verticals. Google was initially reluctant to aid these companies in any way, but recently it released the Pigeon update, which gave a ranking boost to company entries on these directories, and gave a boost to companies who had a large number of great reviews.

    It shows that Google is willing to acknowledge when another company upstages it—but it’s not willing to let them take its users. Google found a way to make the directories happy, make their shared users happy, but still keep people relying on Google for their needs.

    The principle is the same here. Google is starting to recognize that other applications and companies are more convenient for certain functions, so rather than trying to compete or submitting and sending users away, Google is keeping all its users within the confines of its own application while simultaneously taking advantage of these outside services. Everybody wins.

    Strategic Changes for Local Business

    OpenTable and Uber are the big names here, so if you aren’t affiliated with either of them, this update won’t affect you much. You might see a few more customers using Uber to get to your destination, and a few more people sending reservations your way through OpenTable (provided you’re listed with them), but other than that, this particular update shouldn’t send you scrambling.

    Instead, this update is an indication of the shape of things to come, and as a result, you can use it as a guide to adjust your strategy for the future. Google is starting to favor highly authoritative, niche, third-party applications, and that trend will likely accelerate, especially over the next few years.

    Local businesses can take advantage of this by listing themselves on as many of these third party services as possible. Claim your profile on every local directory that pertains to you, and keep watch for new applications emerging in popularity. Cultivate positive reviews and user activity on these sites as much as possible, and respond quickly to any negative reviews or comments.

    This strategy is already useful for local SEO as well as overall user engagement, but it’s only going to grow in significance as Google spends more time trying to provide an integrated, seamless experience for its users. You’ll get more attention, Google will keep its users, independent applications will get more credit, and ultimately, your shared users will have a more enjoyable, more reliable online experience.

  10. How to Diagnose Your Low Conversion Problem

    Leave a Comment

    Conversions are the key to online sales success, serving as the gateway between an interested party and one just passing by. Conversion rates are the filter between your web traffic and your active customer base, so if your conversion rates begin to dwindle, your revenue will take a corresponding nosedive.

    A low conversion rate is not a hard problem to detect. If you’re seeing ample traffic to your website or landing page but you aren’t seeing many people make a purchase or fill out your information form, you have a conversion problem. Determining the root cause of your conversion problem, on the other hand, is more complicated.

    If you’re suffering from a lack of conversions, investigate the root of your problem by focusing on these questions.

    Who Are You and What Are You Selling?

    articleimage608Who Are You and What Are You Selling

    For a moment, forget everything you know about your business and everything you’ve done with this campaign. Look only at the final destination of your customer—usually the landing page, or the specific page of your website where you want people to convert. Using only the information available to you, form an impression of your business, including your brand identity and what it is you’re selling. If you can’t answer those questions, you may have found the root of your problem.

    Visitors need to see, immediately, the personality of a brand and the core goal of the landing page. For example, a new customer would have zero motivation fill out a form and hit “submit” if there’s no information about the business requesting such data. At the very least, you should have a link leading to more information about your business, and a clear showcase of your brand for new users. It’s also helpful to have a paragraph (or two) summarizing your business and providing information about your target products and services.

    Who Is Supposed to Be Reading This?

    articleimage608 Who Is Supposed to Be Reading This

    If you’re answer to this is “everyone” or something as vague as “web visitors,” it’s time to take another look at your target demographics. The most successful online marketing efforts are the ones with a laser focus, pinpointing very specific demographics with targeted messaging. If you write the same message for a 45 year old woman and a 16 year old boy, you’re going to have very different results.

    Only you can determine who your target demographics truly are. Use market research, historical data, or surveys to gather information about your audience, and assess which market segments are most likely to purchase your products. If you have multiple product lines for multiple demographics, you’ll need to split your landing page up into different segments so you can appeal to each independently.

    Once you’re successfully isolated a key demographic, you’ll need to refine your design and messaging to reflect that demographic’s interests. As a simplistic example, an older visitor might be interested in the safety of your product while a younger visitor might be more interested in its design.

    How Much Is There to See?

    Great landing pages are minimalistic. Bombarding a user with tons of images and information is a sure way to overwhelm them. Instead, cut down anything that isn’t absolutely necessary for the landing page to function. You’ll need a goal, like a form to fill out or a product to add to a cart. You’ll also need a strong and visible showcase of your brand and business so people know who they’re buying from. And it doesn’t hurt to have a few compelling visual elements. Beyond that, anything else you include could be doing more harm than good.

    A key example of this is bloated forms to fill out. A person’s name and email should be plenty of information to allow you to follow up—don’t ask them to fill out 20 different pieces of information. It’s too easy for users to quit halfway through the process, or bail before even attempting it.

    Take a razor blade to your landing page if this is the issue you face. Instead of listing the top 20 benefits of your product, reduce it to three, and try to minimize those three to single-word bullet points to capture more immediate attention. Instead of listing 10 different information fields for users to fill out, cut it back to four. Users’ attention spans are at all-time lows, so don’t count on your information being seen unless it’s some of the only information on the page.

    What Are Users Supposed to Do?

    Again, you’ll need to play the role of someone who’s never seen your landing page or website before. Pretend you’re a first-time visitor, and give yourself three seconds to look at the page’s design. Are you able to instantly tell what it is the site wants you to do? For example, is your form the most prominent visual item on the page, front and center? Is the “add to cart” button (or similar call to action) plainly visible, standing out from all the other elements? If not, you’ll need to make some design changes to make it even clearer to the user. Simpler is always better.

    How Are You Communicating?

    articleimage608 How Are You Communicating

    The effectiveness of your messaging is also a crucial component of successful conversion. Already in this article, I’ve written about the importance of targeting your message to a specific demographic, and about keeping things as minimal as possible. Those are important elements of the copy on your landing page, but you’ll need to take things a step further.

    You don’t have much room to work with on a landing page, and first impressions are everything. Look at the most prominent words on the page—usually your headline, first words of paragraphs, and phrasing around calls to action. Are they strong, compelling words, or filler words? Are your sentences clear, concise, and semantically appropriate? What emotions are you eliciting through your messaging?

    If all of this seems new to you, or if you aren’t satisfied with the answers to these questions, you’ll need to do a critical analysis and overhaul of your existing copy.

    What’s the Benefit?

    One more time, I’ll have you pretend to be a first-time visitor with no previous knowledge of your business. You’re seeing your landing page for the first time. Ask yourself immediately—what’s in it for you if you fill out this form or make this purchase? What is the value of taking this action, compared to the cost?

    For a simple purchase, you can make this clear by highlighting all the features of the product in question, along with any special offers—like a discounted price for web visitors. If you’re just looking for a form to be filled out, make sure the user is rewarded for the action. Offer a coupon or a free download of a piece of content. Just make sure it’s clear there’s something valuable available by taking action.

    Optimizing a landing page or website for conversions is an ongoing battle. You’ll never have a form that encourages 100 percent of your visitors to sign up, but with careful attention and responsive tweaking, you can gradually ratchet up your conversion rate and turn more of your site visitors into qualified leads or paying customers. Ultimately, that means more revenue for your business.

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!


-The AudienceBloom Team