CALL US:  1-877-545-GROW

Category Archive: Google

  1. Why Very Long Articles Can Hurt Rankings and Engagement

    Leave a Comment

    You’ve heard it before; content is king, and the more content you have, the better.

    Be careful how you interpret this. More content is almost certainly better—having more content means you’ll have more indexed pages in Google, more eyes on your work, and more opportunities to convert your readers into customers. But that “more” word is ambiguous and tricky. More content can only be better for your business if the level of quality remains the same, and if the quality drops, more content can actually be a bad thing.

    Take for instance, the very long article—referring to articles of several thousand words or more—it’s true that they have ample body, giving Google lots of text to scan through and giving your users lots of information, but they can actually hurt your search rankings and user engagement strategy.

    The Page Problem

    articleimage866The Page Problem

    The first problem with long articles is what they take away from a short- to medium-size article strategy; page space. Google, when it scours the web for information on websites, looks at site maps and page structures for the bulk of its information. It scans your entire blog, looking for clues to who you are, what you do, and what you like to write about, and prioritizes the titles of those blogs when drawing conclusions. Because titles are critically important, very long articles can cause your relevance to suffer; if you only have one title for every 5,000 words, you’re artificially throttling what Google takes into consideration. If you take your articles down to 1,000 words, you’ll instantly quintuple the number of titles it scans.

    Furthermore, Google loves to see new content. It rewards sites that offer brand-new articles on a regular basis, and tends to decrease the rank of those with fewer updates—even if the total volume of content is high. Publishing one very long article a week instead of five shorter articles is a bad move that can make your site look inferior in Google’s eyes.

    Tired Users

    articleimage866Tired Users

    Google’s eyes aren’t the only ones that matter—you also have to keep your users in mind. The average user has a low attention span, and a high demand for fast, immediate information. Web users are accustomed to 140-character tweets and short, punchy news articles. They don’t have the time, patience, or desire to trudge through a massive article.

    Writing a very long article might seem necessary for longer, more demanding topics, but if nobody wants to read the full material, you’ll be doing yourself a disservice. If you’re having trouble finding a way to shorten your articles, consider breaking them up; instead of writing one massive article, split it into a five-part mini-series that your users can more easily digest. Or, simply hit the high points and make your title more general in turn. Users do want to see detail, but that doesn’t mean you have to go over the top with your explanations.

    Fewer Conversion Opportunities

    Longer articles also take away some of the conversion opportunities you’ll find in other, more concise articles. Generally, if you’re writing for conversions, the rule of thumb is to end with a lead-in to a conversion opportunity, or else have one major leverage point for conversion. If you have one possible conversion per article, you’ll have more conversion chances with five smaller articles than you will with one extremely long one.

    Additionally, because long articles tend to alienate readers, you’ll find that your conversion attempts will often go neglected in your longer features. That means your total number of conversions will suffer if you continuously churn out very long posts.

    Higher Cost to Engagement Ratio

    articleimage866Higher Cost to Engagement Ratio

    It’s also worth mentioning that the cost of generating long articles, provided they have the same level of detail as your shorter articles, is much higher than that of their shorter counterparts. While you might spend the same level of effort on your 5,000 word post as you do on five 1,000 word posts, the cost to engagement ratio is much higher.

    Each post you publish is an opportunity. It’s a new link that you can syndicate to your audience through social media, a new opportunity to attract some referral traffic, and a new chance to get featured on an RSS feed or similar blog aggregators. Writing longer posts means writing fewer posts, so the total number of those engagement opportunities goes down. Since you’re spending the same amount of money on a much lower ring of visibility, you’ll be getting less value for your investment.

    A Note on Minimum Length

    Many of these explanations indicate that multiple short articles are superior to fewer long articles. However, this should not imply that shorter is always better. Your articles need to be of sufficient length to interest and inform your readers, usually 500 words at least.

    It’s true that extremely long articles can be damaging to your customer engagement and SEO strategies, but don’t ever let word count become your top priority. Your first goal should be providing the type of content your users want to see and read, and that means keeping things as concise and detailed as possible, regardless of length. It’s also important to diversify the types of posts you publish—not just long articles and not just short articles, but articles of varying lengths and formats. As long as you’re working with your readers in mind, you won’t have to worry too much about how long your articles get.

  2. Best Practices for Goals in Google Analytics

    Leave a Comment

    Google Analytics is the Swiss Army Knife of the online entrepreneur. It’s full of detailed insights and information you can use to analyze your online traffic and perfect your approach to earn the most new customers and the greatest amount of recurring revenue. But many business owners fail to use Google Analytics to its full potential, relying solely on inbound traffic figures and never venturing further into the platform.

    The “Goals” section of Analytics is one of the most useful tools you’ll find. If set up properly, you’ll be able to track conversions throughout your site, and run an analysis to determine the overall value of your campaign, giving you a perfect gateway to uncover the ROI of your inbound efforts.

    Initial Setup

    articleimage862 Initial Setup

    Setting up a goal is relatively easy. All you have to do is find the Admin section for your target site, click on “Goals,” and then “Create a Goal.” Google Analytics offers a step-by-step process that allows you to set up any goal you’d like.

    For most users, you’ll be setting up a template goal. Some of the common goals you can choose from include “destination” goals, which are completed when a user reaches a specific page, or “event” goals, which are completed when a user takes a specific action, like playing a video. Once you’ve selected a type, you’ll be able to customize your goals and fill in the necessary information—like the URL for your destination goal.

    Once you’ve got your initial goals set up, make sure to run a handful of tests to make sure they are functioning properly.

    Designating a Value

    articleimage862 Designating a Value

    You’ll also have the opportunity to designate a value to the completion of each of your goals. Take advantage of this; it’s going to provide you with a major opportunity to objectively analyze your online marketing results later on.

    For some goals, coming up with this value is easy. For example, if you’re selling an ebook for $5 and you set up a goal for the completion of a single order, the value of the goal would be $5. However, if you’re selling multiple items in varying groups, you’ll have to come up with the average value of a customer order and use that as the assigned value of a goal. The process is further complicated by non-monetary goals, such as those assigned to the completion of a contact form. Here, you’ll have to determine the ratio of inquiries to sales, and then the average sale to determine the average goal completion value.

    This may take a few extra steps, but coming up with an accurate value is essential to determining the objective results of your campaign later down the road.

    Determining Which Goals to Set Up

    If a specific action on your website corresponds to revenue or the strong possibility for revenue, you should set it up as a goal. Only then can you be able to concisely and accurately project how much revenue your inbound marketing strategies are earning. Goals don’t take much time to set up, and once they’re set up correctly, you can run with them for as long as you need. Nobody has ever complained about having too much data available.

    Still, if you have multiple transaction points and multiple points of contact, it may be overwhelming to try and set up a goal for each one of them. Start out with the goals that are the most critical for your business goals, and once those are complete, gradually flesh out the others.

    The Funnel

    Setting up a funnel is an optional part of the goal setup process, but I’ve found it extremely valuable for determining where your customers are coming from and why. With the funnel option, you’ll be able to outline the typical process your visitor goes through before completing a goal; for example, a customer may arrive at your homepage, travel to the blog section, and eventually land on the contact page, where they complete your “contact” goal.

    Setting up a funnel is advantageous because once you have some data flowing, you can easily visualize your customer’s path. Analytics will map out the ideal customer flow you outlined, and give you data for each step of the process. You’ll be able to see what percentage of your customers move on to each step, which will allow you to pinpoint any holdups to your ultimate goal.

    Measuring Your ROI

    articleimage862Measuring Your ROI

    The most important function of goals is getting the opportunity to objectively measure your return on investment (ROI). Will goals in place for all your major transaction and conversion points, you can estimate exactly how much revenue your site has brought in over a given period of time. Determining how much you spent to get that level of traffic is usually the tricky part, since you’ll have incoming traffic from searches, referrals, direct entries, and social media. Still, if you can estimate how much you spend on marketing and compare it to how much you’re making through your goals, you’ll be able to determine the effectiveness of your current strategy.

    If you aren’t already using goals in Analytics, it’s a good idea to get started. Even if you don’t plan on running an analysis in the near future, you may find yourself wishing you had the data. The sooner you set up goals, the more information you’ll have access to, and the better you’ll be able to project the real results of your online marketing strategy.

  3. Are Google Updates a Thing of the Past?

    Leave a Comment

    For more than a decade now, Google updates have been keeping search marketers on their toes. Every time you got used to one common SEO custom—such as the most effective way to build backlinks—Google seemed to respond by pushing a major algorithm change that altered how it took those factors into consideration. In the SEO community, industry professionals were constantly either looking for ways to take advantage of the most recent update or trying to anticipate what changes were coming with the next one.

    Now, as we enter a new era of search, Google’s update patterns appear to have shifted. For the past several years, rather than introducing new algorithm changes, the search giant is only making tweaks to previously existing ones and making minor changes to account for new technologies. Rank disruption is still occurring, but on a much smaller scale, leaving search marketers to wonder—are Google updates a thing of the past?

    The Major Overhauls

    articleimage861The Major Overhauls

    Google updates have earned a reputation for being large, disruptive, and sometimes annoying pushes that can raise your site to the top of the SERPs or drop you off into online oblivion. That’s because most of Google’s major updates so far have been massive game changers, either completely overhauling Google’s search engine algorithm or adding some new set of qualifications that turned the ranking system on its head.

    Take, for instance, the Panda update of 2011 affected nearly 12 percent of all queries, massively disrupting the search game by introducing a new concept of content-based evaluation. Sites with high-quality content were rewarded while sites with spammy content were penalized.

    It was a fair system, and searchers of the world were happy to start seeing more relevant results and fewer obvious attempts to climb ranks by whatever means necessary. But it was still a major headache for search marketers who had invested serious time and money into the previous iteration of Google’s search algorithm. For a time, updates like these were common, and search marketers were constantly on the run, waiting for more changes like the Penguin update, or Panda 2.0, which carried another massive update to Google’s content evaluation system.

    Modern Panda and Penguin

    articleimage861Modern Panda and Penguin

    Panda and Penguin, two of Google’s biggest landmark algorithm updates, have seen multiple iterations over the past five years. Panda 2.0 was followed by small iterations leading to 3.0, and Penguin 2.0 came out only a year after the initial round of Penguin. These algorithm changes were substantial, and search marketers attempted to predict the cycle based on existing patterns, projecting when the next major Panda- and Penguin-based algorithm changes would come.

    But something changed in 2014. Rather than unloading the Panda update in a major package, Google started rolling out data refreshes and minor tweaks to the algorithm on a steady basis. Instead of hitting the search world with a massive burst, it introduced a regular, unobtrusive pulse. Similarly, with the Penguin update, major iterations were virtually done away with. Marketers named an algorithm update “Penguin 3.0” in late 2014, but search volatility was limited compared to Penguin updates in the past.

    This, combined with the fact that Google hasn’t released a major overhaul to its search function since the Hummingbird update of 2013, seems to indicate that instead of rolling out massive, disruptive updates, Google is more interested in rolling out very small, gradual changes.

    Niche Algorithm Updates


    Other than extensions for its previous updates, Google has also released a handful of other changes. However, most of these are focused on niche functions—for example, the unofficially nicknamed “Pigeon update” of 2014 overhauled the way Google processes and displays local search results, taking local reviews from directory sites into account. Similarly, Google has been making changes to its Knowledge Graph and how it displays on SERPs.

    These niche updates don’t radically change Google’s core algorithm, nor do they interfere with any major updates of the past. They do have an impact on how search works and what strategies are the most rewarding, but they haven’t done anything to change the fundamental elements of a great SEO strategy.

    The Case for Micro-Updates

    There are a lot of reasons why Google would want to abandon large-scale updates in favor of smaller, less noticeable ones, and the evidence supports that transition:

    • Major updates have slowed to a stop. Instead of large batches of changes, Google is rolling out Penguin and Panda changes gradually and almost imperceptibly.
    • Google is no longer officially naming its updates. Penguin 3.0, Panda 4.1, and the Pigeon update are all unofficial nicknames—Google has abandoned the process of naming its updates, indicating it’s moving away from the process.
    • Search volatility is decreasing. Since Panda’s 12 percent disruption, nothing has come close to that level of volatility.
    • Google is finally at a stable point. The search algorithm is now complex enough to evaluate the quality of sites and the intention behind user queries, leaving little reason to rapidly accelerate through growth spurts.

    Of course, it’s possible that Google has a few more aces up its sleeves, but for now it looks as though major updates are dead, in favor of smaller, less momentous rollouts.

    What Search Marketers Can Learn

    There’s no reason to fear anymore. It’s likely that Google will no longer be pushing out the updates that have disrupted so many business rankings for so long. Instead, search marketers should understand that the fundamental best practices for SEO—improving user experience and building your authorityaren’t going to change anytime soon. The tweaks here and there might fine-tune some small details, but for the most part, the sites and brands that offer the best overall experience are going to be rewarded.

  4. What’s the Best Way to Optimize a Site for Mobile?

    Leave a Comment

    Mobile is taking over the search world. More people are using their mobile devices to perform searches, mobile searches are gaining popularity over desktop searches, and search engines like Google are stepping up their efforts to provide the best possible mobile experience to the greatest number of users. Mobile devices, like smartphones, tablets, and the upcoming plethora of smart watches, are starting to take over the realm of online experience, and if you want to survive in the business world, you’ll need to stay ahead of the curve.

    Why It’s Important to Have a Mobile Optimized Site

    articleimage805 Why It’s Important to Have a Mobile Optimized Site

    In the early days of smartphones, there were critics who claimed that smartphones were just a fad, or that people wouldn’t rely on them to perform searches due to small screen sizes and difficult interfaces. But the trends of the past several years have proven the naysayers wrong: smartphones are here to stay, and creating a mobile experience that suits those mobile devices is imperative:

    • First and most importantly, your site exists to give your users a high quality experience. You want your users to find what they’re looking for on your site easily, and with a design that’s easy on the eyes. If your site isn’t optimized for mobile, you’ll be compromising that experience for a significant portion of your user base, leading to poorer consumer-brand relations and fewer opportunities to attract or convert new leads.
    • Second, your site’s compatibility with mobile devices is measured and interpreted by search engines. Google can tell which sites are optimized for mobile and which are not, and for mobile searches, it’s highly unlikely to rank a non-optimized site in the first page. This is because Google wants mobile users to have the best possible experience, so if you aren’t optimized for mobile, you’ll be missing out on all that search traffic. Plus, mobile-optimized sites get a ranking boost even for desktop searches, so you really don’t want to miss the opportunity.
    • Finally, the competitive factor is critical. Mobile optimization is a new standard in web practices, and countless businesses have already taken steps to ensure their sites comply. If you have a direct competitor whose site is mobile optimized while yours remains non-optimized, you could immediately lose a ton of recurring customers who prefer to browse the web on mobile devices.

    Three Options to Optimize for Mobile

    articleimage805three Options to Optimize for Mobile

    Optimizing for mobile isn’t complicated, but it isn’t as simple as flipping a switch. There’s no single patch of code or button you can push to magically alter your site to be compatible with mobile devices. However, you do have several options.

    Responsive Websites

    Responsive websites are optimized for mobile at a design level. They are created in such a way that allows the components of the page—such as the banners, blocks of text, headlines, and so on—to organize themselves on the page based on the size of the screen that’s accessing the webpage. These components may flex or stack to accommodate a smaller screen size, so a desktop user and a mobile user would both be able to easily navigate the site (even though the layout might be different).

    There are a number of advantages to responsive websites. Since the design is flexible enough to adjust to any screen, every type of mobile device will have a customized experience. However, the “responsive” element only needs to be built once. There is only one URL for your website, which makes it easy to develop and easier to manage over time, and it’s relatively simple to implement. The loading times for responsive sites tends to be slightly slower than the other options, but that’s generally a small price to pay for a universally adaptable website.

    Mobile URLs

    Mobile URLs are exactly what they sound like—they’re separate, customized URLs that exist for the mobile version of a webpage. For example, if your traditional website was, your new website could be Whenever a user accesses your site using a mobile device, you can automatically re-point them to the mobile version of your site (and provide a link to toggle between these versions, just in case a user wants to switch).

    Mobile URLs are starting to become antiquated, but they’re still useful for some businesses. They take more time to create than a responsive design, since they require an independent creation, and require more extensive ongoing upkeep. They’re also vulnerable to fault points in the redirect system—if you accidentally direct a mobile user to the desktop version, they may have a poor experience.

    Dynamic Content

    The third option for mobile optimization is closer in theory to responsive design. Like with a responsive design, dynamic content structures require a single URL to house both a mobile version and a desktop version. The difference is, in a dynamic content setting, you’ll have twin versions of your site—the desktop and mobile versions—ready to display based on the type of device and screen size trying to access them.

    This is an improvement over mobile URLs, since you’ll only need to manage one URL, and you won’t have to worry about creating and sustaining a redirect. However, there are some flaws that may prevent you from achieving the best results. Creating one mobile version can be problematic, since there are hundreds of different mobile devices that could theoretically access your site.

    The Best Option

    articleimage805The Best Option

    Google doesn’t care how you optimize your site for mobile, as long as it is optimized in some way. Whether you choose responsive, mobile URLs, or dynamic content, Google will consider your site optimized for mobile, and you’ll rank accordingly. Your users likely won’t care what type of mobile-optimization strategy you use either, as long as you’re giving them the best possible experience.

    That being said, your decision should come down to your own personal preferences. From a technical standpoint, responsive designs are generally the cleanest; they only require one redesign to be complete, and the ongoing maintenance is pretty much nonexistent, at least compared to dynamic content or mobile URL strategies. Plus, you’ll eliminate the vulnerability of failing to accurately judge the type of device being used to access it.

    Improving Your Rank in Mobile Search Results

    After optimizing your design and structure for mobile, there are a handful of ongoing strategies you can use to boost your rank in mobile searches, even beyond the strategies of a traditional SEO campaign:

    • Decrease your page loading times. Mobile devices load pages slower than desktop versions. Make sure your mobile design is optimized for lightning-fast download times.
    • Keep plenty of content on your pages. Since mobile users need things quick, it may be tempting to reduce your on-page content, but keep as many words on the page as appropriate to maximize the amount of content Google can crawl.
    • Avoid the temptation to use pop-ups. Pop-up ads are seeing a resurgence, especially for companies trying to push their mobile application specifically to mobile users. Doing so can devalue user experience, increase page loading times, and decrease your domain authority in Google’s eyes.

    Aside from ongoing SEO updates and minor tweaks to the design and functionality of your site, mobile optimization really is a one-time process. With your one-time investment, you’ll instantly gain more favor with your user base, gain more visibility in search engines, and get an edge over your competition. If you haven’t yet optimized your site for mobile, now’s the time to get it done.

  5. How the Google Knowledge Graph Is Growing to Change Search As We Know It

    Leave a Comment

    The search world is always on the move, but it wasn’t until the introduction of the Knowledge Graph a couple of years ago that search marketers really started questioning whether conventional SEO practices were feasible in the long-term.

    The Knowledge Graph has been the subject of a slow evolutionary process, gradually incorporating more elements and covering more ground, and now it has reached a threshold that’s made it the center of attention for many search experts. It’s changing the way people use search, slowly but surely, and if your search campaign is going to survive its rise to prominence, you’ll have to be ready for it.

    What Is the Google Knowledge Graph?

    articleimage800 What Is the Google Knowledge Graph

    The Knowledge Graph is a small box of information you see off to the right-hand side of your traditional search results. Depending on what you’ve searched for, the Knowledge Graph will provide a summary of your subject, including bits of information you may be looking for. For example, searching for a movie will often prompt the Knowledge Graph to show up with a list of actors, the year of release, and any awards associated with the picture. Searching for a politician will prompt the Graph to display his/her picture and a brief rundown of his/her political history. It’s essentially Google’s way of providing immediate information to a user, rather than forcing them to wade through sites to find what they’re looking for.

    The Changing Function of Online Search

    articleimage800The Changing Function of Online Search

    This move shows Google’s dedication to providing the best possible experience for online users, which is a good thing. However, since many online users won’t venture to the top search results after finding their desired information within the Knowledge Graph, previous top-position holders may not see the same amount of traffic they did before the Knowledge Graph existed.

    The entire motivation behind the Knowledge Graph’s release is an indication of the future role of online search. Rather than being a tool to find online sites, it’s becoming a tool to find direct information, and as a result, the scope of SEO and online business marketing is bound to change.

    The Latest Developments

    articleimage800 The Latest Developments

    As you might imagine, since it is a Google product, the Knowledge Graph is not some stagnant, one-time development. It is a living, growing mechanic that continues to become more advanced on an almost daily basis. Even in the short history of 2015, the Knowledge Graph has been subject to updates and advancements.

    2015 Oscar Nominations

    In an unpredicted move, Google began showing information on the 2015 Oscar Nominations in the Knowledge Graph, shortly after they were announced. Any search for “Oscar nominee” or “Oscar nominations” will lead to a list of the eight films nominated for the award for Best Picture. In addition, Google is offering detailed information about the Academy Awards in general, as well as the ceremony date for 2015. It’s a sign of Google’s commitment to providing quick-reference information accurately, but also in a timely manner.

    Social Profiles

    Back in November of 2014, Google stepped out of its Google+ shell and started openly providing links to other social media profiles in its Knowledge Graph box. However, these links were restricted for use by major personalities, such as politicians, actors, and musicians.

    Starting in January of 2015, Google is providing links to social profiles of major brands. There’s even a specific markup Google released so you can accurately provide the details to your corporate social profiles to the search engine.

    Increasing Frequency

    Partly due to an increasing breadth of topics covered by the Knowledge Graph and partly due to an increasing number of companies using proper markup formats on their sites, the Knowledge Graph is showing up for an ever-increasing number of queries. According to a recent post by Steven Levy, the Google team estimates that current total number of queries to be 25 percent. One out of every four queries now leads to a Knowledge Graph box, and that number is likely to grow.

    How to Prepare for the Knowledge Graph’s Expansion

    Already, the Knowledge Graph is making waves in the search world. But as most search marketers have learned the painful way, the best way to respond to a new search function is to proactively prepare for it, rather than reacting to it after the fact. As the Knowledge Graph begins to grow in influence, take measures to protect your SEO strategy.

    Avoid Writing General Information

    This is good advice for any content marketing campaign, regardless of the encroaching Knowledge Graph. Rather than writing general information articles about topics related to your industry, focus on writing in a very specific niche—the more specific the better.

    This is going to hold several benefits for your campaign. First, and most relevantly, writing niche topics will prevent the Knowledge Graph from stepping into your territory. For now, the Knowledge Graph only projects common information about the most general subjects, so the more specific the topics you cover, the less likely it is that the Knowledge Graph will show up for your target queries. Second, the more specific you get with your topics, the less competition you’re going to face. That means you’re going to rank much higher for slightly lower-traffic keywords. It’s a shortcut to greater search traffic.

    Use Markups

    Google is open about the fact that the Knowledge Graph relies on microformatting to draw in information, so if you want to make sure the Knowledge Graph has the most accurate and most complete information about your company and everything you offer, use every markup you can. is a great (and free) resource you can use to mark up the information on your website, and it also provides detailed information on how to incorporate them onto your site.

    As Google starts rolling out expanded coverage of the Knowledge Graph, like it recently did with social profiles, be ready to grab new microformatting requirements and implement them as needed.

    Find Alternative Means of Improving Online Visibility

    It’s unlikely that the Knowledge Graph is going to end SEO as we know it—even in years to come, when the function has expanded in accuracy and coverage, a number of people will still rely on Google to find actual sites with the information they seek. Even so, it’s important to hedge your bets.

    Gain online visibility through non-search related channels, such as RSS feeds and social media. Get involved with other sites, exchanging guest posts and starting threads and discussions leading back to your site. You’ll also want to get involved with as many third-party apps and services as possible, such as local review sites and new applications for your industry (such as Open Table reservations for restaurants). As smartphones and smart watches become more mainstream, app-based discovery will come to rival traditional searches, and getting involved with those apps early on will keep you ahead of the trend.

    The Knowledge Graph is a game-changer, and while it’s improving the search experience for millions of users, it’s also complicating the lives of search marketers. Still, suggesting that the Knowledge Graph is going to “kill” SEO or otherwise destroy the foundation of your inbound strategy is an overstatement. Like with any development, you’ll need to work to understand it, cover your bases wherever you can, and proactively prepare for its next iterations.

  6. How Local Search and Mobile Search Are Becoming One

    Leave a Comment

    There was once a time when searches were all practically the same. They were performed in the same way, on a desktop, and two identical queries from two very different people would still yield identical results. Obviously, this isn’t the case anymore, but the degrees of personalization added to search have been relatively well-defined. Mobile searches generate slightly different responses than desktop searches, when a device can detect the location of a user, local-specific searches become possible even without local keywords, and when a user is logged into Google, his/her search history can have an influence on results.

    Now, as we move into 2015 with new technologies, more advanced search algorithms, and a more demanding audience, these lines are beginning to blur, and customized searches are starting to become normalized and ubiquitous.

    The Tenets of Mobile Search

    articleimage785Tenets of Mobile Search

    Mobile search wasn’t always taken seriously, but the rising trend of searches being performed on mobile devices have prompted Google and popular webmasters everywhere to take action. Google offers a more concise search results page, lending an easier search experience for its mobile user base, but more importantly, it has implemented a reward system for sites optimized for a mobile experience. Essentially, when a search is performed on a mobile device, Google increases the rank of sites that are optimized for mobile—either in a responsive format or in its mobile-specific form.

    The idea makes sense. Mobile users demand a mobile experience—otherwise, it’s hard to read, see the full content of the page, and even click buttons. Rewarding the sites with mobile features means that more mobile users will get an ideal experience.

    How Local Search Is Growing


    Local searches once relied on local-specific keywords. For example, in order to find a veterinarian in Raleigh, you would need to search specifically for “veterinarian in Raleigh” or something similar. Today, as long as your device can detect your location, the local portion of the search is applied automatically, and a search of “veterinarian” is sufficient to generate local-specific veterinary results. It has made things dramatically easier for local searchers looking for restaurants, hotels, or other services.

    Last year, Google also unveiled the unofficially nicknamed “Pigeon” Update, a major algorithm overhaul made to improve local search results. Now, the presence and reputation of businesses in local directories and local rating sites like Yelp and TripAdvisor are factored into search rankings. Essentially, this means that better-reviewed local businesses are more likely to rank high for a local search. More people are making local searches due to the expanding presence of local businesses online, and users are far more satisfied with the results.

    The Overlap Between Local and Mobile

    articleimage785 How Local Search Is Growing

    Two major factors are responsible for the merge of local and mobile searches.First, mobile devices are popularly enabled with location services, which inform the device of the user’s location. This information enables local searches to happen automatically, whereas browsers rarely come with this option as a default and not everybody logs into an account before they search. The bottom line is that searches performed on mobile devices are more likely to have local results as a default.

    Second, the number of local searches performed on mobile is staggering, partially due to a rising trend of mobile searches and the nature of mobile devices. According to a recent survey, 56 percent of all mobile searches have local intent behind them. People are more willing to use their mobile devices when searching for something local, in large part because many local searches are needed when en route or otherwise away from home.

    As a result, the majority of local searches are now being performed on mobile devices, and the majority of searches performed on mobile devices have some kind of local intent. Those two pieces of data, both trends still growing, are reason enough to declare that the worlds of mobile and local search are starting to merge into one.

    What This Means for Search Marketers

    If you’re currently wondering how to juggle all the facets of a successful optimization strategy, take a deep breath. Despite the increasing complexity and sophistication of Google’s ranking algorithm and emerging technologies, SEO has actually gotten simpler over the years. The combination of local and mobile search is just another step in that simplification process, and as a search marketer, you stand to benefit.

    First, you’re going to need to make sure your site is optimized for mobile. By now, that should be a given, and regardless of whether or not mobile and local searches are interrelated, it’s going to continue being critically important. If your site isn’t optimized for mobile, you aren’t going to show up in mobile searches, so optimize your site if you haven’t already—fortunately, this is a one-time process.

    Next, your local SEO efforts need to be increased. Even if you have multiple locations, or if you don’t consider yourself tied to your specific location, it’s important to engage in a local SEO strategy. The potential rewards are enormous and increasing, with less competition and higher search volume than ever before. Claim your profiles on any relevant local directories that you can, publish local-specific content whenever you can, and actively cultivate positive reviews online when possible. Increasing your local relevance will greatly increase your local web traffic, and will set you up nicely for search developments down the road.

    That’s it. For the time being, the phenomenon of increased mobile and local searches isn’t going to have much of an impact on your strategy other than an increased need for ongoing, local-specific updates. However, as we look to the future, this trend could evolve, and new means of searching could completely revolutionize the industry.

    The Next Age: Wearable Technology

    Wearable technology is on the horizon. Augmented reality with Google Glass and seamless interfaces with the Apple Watch are going to spark a new trend in compact, flexible, hardly noticeable technology. That’s going to make waves in the world of mobile search, as hands-free, on-the-go searches become a critical space for online visibility and integration from the real world to the search world is going to define your success or failure.

    It’s hard to say what specific changes are in the pipeline, especially since Google keeps its algorithm secrets under lock and key, but it’s reasonable to speculate a handful of potential developments. Local searches will start to become hyper-local, focused on detail down to a city block rather than a broader city or region. Mobile searchers will need results instantly, with less of a need for content and more of a need for immediate answers like directions, hours, and ratings. And local businesses who reward mobile searches, perhaps with integrated onsite functionality, will earn the greater share of mobile search traffic—and potentially foot traffic as a result.

    The separate evolutions of mobile and local searches are starting to align, especially as the next wave of mobile technology begins to enter the scene. Stay ahead of the curve by maximizing your local search visibility, and taking advantage of the latest mobile devices when they start to emerge in the marketplace. As always, the bottom line is user experience, so give your users what they want and what they expect, and you’re going to be rewarded as a result.

  7. How an Outdated Sitemap Can Seriously Throttle Your Rankings

    1 Comment

    Sitemaps are a critical and often overlooked element of your site structure, and they play a crucial role for search engines looking to index information about your site. If your sitemap falls out of date and you don’t take measures to correct it, you could pay the price with reduced search visibility and therefore, less traffic to your site.

    Why Sitemaps Are Beneficial

    articleimage780Why Sitemaps Are Beneficial

    Onsite sitemaps are a link on your main site that contain a structured layout of all your existing pages. They’re crawled by Google and inform the search engine about the remainder of your onsite pages.

    XML Sitemaps are files that you can build and submit to Google directly. Essentially, these files are a condensed map that lays out the structure and hierarchy of your site. By submitting a sitemap to Google, you’re telling the search engine to review and index the pages of your site—and this is a critical first step when you’re launching something new.

    Sitemaps are basically instructions that allow search engines to find your pages faster and more accurately. Keeping them updated ensures that Google has the best understanding of your overall website, and the greatest number of your pages are showing up for the appropriate searches.

    The Dangers of an Outdated Sitemap

    articleimage780The Dangers of an Outdated Sitemap

    If your sitemap isn’t up-to-date, you could be providing inaccurate data to search engines. Depending on the severity of your inaccuracies, this could have major consequences or minimal impact. For example, if one of your hundred product pages drops off, you won’t see much of an impact. However, if you’ve restructured your entire navigation, search engines could be confused when they attempt to crawl your site, and you may lose indexed pages as a result. In addition to having a smaller number of indexed, searchable pages, your domain authority could even take a hit.

    The bottom line here is that an outdated sitemap will send outdated information to Google—and while Google, in some cases, is smart enough to make sense of these discrepancies on its own, the safer play is to ensure your sitemaps are always up-to-date.

    How Your Sitemap Can Become Outdated

    articleimage780How Your Sitemap Can Become Outdated

    Sitemaps don’t become obsolete on their own. Only through a deliberate change in your site, usually an increase or decrease in the number of pages, can make your previously submitted sitemap outdated. Keep a close eye on the changes you make to your site, and if you do make a significant change, take efforts to keep your sitemap updated accordingly.

    Adding and Removing Pages

    By far the most common reason for a sitemap becoming outdated is the addition or removal of a core page. Even traditional, static websites experience the need for change from time to time—whether that’s the addition of a new service page or the removal of a special offers page that’s no longer relevant. While some regularly updated sections of your website (such as a blog or press page) will be routinely scanned by Google, any major page changes will need to be reflected in an updated sitemap.

    Redesigning the Site or Navigation

    Restructuring your site will also require an update to your sitemap. In addition to simply listing out the pages of your site, the sitemap is responsible for showcasing the hierarchy of your web presence, outlining the most important pages in a very specific order. If you make major changes to your navigation or restructure your page-based priorities, you’ll need to update your sitemap.

    Adding or Removing Products or Listings

    E-commerce sites and sites with classified-style postings (like job opportunities) tend to be the most vulnerable to sitemaps falling into obsolescence. Since most of these sites have large volumes of products and listings, sometimes numbering in the thousands, it’s common for new postings to be added and old postings to be taken down. Fortunately, a dynamic sitemap can spare you the pain of manually updating a sitemap every time you make a minor change, but you will have to routinely check to ensure your sitemap is accurate and up-to-date.

    Determining Whether Your Sitemap Is Outdated

    The easiest way to test whether your sitemap is current or outdated is to check it using Google Webmaster Tools. If you haven’t yet uploaded a sitemap here, you can start from scratch. If you need help creating your sitemap from scratch, be sure to read up on Google’s guidelines for building a sitemap.

    Once submitted, you might encounter errors during the upload process:

    • If you see a Compression Error, Empty Sitemap, HTTP Error (specific code), Incorrect Namespace, or Incorrect Sitemap Index Format, there is likely a problem with the format of the sitemap you submitted. These problems are generally easily fixed, and do not necessarily indicate a problem with the links and structure included in your map.
    • If you see Invalid or Missing errors, a Parsing Error, or a Path mismatch, it generally means there is a formatting error in the body of your sitemap that needs to be corrected.

    And once the sitemap is accepted, you may find errors with your sitemap. Perform a test by clicking on your intended sitemap, and clicking Test Sitemap in the top right corner. From there, you’ll be able to Open Test Results and view the results of the test.

    The test will tell you what type of content was submitted, in a quantifiable data table, including the number of web pages and videos that were submitted in the test. Any errors that Google encountered, which prevented it from indexing a page that was submitted, will be displayed. Some errors arise when there isn’t a page present where one should be according to the sitemap. Others are based on outside factors, such as server-related problems, or the presence of a robots.txt file blocking Google crawlers from discovering it.

    If you notice any of these errors preventing your sitemap from being accurate or up-to-date, take a closer look at the breakdown in Google Webmaster Tools, make a list of any corrections you need to make, and start making them.

    Submitting a New Sitemap

    Once ready with your new sitemap, head to the Webmaster Tools homepage and enter through the site you wish to submit the sitemap for. Under the Crawl header, click on Sitemaps, select the sitemap you wish to resubmit, and click the Resubmit Sitemap button. Once resubmitted successfully, you’ll be able to re-run the test you used to find the errors in the first place. Hopefully, all of these errors have been corrected in your revision. If not, you’ll have another opportunity to make corrections and resubmit a new sitemap.

    If you’re using static XML sitemaps and you run an e-commerce site or another type of site where pages come and go regularly, you’re in for a lot of work. With a static XML sitemap, you’ll have to manually change and resubmit your work with every change. Instead, you can build a dynamic sitemap and setup automated “pings” to notify the search engine whenever there is a major change.

    No matter how you look at it, sitemaps are an important piece of the puzzle when it comes to making sure Google has the right information about your website. Take measures to ensure your sitemap is up-to-date at all times, and you’ll be rewarded with more indexed pages, and more search traffic as a result.

  8. When and How to Use Google’s Disavowal Tool

    Leave a Comment

    While Google’s penalty-based algorithm updates sometimes make it seem like the search engine giant is actively trying to disrupt the ranking efforts of webmasters, the truth is Google cares about the web, and wants to do everything it can to help webmasters succeed in their own endeavors. Google Webmaster Tools, if you couldn’t tell by the name, was created for that very purpose. It contains a number of research- and analysis-based tools to improve your visibility to Google robots, improve the functionality of your site, and ultimately provide a better experience to your web visitors.

    One of these tools is known as the Disavow Links tool, which exists to help webmasters remove harmful or unwanted backlinks that may be interfering with your campaign. Understanding why this tool exists, as well as when and how to use it, can be highly beneficial for your SEO campaign.

    Why Would I Need to Disavow Links?

    Quality external links are a good thing. The more high-quality links you have on external sites pointing back to your domain, the more authoritative your site will be according to Google, and the higher you’ll rank for relevant keywords. However, if you have low-quality links, Google will work against you by making it more difficult to rank for relevant keywords. Just one bad backlink could easily set you back weeks or months in your ranking efforts.

    Types of Links That Can Harm Your Site

    Google wants to clean the web of spam and irrelevant information, so the best links you can build are the ones that fit naturally into a forum, site, or conversation. If your link appears unnatural, or if it appears like it was built for the sole purpose of trying to boost your rank, it may be considered a bad backlink, and can actively work against you.

    Some of the most common types of bad backlinks include:

    • Links on low-quality sources, such as article directories, link scheming sites, or other hosts for spam and throwaway content.
    • Repetitive backlinks, such as multiple instances of the same URL being used across the web with no variety.
    • Backlinks anchored heavily in keywords, or other attempts to optimize links for keywords unnecessarily.
    • An excessive number of links found on one specific source.
    • Links on irrelevant sites, such as an industry directory for an industry that’s irrelevant to your business.

    If you avoid building links on these types of sources, you should be in the clear, at least for the most part. There’s still a small chance that you could be the victim of negative SEO, which occurs when a competitor or other third party builds a negative backlink without your permission. These instances are rare, but they do occur, so it pays to stay apprised of your backlink profile.

    When to Use the Disavow Tool

    The Disavow Tool wasn’t created to be an easy out for these instances. Google takes negative backlinks very seriously, and allowing webmasters to instantly and permanently remove a ranking penalty would defeat the purpose of ranking penalties altogether.

    Instead, it exists as a last-ditch effort for webmasters to remove a particularly harmful link, or set of links. After all other options have been exhausted, it can be used as a request for Google to consider overlooking those specific links when it scans the web. Requests are manually reviewed, and are typically reviewed judiciously—the majority of requests are ultimately ignored.

    Keep this in mind as you audit your backlink profile or attempt to improve your positions in Google. If you notice your rankings or your organic traffic taking a sharp drop, use a link monitoring tool like Open Site Explorer to determine whether a bad backlink could be the root of your problem. If it is, and you cannot remove the link in any other way, you’ll need to refer to the help of the Disavow Tool.

    Step One: Remove the Links You Can


    Before you log into Google Webmaster Tools, take efforts to remove whatever links you can on your own. The first step is the most logical—try removing them yourself. For example, if you’ve posted a comment on a forum with a backlink pointing back to your site, try logging into that site and deleting your comment.

    Of course, this strategy isn’t always going to work. In many cases, you’ll need the assistance of the presiding webmaster if you want to get your link taken down. Most sites list the webmaster contact on the contact page of their site, so send them a polite request email to have your link formally removed. If the contact isn’t listed, you can use Google itself—type in “” where is the URL of the site whose webmaster you’re trying to find. In the vast majority of cases, webmasters will be more than happy to help you out. However, if the webmaster in question is unresponsive after multiple follow-ups, or if they flat-out refuse to help you, you may need to move on to the next step.

    Step Two: Verify Webmaster Tools (if You Haven’t Already)


    Before you gain access to the Disavow Tool, you’ll first need to verify your Webmaster Tools account with your domain. The easiest way to do this is to upload an HTML file (which Webmaster Tools will provide to you) to your site, but you can also verify your account with Google Analytics or your domain registrar. Once your domain is verified within Webmaster Tools, you’ll be able to login and complete the next steps of the disavowal process.

    Step Three: Download Your Links


    First, you’ll need to download a full list of backlinks pointing to your site. Log into Webmaster Tools, select your domain, and once in your dashboard, click on “Search Traffic.” From there, click on “Links to Your Site,” and under “Who links the most,” click “More.” Once there, click on “Download more sample links”—as an alternative, you can click “Download latest links” and get the dates associated with your links as well. This will allow you to download a file that contains all pages and links pointing back to your domain.

    Step Four: Upload Links to Disavow

    Create a text file (.txt) that contains all the links you wish to disavow, using the link list from the file you downloaded. Be sure to only include the problematic links that you were not able to manually remove. Once you’ve got that all set, you can head to the Disavow Tool itself, click “Disavow links,” follow the prompts, and upload your text file. Uploading a text file will automatically replace any previously uploaded text files, so make sure your file is up-to-date. It will take at least a few days, probably a few weeks, before Google reviews your requests.

    Remember, the Disavow Tool isn’t a magic solution to get rid of all your linking problems. You’ll need to carefully restructure your backlink building strategy, and actively work to remove any negative links long before you even consider using the Disavow Tool. Even when you do make a submission, it’s not uncommon for Google to reject your request. Your best strategy moving forward is to make sure you only build the highest quality links for your backlink profile, and catch problematic links proactively.

  9. Why Is Google Pulling Out of So Many Countries?

    Leave a Comment

    Google has ceased operations (or a portion of their operations) in a number of external countries over the course of the past month, including China, Spain, and Russia. The pullouts, though individual and motivated by separate pressures, form an intriguing pattern of international relations for the company.

    Pulling Out of China


    Google’s China domain, once, now officially redirects to, calling itself the “new home” of Google China. For a number of years, Google complied with Chinese requests to censor and alter its search results for the Chinese population (such as restricting information about the 1989 Tiananmen Square protests), but this recent switch means that Google is no longer willing to comply with those stipulations. Uncensored results for searches can now be found on, though whether or not the Chinese population can access that domain is still controlled by the Chinese government.

    Renegotiations stretch as far back as January of 2014, when Google and the Chinese government began discussing the state of Chinese censorship and Google’s future in the country. Google’s official statement on the matter revealed that a cyber-attack from China was the first domino in a chain of incidents that led the search engine giant to discontinue services such as Google Search, News, and Images in China. Services are still being provided on Hong Kong servers.

    Google is refusing to further compromise their search results due to foreign governmental requests. It was not motivated by external pressure or as a punitive move—instead, it was a logical decision that Google believed would work out the best for the greatest number of people.

    Pulling Out of Spain


    In a somewhat differently motivated move, Google is officially putting a halt to its News service in Spain. Earlier in 2014, Spain implemented a new amendment to its copyright law to take place in 2015—though frequently defended to be a move against piracy, the amendment earned the colloquial name “the Google tax.” Under this amendment, the writers of journalistic articles would be entitled to monetary compensation for segment-based views of their work on news aggregation sites, such as Google News. Further complicating the amendment, this right is inalienable, meaning that writers and publishers cannot waive their rights under any circumstances.

    In response to this upcoming legislation, Google formally announced that it is ceasing Google News provision in Spain entirely. Pulling an Internet-based service out of one specific country is, of course, challenging—while users will no longer be able to access Google News at “” as they have in the past, they’ll easily be able to access the Google News pages of other countries. Since Spanish users will still be able to access this “snippet” content, Google may still be held liable under the new legislation—so 2015 will be an interesting year for the future of Google in Spain (and Europe as a whole, considering other European countries may soon pass similar legislation).

    Rather than making an independent decision in this case, Google is simply responding to an external threat. While Spain’s amendment is not specifically targeted toward Google, Google stands to lose much from the new proposal, and felt the need to mitigate or prevent those losses.

    Pulling Out of Russia


    Following its decision to pull Google News out of Spain, Google reported that as of January 1, it will be closing its engineering offices in Russia. This comes as a result of the Russian Parliament passing a new bill that mandates all foreign Internet companies store the data of Russian citizens within Russian borders—beginning in 2016. While Parliament claims the bill’s intent is to protect Russian users’ information, and prevent foreign espionage from gaining them, most critics believe it is simply a move designed to restrict the availability and flow of information. Though other Google services will still be made available to Russian citizens, Google will not comply with the Russian law.

    The motivation behind this decision is similar to the ones driving its decision in both Spain and China. The Spain pullout was motivated by new legislation, the China pullout was motivated by a desire to protect the availability of information, and the Russia pullout was motivated by new legislation designed to restrict information.

    The “Right to Be Forgotten” Controversy

    While these pullouts have all occurred in the month of December, Google’s international woes stretch back to May 2014, when the “right to be forgotten” privacy ruling by the European Union first started making headlines. Under the new EU decision, private citizens are able to make requests to major Internet-based providers like Google that certain links to information about themselves be removed from databases. Unless a compelling case can be made that the information is necessary for public interest, the links must be removed.

    This new decision put much pressure on Google, with over 169,000 links being removed. Though some have argued the decision is a positive move forward for privacy rights, it also forces some factual, information-based links to be removed from public availability. Since Google is a major proponent for absolute information availability, the decision came as a harsh blow.

    Google to Be Broken Up?

    Furthering the tensions between Google and the EU, European Parliament has also been implementing an antitrust investigation to determine whether Google, as a company, constitutes a monopoly on the Internet search industry. While there are clearly other search competitors around such as Bing and Yahoo!, Google is the dominant competitor. Should the EU get its way, it will demand that Google be broken up into separate, distinct companies.

    Such a decision would put enormous pressure on Google, though Google could likely find a way around such an eventuality.

    Fines From Spain, France, and the Netherlands

    Around mid-December, the Netherlands sent a warning to Google that the company needs to update its privacy policy to provide “clear and consistent” information about how private users’ personal information is used. The Netherlands cite a number of ambiguities and inconsistencies in Google’s privacy policy, and after such major attention and pressure from the European Union, they are cracking down harder than ever. Should Google fail to update its privacy policy by February of 2015, the Netherlands is threatening to issue an $18.7 million fine.

    This fine would be an additional blow to Google, after similar (though less expensive) fines handed down from Spain and France for committing the same privacy policy offenses.

    The Future of Google’s International Operations

    It’s unclear what the future holds in store for Google, but there are two major threats working against Google’s traditional international operations. First, secretive countries like Russia and China are making harsh demands about information availability to their respective populations, and Google wants to maintain open channels. Second, European countries are on a crusade to protect private user information and weaken the perceived monopolistic Google, and Google wants to maintain its independence and continue making information indiscriminately publicly available.

    Aside from Google’s desire to remain a singular, powerful business entity, the core root of all these pullouts and tensions is Google’s wish to keep information publicly and widely available. The search engine giant will undoubtedly face increased pressure in 2015 on all fronts, and it will be interesting to see how they respond.

  10. How Facebook Could Be Stifling Google’s Growth

    Leave a Comment

    For years, we’ve collectively recognized Google as the biggest name in the Internet. They’re responsible for refining—and some would say perfecting—the course of online search. Every year, they come out with new products and features that enhance and streamline online user experiences. Even their culture is impressive, rewarding their employees as well as their users with fun, seamless experiences.

    Over the course of the past 15 years, Google has exponentially grown, but that momentum is now starting to wane. There are several factors responsible for the decline, as with any change in business, but one of the biggest factors might be coming from an unlikely contender—an indirect, but massive competitor in the social media world.

    Google’s Slow Decline

    articleimage664Google’s Slow Decline

    Google is still the powerhouse to beat in the search world, but as an entire company, it’s starting to lose some momentum. They recently released a quarterly financial report, and despite still showing growth, there were significant points of concern for investors. They had expected to grow at a rate of 20 percent, but only grew at a rate of 17 percent.

    Certainly, Google’s explosive growth couldn’t last forever. There is a finite number of potential users of the search engine, and Google has become so ubiquitous that there are simply fewer potential new users to go after.

    While the financial report did not give any explicit reasoning behind the missed projects, it’s reasonable to assume that increased competition could have been a factor. After all, Bing and Yahoo! are starting to see increased popularity as online search options. But it’s Facebook, the social network powerhouse, that might be giving Google a run for its money.

    Users and Content


    Already, people are starting to turn to sources other than search engines to get their content. Most users scroll down their Facebook news feed in order to catch up on the stories of the day, and most users log into Facebook on a daily basis. With 1.5 billion active profiles, Facebook is a force to be reckoned with, and Google will likely continue to see declining figures as more users rely on alternative sources to consume content and get information.

    Facebook Advertising


    Facebook advertising may be the single biggest throttle to Google’s stream of revenue and growth potential. Since Google searches and most Google products are completely free to use, Google makes the majority of its substantial income through paid advertising, like the PPC ads you see at the top and side of your search bar. For years, businesses have paid top dollar to be featured in one of these slots and gain traction by attracting new clicks.

    Today, those ads are harder to get. Visibility has increased, but so has the demand of companies trying to earn those coveted spaces. As a result, the cost of paid advertising with Google has skyrocketed, forcing small businesses to go running to a competitor.

    While Bing has been trying its best to launch a cohesive and efficient paid ad engine, Facebook has won out as the most valuable alternatives for businesses. Through Facebook, businesses can spend as little as $5 a day for as long as they want, and narrow down the scope of their target audience based on age, sex, geographic location, and even personal interests—that’s far more specific functionality than Google offers with their keyword-based platform, and since many businesses are marketing through Facebook anyway, the integrated ad platform is capturing their revenue.

    Facebook Atlas and User Knowledge

    The trend of usage and practicality for Facebook advertising is only going to grow. For the past several months, Facebook has been teasing the announcement of Atlas, and the platform is now live. Atlas is described by Facebook as “people-based marketing,” a marketing and advertising solution for businesses who need something a little deeper and more effective than the Facebook ads of yesteryear.

    Atlas is built on an entirely new base of code, and offers more integrated campaign management features. The usual audience targeting and cost management features are present, but with a more detailed metric reporting feature and tie-ins to offsite sales so businesses can accurately project the ROI in their campaigns.

    But the greatest feature of Atlas, and the biggest threat to Google, is the inherent understanding of individual users that Facebook has accumulated over the years. Facebook has slowly and diligently been collecting incredibly detailed pieces of data on all its users, just waiting for the perfect opportunity to monetize it. Facebook knows much more about its users than Google does, and with Atlas, it is prepared to turn that advantage into cash.

    Furthermore, Atlas is looking to become an ad-serving platform that extends all across the web, not operating the confines of any one platform, making it a direct competitor to Google and challenging Google’s own product, DoubleClick. Atlas is also ignoring cookies entirely, entering the fray with a new form of user tracking based on the unique URL of Facebook users’ profiles.

    The bottom line here is that Facebook has a greater pool of knowledge than Google, and with Atlas, it might have greater functionality. It’s certainly a major player in the world of digital paid advertising now, and Google has a right to be concerned. In the coming years, don’t be surprised if Google PPC campaigns start to wane in price and significance as Atlas makes improvements and starts to leech more of Google’s revenue.

    Looking to the Future

    Google is still one of the biggest, most profitable companies in the world, and it isn’t going away anytime soon. Even if Facebook is radically successful with its latest product, and even if it continues to refine its strategies and grow, Google will still have the power and the authority to continue its pattern of growth. That growth might be limited, but it will still exist.

    Over the course of the next several years, you can expect that Facebook’s Atlas will be followed up by a series of related products and services, taking advantage of Facebook’s ever-growing database of user information to bring in more revenue and attract more potential customers. Right now, Facebook is barely causing a dent in Google’s overall strategy, but after five or ten years of aggressive pushing, they could force Google into a direct competitive situation. What would happen as a result of such a showdown remains to be seen, but it’s likely that the two entities will learn from each other and continually try to one-up each other in an effort to achieve dominance.

    What It Means for Marketers

    Since Google isn’t dying, you don’t need to worry about the future health of your SEO or PPC campaign. However, it might be worth your time to invest some money in a new Facebook Atlas campaign and see how well the platform works for your business. As the platform grows, you’ll likely tap into some impressive new functionality, and your ad campaign will likely benefit.

    The emergence of a new competitor to Google can only mean good things for the individual marketer. Increased competition means you’ll see better functionality rolled out at a faster pace, you’ll see prices drop in an effort to generate more revenue, and you’ll be in the middle, taking advantage of each platform to maximize your reach and impact.

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!


-The AudienceBloom Team