AudienceBloom

CALL US:  1-877-545-GROW

Category Archive: Google

  1. 3 Ways to Acquire Links from Large News Websites

    Leave a Comment

    Link building has come a long way since the early days of posting links pointing to your site like they were flyers for a lost cat. Successful offsite SEO is no longer a matter of quantity (though, quantity and regularly still play a role) so much as it is a matter of quality. In order to build the authority of your own site, you must leverage the authority of existing sites, and construct links that are meaningful in the eyes of users as well as search engine robots.

    So what constitutes an “authoritative” site? Two of the most valuable options, .gov government official sites, and .edu educational sites, are rare to find and hard to build links on—it’s no simple matter of making a simple request or doing the posting yourself. The next best thing is getting your link on a major news website, like CNN or MSNBC, but that must be just as hard, right?

    Actually, building links on major news sites isn’t as difficult as you might imagine. It’s true that it will take significantly more time and effort to build these links, and you may never be guaranteed a spot at the end of it, but these links are far more valuable than ones you build on traditional forums or blog spaces.

    Try any or all of these three significant ways you can earn valuable backlinks from major news sites:

    1. Take Advantage of Google News.

    articleimage529Take Advantage of Google News

    Your first option is one of the easiest. In order to get a link on a major news site, you have to get noticed. And getting noticed isn’t just a matter of making an introduction. In order to get seen and appreciated by a major news site, you have to have information that is truly newsworthy; these outlets have a reputation because they’re committed to publishing only the most significant material.

    Google News is a publication outlet that can help you achieve that visibility and credibility. If you’re new to Google News, think of it as a gigantic, constantly updated database with news stories from around the world. Google takes this aggregated cache of news and then displays it to users using sophisticated algorithms that show content appropriate for each user’s interests, history, and geographic location.

    Google News allows almost anyone to post credible news articles for consideration to be included in this database. If you have a “news” or “press” section of your website (and you actually use it to publish newsworthy information), this option is perfect for you. You can set your site up within the Publisher Center, and submit content regularly for consideration. You can also submit individual articles or press releases.

    If you’re lucky, another major news publication will take notice of your Google-published news entries, and will either reference it in a link out to your site, or follow up with you for more information. It’s a roundabout way of getting attention from news sites, but if your content is worth their attention, they will take notice.

    The worst-case scenario here is that no major news sites pick up your link—if that’s the case, you can still enjoy the benefits of having your articles listed on Google News. It may not increase your domain authority as much as a pack of high-authority links, but it will send a significant flow of new traffic to your site.

    2. Distribute Your Own Press Releases.

    articleimage529ownpressrelease

    The principle behind this tactic is the same as the previous entry: in order to earn a link on a major news site, you have to get their attention with a major news article. Google News is incredibly useful for sites with regular news releases, but it only increases the visibility of content in one channel: Google search results. If you’re interested in submitting your article to major news sites directly for consideration, you can distribute your own individual press releases with a service like PRWeb.

    Through PRWeb, you’ll be able to publish your press release and syndicate it—depending on the service you use, you should be able to submit it to tens of thousands of different news outlets, differentiating them according to their geographic location or industry niche. This distribution usually includes some of the biggest names in national news, but you will have a higher chance of getting picked up in low- to medium-authority news sites.

    If your press release is highly significant, well-written, and timely, you do stand a decent chance of getting picked up by a major news outlet, featuring a link pointing back to your site as a reward for your efforts. However, even if your article falls through the cracks of the highest-tier publications, the links you earn on lesser-known publications will still be highly valuable. This is especially true for local news sites, which will earn you backlinks anchored with a specific location, enhancing your relevance in local SEO.

    Temper your expectations by remembering that submitting a press release is no guarantee of publication on a major news channel, but distributed press releases are still one of the best shots you have. Submit newsworthy press releases regularly for the best SEO benefit.

    3. Get Involved in the Community with Comments.

    articleimage529Get Involved in the Community with Comments

    Comments are always a decent option for link building, and major news sites are no exception. You’ll certainly get more visibility and credibility if a news site publishes one of your articles, but if you’re looking to get some high-authority link juice, posting something relevant in the comments section is a great alternative.

    Your best bet is to find an article that has something to do with your industry. For example, if you work with new technology, find something in the “Technology” section that is related to a product you’ve produced. If you work in financial services, something in the “Money” section might be better. You can also use a search function to find a highly specific article, but do your best to select articles with a recent publication date in order to stay relevant.

    As with any link building exercise, take caution to ensure your link appears natural. Any indication that could give a webmaster the impression that you are only posting a link for the page rank boost will immediately get your comment flagged as spam and removed. Make sure your link is to a specific, relevant page on your site, and introduce the link by explaining why you’re posting it and why it’s relevant to the article. Get involved with the discussion, and you might earn a handful of new web visitors in addition to improving your SEO.

    Getting links from major news sources is neither stable nor easy enough to be a reliable, independent link building strategy. However, when executed as part of a broader, multifaceted offsite SEO campaign, news-related link building can be an enormous assist to your efforts. Be patient with news sites and stay consistent in your efforts even if you don’t get a bite right away. Emphasize the quality and relevance of your news items, and diversify your strategy whenever possible to cover the most ground. Eventually, you’ll find a rhythm for your news, and your SEO strategy will succeed because of it.

  2. 8 Changes You Need to Make After Panda 4.1

    Leave a Comment

    After four months of silence on the Google Panda front after May’s Panda 4.0 update, the next iteration of Panda is here. Referred to as Panda 4.1, the update isn’t big enough to warrant the title of “5.0,” but is significant enough to have search marketers scrambling.

    Building on the intentions of its predecessors, the Panda 4.1 continues Google’s tradition of gradually weeding out low-quality content in favor of well-written, informative, engaging content. Sites with aggregated or copied content, such as lyric databases and medical content hubs, seem to have been hit the hardest by this iteration of Panda, suggesting that Google’s duplicate content detection is becoming more sophisticated. On the flip side, small- to medium-sized businesses with diverse original content are seeing a boost.

    The update started rolling out officially on September 25, 2014, and became active in gradual updates that spanned through the first week of October. Most companies have already seen the gains or losses from this update, so if you haven’t noticed your rankings change in the past few weeks, don’t worry—Panda 4.1 probably didn’t affect you.

    Still, Panda 4.1 has changed the world of search yet again, and if you want to take advantage of it and prepare for the next phases of Google’s evolution, there are several strategic changes you’ll need to make:

    1. Scour your site for duplicate content—and get rid of it.

    articleimage509Scour your site for duplicate content

    Sites with volumes of duplicated content are the ones who have been hit hardest by Panda 4.1. Now is your chance to get rid of the dead weight. Look throughout your site and your blog to find any articles that might be partly replicated from an outside source. Just because you don’t plagiarize work doesn’t mean you’re not at risk—extended quotes, attributed work from outside authors, and paraphrased sections could all register as duplicated material, and could hurt your overall ranks. If you find any content that could be seen as a duplicate from another source, get rid of it.

    2. Do a content audit and remove or improve “thin” content on your site.

    articleimage509Do a content audit

    “Thin” content is a vague term, referring to content that is densely packed with keywords, light on value or specificity, or shoddily written. We’ve all seen content like this, so it should stick out like a sore thumb—especially in comparison to a longer, more detailed piece. Go through your previously published material and review the pieces of content that look like they’ve been scrapped together. You have two options for these pieces: either delete them, or take the time to revise them and turn them into a similar, but more valuable piece.

    3. Adjust your content strategy to include only the highest quality material.

    articleimage509Adjust your content strategy to include only the hi

    Depending on the current level of your content marketing strategy, this change could be enormous or barely noticeable. Moving forward, all your content needs to be of the highest quality—that means based on an original idea, written by an expert, and highly detailed. Don’t worry as much about the frequency of your posts; if a piece of content isn’t as high quality as you’d like it to be, do not publish it. It’s better to have a smaller number of better-quality posts than a greater number of lesser entries. You may be doing this already, but it’s still a good idea to revisit your strategy and see what positive changes you can make.

    4. Add more outbound authoritative links to your content.

    Google wants to see high-quality, authoritative content. If you want to be seen as authoritative, you need to back up your facts and provide references to support your claims. The best way to do that is to provide in-text links pointing to outside, authoritative sites. It’s a way of leveraging the current status of well-established sites to bolster your own authority. As you continue writing new content, experiment with posting more outbound links to build your own credibility. Make sure to use a diverse range of sources to avoid spamming any one source with an excessive number of backlinks.

    5. Include more images in your posts.

    Embedded images in your blog posts do two things: first, they look more enticing to your readership, giving you greater reader retention and more reader satisfaction. Second, they give your content the appearance of detail, and make your content seem more valuable according to Google. Include infographics in the body of your blog posts to illustrate a point with information; if they are original, they’ll naturally attract backlinks and assist your strategy in multiple ways. Otherwise, include any relevant images you can find (as long as they’re legal to use) to complement the text on your page.

    6. Publish author credentials to establish author expertise.

    According to the recent leak of Google’s Quality Rater Guidelines, author expertise is an important factor in evaluating the authoritativeness of a piece of content. Instead of trying to make your content seem like it was written by an expert, have your content actually written by an expert. Include author credentials at the bottom of each published article, identifying the author’s name, title, and area of expertise. If you do this consistently, and offsite content also features this author’s name, you’ll effectively build that author’s authority, and your content will be seen as higher quality. It’s a small change that could add up to major results.

    7. Build more high-quality links to your content.

    Despite all the changes that the Penguin updates have made to the world of backlink building, backlinks are still tremendously important for building a site’s authority. This change is essentially the strategy I covered in point 4, but in reverse. If a high-quality site, such as an information database or a .edu site, links to one of your articles, that article will be seen as much more credible, giving you a Panda-proof boost in authority. If you can incorporate more of these high-authority backlinks into your link building campaign, your domain’s overall authority will significantly increase.

    8. Perform content audits regularly.

    The best ongoing new strategy you can adopt in response to Panda 4.1 is a regular content audit. On a monthly or bi-monthly basis, take an hour to review all the new onsite content that’s been published since your last audit. Carefully review each piece to determine its quality; check for originality, authoritativeness, and level of detail. If any of these pieces does not meet your quality standards, either get rid of it or revise it to make it comply. Doing this regularly keeps you vigilant, and keeps your content quality from ever declining or putting you at risk for another Panda-related drop in rank.

    Google is notorious for keeping online marketers on their toes, and it has continued that reputation with this latest update. With Panda 4.0 coming in May and 4.1 rolling out in September, Google could be back on a quarterly (or near-quarterly) updating pattern, like it was for previous iterations of Panda. If that’s the case, it could mean another major update is on the horizon for December or January.

    Stay sharp and keep your strategy up-to-date, and you’ll likely ride past the next Panda update with no mysterious drops in rank. You might even get a boost!

  3. How Google Determines Search Results [Infographic]

    Leave a Comment

    Google’s algorithm is more sophisticated than ever, and its secrets have been the focus of thousands of hours of research and testing. After all, if you understand how Google determines the rankings of its search results, you can influence the way your own content ranks, gaining a significant advantage over your competitors.

    Two recent correlation studies have garnered particular interest and respect; one is from Moz, and the other is from SearchMetrics. We combined the data from both correlation studies to draw new insights, analysis, and recommendations for marketers, business owners, and webmasters looking to gain an advantage on the competition. Below is the infographic we created to illustrate our findings.

    GoogleBot-Infographic-ver-10

    Embed in Your Article


    This code links back to audiencebloom.com

  4. How to Find and Remove Bad Links Pointing to Your Site

    Leave a Comment

    Link building is an essential part of any SEO campaign. Onsite strategy revolves around producing relevant, engaging content on a regular basis while ensuring your site is structured appropriately, while offsite strategy focuses on building your site’s authority through external links and brand mentions. But not all links are good links, and just a handful of bad links could compromise the integrity of your strategy and cause you to lose rank as a result.

    After the Penguin update of 2012, link building became a much more sophisticated process. Today, it’s no longer enough to post links wherever you get the chance to—you have to make sure your links are natural, relevant, and beneficial to the parties who see them. Anything deemed irrelevant or spammy is decidedly marked as a ”bad link”, and will damage your SEO efforts for as long as it continues pointing to your site.

    Fortunately, tracking down and removing bad links is easier than you might think. In this article, I’ll walk you through each step of the process.

    What constitutes a “bad link”

    articleimage496badlink

    Bad links come in many forms. As a general rule, anything that was posted with the sole intention of increasing page rank is determined to be a bad link. This includes links posted on irrelevant sites, links that were paid for, high numbers of links in a given area, and links anchored with keyword-stuffed text. Here are some of the most common culprits:

    • Low-quality article directories
    • Link farms and other sites that try to host links for thousands of sites
    • Paid sources of link building
    • High-frequency post exchangers (two sites that bounce links off each other constantly)
    • Link wheels and other link building gimmicks
    • Spam links in forums or conversations, or links intended solely to generate traffic
    • Links in non-industry related directories
    • Links in irrelevant or fluffy content, such as non-newsworthy press releases

    Your first step is to avoid building these types of links in the first place. Instead, focus on posting links only in relevant conversations on sites related to your industry or geographical location. Don’t focus on making your links “appear” natural—focus on building natural links.

    Once you’ve integrated that into your strategy, there’s still a chance of bad links seeping through. You aren’t the only one building links on the Internet, so it pays to scout for third party sources that might be interfering with your search marketing campaign.

    How to view links pointing to your site

    articleimage496How to view links pointing to your site

    If you haven’t already, set up a Google Webmaster Tools account and add your website to it. You’ll probably need to go through at least one verification step before you can access the account. Once you’re logged in, go to Search Traffic > Links to Your Site, and you should see a great listing of links pointing to your site. Alternatively, you can generate a more comprehensive report using Moz’s free tool Open Site Explorer, dubbed the “search engine for links.”

    Simply type your URL into the search bar and you’ll be able to see the type of links you have as well as the anchor text, link URL, site source, and various other pieces of data. Here, you should be able to determine which links are “good” and which links are “bad.”

    When to take action

    articleimage496When to take action

    Of course, there are always gray areas, and not every questionable link demands an immediate action. The best long-term practice to adopt is careful monitoring of your ranks and domain authority. If you notice a significant drop with no explanation, a rogue bad backlink could be the culprit. When you notice a drop, browse through the links pointing to you and weed out any that don’t appear natural or don’t seem like a part of your regular strategy.

    Alternatively, if you don’t notice any significant drops, it’s still a good idea to peruse your link structure occasionally. In these cases, only pull the links that appear to be big red flags—the obviously terrible links, which will probably harm you sooner or later.

    How to take action

    Now that you’ve identified a link or two that needs to be removed, it’s time to take action against it. There are a series of escalating steps you can take in order to remove these links, and you may never need to use all of them.

    Step One: Try and remove it yourself

    The easiest way to remove a bad backlink is to remove it yourself. If your link exists in the form of a comment on a forum, you can flag it as spam. Or, if your account is the one that posted it, you can manually take it down.

    Unfortunately, this isn’t always an option. If you can’t remove the link yourself, move on to step two.

    Step Two: Contact the website administrator

    The next step is also simple: ask the person in charge to take it down. It really is that simple. If the link was built as a mistake, or if it was built by someone unauthorized to post it, most webmasters will be more than happy to assist you in taking it down.

    For this step, locate the source of the link—this should be easy if you’re using the Open Site Explorer Tool. Usually, the webmaster’s contact information is posted somewhere on the site, but if you can’t find it, check Whois.

    In your contact, remain polite and repeat your approach for each site you reach out to. This will give you a better chance of getting results, and will save you the time of writing a new letter each time. Follow up if you don’t hear anything after a day or two.

    If you can’t find the webmaster’s contact information, or if the webmaster has some reason for refusing to take your link down, you can escalate the process to the final step.

    Step Three: Use Google’s Disavowal Tool

    If there’s no other way to remove the bad links, you can ask Google to exclude them from consideration when calculating your ranks. The Disavowal Tool, found in Google Webmaster Tools, allows you to make that request. It allows you to create a singular file that contains all the URLs you wish to “disavow” from consideration, indicating which sites refused to take the links down and which sites were impossible to contact.

    Remember, the disavowal tool is not a removal tool—it is a request tool. Google reserves the right to deny your requests and keep the links in consideration if it feels you are relying too heavily on it. As such, you should use the disavowal tool only as a last resort. Do everything you can to remove your links manually before it escalates to this level.

    Once the bad links are removed, it may take some time before your rankings return to normal. This is an expected part of the process, so be patient after removing the links in question. The authority from good links and the damage from bad ones both seem to linger for a few weeks after the links are removed.

    Invest time in your link building campaign, and don’t forget that removing bad links is just as important as building good ones. Take at least one or two days a month to review the number and type of links pointing to your site, and make adjustments accordingly. Over time, you’ll sculpt a near-perfect link profile, and you’ll keep your website positioned as a positive authority in Google’s eyes.

  5. 6 Metrics That Define Your Site’s Average User – and How to Learn From Them

    Leave a Comment

    Understanding your audience is the key to creating a great user experience and building the reputation of your brand. A few decades ago, the only way to get more information about your customers was to conduct lengthy market research studies, involving in-depth surveys and qualitative analysis. While market research is still around, there is even deeper, more quantitative data available immediately to every website owner in the world. By using this data to better understand your site’s average user, you can perfect your user experience and improve your customer retention.

    Today, I’ll take a look at six key metrics that illustrate a picture of your site’s average visitor:

    1. Acquisition Data

    articleimage495Acquisition Data

    Acquisition data is your key to discovering how people are finding your site. You can find this information in Google Analytics under the “Acquisition” tab—to start, check out the “Overview” section. Here, you can see a nifty pie chart that will segment your audience into the four main channels responsible for drawing traffic to your site: direct traffic, which constitutes visitors visiting your site from a typed-in URL or bookmark, organic traffic, which constitutes visitors who found your site through search, referral traffic, which constitutes visitors who found your site through external links and advertisements, and social traffic, which constitutes visitors who came to you through social media.

    What to learn: Here, you’ll be able to get a relative gauge on how effective your different inbound campaigns have been. For example, if you notice your social campaign is generating 80 percent of your visitors, you can rest assured your social campaign is doing well, but your organic search campaign could use an extra boost. You can also learn the primary motivation of your average visitor: for example, you know that most direct visitors are already familiar with your brand, while organic visitors are looking for information on your site.

    2. Bounce Rate

    articleimage495bouncerate

    The bounce rate is a crucial measurement that lets you know how often someone leaves your site after viewing a specific page. For example, if you’re looking at your home page and it has a 60 percent bounce rate, that means 60 percent of your homepage visitors leave your site after viewing the page, while 40 percent delve deeper to learn more. You can view your bounce rate in several sections of Google Analytics, since it’s going to be different for each page and for each section of traffic.

    What to learn: Obviously, you want all your bounce rates to be as low as possible, but comparing different bounce rates on your site can give you a good idea of which pages are the most effective, and which need some work. Check out your bounce rates under Behavior > Site Content > All Pages to see which of your pages specifically have the lowest bounce rate, and check them out under Acquisition > Channels to see how each segment of your inbound traffic bounces or stays.

    3. Behavior Flow

    Behavior flow is a new feature in Google Analytics that, truthfully, looks like a bit of a mess on first view. Don’t be intimidated, however. Behavior flow is an incredibly useful tool that can give you an accurate portrait of your average customer’s journey as he/she traverses your website from initial entry to eventual exit. The flow chart begins with a landing page, which is the first page your users come into contact with, and shows the most common next steps in each user’s interaction. At each step, you’ll be able to view information such as total number of sessions, and drop-off rates.

    What to learn: Here is the perfect place to understand the navigability of your site. Most sites start with a captivating landing page and engaging internal content pages which all eventually lead to a conversion page, such as a contact or request-a-quote form. By looking at your behavior flow chart, you can determine what portions of that traffic direction are effective, and which ones need further work.

    4. Demographics

    articleimage495demographics

    Your demographic information is perhaps the easiest to understand in this list, but it’s still important to get in the head of your user. Check out the Audience > Demographics > Overview section of your Google Analytics page, which will show you a report detailing the ages and genders of your average users. In this section, you can also learn the geographic location of your visitors, which can also help you get a solid image of your average site user.

    What to learn: There are two ways this can go. First, if you do not have a clear understanding of who your target demographic is, you can use this information to form that knowledge. From there, you can adjust the design and writing of your site to appeal to its most popular demographics. Second, if you do have a firm idea of your target demographics, you can use this information to adjust your strategy so you maximize the percentage of site visitors who actually belong to that demographic.

    5. Engagement

    Your engagement metrics will vary depending on the structure of your site, but they should at least include conversions and social signals. To track conversions, you’ll have to set up a goal in Google Analytics, which will track user information that leads to an eventual “goal completion” (e.g. filling out a contact form, clicking a specific button, etc.). On a regular basis, you can measure those engagements and get an idea of who is converting and why. Similarly, if you include social sharing options on many pages throughout your site (especially on individual blog posts), you’ll be able to gain key insights about what types of users are interested in your content, which content they’re interested, and how they’re interested in sharing it.

    What to learn: With this behavioral information, you’ll be able to customize your site and your content to cater to the engagement preferences of your user base. These adjustments will lead to higher engagement rates and higher conversions.

    6. Access Points

    Learning how your customers access your website is also important, especially with the rise of mobile traffic popularity. Go to Audience > Technology, and you’ll be able to see the browser preferences of your average site visitors. Check out the Mobile tab, and you’ll be able to see what percentage of your visitors are accessing your site via mobile.

    What to learn: It’s always important to optimize your site for mobile, no matter what. But if you find that the majority of your site visitors are accessing your page using a mobile device, it’s critically important to make sure they have an ideal experience. Learning the browser information of your users is also important; for example, if you find that the majority of your users use Internet Explorer, you should ensure browser compatibility and optimize your site for Bing.

    Once you have a solid understanding of your site’s average user, you can analyze the factors that significantly affect their experience. When you make adjustments to your site layout or your inbound strategy, you’ll be able to measure your new data and compare it, apples-to-apples, against your previous information. Gradually, you’ll refine a near-perfect platform for your target audience and grow your brand’s reputation.

  6. Could This Be the End of Google+?

    Leave a Comment

    Ever since it was first launched, Google+ has lagged in popularity, especially in comparison to its biggest rival, Facebook. As of 2014, Facebook had 1.28 billion users, but according to USA Today, the number of active monthly Google+ users was just 300 million. Despite losing the numbers game consistently, Google has shown favoritism for its product, doing everything possible to encourage more users to sign up and call more attention to the platform.

    But a handful of recent tactical moves by Google are withdrawing from this aggressive strategy, possibly indicating that the decision makers at Google are no longer banking on Google+ being a success. Much of Google+’s current user base uses the product solely to get the ranking benefits, so if Google is no longer favoring the platform, their incentive will disappear and enrollment could drop even further.

    Essentially, if Google pulls away its support, the entire foundation of Google+ could crumble, and the platform could fall into MySpace territory—in a pit of forgotten irrelevance. That’s the extreme perspective, but it’s realistic to think that, given the steps Google has taken, it’s losing enthusiasm for its social media platform and is ready to move on to other endeavors.

    Let’s take a look at some of the factors responsible for this apparent shift.

    The Decline of Google Authorship

    articleimage492The Decline of Google Authorship

    Google Authorship was first rolled out with the platform in 2011, catering to writers by displaying the author’s headshot, name, and various social information in SERPs next to the article information. Google used this as a tool to entice new writers to the platform and simultaneously give more visibility to their social network. However, in June 2014, John Mueller first announced that the profile photo and the display of various circle count information were going to be eliminated from listings. Mueller insisted that the change was to make the results pages seem less cluttered, but it also seemed to reduce the power of Google Authorship.

    Then, in August of 2014, Google removed a feature known as “Author Stats,” which previously allowed Google Authors to measure their impact by tracking impressions and click rates for their articles. While Google+ articles now once again feature profile images, it’s clear that Google is reeling back some of the Authorship benefits that once made the platform so appealing to bloggers and writers. This could be a step in a new direction for Google+, or it could be one of several signs that Google+ is no longer a priority for the tech giant.

    The Departure of Vic Gundotra

    articleimage492 The Departure of Vic Gundotra

    Vic Gundotra had been working for Google for five years when Google+ officially launched in 2011. As the head of the Google+ division, Gundotra was seen as a figurehead of the social media platform. In April of 2014, however, Gundotra resigned from his position at Google. Gundotra elaborated that his departure was for personal reasons unrelated to the state of Google+, but it can still be seen as a major change for the platform. Obviously, the departure of one person cannot spell the doom of such a large system, but as one of several factors, it is a significant event.

    The Transition of Hangouts

    articleimage492 The Transition of Hangouts

    Google Hangouts—an interactive video conference platform—were once tied directly with Google+, requiring all participants to have a Google+ account in order to host or join a meeting. However, in summer of 2014, Google took a number of steps to enhance the Google Hangout experience, one of which was removing the requirement of having a Google+ profile. Now, Google Hangouts can be accessed by anyone with a Google Apps account, even if they do not have a Google+ profile. Google claims this change is simply to make the platform more professional (since personal profiles will no longer need to be seen), but it’s also one less reason to get a Google+ profile.

    The Transition of Videos

    At the end of August 2014, Google secretly rolled out a new feature in YouTube—the ability to import all your Google+ uploaded videos to your YouTube account. At first glance, it seems like just a nifty feature that allows you to connect your shared videos to your YouTube profile. However, it could be the first of many steps that Google is rolling out to give users a chance to export or re-host their shared Google+ content.

    Sign in With Google

    There is an increasing trend among web developers who have already created a button on their site with an option to “Sign in with Google+.” The integration is a nice feature for users of the Google+ platform, so developers are reluctant to remove it. However, Google appears to be rolling out a new button with an option to “Sign in with Google” to select web developers as a replacement. This may not seem like a small change—just removing the “+”—but it could be a major sign that Google is preparing to take more efforts to step away from the traditional Google+ brand.

    No Longer Required

    For several years, building out a Google+ profile was a requirement if you wanted any type of Google account. If you wanted a new Gmail address, or if you wanted to sign into YouTube, you had to create a Google account from scratch, and you’d automatically be enrolled in Google+, with no option to remove yourself unless you delete the account entirely. However, as of September 2014, Google+ enrollment is no longer a requirement for new signups. If you create a Google account from scratch, eventually you will be prompted to either “Create your [Google+] profile” or turn down the offer with a “No thanks.” Again, this is a relatively minor change, but the fact that Google is no longer pressing Google+ on new signups could be an indication of their declining enthusiasm for the product.

    What Does Google Think of All This?

    Clearly, Google has heard all the speculation that they’re doing away with the Google+ platform. But for now, they’re as insistent as ever that Google+ is a quality product they’ll continue to support. According to a report by TechCrunch, at least one Google representative denied the decline of Google+ by acknowledging Vic Gundotra’s resignation: “Today’s news has no impact on our Google+ strategy—we have an incredibly talented team that will continue to build great user experiences across Google+, Hangouts, and Photos.” There have been some significant staffing changes, but that doesn’t mean that Google+ is going away; according to internal sources, it’s just an indication of shifting priorities or new team assignments. To date, Google has given no explicit indication that it’s doing away with the platform.

    So What Does It All Mean?

    There’s no clear answer yet. Google+ could be phased out over the next year, or it could stick around for decades to come. But it is clear that Google is reevaluating their priorities, and is attempting to isolate the “Google+” brand from some of its other products. Rather than forcing Google+ down its users’ throats, it’s taking a more relaxed approach, segmenting its features and keeping Google+ as what it is—a social network for friends, colleagues, and family members. Whether these changes lead the platform to a new height in popularity, or result in its final demise remains to be seen.

  7. Why Germany Is So Desperate to Uncover Google’s Search Algorithm

    Leave a Comment

    articleimage481Why Germany Is So Desperate to Uncover Google's Sea

    German justice minister Heiko Maas recently demanded that Google reveal its search algorithm. Unfortunately for him, and for the people of Germany, that’s simply never going to happen. Google has demonstrated a long history of keeping its search algorithms a tight secret, and it’s not about to compromise that history because of one man’s—or one country’s—request.

    However, it’s important to understand why Germany is fixated on uncovering Google’s classified search engine algorithm, as well as why Google will probably never give it away. It’s more than just a battle of transparency versus proprietary control; Google truly believes it has the best interests of the online world at heart, but Germany does as well.

    So who’s the justified party in this request and denial? And why does it matter for search marketers?

    The Request

    articleimage481The Request

    Let’s take a look at why justice minister Heiko Maas made the initial request. Originally made public in an interview with Financial Times, Maas explained that he is unhappy with Google’s actions in Europe as a whole, including its policies on the privacy of user data, and the perceived monopoly it has on the world of online search. As a result, and in the interest of consumer protection, Maas believes that uncovering Google’s algorithm will give users more visibility and more information about the online tool they so regularly use.

    Google immediately and predictably pushed back. It’s kept its search algorithm an uncompromisingly tight secret for more than 15 years, despite countless requests—both public and private—to release that information to the public. Google officially responded that they would not comply with the request, stating that publishing their proprietary search algorithm would mean compromising its trade secrets and making the search engine more vulnerable to spammers.

    Both sides have valid concerns.

    Germany’s Case

    articleimage481Germany’s Case

    German justice minister Maas speaks for more than just his own country. His request came as a result of Google’s actions and presence throughout all of Europe, and by extension, the entire world. There are Internet users relying on Google as their search engine of choice in almost every corner of the globe, and the issues Maas sees are present everywhere.

    The case for user privacy

    The European Union tends to care about the privacy of its citizens a little more than the United States. As a result, the EU is concerned about the types of information Google has on its users, as well as the information that is semi-permanently stored on its databases. For example, the EU enforces a “right to be forgotten law,” which legally mandates that private citizens have the right to permanently remove old information about them that exists on the web. Google has resisted this mandate, offering a compromise that has since been rejected by EU officials. By revealing its search algorithm in full, users can learn more about their privacy and learn what they can do to protect it.

    The case for transparency and user safety

    In general, consumers have a right to know about the products they buy; this is why ingredient lists are mandated to be displayed on food products. Barring proprietary secrets, this information should be made public, and Maas would argue that all German (and European) citizens have a right to know what makes Google’s search engine work. It’s a relatively weak argument, since Google can immediately counter that their search algorithm is a trade secret, and its revelation could damage its integrity as a business.

    The case against monopolization

    In the United States, Google carries about two-thirds of all web searches, making it a powerhouse, but leaving a bit of room for the competition. In the European Union, however, Google is used for 90 percent of all searches. For years, the EU has been trying to make headway in the case against Google’s unrivaled power in the search world. While it is true that Google is dominating the competition, there isn’t much they can do. Nobody is forcing users to rely on Google; there are many other alternatives, such as Yahoo! or Bing. This makes it difficult to hold Google accountable to anti-trust laws, and allows Google’s reign to continue.

    Google’s Case

    Google has a strong case as well. Rather than focusing on the individual safety and privacy of its users, Google wants to make sure the web experience for the population as a whole is as good as it can be—and at the end of the day, they want to make a profit too.

    The case for proprietary secrets

    Google is a for-profit company. They are responsible for 90 percent of searches in the EU because they’re the best search engine around, and that’s a direct result of the effort they’ve spent on improving their algorithm over the years. If you give that algorithm—and all that work—away for free, anybody could build a similar search engine, and Google’s value would instantly plummet. As a private company that employs thousands, Google wants to stay profitable and healthy.

    The case for web quality

    There’s another big reason why Google keeps its algorithm secretive. Back in the old days of SEO, search marketers would take advantage of Google’s loophole-ridden algorithm any way they could, including spamming keywords and backlinks across the web. This led to a poor web experience for online searchers, and an online world riddled with low-quality, irrelevant content. As Google’s algorithm became more refined—and less predictable—the web gradually evolved to reward sites with high-quality content and structure.

    Google would argue that making its algorithm public would instantly take us back to a darker time, when any search marketer could use shady manipulative tactics to take advantage of the holes in the algorithm’s structure. By keeping the algorithm a secret, Google is keeping the Internet on a stable path forward to even more sophisticated content and search marketers who are more interested in providing a good user experience than in manipulating their rank.

    Why Google Remains in Control

    Google is an international company with an international dominance, so even if Germany passes restrictions or imposes fines on the search engine giant, Google will likely be unaffected. Nobody is forced to use Google. They use it because it’s the best search engine available. As a result, people will continue to demand it even if its top-secret algorithm is never exposed, and even if their privacy and consumer safety are at risk.

    Google is a powerhouse, people love it, and it isn’t technically breaking any rules. A public and international movement that demands the revelation of its algorithm might eventually sway the search giant, but such a movement would take years—or decades—to swell. In that amount of time, search will have evolved so far that this conflict may no longer be relevant.

    The ultimate takeaway here is that while Germany wants Google to be more transparent, Google is still in control, and for the most part, online searchers don’t care. They just want to log on, type in a search, and see the relevant results they’re used to. Unless users start caring enough to switch to an alternative search engine, Google won’t bat an eye; they’ll keep their search algorithm under wraps no matter who starts requesting it.

  8. 5 Steps to Responding to a Social Media Catastrophe

    Leave a Comment

    Social media is a fickle world, and popularity isn’t always a positive quality. No matter how careful you are or how dutifully you nurture positive responses amongst your followers, eventually, you will face some form of negative backlash. It could be something as simple as an angry review, or something as catastrophic as a campaign against you. If you have a big enough audience, eventually something bad can happen, and when it does, it pays to be prepared.

    In any social media campaign, the measure of a company is not whether something negative ever arises; instead, it is how that company responds to something negative when it arises. The way you respond to a social media catastrophe can either neutralize the threat immediately and win new customers, or turn a simple situation into a complicated, far-reaching one.

    Proactive Measures

    The first and most important principle to understand in the context of social media catastrophes is that most catastrophes are preventable. Rather than waiting for a calamity to respond to, it’s better to take proactive measures to mitigating the frequency and scope of those calamities. You’ll never be able to prevent everything bad from happening in your social presence, but you can reduce the damage by taking a handful of simple precautions:

    • Check your profiles often. Disasters get worse if you leave them unattended. Establish a system that allows your company to check in on your social media profiles on a regular basis, even during off hours. Neglecting your profiles could lead to a small misunderstanding ballooning into a real catastrophe.
    • Proofread everything.Take the extra time to proofread every post you make, and carefully review your planned campaigns for any possible misunderstandings. In contests, clarify your wording. In hashtags, look for possible alternative interpretations. Do everything you can to catch mistakes and prepare for possible misunderstandings.
    • Ask for feedback. Openly invite your users to share their opinions with you on social media—about your brand and about your products and services. Doing so will allow constructive criticism in a contained format, and could prevent angrier, less predictable storms of negativity from arising in the future.
    • Have a plan. Orchestrate your social media management by delegating responsibilities with clear parties and instructions. Make plans for how to handle different types of comments, and how to escalate a situation to a higher response level.

    With these proactive steps in place, you can successfully avoid some—but not all—social media disasters. Here’s how to handle them when they do come up.         

    Step One: Understand

    articleimage486Step One Understand

    Relax. When you see an inflammatory comment, a series of negative posts, or something else that compromises the reputation of your company, the first step is to take a moment to understand exactly what’s happening. Dissect each comment, post, or response, and try to analyze the root of the problem.

    Sometimes, that problem will be internal; an inappropriate post or a mishandling of information could easily turn into a public issue. These problems are usually easy to understand, and relatively easy to correct.

    Other times, the problem will be external; a customer could be openly complaining about your company, manipulating information that you’ve posted, or inciting others to lead a charge against your brand in some way. These issues tend to be more complex and harder to pinpoint—for example, if a customer makes a post on your Facebook page that says “This company is no good. Will not do business with them again,” it’s difficult to tell exactly what prompted the post in the first place.

    No matter the nature of the problem, make sure you understand it as thoroughly as possible before moving on.

    Step Two: Respond

    articleimage486steptworespond

    Responses are powerful because people always like to know they’re being heard. Ignoring a problem on social media will only make that problem worse. The individual who posted the negative material will grow restless and angrier, and others might see that you have taken no steps to resolve the issue, leaving them with a negative impression of your brand.

    Your response doesn’t have to do anything immediately; it can be a simple acknowledgement of the problem, or a request to get more information. But it does have to be sincere and personal. Simply telling a customer on social media to fill out a form on your website or call a customer service number is not enough. If you’re dealing with negative comments, let the person know that you hear his/her complaint, and that you take it seriously. Sometimes, this alone can redeem your brand and set you on the right path for resolution.

    Step Three: Apologize and/or Explain

    articleimage486stepthreeapologize

    Next, you need to offer some level of justification or restitution. If a person has a problem, they have either been mistreated, misinformed, or they have misunderstood your company’s policies. No matter the case, it’s important to back up your response (either immediately or as a part of the conversation) with either an apology or an explanation.

    Keep this level of response public, so others can see that your brand cares enough to offer a sincere level of support. An apology will let your customer know that you didn’t mean for the situation to happen. An explanation will help him/her understand your company policies and procedures—even if he/she isn’t satisfied with the explanation, other followers will see the explanation and you’ll have a better chance of preventing something similar coming up in the future.

    Step Four: Offer to Make It Better

    This step is flexible, based on your company’s direction in customer issue resolution. Once you’ve apologized or explained your company’s stance, you can do what you can to make it up to the individual who feels angry or wronged. You can’t always redeem yourself in the offended follower’s eyes, but by making a public offer of restitution, you can earn a better public reputation and improve your standing.

    If the customer is unsatisfied with a product or service, you can offer a refund, discount, or replacement. If the customer is unhappy with your brand in general, ask them what you can do to make things better for the future. As an extension of step one, you have to understand why your customer is upset before you determine how to make it better. That level of personal engagement can only be beneficial in the long term.

    Step Five: Follow Up on Your Offer

    Finally, you have to follow up with your offer, provided the follower accepts. If you offer a replacement product, send them the product immediately along with the shipping information. If your offer is less tangible, such as a change to a company policy, make a public announcement when the change is complete. If you don’t follow up on your offer, you’ll be asking for a return customer or future complaints—and this time, they’ll probably be angrier.

    Don’t fear social media disasters. Treat them as learning opportunities. If your followers are reacting negatively to something, odds are they’re sharing their honest opinions. If you listen to them and do everything you can to make the situation better, you can either win that customer’s loyalty again or learn how to prevent the situation in the future.

     

  9. How the Pigeon Update Destroyed the Relevance of Keyword Rankings

    Leave a Comment

    Keyword rankings have been on the decline for years. There once was a time when holding the top position for a target keyword was the pinnacle of success in the world of search marketing, but today, keyword rankings simply aren’t as meaningful as more important factors like organic traffic and the behavior patterns of your web visitors.

    Now that the Pigeon update has taken root, the relevance of keyword rankings has been all but destroyed.

    The Start of the Decline

    articleimage468The Start of the Declin

    It’s difficult to pinpoint an exact moment when keyword rankings started to become less significant in the world of online search, but the Panda update of 2011 is a good place to start. Before the Panda update, SEO was a mathematical, somewhat fine-tuned process. Websites wanted traffic. In order to get that traffic, you had to have a high rank. In order to get that rank, you had to include X pieces of content with X number of keywords, then post X number of links on external sites for a period of X months until you got the rank you wanted.

    Then Panda hit. With the Panda update, factors for ranking became much more complex. Google started scrutinizing the quality of your content more than the quantity or keyword makeup of your content, and looking at other factors like bounce rates and site loading speeds to influence its search results. Ranking was still important, but keywords began to diminish in importance, since Google’s search algorithm no longer paid much attention to how many keywords were used in a given site. Instead, it started using contextual clues to qualify the significance of each site (and individual page).

    The Penguin update was next to follow. While the Panda update was intended to clean up the irrelevant content on the web, the Penguin update was intended to clean up irrelevant links. External links anchored by repetitive keywords started yielding penalties instead of ranking boosts, and the source and number of external links built started to matter. As such, external link building could essentially no longer focus on specific keywords. Instead, they had to focus on topics and categories.

    The Hummingbird update made things even more complicated in 2013. It formally introduced a system of “semantic search,” which started looking at contextual clues in users’ search queries rather than the individual keywords and keyword phrases. For example, if a user searched for “cemeteries in Hoboken NJ,” older versions of Google’s algorithm would dissect that phrase into keywords, searching for pieces of content with the phrases “cemeteries” and “Hoboken NJ” near each other. “Hoboken NJ” is not a natural phrase in most forms of written content, so search marketers would have to wedge the phrase into their content unnaturally to rank for it.

    The post-Hummingbird search algorithm looks at the phrase “cemeteries in Hoboken NJ” and analyzes it as a conversational phrase, looking for websites that appear to represent cemeteries (not necessarily scouting for the keyword phrase), with a given location of Hoboken (again, not searching for the appearance of the keyword phrase). It was a thick nail in the coffin of keyword relevance, but there were still opportunities for keyword optimization until the Pigeon update came along and destroyed the last remnants of keyword relevance.

    Enter the Pigeon Update

    articleimage468Enter the Pigeon Update

    The Pigeon update was not officially named by Google, but it came on to the scene in July of 2014. Targeting mostly local search results, the Pigeon update changed much of the way Google populated results pages for individuals looking for a local business or organization.

    So what, exactly, did Pigeon change?

    First, it decreased the distance used to calculate what constitutes as “local.” For example, before the Pigeon update, Google might give you results based on your city or based on your ZIP code. After the update, it might use your specific neighborhood or even your street based on your geographic location. This is especially helpful for mobile users who share their location with Google. It leads to more specific results and fewer generation errors.

    Second, it started increasing favoritism for online review sites and local directories. For example, local restaurants with a large number of highly rated reviews started ranking higher than local restaurants with little to no reviews. In some cases, individual profile pages on services like Yelp started ranking higher than the official websites for those businesses.

    Consider this change for a moment. Offsite optimization can still improve your authority, and structural onsite optimization (such as offering easy site navigation and properly structured meta data) can also improve your authority. However, user reviews on local directories still matter more, and that’s something you cannot directly control. All you can do is ensure your local profiles are claimed and accurate, and encourage your users to post positive reviews whenever they can.

    Combine this idea with the current state of semantic search. Essentially, instead of looking at your search queries in terms of keywords and matching those keywords to content found on the web, Google is now analyzing the intention of your search query, and is logically finding the best answer. If your establishment is defined as a burger restaurant, Google will logically find your online presence for any queries that seem like they’re intended to find burger restaurants—regardless of keywords.

    Instead of containing the keywords you think your potential customers are using, you need to make sure there’s enough information on the web for Google to understand the nature of your business. When you were young, you were told the best way to make friends is to simply be yourself, instead of pretending to be who you thought people wanted as a friend. This is the same principle.

    How to Increase Your Visibility

    articleimage468How to Increase Your Visibility

    Even though keyword rankings and keyword searches are no longer relevant for most companies, it’s still possible to increase your visibility. Obviously, increasing your perceived relevance and authority can lead to generally higher ranks, but you can’t focus on ranking for a specific keyword or keyword phrase.

    Claim your local profiles

    Make sure you’ve claimed and verified your information on all available and relevant local directories. Nab whatever social media profiles you can, and start closing the gap on local review sites like Yelp, TripAdvisor, and UrbanSpoon (if relevant). Here, you’ll be able to verify all your information, taking great care that your name, address, and phone number (NAP) are present, accurate, and in the exact same format throughout the web. This is the information Google cares about, and the information that will make Google understand who you are.

    Encourage more reviews

    Engage with your customers by prompting them to post reviews about their experiences. You cannot buy or force people to write reviews (because that can get you flagged and penalized), but the more positive reviews you have, the more Google will see you as an authority.

    Traditional best practices

    And of course, best practices for modern SEO still apply—you just no longer have to worry about specific keywords and keyword phrases. Build high-quality backlinks on external sites that are relevant to your location or relevant to your industry, and regularly update your site with high-quality, relevant, interesting content. Complement those strategies with a strong social media presence, and stay consistent.

    Just because keyword rankings are no longer relevant for consideration doesn’t mean SEO is dead—far from it, in fact. Feel liberated. Now, the tedious mechanics of keyword research and keyword frequency are all but gone, and you can instead focus on the more valuable elements of your business: promoting solid branding, creating value, and engaging with your audience.

  10. 8 Search Engines Bypassing Google with New Trends in Online Search

    Leave a Comment

    Google is the king of search. At least, that’s the way it’s been for the past 15 years or so. Competitors were commonplace once upon a time, back when the Internet was still taking shape. But today, around two-thirds of all searches are performed on Google, and the few major competitors remaining have joined forces just to stay afloat. As a result, most search marketers only focus their efforts on getting praise—and ranking boosts—from Google.

    It’s a sensible and worthwhile priority, since Google is still currently in control of the world of search. They hold the largest share of searchers—by far—and they tend to set the industry trends that every other search engine follows.

    However, the Internet is becoming more open, and despite the tiny amount of attention they seem to be generating, there are new competitors trying to give Google a run for its money. Stale search functionality, predictable ad structures, and little attention to privacy are just some of the problems these micro-competitors are trying to resolve with their own search algorithms.

    As a search marketer, it’s probably not worth adjusting your strategy just to fit in with these new search engine alternatives—at least not yet. But it is worthwhile to learn what these competitors are up to, and why they’re putting up the effort. Knowing the landscape of the competition could prepare you for the rise of a new major search rival, or perhaps the absorption of their expertise into Google’s juggernaut algorithm.

    These are eight of the most popular and most attention-worthy alternative search engines around today:

    1. Bing.

    articleimage467bing

    Bing is the first and most obvious major competitor of Google, capturing a little over a fourth of all search traffic on the web. Bing is important to watch if for no other reason than their commitment to improving their own structure. Already, Bing has expanded by using their own search algorithms to fuel Yahoo!’s search interface, and Internet Explorer users (yes, they still exist) use Bing almost exclusively. Bing offers inbound link data with its LinkFromDomain feature, which is handy for search marketers, and it also has a handful of special searches for specific file types and certain phrase patterns (such as specific words being a specific distance away from each other in an online text).

    2. DuckDuckGo.

    articleimage467duckduckgo

    In the modern age, dictated by NSA monitoring, hackers stealing credit card information, and leaks of private photos, privacy and security have become crucially important to the average Internet user. Google keeps your information practically forever, and will willingly disclose it to authorized outside parties, but DuckDuckGo intends to offer an alternative. DuckDuckGo doesn’t keep any private user search data, which eliminates the possibility of personalized results, but protects users from the possibility of their search history being scrutinized or accessed. As user privacy becomes an even greater concern, DuckDuckGo will likely grow in popularity (or serve as the inspiration for an even greater privacy-minded competitor). It’s unlikely that Google will adopt any major changes to its stance on privacy, so expect to see at least a handful of new privacy-based search engines pop up in the next several years.

    3. Boardreader.

    articleimage467boardreader

    Boardreader is a simplistic search engine based around a fairly simple idea: providing an easy way to search through forums and online communities. This is a highly specialized form of search engine, and it’s valuable because it could influence how Google (or another major competitor) handles these types of queries. Using a specialized algorithm, Boardreader scours the web for your search queries only when they appear within the body of a forum thread, post, or message. If you’re looking for a conversation about a specific topic, rather than raw information or direct access to a specific topic, Boardreader can be extremely helpful.

    4. Topsy.

    What Boardreader is for online forums and message boards, Topsy is for social media sites (but especially Twitter). Google does integrate social media information into its search results, but they tend to be based on the newest relevant posts or the users behind them. Topsy, on the other hand, delves deeper into the social world, generating results based on specific times or places, and offering regular alerts or analytics information for the inquisitive minds using it. It’s another branch in the search world that opens the door to more possibilities for social inquiry. Google may improve its social search functions, but this level of specificity in results will likely remain with specialized outlets like Topsy.

    5. CC Search.

    Creative Commons (CC) Search is a targeted search engine that populates results that you can share and use on your own. While CC Search itself is not a search engine, it functions as one by aggregating the results of other services to generate works that operate under a CC license. It’s a nice shortcut that can lead you to publicly available pictures and other pieces of content, but there’s no absolute guarantee that every piece of content is free to use—so check your links and give proper attribution. Still, CC Search is a nice example of a specialized search engine that gives more specific, relevant results than Google can offer for a specific purpose.

    6. WolframAlpha.

    WolframAlpha started as a mathematical tool for students and professional mathematicians, and has since evolved into an integrated search engine focused on science and math. Users trying to search for specific math and science facts often rely on WolframAlpha rather than Google because it provides more direct answers in a faster interface. Its search function can even solve equations directly without simply searching for instances of that equation on the web. Google does have some calculating functionality currently, but WolframAlpha is highly specialized, and geared toward mathematicians. It could mark the beginning of a trend in catering to specific professionals, providing only the information they need rather than trying to give the most relevant results to a generalized audience.

    7. Quantcast.

    Quantcast is useful for search marketers, not just because it’s developing a new trend in search, but because it can provide valuable information about web traffic and demographics. Currently somewhat limited in scope, Quantcast aims to provide detailed web visitor information about various sites to each search user, giving more transparency to the Internet as a whole. While Google tries to keep its algorithms and data as secretive as possible, Quantcast is all about open insights, and serves as an interesting foil to the search powerhouse.

    8. Crunchbase.

    Crunchbase is another “specialist” type of search engine, specifically scouting the web for people and businesses. If you’re looking for specific information about a company or an individual professional, Crunchbase will currently give you the most concise results. It gives users more personal results, weeding out any unnecessary review sites, social posts, or news results and focusing strictly on forming connections. Google may one day give greater weight to these types of results, but since they want to be an all-encompassing search solution, this type of search might always be better suited to such a specialized competitor.

    Your core strategy needs to focus on the present, and as such, it should remain on pleasing Google for the time being. But as these small competitors begin to develop more advanced algorithms and start to encroach on previously undisputed Google territory, you’ll need to keep watch for the successful new trends. The world of search is expanding, at an admittedly glacial pace, but if you can keep your focus broad and inclusive of these alternative options, you’ll be prepared for whatever awaits you on the horizon.

Success! We've just sent an email containing a download link for your selected resource. Please check your spam folder if you don't receive it within 5 minutes. Enjoy!

Love,

-The AudienceBloom Team