Search trends are dependent on a number of closely interacting technologies, and you need to be aware of how they’re changing if you want to stay ahead of the competition, especially as rates of change accelerate across the board. There are device technologies, which have given us mobile devices and more sophisticated forms of local search, web technologies, which have made it possible for more companies to make more creative websites, and raw search technologies, which make search faster, easier, and more relevant for users (among other classes of technology).
Of the search technologies, one of the most fascinating—and the fastest changing—is semantic search, the ability for search engines to recognize and interpret the natural language of its users’ queries. Semantic search is evolving in some astounding ways, and the sooner you start adapting to them, the better.
Before we look at how semantic search is developing today, we have to understand how it originally came into being. Back in the early 2000s, there was no such thing as “semantic search,” and natural language recognition seemed like a distant dream for AI. Search engines functioned using a keyword-based mapping system; they would identify certain keywords and keyword phrases in your query, then generate a list of the places on the web where those terms were used most frequently and most prominently. Over the years, this process became more sophisticated, weeding out unnaturally keyword-stuffed pages and mapping more complicated phrases, but it basically functioned the same way.
Google’s Hummingbird update changed the game when it came out in 2013. Rather than using keywords to find the most relevant results for a query, Hummingbird could interpret the intention of a user query based on its phrasing, and find relevant entries from there. Its emergence marked a significant departure from keyword-based strategies of search optimizers, instead forcing content marketers to try harder to answer user questions, concerns, and interests.
Late last year, Google released a new machine learning algorithm to Hummingbird called RankBrain. The goal of the algorithm is to improve Hummingbird’s semantic search capabilities by gradually learning more about the way people talk (and enter queries into search engines). Though semantic search is already pretty impressive, it struggles when a user’s query is especially wordy, complex, or ambiguous. RankBrain learns from prior experiences, essentially updating itself, and eventually becoming able to break those complex and indecipherable queries down to more manageable chunks. It’s a sign of Google’s commitment to never-ending phases of improvement—without the temporal and logistical wall between engineers and manual updates, this automated algorithm will be able to develop faster than ever.
You’ve undoubtedly noticed a surge in “rich answers,” which is the term given to concise entries in SERPs given prominence above standard search results. These can take the form of images, sentences, paragraphs, numbers, or any other type of answer that can immediately and concisely address your query (without ever demanding you to click through to a separate page). These are rising in prevalence for three reasons:
This is one of the most important effects of increased semantic search analysis, as it reduces reliance on external web pages to answer questions. It’s been argued that this will eventually stifle search traffic to all websites, but we’ll cross that bridge when we come to it.
Related questions are also seeing a rise—especially over the last few months. You may see these popping up about halfway down your search results, prompting you to investigate similar or frequently asked questions related to your original query. However, you’ll notice the answers to these questions often differ from their rich answer counterparts, implying that a separate algorithm is responsible for generating them. It’s unclear how all this ties together, but it’s clear Google has a long-term plan for query pattern recognition in addition to basic semantic understanding.
If you’ve read this article with an SEO perspective in mind, you may be wondering how all this affects you. Yes, it’s interesting to learn the inner mechanics and history of Google’s semantic search capabilities, but what practical information can you walk away with?
First, understand the key areas that Google is developing (either through more manual updates or with their new machine learning algorithms): voice search, semantic understanding, rich answers, and related questions. Google’s main concern is on getting correct, relevant information into the hands of searchers as quickly and easily as possible.
Your goal, therefore, should be to help Google get the job done. Spend more time researching common questions in your industry, and write answers to them. Explore complex, niche topics, and microformat your site so Google can scan it for the answers. Become known as an authority and provide the information that your users want, and you’ll be rewarded in the form of more visibility. It’s as simple as that.
Semantic search isn’t going away anytime soon, and your competition may already be making plans to conquer it in their own way. Keep this in mind as you audit, analyze, and shape your strategy in 2016 and beyond. Success in SEO isn’t about finding something that works and sticking with it forever; it’s about constantly refining your approach to accommodate these captivating new trends as they emerge.