The phrase “conversational search” can mean a lot of things. In some of its most popular uses, it refers to a system of semantic understanding first introduced with the Hummingbird update back in 2013; this update allowed Google’s algorithm to analyze and “understand” the intent behind a phrase-like query, rather than breaking it down in terms of keywords and keyword phrases. Now, Google is introducing a subtle, yet highly significant twist on the phrase, and it could be the spark for a new revolution in search technology.
Following the description of “conversational search” as I outlined above, personal digital assistants have helped usher in a new era of casual searching. Siri, OK Google, and Cortana are just a handful examples of artificial intelligence programs that are able to analyze and understand natural human speech, then applying it to an actual search function to find whatever the searcher is looking for. These programs, glitchy when they were first introduced, have undergone an impressive series of evolutions to get to the state they are today—capable of deciphering even muddled human speech with startling accuracy.
The emergence—and subsequent reliance on—these apps has helped to solidify a shift in user search behavior. Back in the early 2000s, the best way to search for something was to think up a handful of keywords that might describe what you’re looking for in Google’s eyes—a form of reverse-engineering to help the tool understand your intent. Today, no such manipulation is necessary—instead, it’s instead more useful to simply talk to these digital assistants (or type a query) as if you were talking to a friend.
Google’s latest experiments are taking this functionality a step further.
The source of this expansion is Google’s iOS app, which naturally must compete with Apple’s Siri digital assistant as a search engine of choice for iOS users. In what’s being called “context-aware” searching, the app now allows users to extend a chain of different, interrelated queries without dabbling into different subjects.
For example, let’s take two phrases: “what’s a good restaurant?” and “What’s the best time to visit New York?” Separately, these two phrases would conjure different results (assuming you live somewhere other than New York). But now, if you ask the second query, Google will start to recognize that you might be planning a trip to New York, and upon asking the first query after it, will list New York-based restaurants rather than restaurants in your current area.
The shift is small, but it opens the door to a new, more convenient pattern of expansive searching. After all, when we’re planning trips or doing research, one query is rarely enough to give us all the information we need. Instead, we rely on branching forms of exploration, adding new queries when we get new information and attempting to give ourselves a more thorough view of the topic. In short, Google recognizes that users usually have follow-up questions, and now its digital assistant app is prepared to handle those questions.
We’re entering an era where “search” isn’t separate from our other digital interactions. Rather than pulling up a web browser, accessing a search engine, plugging in a term, then wading through results to find what we want, we can now use a handful of spoken words to instantly conjure up information. What Google is presenting is the next logical step—enabling the “monologue” of vocal search to become more of a “dialogue,” with branching conversational paths rather than a call-and-response type system.
It’s difficult to say exactly how this will develop in the future, but keep in mind that Google is also expanding its Knowledge Graph functionality to better accommodate follow-up questions and expanded forms of information. Such applications are combinations of artificial intelligence and previous autocomplete types of algorithms. The end result here is more thorough results, and more conversational, natural interactions with machines. Soon, these digital assistants may be able to hold their own end of the conversation, asking clarifying questions and offering suggestions we didn’t even ask for. We’re still a few years away from this, and it may develop differently, but it’s still worth thinking about.
Because this upgrade in functionality is relatively limited, it’s unlikely that it will have a substantial influence on SEO results. The algorithm Google uses to calculate the best results for a given query isn’t changing; instead, Google is transforming queries to better connect logically to previous queries. In the example I listed above, asking Google for restaurants after a New York-related query is equivalent to the search query “New York restaurants.”
Still, it’s worth noting that this upgrade, and the assumed updates to follow, will only increase the average consumer’s tendency to search using spoken phrases rather than conventional keywords. As such, keyword-based SEO strategies are falling even further from modern practical application. Instead, it’s better to cover a wide range of topics related to your industry, focusing on natural, conversational language, and appealing to topics you know people are going to want to read.
Pay close attention to how these conversational and artificial intelligence advancements develop over the next few years. SEO is in no immediate danger of dying, but I imagine it will transform dramatically—perhaps into something almost unrecognizable—as these search functions continue to grow more advanced.