It’s well known that Google puts its users first when it comes to calculating search ranks. Domain authority, a measure of a site’s merit or trustworthiness, is one of the most important factors for determining where a specific page within a domain will rank. For example, if a user searches for “shaving cream,” a site with a “shaving cream” page and a high domain authority will rank higher than a similar site with a low domain authority.
The factors responsible for forming a site’s domain authority are somewhat mysterious. Search marketers have uncovered many of these factors, either through official Google releases or through trial-and-error based experiments with Google’s search directly. For example, we know that the number and quality of backlinks pointing to a domain factor into how authoritative that domain is perceived to be. But new information suggests that Google is attempting to find new, better ways to calculate a site’s authority, including using the accuracy of information found on the domain.
New Scientist recently revealed that the search engine giant was starting to push the idea of Knowledge-Based Trust (KBT), which is a proprietary method of calculating a page’s authority based on how accurate the information found on it is. Rather than exploring the backlink profile of a site, this algorithm would focus on “endogenous signals,” which would determine the correctness of various facts listed on the page.
To determine the correctness of this material, Google would compare snippets of information found on the page to similar snippets of information it already has compiled on the Knowledge Graph. In case you weren’t aware, the Google Knowledge Graph is a compendium of verified information pulled from various authoritative sources on the web and reviewed manually for accuracy. You can see information from the Knowledge Graph in its current state by searching for movies, actors, politicians, or other famous subjects—it’s presented in an organized box on the right-hand side.
According to Google’s recent report on the matter, the KBT algorithm has already been used on 2.8 billion facts and snippets taken from the web. In what was considered to be a successful test, Google verified the accuracy of 119 million web pages and 5.6 million websites. While it’s difficult to say how accurate the algorithm was in determining that accuracy, it seems to be a good start for the technology.
Assuming the KBT algorithm one day goes live, Google researchers have suggested that it will only serve as a complement to existing authority-determining factors like backlink profile analysis, rather than a replacement of them. Should the algorithm take effect, there will likely be significant volatility in search ranks. Depending on how accurate your site’s information is compared to Google’s Knowledge Graph, you could move up or down in rankings in a sudden motion.
However, because Google’s algorithm already detects the quality of your writing and the strength of your content, authority is already based on accuracy by proxy. If you’ve remained committed to posting only the best, most helpful information you can to your users, chances are the release of this new algorithm will not significantly lower your rank.
Also consider the fact that KBT is mostly based on information housed in the Google Knowledge Graph. Currently, the Knowledge Graph only contains very specific types of information; for example, novels, celebrities, cities, and historical events are all categorized and indexed based on a certain format, but more complex information like “how to change a tire” are more difficult to categorize, and likely will not be indexed by the early stages of Google’s more advanced information-processing products.
Google is nothing if not meticulous. Before the company integrates its KBT algorithm into its existing search algorithm, it’s going to want to be sure of the technology’s effectiveness, and that means months of rigorous testing. Early signs appear to validate the effectiveness of the algorithm, but Google’s development team will likely want to refine their approach before debuting it to the general public.
That being said, Google is constantly pushing for new updates and the best possible search functionality for their billions of global users. Since the KBT algorithm is largely based on the quality of the Knowledge Graph and the Knowledge Graph has been in constantly refinement since 2012, Google may be more willing to make an early push. Google’s updates typically come as a surprise even to search marketers in the know, so KBT will probably be rolled out when we least expect it.
For now, don’t worry too much about KBT. It’s stuck in a testing phase right now, and when it rolls out it will probably be refined to the point where it only minimally affects the landscape of search. What you can and should focus on in the meantime is the quality of your content. Start double checking the facts and figures of all your posts in syndication, and implement a review process that formalizes a fact-checking procedure for all new works that end up on your site. This procedure will help your site become more KBT-friendly, but that’s a far-off concern. Your immediate priority should be taking more steps to ensure that your users get the most accurate, most valuable information possible. Like Google, you must always put your users before anything else.