The 7 Most Common Ways Webmasters Botch On-site SEO
Webmasters have a ton of responsibilities, so it’s incredibly rare to find one capable of performing his job perfectly. There’s no such thing as a perfect website (as you’ve undoubtedly experienced), and even if there was, only the most astute and experienced webmasters would be capable of achieving it. Most of us are entrepreneurs, or designers, or marketers who may excel in some functions but are invariably weak in others.
In some organizations, the role of onsite SEO falls to webmasters exclusively, and even when it doesn’t, it’s usually the webmaster’s responsibility to handle the architectural structures and changes necessary to help a site rank. They’re hard to detect, so they’re easy to miss, and if you operate under the assumption that you’re doing everything right, you could wind up sabotaging your ranks without ever knowing it.
These are some of the most common ways people have done it:
1. Extended or Nonsensical URLs.
Take a look at the URLs of your web pages. What do they look like? At the end of the breadcrumbs trail, is there a sensible, descriptive phrase like “learn-seo-today” or a long string of incomprehensible numbers and punctuation marks? If it’s the latter, you’re in trouble. Google uses URLs as part of its understanding of the nature and intention of your pages. This is a critical opportunity to include a strong keyword, and help Google understand the underlying structure of your website. Besides, those long trails of confusing numbers are unflattering from a user experience perspective. Every page of your site should be referred to appropriately in your URLs.
2. Forgetting Robots.txt.
The robots.txt file is an important inclusion in the back end of your site that lets Google know how it should crawl your site. Without that instruction manual, Google will treat your site as default, which isn’t usually a good thing. Robots.txt will allow you to instruct Google to crawl or not crawl specific sections and pages within your site, giving you more control over how your site is seen in search engines. When creating and uploading your file, be sure to double check it—you’d be surprised how common it is to accidentally block a solid page from being indexed by mistake.
3. Neglecting the XML Sitemap.
Your XML sitemap is another file that helps Google understand your site, though in less specific terms. If robots.txt is your instruction manual for Google’s web crawlers, then your XML sitemap is, well, the map. It tells Google how your site is structured, what pages exist, and how your site is broken down. This can help Google direct users to the most relevant pages within your site, and possibly break down the hierarchy of your pages for them. Forgetting or failing to update your sitemap won’t kill you, since Google’s pretty good at drawing its own conclusions, but it’s a simple fix you shouldn’t neglect.
4. Allowing Thin Content.
Technically, it’s your writer’s job to come up with the content, but it’s your job to build the sitemap. Part of that job means deciding which pages are necessary and relevant for your users, and eliminating anything that isn’t. If one (or more) of your pages are stuffed with fluff content (and your writer can’t do anything else with them), it means that page probably doesn’t need to be on your site. Nip thin content in the bud by structuring your site minimalistically and as effectively as possible.
5. Duplicating Meta Titles and Descriptions.
Your meta titles and descriptions help Google understand the intent of your pages, and dictate the appearance of your content in SERPs. You’ve probably got them all filled out, and maybe you’ve even used a tool to help you create them, but are all of them unique? Duplicate meta titles and descriptions don’t save you time—they’re redundant, and they waste your potential. Be sure to write an appropriate, unique entry for each of your pages.
6. Duplicating Indexed Content.
Duplicate onsite content has little to do with plagiarism (in most cases). Instead, it has to do with how Google scans and indexes your website. If a page is mistakenly indexed twice (such as once with a http:// prefix and once with a https:// prefix), it could register as duplicate content. You can generate a list of such instances using Google Webmaster Tools; be sure to eliminate them with proper canonicalization or through 301 redirects.
7. Failing to Include Microformatting.
Google loves to use rich answers—the bits of information you sometimes see immediately in search results, above and apart from traditional SERP entries. Rich answers are growing in prominence and importance, but Google relies on others to get the job done. Microformatting is a backend markup that feeds Google this information in digestible chunks, establishing a universal language that all webmasters and bots can follow. If you aren’t using it, you’re missing out on some serious potential search visibility (and leaving your users with less information, accordingly).
Being proactive is the greatest quality anyone in the SEO industry can have; problems are natural, often inevitable, and they’ll creep up on you even when you feel like everything else is going right. The only way to catch them is to actively seek them out and correct them before they do too much damage. Don’t lose your mind trying to make everything perfect, but don’t rest on your laurels either—run routine checks to ensure your onsite SEO starts and remains in good health.
What can we help you with?
- Link building services for my company.
- White label link building for my clients.
- Major media brand mentions
- Something else (get in touch!)
Looking to grow your traffic?
Our managed SEO and social campaigns and high domain authority link building will increase your presence and organic search engine traffic.Request a rate card
Want more great resources?
Check out our new Resource Library, with over 100 expert articles spanning all aspects of online marketing, divided into 16 chapters.See our Resource Library