AudienceBloom

SEO 101: A Guide for the Technically Challenged

Search engine optimization (SEO), to the outsider, is a frustrating, complicated mess. Google doesn’t publish how its algorithm works (though it does give us helpful hints), and there are hundreds of independent, technical variables that can determine how your site ranks.

If you don’t have experience with programming or website building, technical factors like meta titles, site structure, and XML sitemaps can seem intimidating and difficult to approach. And while it’s true that experience pays off—a novice won’t get the same results as someone with years of experience—the reality is that SEO is more learnable than you probably give it credit for.

I’ve put together this guide to help the technically challenged folks out there—the ones new to SEO, or those unfamiliar with coding and website structure—to illustrate the basics of SEO, and simplify some of the more complicated techniques and considerations you’ll need to get results.

Table of Contents

+ The Big Picture
+ The Technical Stuff
+ The Non-Technical Stuff
+ Conclusion

The Big Picture

First, I want to cover the “big picture” of SEO, because the “technical,” intimidating stuff is only a fraction of what’s actually involved in your search rankings. The goal of SEO is to increase your search visibility, which in turn will increase your site traffic.

Google ranks sites based on a combination of two broad categories: relevance and authority. Relevance is how closely the content of a page is going to meet a user’s needs and expectations; for example, if the user asks a question, Google wants to find a webpage that answers it. Authority is a measure of how trustworthy or authoritative the source of the content is.

Your tactics will usually involve building your authority, increasing your relevance for targeted queries, or both, across three main areas of optimization:

The Technical Stuff

Don’t worry. I’m going to make this as painless as possible. In this section, I’m going to cover most of the “technical” SEO elements that you’ll need to consider for your campaign. These are changes you’ll need to make to your site, factors you’ll need to consider or monitor, and potential technical issues that could come up during your campaign. I’m going to cover these as simply and as thoroughly as possible—so you can understand them and use them, no matter how much technical experience you have.

Search Indexing

When you go to a library for information, librarians can probably help you by finding a book. But no matter how relevant a book may be to your interests, it won’t matter if the book isn’t currently on the shelves. Libraries must acquire books as they’re released, updating old copies and adding new copies, to keep the most recent information on the shelves.

Search indexing works similarly. To provide results, Google needs to maintain shelves of “books,” in this case, a running archive of websites and pages that are available on the web. Google uses automated bots, sometimes known as “crawlers” or “spiders” to continually search the web for new page entries, which it then logs in its central system.

How is this relevant for you? If you want to be listed in search engines, and be listed accurately, you need to make sure your site is indexed correctly.

There are three main approaches you can take for search indexing:

You’ll also need to consider creating a robots.txt file for your site, which is essentially an instruction manual that tells Google’s web crawlers what to look at on your site. You can create this file using Notepad, or any program on your computer that allows you to create txt files—even if you have no coding experience.

On the first line, you’ll specify an agent by typing: “User-Agent: ____”, filling in the blank with a bot name (like “Googlebot”) or using an * symbol to specify all bots. Then, on each successive line, you can type “Allow:” or “Disallow:” followed by specific URLs to instruct bots which pages should or should not be indexed. There are various reasons why you wouldn’t want a bot to index a page on your site, which I’ll get into later. However, you may want bots to index all pages of your site by default. If this is the case, you do not need a robots.txt file.

For example:

If and when your robots.txt file is ready, you can upload it to your site’s root directory like any other file.

Site Speed

Speed has been a somewhat controversial topic in SEO, as its importance has been somewhat overblown. The loading time of your web pages won’t make or break your rankings; reducing your load time by a second won’t magically boost a low-authority site to the top rank.

However, site speed is still an important consideration—both for your domain authority and for the user experience of your site. Google rewards sites that provide content faster, as it is conducive to a better overall user experience, but it only penalizes about one percent of sites for having insufficient speed. When it comes to user experience, every one second in improved site speed is shown to be correlated with a two percent increase in conversions.

In short, whether you’re after higher rankings or more conversions, it’s a good idea to improve your site speed.

You can check your speed using a site like Google’s own Page Speed Insights, and start improving your site with the following strategies:

Mobile Optimization

Mobile optimization is a broad category that includes both technical and non-technical elements. Mobile searches now outnumber desktop searches, so Google has taken extensive efforts in recent years to reward sites that optimize for mobile devices and penalize sites that don’t.

Put simply, if your site is “friendly” to mobile devices, capable of loading and presenting content in a way that works well for mobile users, you’re going to see an increase in authority and rankings. Incidentally, you’ll also become more appealing to your target demographics, possibly increasing customer loyalty and/or conversions.

So what is it that makes sites “optimized” for mobile devices? There are a few main criteria:

If all this sounds complex to you, don’t worry. There are some simple ways to test your site to see if it’s counted as “mobile friendly,” and simple fixes you can make if it’s not. The easiest way to make your site mobile friendly is to make your site responsive; this means that your site will detect what device is attempting to view it, and automatically adjust based on those parameters.

This way, you can keep managing only one site, and have it work for both mobile and desktop devices simultaneously. You can also create a separate mobile version of your site, but this isn’t recommended; especially now that Google is beginning to switch to mobile-first indexing.

How can you make your site responsive? The easiest way is to use a website builder and choose a responsive template. Most mainstream website builders these days have responsive templates by default, so you’ll be hard-pressed to find one that doesn’t offer what you need.

If you’re building a site from scratch, you’ll need to work with your designers and developers to ensure they’re using responsive criteria.

As long as your site is responsive, you should be in good shape. If you’re in doubt, you can use Google’s mobile-friendly tool to evaluate your domain and see if there are any mistakes interfering with your mobile optimization. All you have to do is enter your domain, and Google will tell you if any of your pages are not up to snuff, pinpointing problem areas so you can correct them if necessary.

Sitemaps

I’ve mentioned the importance of sitemaps in multiple areas of this guide so far. Now I’m going to get into the technical details of what sitemaps are, why they’re important for your site, and how to create them.

There are actually two different types of sitemaps you can build and use for your site: HTML and XML. I’ll start with HTML sitemaps, since they’re a little easier to create and understand. As I mentioned before, HTML sitemaps exist as a page on your site, visible to both human visitors and search engine crawlers.

Here, you’ll list a hierarchy of all the pages on your site, starting with the “main” pages, and splitting down into categories and subcategories. Ideally, you’ll include the name of the page along with the accurate link to it, and every page on your site will link to your HTML sitemap in the footer.

Google won’t be using an HTML sitemap to index your pages, so it’s not explicitly necessary to have one. However, it does give Google search crawlers a readily available guidebook of how your pages relate to one another. It can also be useful for your visitors, giving them an overall vision of your site.

XML sitemaps are far more important. Rather than existing as a page on your site, XML sitemaps are code-based files that you can “feed” to Google directly in Google Search Console. They look a little like this:

As you can imagine, they’re a nightmare to produce manually, but there are lots of free and paid tools you can use to generate one.

Before I explain XML sitemap generation, you need to know what they’re used for. Again, these aren’t going to determine whether or not Google indexes your site; Google is going to crawl your site anyway. Instead, Uploading your XML sitemap to Google will instruct Google which pages you find most valuable on your site, and how those pages relate to one another.

For example, you could exclude technical pages of your site that contain fewer than 200 words, so the overall perceived quality of your site isn’t dragged down by your worst content.

Google explains that XML sitemaps are especially useful for the following types of sites:

Note that excluding a page from your XML sitemap doesn’t mean that page won’t be indexed; the only way to fully block indexation altogether is to use your robots.txt file (as I described earlier).

Does this all sound too complex? Don’t worry; the actual process you use to create a sitemap is fairly simple. Most CMSs have built-in features that allow you to automatically generate both HTML and XML sitemaps; for example, Yoast’s SEO plugin gives you the ability to create dynamic sitemaps, which automatically update as you make changes to your site.

For example, you could exclude pages of your site that fall short of a given word-count threshold, and if you add content, they’ll automatically begin to reappear.

It’s helpful to know how sitemaps work and why they’re important, but for your own sanity, it’s best to leave their generation in the hands of automated apps.

Meta Data and Alt Text

What I’m going to refer to as “meta data” is a blanket category that includes page titles, meta descriptions, and alt text. These are sections of text that describe your pages (or specific pieces of content within those pages). They exist in the code of your site, and are visible to Google search crawlers, but aren’t always visible to visitors (at least not in a straightforward way).

Google’s crawlers review this information and use it to categorize certain features of your site, including pages (as a whole) and piece of content within that page). This makes it helpful for optimizing your site for specific keywords and phrases.

It’s also used to produce the entries in search engine results pages (SERPs) that users will come across. Accordingly, it’s important to optimize your meta data to ensure that prospective visitors are encouraged to click through to your site. The title of your page will appear first, followed by your page URL in green, followed by your meta description, as shown in the example below:

Your goals in optimizing the meta information of your site then, is to first ensure that Google is getting an accurate description of your content, and second to entice users to click through to your site.

Thankfully, optimizing your meta data is fairly simple. Most CMS platforms will, for each page of your site, offer blank, clearly labeled boxes that let you edit the corresponding meta data for that page. Remember, it’s a good idea to include at least one keyword in each of your titles and descriptions, but you’ll want to avoid keyword stuffing, and focus on writing meta data that makes sense to your users.

Technical Errors

The last component of technical SEO I want to cover is the possibility for technical errors; these are common things that can (and probably will) go wrong with your site, causing a hiccup in your rankings and interfering with your plans.

If you notice your site isn’t ranking the way it should, or if something has dramatically changed without your notice (and no immediately clear underlying cause), your first troubleshooting step should be checking for the following technical errors:

<link rel=”canonical” href=”https://audiencebloom.com/examplepage/”>

If you have an SEO plugin, you may be able to enter the canonical link manually, like you did with titles and meta descriptions. Alternatively, you could use 301 redirects to clarify duplicate content discrepancies, but it’s arguably easier to set up canonical tags.

There are some other technical issues you may encounter, such as images not loading properly, but many of them are preventable if you follow best practices, and are easily resolvable with a quick Google search. Even if you don’t understand exactly what’s happening or why, following step-by-step instructions written by experts is a fast way for even amateurs to solve complex SEO problems.

The Non-Technical Stuff

In this section, I want to cover some of the “non-technical” tactics you’ll need to have a successful SEO campaign. None of these strategies requires much technical expertise, but it’s important to understand that the technical factors I listed above aren’t the only thing you’ll need to grow your rankings over time.

Keep in mind that each of these categories is rich in depth, and requires months to years to fully master, and these entries are mere introductions to their respective topics.

High-Quality Content

Without high-quality content, your SEO campaign will fail. You need at least 300 words of highly descriptive, concisely written content on every page of your site, and you’ll want to update your on-site blog at least two or three times a week with dense, informative, practical content—preferably of 700 words or more. This content will give search engines more pages and more content to crawl and index.

Collectively, they’ll add to the domain authority and individual page authorities of your site pages, and they’ll provide more opportunities for your site visitors to interact with your brand and your site. Here are some resources to help you create and publish high-quality content:

Keyword Optimization

All that on-site content also gives you the opportunity to optimize for specific target keywords. Initially, you’ll select a number of “head” keywords (usually limited in length, and highly competitive) and “long-tail” keywords (longer in length, usually representing a conversational phrase, and less competitive) to optimize for.

When performing your keyword research, you’ll choose terms with high potential traffic and low competition, then you’ll include those terms throughout your site, especially favoring your page titles and descriptions. You’ll want to be careful not to over-optimize here, as including too many keywords on a given page (or your site in general) could trigger a content quality-related penalty from Google.

Link Building

Authority is partially calculated based on the quality and appearance of your site, but the bigger factor is the quantity and quality of links you have pointing to your site. Link building is a strategy that enables you to create more of these links, and therefore generate more authority for your brand.

Old-school link building tactics are now considered spammy, so modern link builders use a combination of guest posting on external authority publishers and naturally attracting links by writing high-quality content and distributing it to attract shares and inbound links. In any case, you’ll need to invest in your link building tactics if you want your campaign to grow. For help, see SEO Link Building: The Ultimate Step-by-Step Guide.

Analysis and Reporting

Finally, none of your tactics are going to be worthwhile unless you can measure and interpret the results they’re generating. At least monthly, you’ll want to run an analysis of your work, measuring things like inbound traffic, ranking for your target keywords, and of course, checking for any technical errors that have arisen.

By interpreting these results and comparing them to the amount of money you’ve invested in your campaign, you’ll get a clear picture of your return on investment—your ROI—and can then make adjustments to improve your profitability. For help, see The Ultimate Guide to Measuring and Analyzing ROI On Your Content Marketing Campaign.

Conclusion

Hopefully, after reading this guide, all those technical SEO details should seem a lot less technical. If you’ve followed the guide step-by-step, you should have been able to tackle tasks like building robots.txt files and improving your site’s speed even if you don’t have experience in creating or managing websites.

Even though this guide covers some of the most important fundamentals of SEO, and can help you through the basics of technical SEO, it’s important to realize that SEO is a deep and complex strategy with far more considerations than a guide like this can comprehensively cover. A good next step would be to check out 101 Ways to Improve Your Website’s SEO.

If you’re interested in further help in your SEO campaign, be sure to contact us for more guidance and expertise!

What can we help you with?