Technical SEO is all of the tactics and strategies that are implemented to make it as easy as possible for search engine spiders to crawl and index your website. Technical SEO is the aspect of SEO that is not directly related to your content. It is the initial behind-the-scenes tactics that you need to roll out prior to moving forward with an aggressive content strategy. Areas like site speed, site architecture, and proper server configuration are all high-level topics of technical SEO.
Below I have provided twenty technical SEO tips for your business’s website. Be sure to bookmark this guide so you can use it as an ongoing reference point. If you do your own SEO, make sure you have addressed all of the tactics below. If you have outsourced your SEO, bring this guide to your next meeting with your SEO partner and see if there are any areas that are not being addressed.
1. Use Google Search Console
If you haven’t already done so, sign up for Google’s Search Console. Search Console ensures that Google can index your content, and that you can submit new content that you need indexed, remove content that you do not want to show up on a Google search, monitor malware and spam, analyze which search queries are making your site show, and much more.
If you are new to Google’s Search Console, you will have to add your site and verify ownership. This is a very simple process, and Google provides several options for verification. You can verify your site by uploading an HTML file, verify through your domain provider, add an HTML meta tag to a certain page, use Google Analytics to track code, or use Google Tag Manager. Personally, I find the method through the domain provider to be the easiest because you don’t have to install any code. But, each option is fairly simple, and Google provides clear instructions.
2. Create HTML and XML Sitemaps
In the footer of a website, have you ever noticed a link to a tab titled “sitemap”? This is the HTML sitemap and is intended to make it easier for visitors to understand your site structure by viewing the sitemap.
The sitemap is a bulleted outline text version of the site navigation. The anchor text is clickable. They used to be much more popular. An HTML sitemap will not immensely help or hurt your SEO.
An XML sitemap, on the other hand, is extremely important. This sitemap is not for site visitors but for search engines. It helps search spiders crawl content quickly and ensures that all content is indexed. Search engine spiders are getting smarter and smarter, but sometimes they can miss certain pages. An XML sitemap ensures no pages are missed.
You’ll need to upload your sitemaps to Google’s Search Console and to Bing Webmaster Tools.
The easiest way to create and update your sitemap is by using a plugin. The most popular sitemap plugins are Google XML Sitemaps, Yoast, and Better WordPress Google XML Sitemaps.
Adding your sitemap to Google is very straightforward. Log in to Search Console. Click Crawl > Sitemaps. Then click Add, which will add all of your sitemaps at once.
In Bing Webmaster Tools, go to the sitemaps navigation section and enter your sitemaps.
3. Robots.txt
You will need to add a robots.txt file to the root folder of your site in order to give instructions about your site to web robots. I know it sounds a bit bizarre and highly technical, but it’s actually pretty easy to accomplish. The technical term for this process is called “The Robots Exclusion Protocol.”
Reference Google’s instructions for creating your robots.txt file.
The purpose of this file is to tell search engines which parts of your site you do not want accessed by search engine crawlers. So, your XML sitemap indicates which pages you want crawled, and your robots.txt file indicates which pages you do not want crawled.
For example, let’s say you set a new site live but you do not yet want it crawled. Or, you have duplicate content that you are going to handle via your robots.txt file.
Keep in mind that these files are public, so do not include sensitive or private data in them.
It’s important that the robots.txt file exists even if it is empty. Create this file with caution, and make sure you do not accidentally exclude any important site files!
4. Avoid Cloaking
Make sure you are not cloaking! Cloaking is a black hat SEO tactic that shows one version of a webpage to users and a different version to search engines. Below board marketers use this trick to fool publishers and game the system. Unfortunately, sometimes honest site owners are cloaking by complete accident. They may have had high turnover with web developers and SEO professionals, and they don’t even know it’s going on.
Search engines want to see the exact same results that users are seeing. Any attempt to be deceptive will only work in the short term and ultimately result in penalties that can take months to recover from. Cloaking is a clear violation of Google’s Webmaster Guidelines.
Sometimes a site will be loaded with content that is difficult for search engines to access, like too much JavaScript, images, and Flash. So, uneducated developers will turn to cloaking instead of taking the time to make the content more accessible to search engine spiders.
- Manage Your 301 Redirects Properly
301 redirects are used to permanently redirect URLs to another URL. You are essentially updating a page as it shows in search results. These redirects are common in the following circumstances:
- You installed an SSL certificate and you want to redirect from HTTP to HTTPS.
- You have an entirely new domain and want to redirect your old site.
- You are merging multiple websites and need to make sure links to outdated URLs are redirected to the correct pages.
- You are redirecting 404 error pages and expired content to newly produced relevant content.
- Visitors are accessing your site through multiple domains. You can pick one of the domains and use 301 redirects to direct all of the other domains to your preferred domain.
Keep in mind that you will want to avoid lengthy redirect chains. Redirect chains are a series of 301 redirects that force the user from one URL to another, and then another, etc. This will have negative implications on authority and link juice. These lengthy redirect chains will also impact the load time of your site. As you know, slowing down your site will impact your SEO and the overall user experience.
Fortunately, there is a tool out there that will help you identify redirect chains: Screaming Frog. Log in to Screaming Frog, select Report, and select Redirect Chains from the pull-down menu. Analyze the report, send it to your IT or web development team, and you are on your way to cleaning up those annoying redirect chains and keeping your site’s link structure nice and tidy.
Make sure you also look for canonical tags that cause 301 redirects. From my experience, people tend to get confused between when to use a 301 redirect versus a canonical tag. When you have a canonical tag that redirects, you have also created a redirect chain. A canonical URL tells search engines that similar URLs are actually one in the same. Using a canonical URL improves your SEO by notifying Google which URL is canonical. You can identify which canonical tags also have a 301 redirect by also using Screaming Frog.
6. Unblock JavaScript and CSS
You may have seen an email from Google about unblocking JavaScript and CSS. The article was widely distributed last year and had web developers and SEOs freaking out. The alert was confusing and left people wondering why Google needed access to these files.
Google needs to render these files for the mobile-friendly algorithm and the layout algorithm. Google recommended unblocking these resources so they could provide the mobile-friendly tag in the search results and apply the appropriate boost in rankings.
You can find block resources by logging into Google Search Console > Google Index > Blocked Resources.
In most cases you will be able to unblock your JavaScript and CSS by checking the robots.txt and ensuring that you are allowing all CSS files to be crawled.
You may also want to learn about directives that you can add to your robots.txt file to ensure Google can crawl your JavaScript and CSS.
7. Manage Duplicate Content
This one is really easy and actually has more to do with the user experience than it does SEO. Duplicate content has a bad name and some SEOs are under the false impression that Google is on a witch hunt for duplicate content and will penalize every instance they find. Not true at all. Please keep in mind, I am mostly talking about redundant content on our own site and not plagiarized content. Plagiarized content is absolutely terrible, and I certainly do not condone any form of plagiarism. If you are concerned that someone is plagiarizing your content, you can use Copyscape to check.
Siteliner is another great tool to check your own site for duplicate content. It could be that you simply used too many of the same words on each page or maybe your copy just needs a makeover. Consider elaborating on each individual page or consolidating similar topics into one page, if possible.
If your site contains large amounts of similar content by design, there are a number of responsible ways to handle that situation. For example, 301 redirects, consistency in your URL linking, and the use of top level domains. For more tips on handling duplicate content, check out Google’s guidelines on duplicate content.
8. User Experience
When you think about your customers and prospective customers in lieu of Google’s algorithm, you will naturally do things that will improve your SEO. There is a natural crossover in ranking criteria and factors that contribute to a good user experience. If you visit a site that loads quickly and provides the information you are looking for, you are likely to stay on the site for an extended period of time while you browse products and services. You are happy because the site loaded quickly, and we know Google looks at site speed and pogo sticking. Pogo sticking is the act of immediately abandoning a site due to a bad user experience. This contributes to a high bounce rate and convinces search engines to lower your rankings.
Sites with thin content typically rank low and typically provide a low quality user experience. In my book, thin content is really a measurement of how much users engage with your content. For example, if users are pogo sticking, you will have a high bounce rate and have bad metrics for time-on-site. This could be a consequence of thin content.
To fix the problem, don’t just think about writing more content, but think about how you can increase value and provide site visitors with high quality content. If you can do this effectively, you will see visitors stay on your site longer and begin interacting with your brand. SEO is not just about attracting new site visitors; it’s about attracting the right site visitors to the right web properties and ultimately converting them to customers.
9. Structured Data/Microdata and Schema Markup
You have heard me advocate writing for your site visitors and not for search engines. Microdata is the exception to this rule and exists exclusively for search engines.
Microdata is an HTML specification that is used to nest metadata within existing content of webpages. Your site visitors will not see your microdata (unless they inspect your source code). But search engines can use your microdata to provide an enriched experience to site visitors.
There are a variety of different microdata vocabularies, but the most common is schema markup. Schema is the most comprehensive list of microdata and is supported by all major search engines. It is the only one you should use!
Think of microdata and schema as extra information search engines can use to show more relevant and enhanced content. The enhanced information they show is referred to as rich snippets or rich cards.
You will have a few different options for implementing schema:
- If you are using WordPress, the easiest way to implement schema is simply by using a plugin. Here are some top schema plugins.
- It’s no surprise that Google has a tool you can use.
- Add the schema manually. I wouldn’t recommend this unless you are really comfortable on the back end of the site or you have outgrown the limitations of the plugins. If you decide to implement the schema manually, be sure to check that your code is correct by using Google’s structured data testing tool.
Please note that adding schema will not directly impact your SEO rankings. It will also not guarantee that a rich snippet will be displayed on every search. However, if you use schema correctly, you can make your organic listing more attractive and consequently improve your click-through-rates.