Updated: 20 Sep, 2016

How to Conduct a Technical SEO Audit in 2016

A new year is just beginning and you have to be more cautious in analyzing a website than that of any other time from this 2015. Because Google is continuously improving its search algorithms to produce to best possible results in 1st page of SERP. So the effectiveness of tradition old school link building is waning.

technical seo audit

So in such circumstances, in order to ensure your online business success, whether you are working on a blog or e-commerce website, having clear and 100% concept of the overall condition of your website is a must. For your convenient, today I am going to discuss about some critical issues that you must carefully consider in order to perform a comprehensive technical site audit for your website.

Check Indexed Pages:

Your site audit should start with searching the domain name in Google. Do a “Site:Domain.Com” search to check how many pages are returned in search results. Also check whether the home page is showed in 1st position of SERP.  If the home page isn’t showing up at the first position, then there might be issues like poor internal linking/ site structure affecting the site.

Robots.txt:

Another crucial issue of site audit is to be sure whether a website is perfectly accessible to search engines. Any mistake here and the search engines won’t able to crawl the site which eventually leads to getting no ranking position in organic search results. There is a file stored on the web server named robots.txt with which you can tell search engine bots which contents to crawl and which not. If anyone accidentally blocks a crucial page or directory of your website, your website will dramatically loose traffic and ranking on from search results. However, any problem with this robots.txt file is quite easy to identify and fix. To find out whether any crucial page has been block or not, you can use a good quality Robots.txt Checker Tool.

HTTP Status Codes:

When a search engine crawler requests a page to web server, the web server returns with a HTTP status code along with the response. Your web server should always returns with a HTTP 200 code, which means everything is OK. But when a web server returns with a HTTP status code of 4xx or 5xx, this often refers to an error for which search engines can’t access to crawl a web page.

It’s always great to use a tool like Screaming Frog or Xenu Link Sleuth to check such HTTP errors. And in case you have the GWT access, you can check such crawling error in the easiest way. Just login to Google Webmaster Tools and then go to the Crawl > Crawl Errors to check the site’s health. It may look like this: Error message in Google Webmaster Tools

Canonicalization:

Canonicalization occurs when either the WWW or non-WWW version of domain doesn’t redirect to the other one. This is perhaps one of the most common problems that most of the websites often suffer from. Because of this canonicalization problem, most websites are actually splitting their domain authority between their www and non-www version of domains. If a website has this canonicalization problem and doesn’t fix this, search engines will crawl and index both copies of your site which ultimately make your website content duplicate. You can easily be sure about whether your website has this problem by searching in Google for:

Site:DomaiName.com –www

If the result doesn’t match with any document, than it’s fine. Otherwise, you need to modify your htaccess file from cPanel to redirect one version of domain to another.

XML Sitemap:

Having a XML sitemap within website can ensure better crawling for webpages informing search engines about any updates within that website. So here is couple of considerations that you should check while analyzing your website’s sitemap:

  • Is the sitemap a well-formed XML sitemap?
  • Is your sitemap following proper XML protocol?
  • Is your XML sitemap structured to show indexation problem?
  • Has the sitemap been submitted to Google Webmaster Tools?
  • Are there any pages in the site crawl, which do not appear in sitemap?
  • Are there any pages in sitemap which aren’t showing in site crawl?

Site Loading Time:

A website with longer loading time always faces higher traffic drop out and eventually increased bounce rate and lost conversion as well. Similarly, search engines bots allocate very limited time to visit each website on the internet. Hence a quickly loading website is crawled more consistently and thoroughly than a slower one.

Site Loading Time

In most case, a home page should have weight around 130KB and other inner pages may have weight of 180KB. There are plenty of online tools with which you can evaluate your site loading time like Pingdom or Google Page Speed tool. These tools really work great in finding issues that are serving as bottlenecks for your website.

Broken Links:

As you know Google and all other search engines crawl web links to link, having too many broken links can result in serious SEO problem. When Google face too many broken links on a website, it may assume that the site has a poor UX, which often result in decreased crawl rate as well as both indexing and ranking problems. Unfortunately broken links can also happen for other webmasters linking to your website incorrectly. So having broken links is very common for a website and you can solve all your broken links with 301 redirection. You can identify broken links of your website using tools like Google Webmaster Tools or Xenu Link Sleuth tool and then fix all those broken links with a 301(permanent) redirect.

Conducting a comprehensive technical SEO audit can be exhausting depending on the complexity and size of a website. But once you are done with this technical audit, you will surely able to improve your site’s overall visibility and ranking in SERP.

Schema Optimization:

If the website is an Ecommerce website; check whether the Schema is implemented. Even though the website is not an Ecommerce; you can still put Schema or Micro Data there. So, while auditing a site; make sure that Schema Data has been properly inputted on the codes.

About Taher Chowdhury Sumon

Taher Chowdhury Sumon

Comments

  1. “How to Conduct a Technical SEO Audit in 2015 a new year is just beginning and you have to be more cautious in analyzing a website than that of any other time from this 2015. “– This content is very useful to everyone. Thanks a lot.

  2. Thanks for sharing this detailed SEO audit article. Its really helpful.

Speak Your Mind

*

Related Post
What They say about the new experience of work with us
Load More....