Google webmaster tool is one of the best free webmaster tools available. With the help of this tool we can submit our site to Google and can keep our eye on performance of our site or blog. GWT is a site optimization tool which helps webmasters to tune up their sites in terms with site submissions, search queries, traffic reports, indexing status, crawling errors, malware errors and many more. Today we are going to discuss crawl errors like why crawl errors come in webmaster tool and how to fix them.
Different kind of Crawl Errors in GWT
First of all we need to know that what kind of errors can come in Google webmaster tool. Right now Google is providing three crawl error reports –DNS, Server Connectivity and Robots.txt file fetching error. You can see below in the snapshot of site crawl errors. you may also like manual action button in Google Webmaster tool.
1. DNS Errors
DNS means Domain Name System. This issue comes when Google bot unable to communicate with your DNS server. This can cause due to heavy traffic load on your website, DNS routing problem or when your server is down. If you get this warning first time then there is not a big issue, you can ignore it or can check it through fetch as Google option in your webmaster tool but if you are getting this warning again and again then you need to check it through your web hosting provider. If they are unable to provide you any support on this then there is only way to shift your site or blog to better hosting provider. Two of the World’s best web hosting providers is Blue Host and Host Gator which I trust the most.
2. Server Connectivity Errors in GWT
Server connectivity errors means Google bots are unable to crawl your website. This can cause because your site is taking too much time to communicate or some firewall is blocking Google Bots to crawl your website. it may also happens when your site is handling a heavy traffic load, at that time it may return Crawlers with an overloaded status which simply means that crawl this site slowly or some other time. If your site has a decent amount of traffic then it’s better to shift it on VPS or dedicated servers to avoid such errors. If your site has not a heavy traffic then this problem is due to firewalls which may blocking crawlers to load your site and in that case you need to contact your hosting provider immediately to fix this error.
3. Robots.Txt File Fetching Error
Robots.Txt file are created to stop search engines to index some particular area of your website or blog because that area has no or zero importance. Robots.Txt fetching errors comes when you did some misspell in writing Robots.tst file’s page name or did some mistake internally in the file. Tuning up your robots.Txt file is all up to you. Check the URL first like –//www.yoursitename.com/robots.txt, if the page loaded successfully then the problem is in the file. You can also take help in creating this file over net.
These are the three site crawl errors which can come in webmaster tools. Some time these errors may be temporary but if you are getting these errors continuously then it’s time to be in action to avoid any kind of penalty on your site. you can also join our seo training program for more details.