Chapter 8 – Google Webmaster Tools viewing data and using tools Part 4.
In this chapter, we will take a look at “Crawl” menu. We will discuss the data and tools available in its submenu . There are six submenus under this option. Let us take a look at it one by one.
1. Crawl Errors: When you click on crawl errors you will get this dashboard which says Site Errors. Don’t be confused here you are actually looking at errors which are there on your site. I am not sure why the menu says “Crawl error” and when you click on it you get Site Error. Keeping the name aside, what we should focus on is data available to us.
The data which are available, here is a kind of treasure for site owners. If you see a dip in the graph of index status, this is the next place where you should come. This option provides with data about any crawl error on your website. It gives The data available here is of 90 days and it gives a graph of crawl error. Ideally, it should be zero. Any rise in the graph should raise your eyebrows and you should look for pages causing the error. Thankfully, Google provides loads of data to identify the issue.
First, it tells us when has the crawl error occurred. Whether it has occurred when it was crawling your website for Desktop by Google bot or whether it occurred when it was crawling your website for smartphone by Google bot for smartphone. This is the reason you see two tabs here Desktop and Smartphone. This data is extremely helpful to maintain your website from desktop and mobile point of view.
Second, it tells you what type of crawl error it was. Whether it was access denied or not found error. When helps you zero down to the issue.
Last, it gives you the list of URL of pages where this error occurred and what type of error was on that page. I am sure you can’t ask for more after this. This helps pinpoint to issue and pages where the issue is and fix. Once you have fixed the issue, you can click on the URL again and tell Google that you have fixed the issue by clicking on “Mark as fixed” button so that google can crawl your website again.
Google does not stop here, but has gone one step ahead in helping webmaster by giving an option Fetch as Google which you can see on this screenshot. You can use this option to ask Google to expedite the process of crawling and indexing. We will discuss this in detail later in this chapter as it is also in the sub-menu.
2. Crawl Stats : This section gives data about how Google has been crawling your website in past 90 days. It gives data in terms of number of pages crawled per day, kilobytes downloaded per day and time spent downloading a page (in milliseconds). A spike in any of the graph is OK, what should be matter of concern is a continuous drop in the graph and you should check for issue on your website to fix issues related to crawl.
3. Fetch as Google: It is tool provided by Google which can be of Great use. The first benefit of this tool is it helps you to see how your webpage looks like when Google sees it. To see this, you need to enter the part of the URL you want Google to fetch for you and then click on “Fetch as Google” button.
While entering the URL make sure that you enter the complete URL including symbols like “–“, “/”. If you miss anything, it may return an error. If you look at the example below you can see same URL, but one says complete and status is green other says redirected and has error status. The only difference between two URL is “/” which I missed out when trying to fetch as Google option.
Now to see how Google sees your page you can click on the URL with complete status. This is what your page looks lie to Google.
There is another button here which says “Fetch and Render”. There is a difference between “Fetch” and “Fetch and Render”. “Fetch” Would display the webpage how Google views your page and “Fetch and render” would display how browsers display your webpage to the audience. So if you want to see how the audience sees your website you can choose this option.
Now lets us come to the most important use of this tool “Submit to Index”. I am sure that you must be wondering after looking at screenshot that what does this little button do. To tell you it does wonders for webmaster. You can use this button to submit your content to Google index.
In case if you have created a content and you want to get it indexed faster by Google you can do “Fetch as Google” for that particular URL and then use button “Submit to index”. This would make the process of indexing faster. This option is also beneficial if in case there was a crawl error on your page and your page was not indexed due to it. So you can fix the issue and tell goggle you fixed the issue by clicking on the URL link in “Crawl errors” and then do Fetch as google to see it is ok now. If everything appears ok, you can simply use “Submit to index” to get it indexed.
“Submit to index” has got two options. You either chose to index single page which allows you to submit 500 requests in a month or you can choose this URL and its direct link which will try to index this URL and pages directly linked from it, you can submit 10 such requests in a month.
There is no time frame for the URL to be indexed, but certainly it will be faster. There is also one clause here that not all your pages will be indexed as indexing of pages depend on a lot of factors and the Google algorithm.
4. Robot.txt Tester: This is a tool which tells you what google is reading from your robot.txt file. If there is anything which is blocked from here you can find it here. It will tell you about any errors and warning found by Google in this file.
It also provides tool at the bottom of the page for you to test if any URL is being blocked or not.
5. Sitemaps: This option allows you to add and test a sitemap to webmaster tool and also view the data related to webmaster tool. You can see all the Sitemaps added to the webmaster tool either it is by you or by someone else managing your site. It also tells that how many pages have been found using sitemap and how many have been indexed from it. It also reports any issues with your Sitemaps.
6. URL Parameters: This tool is helpful when you have a dynamic website such as shopping website. Here your URL parameter keeps changing for a specific page. So a different visitors may be looking at the same page with different URL depending upon what stage they are in, for a particular page. This is confusing for Google bots and may consider this different URL to be different pages and can consider them as duplicate pages. So using this option you can define your URL parameter and tell
Google to ignore different forms of the same URL.
You should use this option as caution as this may remove some of your pages from search results. You watch the video from Google to find more about URL parameter and how to configure them.
I would also like to discuss two more menus in this chapter and that is “Security Issues” and Other Resources.
Security Issues: It is a place where you can find if google has detected any security issues on your website or if your website has been hacked.
Other resources: This is a place where you will find if you resources which Google provides for help of webmasters. These tools are not part of Webmaster tool account, but they can be accessed separately. To know more about these tools you can follow my blog post related to these tools.
This brings us to the end of the chapter. As always I would request you to ask questions and make best use of forum.