Its always been an endeavor of web site owners and marketers to gather comprehensive information about how Google views their site. Google webmaster tools provide assistance in this regard. These tools also strive to elevate a site’s visibility along with offering useful alternatives on how to take care of web site listings.
To able to use the Webmaster Tools, you need to have an active Google account which you can access at https://www.Google.com/webmasters/sitemaps.
Step 1: Verify your site
Before taking a further step, it is essential you inform Google about the sites you desire to control.
Once you enter the prompted URL, you shall be asked for verification to control the site. It is entirely your discretion whether you want to upload a blank HTML file, or paste a META tag in your home page’s HEAD. It is simply a way for Google to ensure that you have access to the domains before you can view any information.
Step 2: Control Dashboard
Now that your web site is successfully verified, you can easily access a web site’s profile. In order to do that you need to click on Manage: http://www.yourdomain.com link in the Webmaster Tools Dashboard. Thereafter you shall be directed to the “Overview” screen which contains high level details of the most relevant information on your site along with links to certain Webmaster Tools.
Step 3: Check different statistical tools
Once you have reached the overview screen, click on the tool links in the left navigation to find out various statistical data about your website.
The Diagnostic tools play a dynamic role by informing you about any errors or problems encountered by Google crawlers while accessing pages on your site. The mobile web crawl tool, for example, informs about the errors on the pages created specifically for viewing on mobile cell phones. Putting it specifically, if GoogleBot comes via your site and tracks errors of missing pages in a robots.txt, or any other problem, the errors will surface, instantly.
Following are the errors usually encountered:
1. HTTP errors
2. Not found
3. URLs not followed
4. URLs restricted by robots.txt
5. URLs timed out
6. Unreachable URLs
Of course, we would want Google to encounter least possible errors. Therefore it is good idea to keep checking these reports regularly, rectify any problems making sure the fresh content is regularly crawled.
The most important ingredient for site marketers are the statistical tools which help them gain access to a number of reports, some of which are given below:
Top Search Queries
Top search query clicks are the top most queries that direct traffic to your site (based on the number of clicks to your pages) from a Google search. Once you access the site, informative tables shall show you which search queries frequently returned pages from your site along with those which were clicked.
Often overlooked, Google effectively shows you where in the SERPs your site was listed for particular search items.
What Googlebot Sees
How Googlebot sees your site is an interesting way to learn how others link to you and the relevance of these links on the page content. This report consists of the phrases used in external links to your site (anchor text), the keywords in your site’s content, the inbound links and finally the actual content on your site due to the density.
The crawl stats report focuses more about the PageRank values rather than the actual crawling stats. Your site is scanned by Google which shows you how your site’s PageRank is distributed on a range of Low, Medium, High – and “not yet assigned”. It maybe a relevant piece of information but also very demoralizing because almost all the pages in the sites will always be seen as low on the PageRank scale. The bottom table of this tool shows which specific page on your site carries the most PageRank.
The most exciting feature of this tool is the bottom table which clearly indicates the specific page on your website carrying the highest PageRank. Each month this table is updated, allowing one to measure the success of each link building campaign.
Through this invaluable tool, you can learn about your site’s Google indexing status simply by using search operators. Following are the shortcut links provided on the following operators.
This is an excellent tool if you are not using a RSS feed manager to publish lists of site updates. This page shall display the number of users who have subscribed to these feeds using certain Google products. Once you use their feed management systems effectively, Google helps you easily find stats on your feed subscriptions.
This effective tool helps you scan those pages on your site which have links pointing to them internally or externally. If you are are striving for more popularity and exposure of your site, this tool would be very helpful in keeping a track of your link popularity.
Google Sitemaps have been used for sometime now as an important ingredient of the Webmaster Tools. Once uploaded these XML based sitemap files help Google to access all the pages you would like.
Most of the relevant data you need for equipping yourself with information van be found in this tools segment of the Webmaster Tools which is fragmented well below:
The robots.txt is simply a guideline for Googlebot and various other spiders to search for the instructions which gives or restricts access to certain elements on the site being indexed or available for public viewing. This tool should be used even when minor changes are made to the robots.txt files as often a simple alteration may force Google and other search engines to drop many pages at a time.
Manage site verification
One can call this a very flexible tool wherein you can go back and make changes in your site to suit your needs as and when required. This is a very useful tool to verify and protect your site.
Set crawl rate
The rate at which Googlebot crawls is based on many factors. This rule specifies about the various activities of Googlebot on your site. The main endeavor of this tool is to attract Google more often onto our site and enabling it to interact with us more by requesting more and more pages. Once that happens fresh content is indexed and ranked speedily in the future.
A bit of interaction by Google results in spikes in the graphs which help you measure your efforts in link popularity.
Set preferred domain
If you are fed up of seeing the same URL format in your reports or if you feel apprehensive about canonical URLs effecting your optimization and links, then the preferred domain tool is the perfect remedy for your worries. With this tool you can direct Google to display the URLs according to your preference.
Enable enhanced image search
This exciting tool can allow you to control the labeling images that exist on your site. It is an effective way to improve image relevancy though one should be a bit cautious about using this tool so that you don’t lose out on traffic.
When certain problems occur on pages that no longer exist and cause issues by being on the Google index, this tool is most helpful in resolving the matter. This tool can help you remove common errors from the index by going back and making the relevant changes on your robots.txt files.
Even though the above information may help you use each tool effectively in the entire Webmaster Tools line, there are other ways you can make the most of your interaction with Google. These tools are one of the effective ways of keeping a tab on your site.
The more these tools are used, the more you would get used to making sense of data available. Ultimately it would help you to learn more about your site and the data you never thought existed.