Google introduces sitemaps
Google is at work again, this time with its new service called Google Sitemaps aimed at webmasters, writes Puneet Mehrotra.
Hi, it's me, Google!
Google, the inexhaustible, is at work again - this time with its new service called Google Sitemaps aimed at webmasters. The programme allows website owners to feed their pages they'd like to have included in Google's web index and also allows them to indicate how often their websites need to be revisited. Google Sitemaps is intended for all website owners, from those with a single web page to companies with millions of ever-changing pages.
From the horse's mouth
According to Google, Google Sitemaps is an experiment in web crawling. It is a way of informing Google crawlers. By placing a Sitemap-formatted file on a web server, a web master can enable the crawlers to find out what pages are present and which have recently changed, and to crawl the site according. Shiva Shivakumar, engineering director at Google, wrote in a post on Google's blog, 'We're undertaking an experiment called Google Sitemaps that will either fail miserably, or succeed beyond our wildest dreams, in making the Web better for Webmasters and users alike. Initially, we plan to use the URL information webmasters supply to further improve the coverage and freshness of our index. Over time, that will lead to our doing an even better job of delivering more search results from more websites.'
The programme mechanics
Webmasters need to create XML files containing the URLs they want crawled, along with optional hints about the URLs such as things like when the page last changed, and the rate of change. There are a number of methods a web master can use to create a Sitemap. He can use Google's Sitemap Generator, downloadable from Google Code. The web master hosts the Sitemap on their server and tell Google Sitemaps where it is. When a Sitemap changes, Google is auto-notified so the newest version is picked up.
Control: Google Sitemaps will allow webmasters to exert a higher level of control over the pages indexed by the search engine. The webmasters can point out which pages they desire to be indexed first, which are the modified pages, when did it happen and what is the frequency at which modifications are being performed.
Updation Frequency: A dynamic site gains tremendously through this programme by getting its newest pages crawled and listed in search engines.
Coverage and freshness: A pay off frequency setting is coverage and freshness of content.
Cost-effective: Google isn't charging any price for this service. However, web masters need to have the technical know-how and server support for this. Python knowledge and ability to install and execute scripts on web server is necessary. In addition, Python version 2.2 must be installed on the web server.
It's a definite positive move forward. Probably, every web master and search engine optimiser has faced challenges with crawling in search engines. Google sitemap could well be a step in the right direction. One of the biggest challenges will be coping with spam. Shiva Shivakumar said, 'Google sitemaps will either fail miserably, or succeed beyond our wildest dreams.' Let's hope it succeeds beyond our wildest dreams.
Puneet Mehrotra is a web strategist at www.websitepromotion.in and edits www.MidnightEdition.com. You can email him at email@example.com.
Follow HT Tech for the latest tech news and reviews , also keep up with us on Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.