XML Sitemaps was started by Google. Today, its standard is maintained by sitemaps.org and adopted by Yahoo and MSN search. What good does XML Sitemaps do for you?
XML Sitemaps
Unlike sitemaps on websites viewed by surfers, XML Sitemaps caters more to search engine crawlers. Today's web pages are beyond simple HTML; search engine crawlers have trouble recognizing content with newer technology. As a result, some pages would be missed. By using XML Sitemaps, a 'crawler' can index pages that wouldn't be looked at otherwise. Some people think XML Sitemaps helps search result rankings, but it doesn't: When pages that haven't been looked at become indexed because of XML Sitemaps, the pages start showing up on search results. It seems like the ranking has been improved, but XML Sitemaps don't affect actual rankings.
Eye of the Crawler
First, let's see your pages from a crawler's point of view. The quickest way is to use text editor, similar to notepad. Rather than selecting a file in an "open file" window, type in the full URL including the "http://" extension of the page you want to see. Then, what you see in text editor is almost the same as what the crawler sees. If you don't see a link in text editor (seeing javascript instead of actual URL, accept-cookie messages, etc), the crawler cannot "crawl" your site any further. If there are links or pages hidden from the crawler, using XML Sitemaps helps the crawler to index those pages.
Update
On top of that, XML Sitemaps includes the date of last modification and frequency of changes. This information helps search engines to decide when to crawl your pages. If you have a site with a large number of frequently updated content, newly added pages will be indexed more quickly. But, you need to re-submit updated XML Sitemaps to search engines. Everyone submits their very first XML Sitemaps, but quite a few people forget about re-submitting the updated XML Sitemaps. So, don't forget!
© May, 2009