Monday, February 13, 2006

Common keywords and crawl-friendly robots

We often post tips for driving traffic to your site with targeted AdWords ads, and we've mentioned how Sitemaps can help users reach your business through the Google search results by making your pages more crawler-friendly. As an ad­vert­iser or site-owner, you may be in­ter­ested in how some of the recent changes to Sitemaps can help you.

Here's Shaluinn from the Sitemaps team with the details:

Sitemaps now shows you a list of the most common words in your site's content and in external links to your site. This can help you build and refine your keyword lists to target your audience. It also gives you additional information about why your site might come up for particular search queries.

To make your site more crawl-friendly, don't forget about your robots.txt file. As Matt Cutts explained in a recent blog post, "The robots.txt file is one of the easiest things for a webmaster to make a mistake on.” Your robots.txt file is a sort of note to web crawlers that tells them which pages on your site it may and may not crawl. Having an over-protective robots.txt can limit how many of your pages can be indexed in Google and discovered in the natural search results. With the newest release of Google Sitemaps, you can get a report that shows you Googlebot's view of your robots.txt file. This way, you or your webmaster can find out if you've accidentally blocked Google from crawling parts of your website.

And, if you're not quite clear on how Sitemaps can help your site, here’s a neat new Sitemaps success story about how it's helping a fellow advertiser, ApartmentRatings.com -- check it out!

No comments:

Post a Comment