Googlebot crawl rate control and more

I admit I’m a little too hopeful that people will do good things but it looks like Vanessa Fox and the rest of the Google Webmaster team actually are working hard to bring valuable tools to webmasters. The [latest offering](http://googlewebmastercentral.blogspot.com/2006/10/learn-more-about-googlebots-crawl-of.html) is the ability to choose from three different crawl speeds. Unfortunately the whole “this change will last for 90 days” and you have to keep coming back to select your preference is pretty lame. _At least_ send out a notification email if requested to remind webmasters that their crawl rate is about to change if it’s set to anything other than “Normal.”

Other new features include the ability to opt in your site’s images into the [Google Image Labeler](http://images.google.com/imagelabeler/) project. Good that you can increase the quality of your images metadata (which hopefully means increased traffic from Google Image Search), not good that you can’t see what metadata they end up producing. It’d be a nice feature if Google “donated” back to the image owners what keywords the Google Image Labeler generates for your images. I’m _not_ suggesting that Google publish the metadata for all the images in Image Search — only that they tell the image **owners** what metadata was discovered.

The charting looks cool but really only has value when you are experiencing problems — otherwise it’s just for the “ooh and aah” factor.

Altogether a nice clutch of new functionality. Thanks Google Webmasters Team!