As promised the Crawl Test Tool has been released to the public.  I wrote a blog entry about this about a month ago which highlighted some of the features of the tool, but in a nutshell this is what it does:
This tool is used to test how accessible your site is to search engines and can help you quickly diagnose potential crawling issues and give you an overview of your site's search friendliness. The tool will spider the URL you enter as well as all the internal links on that page (max 50 per report). For each spidered URL, it will examine the following: whether it's indexed in the major search engines, last time google spidered the page, http status code, primary keywords on the page, meta description, and the number of internal links on each page. 
Since then two things have changed:  The Crawl Test no longer uses the Yahoo! Term Extraction API to retrieve relevant keywords, instead it uses data from our in-house Term Targeting Tool.  Secondly, unless you are a Premium Member you can only run one report per day and each report will only spider 5 pages.  Premium members can run as many reports as they like and it'll spider 50 pages.  I'd also like to point out that a good way to test the spider-ability of important pages on your site is to run a crawl test on your site map.

I've got a sample report available that I ran on our Web 2.0 Awards.  As always, suggestions and ideas are welcome.