Everyone's favorite social bookmarking site: Del.icio.us, appears to be rendering different content to the search engines than to its users. Every page on del.icio.us appears to have a the following directive in place:

<meta name="robots" content="noarchive,nofollow,noindex"/>

This should technically remove all of their pages from Google's index.  Performing a site:del.ico.us command at google, however, returns several million pages.  I checked Google's cache and didn't see a meta noindex tag.  Examining robots.txt reveals the same thing: a standard user agent renders  "disallow /," but a GoogleBot user agent does not.

They also nofollow all the outbound links on their site, but the nofollow attributes don't appear to be cloaked (so del.icio.us won't pass any link value).

My guess is that this is a method of combatting spam.  They're probably hoping that having the meta tag will deter potential spammers from saturating their site with crap.  It might also weed out robots that are scouring the web for valuable places to inject spammy links.  It doesn't seem like a particularly effective tactic, but I suppose every little bit helps.

Thanks to Emil Stenström for pointing this out.  Apparently this was covered on SEO Speedwagon in August, but I hadn't heard menton of it until now.