For the years from 1998-2005, Google and webmasters were like nerds and dating - separated by an unwritten code that kept them far away from one another. With the exception of the occasional comment by GoogleGuy on the forums or Matt Cutts at an industry show, there was virtually no connection between the search giant and those who marketed sites and content via their medium.

In the summer of 2005, Google started its sitemaps program, which eventually evolved into Webmaster Central. Today, that organization comprises hundreds of individuals around the world, working on webmaster relations, webmaster problems and webmaster tools. It's the de facto model followed by Microsoft and Yahoo! and in many ways epitomizes the legitimacy that SEO has achieved from our darkest days as web outcasts.

However, there is one faction infuriated with what Google has built and desperate to stop the advancement of the Webmaster Central programs - in particular those portions that require site owners to verify their domains with Google. Who are they? Google's upstart competition. I won't talk specifically about the sources of these opinions, but they include more than a few of the major startup players in web search, as well as International engines and stealth-mode operators. What's the problem? I'll explain.

From the beginning of the web until about 2006, any company who wanted to build a web search engine had everything they needed at their disposal (at least, so long as they had the funds to make the effort happen). Despite the massive technical and financial requirements, nothing stood in the way of a creative group crawling and indexing the web, then using that data to construct an index and an algorithm to show results. It was all there on the Internet, just waiting to be fetched. Google changed that with the introduction of Sitemaps and the later growth of Webmaster Central.

Now, there's tons of data no startup engine could access without hacking Google's servers. Information like:

  • Sitemaps - The lifeblood of many sites' crawlability and accessibility as well as information about which canonicalization of URLs and even URL exclusion is now exclusively available to search engines that receive the sitemap. Even Yahoo! and Microsoft are severely disadvantaged, as webmasters are less likely to submit sitemaps to them.
  • Geo-targeting information - Google allows you to set geography and language for subdomains, subfolders, or entire sites inside the Webmaster Console, which is fantastic for websites and SEOs, but gives a clear competitive advantage over any player wishing to enter the search space.
  • Crawl Rate Control - For sites that want to allow faster crawling or those who require less heavy demand, crawl control is an option inside Webmaster Tools and another piece of information that benefits Google's control over site data.
  • Domain preference - Even though it's a simple thing, the ability to set a domain preference (url.com vs. www.url.com) gives Google a decided advantage.

Any piece of information that's submitted behind a verification protocol, rather than openly accessible by crawling, is going to hinder competition and help reinforce the market leader's domination. Suddenly, in the last two years, the barriers of entry to building an effective web-wide search engine have skyrocketed.

Caveat - I personally do not believe that Google built these tools with the goals of eliminating potential competition, and in fact am a huge fan of Webmaster Central, particularly the folks who work as analysts and evangelists to the webmaster community. However, it's certainly valuable to consider the effect a service like this has on the broader technology market, and how it fits in with Google's pledges of open-ness and "non-evil."