Update Jan. 5, 2009 - I've actually significantly changed my mind about this advice. Yes, sitemaps can still obsfucate architectural issues, but given the experience I've had over the last 1.5 years, I now recommend to all of our clients (and nearly everyone else who asks) that sitemaps be submitted. The positives in terms of crawling, indexation and traffic simply outweigh the downsides.
It sounds bizarre, almost counterintuitive, but many of best minds in the world of SEO appear to be rallying around the idea that submitting a feed to Google Sitemaps and Yahoo! Site Explorer is actually a terrible idea. The logic behind the practice is simple, if you follow the steps:
- Without sitemaps, a search engine visits your site's pages through links on and off the site, indexing and ranking those pages it deems worthy of being indexed and ranked.
- When a search engine crawls your site and fails to index particluar pages, you have a signal from the engines that those pages lack the necessary components for inclusion, be they architectural, link strength, content-related, etc.
- Sitemaps enables search engine to crawl and index pages that they might not ordinarily include in a normal crawl process.
- If a page lacks the link juice, internally or externally, or has content that engines wouldn't normally deem worthy of indexing, Sitemaps may overlook these weaknesses and include those pages in their indices.
Why are so many SEOs recommending against submitting a feed to Sitemaps? Because the data you get from the natural crawl IS valuable, and submitting an XML feed (or any other format) can cause that natural process of inclusion to be lost. If a page isn't accessible, doesn't carry enough link juice, or lacks unique, valuable content, I want to know about it, and the Sitemaps process can be a hinderance.
Enormously big sites, who will see more value from having thousands of extra pages included in the index, even if it means a few stragglers are left behind are exempt from this rule. So, too, are sites managed by a team who is unwilling or unable to take the time to detect and fix omissions.
Don't get me wrong - Sitemap submission is an amazing and valuable tool in a webmaster's arsenal, but it's also one that should be wielded with careful knowledge of the side effects. I'd love to hear your opinions on the subject.
BTW - Full credit to DaveN for first introducing me to this idea back in Chicago.
p.s. Jan. 5, 2009 - I've actually significantly changed my mind about this advice. Yes, sitemaps can still obsfucate architectural issues, but given the experience I've had over the last 1.5 years, I now recommend to all of our clients (and nearly everyone else who asks) that sitemaps be submitted. The positives in terms of crawling, indexation and traffic simply outweigh the downsides.
I get the logic there Rand but I totaly disagree. Here's my reasoning:
1. It is better to get pages in the index now instead of later
2. If a page does not have the link-juice needed to be indexed it will hit suplemental and fall out (in Google anyway) even if it is included in the sitemap
3. You can track how effective a page is after it gets in the index by tracking how many SE referrals it gets
4. Just being in the index does not guarantee ranking for anything but not being in the index means not ranking for anything period
And your points 3 and 4 are reason enough to include a sitemap feed IMO.
Is this a test to see how many "mozletts" are out there swallowing up anything you post? ;-)
Mozletts? Awesome! :) That's SO going to be my word of the day!
I agree with rmccarley. You'll find out if your pages have problems if they hit supplemental, and then you can look to work on those particular pages as you would with your way of just looking at what is not being indexed and work on that. I admit that sitemaps are useless for useless pages being indexed, but the vast majority of the time they are a good way of getting your pages indexed and then looking to see what needs attention.
He's got a point (4 points, actually), Rand.
Hay rmccarley can you tell me how to track se referall for my pages on my site
Thanks
Sorry but I'm going to flat out disagree with you on this one. There is no quicker and easier way to tell what pages google is looking for and not finding. Come in on a client project that been live for a few years and a few revisions and you have no idea what they moved around and lost, but sitemaps will tell you. Do they have some funky set up or architecture in place they neglected to tell you about sitemaps will tell you. Have the site all the sudden go AWOL in the SERPS sitemaps can give you clues to what happened.
For a pristine site straight out of the box that's set up correctly you may not need it, however for stuff that been out there and may have issues it's a no brainer.
And I'm about as anti-google-borg as they come.
Are you talking about uploading the xml sitemap file or logging into Google Webmaster Tools (formally sitemaps) and looking at reports?
I do not see any extra information provided by providing the XML file. It just tells me there are a certain number of pages. All the error reporting and such is still provided without the xml file.
Unless of course I'm missing some reports somewhere. If I am, please let me know.
Graywolf - I think you might be missing what I'm suggesting, which isn't to use the data and tools provided by Webmaster Central, but rather, simply, to avoid submitting a Sitemap to them unless you're sure the positives outweigh the possible negatives.
Interesting concept... may have to give it a ponder. Hopefully someone can come to the table with some test data.
But are we crossing crawling and indexing?
I guess I would consider the sitemaps as possibly helping to get pages crawled sooner, but I can't imagine the SEs saying, "hey, this is a really crappy page that we normally wouldn't index, but since you submitted a sitemap, we will."
I'd also have to wonder at what size it would or wouldn't matter, or whether the value of not submitting would show itself? I almost wonder if the exception for the larger sites might be where the real benefit you describe would be felt... those sites that are so large that the SEs might have a hard time crawling all of the pages might benefit more to identify those pages that aren't being crawled and then working backwards to find the break in the system.
Pretty much exactly the point that I was trying to make, but much more eloquently put...
I always tell clients that if their site is built correctly, they don't need to submit a sitemap. I've also never submitted a sitemap for any of my own sites, aside from those used to test our their system and capabilities. Agreed on huge dynamic sites being a possible exception, but in general, not being fully crawled and indexed is an architecture or linking problem and submitting a sitemap only masks the symptoms of the problem and doesn't fix the cause.
If I went to a doctor who masked my symptoms, but didn't cure my disease, I'd be pissed and I think any seo client would have the right to feel the same way about site architecture problems or link problems that were treated in the same manner.
I must disagree also.
Sitemaps gets the spiders crawling your site much faster and gets your pages in the index much quicker.
I also disagree with your #2 statement very much. Each spider that comes to your site comes in from a different link and will crawl differently. Also, they are not necessarily going to do a full crawl every time so you want to encourage them to come back much more quickly in order to get indexed faster. Just because they don't index particular pages on a crawl does not necessarily mean that "those pages lack the necessary components for inclusion, be they architectural, link strength, content-related, etc."
As for your #4 point, I am curious if there is any research or anything that shows that "Sitemaps may overlook ... weaknesses and include those pages in their indices." I have never heard that using sitemaps increases a page's value in the eyes of the SEs, only that it gets them to the page quicker.
I just feel like you would want to provide every opportunity for your sites to be indexed further by Google and push your content to them instead of waiting for them to come pull it out on their own.
I think rmccarl, identity, graywolf and Aaron Pratt all have great points.
I use sitemaps to ID new content, when you have a new site with few backlinks and no history (internet footprint) you want to notify Google as quick as you can that new content is yours. Scrapers and feeders come in from your RSS feed to duplicate content so you want Google hitting the same channels show that you originated it.
Try this, login to webmaster tools, look for a site that hasn't been crawled in a month because it has little activity, check and click "resubmit selected" and watch Google recrawl your content in minutes.
It's true that if a site has enough incoming links it will get crawled in the same way as I mention above BUT as for the idea to "not submit" that comes from a lack of understanding how Google works.
I tend to disagree. Even with small sites, where link structure is very clear, sometimes Google doesn't index all the pages, so a sitemap is a valuable tool. Specially if a page is deemed not so important to the search engine, but in fact is important to the site owner.
Oh great...
I was on webmaster radio a couple of hours ago saying I feel sitemaps is a must.
You can’t beat it for speed of indexing.
I am always adding new content 12 pages this week, and that’s in 11 languages translated by native speakers making almost each instance unique simply by the nature of native translation. So I guess my client site may fall in the enormously big site category, but I never think of it that way. I guess I’m like a parent with a fat kid, it may be big but it’s still a little baby to me.
Some not all of that new content is time sensitive.
I'm not sure I would be comfortable advising my client to remove sitemaps, and be able to feel confidently that the time sensitive content will be indexed within the window of opportunity.
So while I see the potential positive aspects of gleaning more data from natural crawl I can’t see dumping sitemaps anytime soon.
P.S. Fat kids need love too
A VERY interesting post Rand - thanks.
We have sites with upwards of 30,000 pages (jobs & articles primarily) and we now submit sitemaps to Google for all of them (haven't got round to Yahoo yet).
What we have found is that it hasn't helped us rank any better (but then we didn't think that it would). What it does do is allow us to get a better grip on crawl time, error pages being found by the bots, and so on. With such huge sites, having a single page summary of all the errors and issues (data which our web analytics has, but doesn't display in such an easy to find manner) is pretty damn useful.
We may well reconsider this position, bit for the moment I think the benefits outweigh any negatives for us.
Hello i'd like to share my experience of sitemaps with you... i have a small site - 170 pages that i launced with a sitemap - after 3 months i was frustrated with only a handful of pages showing up in the SERPS - so i removed the sitemap and voila - within 48 hours 90% of my site was crawled and my traffic rocketed. I used a Google sitemap generator tool to create the sitemap in the first place - probably wasnt correctly formatted but it still verified ok with G. The moral of the story - IMO don't use a sitemap unless you A) Know exactly what you are doing and B) you have to. BTW i'm new here - joined after seeing Rand at SES London.
Welcome, Howard. Hope you enjoy and benefit from the daily blog for many years to come.
Just to give some insight, I don't believe sitemaps will hinder a website in any way. I did a little research and found that the regular crawl methods and sitemaps appear to be completely seperate.
Taken from Google's FAQ section on Sitemaps:
"A Sitemap provides an additional view into your site (just as your home page and HTML site map do). This program does not replace our normal methods of crawling the web. Google still searches and indexes your sites the same way it has done in the past whether or not you use this program. Sites are never penalized for using this service."
Source: https://www.google.com/support/webmasters/bin/answer.py?answer=40318&topic=8465
"No. Using Google Sitemaps will not influence your PageRank; there will be no change in how we calculate the ranking of your pages."
Source: https://www.google.com/support/webmasters/bin/answer.py?answer=35177&topic=8465
I don't think that Rand is saying that they'll hinder the crawl, the gist is that they'll hide potential issues with site structure that, if identified, could improve the overall site.
I'm going to disagree with this idea.
For one thing, as you pointed out, it's not scalable. At what point does it become advisable/inadvisable? What if you start out small and then grow? How do you know when you are at the point where you should have the site map? Probably too late, for most people.
Second, when my livelihood is on the line, I'm a belt and suspenders kind of guy. I also don't take chances with other people's livelihoods, either, like clients.
It doesn't make sense to create a scenario where you know things will fail just so you can see where the failure points are. In engineering, that's considered to be an inherently unstable system.
That kind of checking for issues should be done before launch. It's like making a car then checking the brakes by taking it for a spin in the city.
These are live sites, not crash test dummies.
Say No to Crash Test SEO!
Ian ;)
Ian,
I was with you until
Not only bad fashion, but technically, conflicting in their approach... again, technically.
But I still gave you a thumb ;)
I'd have to agree that test sites for testing purposes would be one thing, but live sites are another matter.
At the end of the day, you have to ask yourself whether it really matters how you got there, as long as what you did couldn't necessarily hurt you.
Now if Say No to Crash Test SEO isn't a great title for an SEO article, I don't know what is. Let me know when you write that one! :)
This post is too old to read. We have better posts on SEOMOZ on same topic.
https://www.seomoz.org/blog/xml-sitemaps-guidelines-on-their-use#jtc81642
All of this complaining for a post that has 12 thumbs up and no thumbs down??
This is almost like a forum now....
Softplus, I've used your gsitecrawler for a couple of years - excellent work!
That's a very good point - why are people not voting as well as commenting? It does suggest that the mechanism to decide 'most popular' posts may not be getting used as much as it should...
Yes I agree with you. Some time I submit sitemap for static sites. But it is really give a bad effect for E-commerce site. I always suggest my clients for html sitemap for visitor, but not xml sitemap for E-commerce site.
I do agree with this in part but as somebody who writes >5 blog posts per day I find sitemaps help them get indexed faster than waiting for a 'real' crawl.
Thats totally the point. They are indexed but did that do you any good? Would it have been better if they weren't indexed and you would then know they need more linkjuice? Rand good point, and a thumbs up for David as well.
I don't use sitemaps. Simple reason is they are not needed and I have never seen them be of value and actually for the precise reason Rand mentioned. Being indexed should not rely on the sitemap. A site should have enough value, a good structure(link-structure and otherwise) that Google can find every page and then index if there is enough value in Googles eyes for that site. If not I highly doubt a sitemap will do you any good. Not sure about you but my pages being indexed do nothing for me unless they are ranking.
BTW I find if you put thought, energy and excitement into your site you don't need the help of a sitemap or anything else. It will virtually grow on its own. Now I am not saying you don't need to work on your site, I am just saying that a good site takes care of itself. I find it is valuable to not use artificial means at times. You can learn alot going natural.
I also use html sitemaps on-site which help Google crawl the site. That basically does the job for me.
I use Google's Python script to generate sitemaps every few hours for klikhir.com. If I didnt have sitemaps Google would crawl my pages less frequently. I need those docs indexed as fast as possible. Sitemaps is my approved back door to Google Search.
Klikhir produces around 150 new articles every day and those articles compete against duplicates spread around the net as digital swag. I want mine at the top of the search result (for free) and sitemaps are the only way forward.
Definitevly, Sitemaps submit is a 2 speed issue :
Natural crawling IS for sure a really important data, for small/medium webiste which can monitor all datas.
But as soon as the website becomes a media with large amount of pages, articles and dynamic generated list, the natural crawl will be the first SEO step, the sitemap construction strategy will be the way to reach more and more visitors by giving search engine page that fill after that follow the important long tail rule.
Natural crawl for high-juice consideration.
Sitemaps crawl for low-juice, unlimited audience !
Interesting Rand. The sites I've dealt with have all been small enough and easy enough to deal with that I've never had to submit a Sitemap to fix crawling issues. Any issue that has come up has been easily handled on the site. I've always assumed taking care of the issue on site was the better approach or at least made submitting a Sitemap unnecessary.
However this is the first I've seen a recommendation saying that seeing the issue in the first place is the best reason for not submitting your Sitemap feed.
And I want to slap my forehead and say d'oh, of course that makes sense. Seems so obvious after it's pointed out doesn't it?
Submitting the feed might fix a temporary problem, but it can hide the fact that there's a much bigger and more permanent problem and I want to know about those problems more than the temporary ones.
Glad to be stirring up lots of controversy; it's definitely good to have people thinking about this issue. I think that for those folks who've used the submission process successfully and are happy with it, I wouldn't take it away. However, I've seen with sites big and small, inclusion in the index and supplemental questions arise when Sitemaps are used for crawling. For some of our clients, it doesn't matter as much, for others, avoiding the "am I just in supplemental because of Sitemaps?" question is valuable.
My plan going forward is always going to be building or fixing a site so that the natural crawl can do all the work. Sitemaps is, in my opinion, sometimes used a crutch for poor architecture and poor inlinking. That's not to say it should never be used, but as with any tool, webmasters need to be aware of what they're using it for and what side effects may accompany.
Absolutely. Sites should be developed up front for accessibility... whether your are thinking of humans users or SEs, both benefit from the same efforts. Anyone thinking that sitemaps are a "magical elixir" for poor site design should quite drinking the stuff.
So far this seems like more of a theoretical discussion... it would be interesting if anyone could bring some kind of research data to the table.
and this is one of the best, most interesting discussions for a long time!!
identity - I wouldn't have brought up the subject if we hadn't been able to compare the results first. We do have a client with a large site who's submitted certain sections to Sitemaps and has others which have only been crawled naturally. We did find value in the non-submission (or, at the least, less confusion about supplemental, etc.), but for large sites, it's often not worth the time to examine at such a granular level.
Rand,
I didn't mean to imply otherwise... I know you always have an eye for data that backs up a theory.
I'm just wondering how much of this has been substantiated (by SEOmoz as well as anyone else reading this) and might be shared, that might also detail some of the issues or concerns brought up here about size and other site specifics.
Also how much of this has been looked at and examined specifically with this in mind versus just an observation.
I completely agree with your statement that sitemaps aren't substitute for a crawlable site (though for one of my clients they're a temporary fix while the crawlability of the site is fixed). To me, a good crawlable site is a given - it's one of the basics. Then once you have that, the question is whether to use sitemaps or not.
I am a bit confused about "am I just in supplemental because of Sitemaps?" Isn't the crux of what you're saying that if you don't use sitemaps poorly linked pages won't get indexed at all? If that's the case then isn't being in the supplemental index better than not being indexed? And isn't it just as easy to notice that a page is in the supplmental index as it is to notice that it hasn't been indexed? Or are you saying that a page that would get into the main index winds up supplemental becuase a sitemap was used? (Thought that was only the typical when there was a poorly constructed sitemap being used).
I use sitemaps and every other tool that Google puts out there, just because it's Google
My business site's been around for a while and I only put about 10 pages of that up there, but for new sites just to get them indexed it's worth throwing at least the home page in there and getting it crawled quickly.
I can't imagine Google penalizing anybody for using their tools. You're helping make their life easier and they should do the same for you.
Actually, I have no idea what I just wrote - I just want to see the "premium member" button under my name :)
while interesting, I kind of wonder whether or not posts is emblematic of the fact that people are looking at the Google site maps tool like its a hammer, while it's actually a screwdriver. In my experience, the value of a tool like Google site maps has not been its ability to resolve crawling and indexing issues, but rather the diagnostic info it provides.
Personally, I never put a google sitemap on a site until after I'm certain that the site can be completely crawled by search engine spiders. For me, I like the site map because while it provides some insight into some crawling issues, its real value has proven to be helping me identify markets that I am not targeting but should, as well as helping me to identify issues with site content.
Not to mention, this diagnostic value has increased substantially with the backlink information site maps is now providing.
I agree with Timoon. Sitemaps are valuable when working on very large sites (i.e. 1000+ pages). Yes it's important to address navigational issues that hinder the crawling process. But it's equally important to ensure that your site's most valuable content is discovered as soon as possible.
There are methods to ensure that not all pages (including low quality pages) are indexed by Google when using sitemaps. Which would achieve quality and speedy inclusion into Google's index.
In my opinion sitemap creation is just another thing bad SEO's need to justify their fees along with putting in meta keyword tags and fancy reports. If you do your job well your clients bank account is the only report they need.
Rand, I totally agree with you. I've used Google Webmaster Tools for all of my personal sites since it was first available and I've never submitted a sitemap for any of them. All of my sites are being spidered regularly and performing well despite the lack of a submitted sitemap.
I do have a dilemma with my firm's SEO clients. Currently, I do submit sitemaps to Google for most of my SEO clients (mostly small business websites). There is a lot of perceived value to our clients when we say that we will submit a sitemap to Google for their site (the ongoing drama of perceived value vs. actual benefit) This is a service offering that we’ve contemplated changing on several occasions, but I've never any other opinions to back me up.Thanks for posting about this issue.
If Google cannot crawl the site, I would veer to say that the pages you are submitting will be considered orphaned. If there happens to be a lot of pages that have this behavior it could be much worse. On the other hand really large sites with massive amounts of new content benefit from submissions like this. Especially sites with decent authority.
Great Link Bait Rand :-)
Don't expect rand to know everything about SEO because just like you and me he evolves with the current changes. If you also notice, people do not like being tricked (linkbaited). This is why places like Digg are not kind to the "SEO" because they look at us like sleazy car salesmen. I have to admit when it comes to Digg, I agree with them even though they are a bunch of elitist lamers.
Absolutely. Good points.
One of the best things about the SEOmoz community as a whole but especially demonstrated by the company itself is that SEOmoz isn't about spreading the SEO gospel from upon high, but encouraging, prompting and promoting the SEO discussion.
I think Rand especially would be the last person to say "don't question what I say, just do it."
and now all we need to round out the day is for Matt to do a Digg letter encouraging them to use an xml sitemap!
Yes, this is quality linkbait indeed (used in the positive sense of the word). It gets people thinking and talking about an idea from a different perspective than what they were preached to this whole time. I'm thoroughly enjoying this conversation.
In our recent experience, launching lots of smaller sites, sites with SiteMaps and without really don't have a big difference in crawl-time. The deal-maker (or breaker) is age of domain/links to domain, as well as standard usability issues.
You do not need to submit a full sitemap of your website. Not so long ago I have submitted a sitemap wich only contains the articles and no navigational or menue-pages. It works fine.
Google almost instantly indexed all articles and sends traffic to the article pages. Without this sitemap the crawler would have started within the menue structure/listing pages and probably would have left out the keyword rich articles.
I would not recommend putting your navigational pages in a sitemap, but you should use it for your content pages.
Use php to give a new date every time the crawler revistis your sitemap, but use also a last-update date from your database with every entry. It seems that sitemaps without Last-update date on every url do not work as well.
Hi!
It's three years since you've posted this and I'm curious whether your thoughts or the technology have changed.
Rand, I am with you.
LOL - Barry is like the SEO version of Obi Wan Kanobi.
Use the force Rand. Now if I could just get the voice of Barry to influence my SEO actions... :)
Just don't be lured by the dark side.
I guess I see xml sitemaps as a road map for the SEs...
you can give 'em a map, but there's no guarantee that they'll follow it,
and whether they do or don't, it's no indication on the condition of the road.
Just seems like the possible negatives of this approach far outweigh the positives. Perhaps you're correct and not submitting it may reveal issues, but will you ever really be sure anyway or will you be making subjective guesses?
I would think submitting and then not seeing pages indexed or being moved to SI would be more valuable in identifying areas to focus on.
I am new in this field. After reading all the comments i get to the conclusion that sitemap is good thing to work on. It can be helpful for newer site but only when you know how to use it in your favors.
Until and unless your not well abreast with all the consequences of submitting sitemap, don’t do it.
I think sitemaps are excellent tools for large websites. For smaller sites, it may not be necessary if google crawls them daily.But in cases like one comment here - where the author wants to get his articles indexed before other sites to avoid duplication, then smaller sites(with very frequent posts) can also go for sitemaps.
I have a big website with more than 1 million pages and i had a tough time creating a sitemap. I used a few applications but all of them would crash handling big number of files.
The i managed to generate a sitemap(with sitemap indexes and satellite sitemaps)with Gsites SQL update. Thanks Softplus.
Back to topic , i think there is no harm in creating a sitemap - but if you can manage it without the sitemaps, well and good. But sitemaps do have advantages.
What are your opinions on the time and frequency in submitting sitemaps. Wouldn't that be based on how frequent your content changes and how big your site is. Or is there anything else?
What an incredibly timely post ... I was in the process of generating a sitemap this morning when I came across this article in my feeds. (I moved to a new domain so figured I best do "the sitemap thing.")
An excellent discussion. Leaves me wondering what is the best thing to do, since my site is relatively small (ball park of 1000 pages). Since my traffic has been down since the move, I am inclined to go ahead and add in the sitemap, maybe for a week or so, to ensure a thorough crawl (hopefully). After that I suspect I will take it down. Still undecided. I expect I will refer to this post several times while I am trying to figure what is best for me.
I wouldn't bother with adding sitemaps. Try getting some deep links, especially to category pages (pages that link to many related pages). That will send PR deep into the site...
You got a good point there, however I'd go for a sitemap if the site is more than 3 levels deep, but the deep links are also a valid idea
XML sitemaps don't help ranking -- I think they only help with indexing. If you are just trying to get new content indexed, then you can use the "feed and ping" method which will generally work even if you don't have any links to the new content.
If some links are too deep then it might be worthwhile to make a different internal linking structure.
I had been struggling with deciding if I should submit to G sitemap or not. Till last year, we were enjoying good indexing and ranking. However after we had implemented our sitemap on G things turned bad.
Now, I know that perhaps it was not all due to the sitemap and there are other factors that might be affecting our rankings (such as IBLs etc) but I always felt that it was probably a mistake to include our sitemap on G.
Well, part of the reason why I came across this post is that I was looking for the reason why google didnt crawl my webpages that have been included in the sitemap file recently. Now, it seems that submitting the sitemap may not be a good idea after all. Anyway, will give google another week or two to see if there is any improvements. Otherwise, will remove the sitemap file from the root directory.
I was waiting with my site without sitemap for over a year and Google was showing only 300 pages, two weeks later I has create a sitemap and now I have all my 3000 pages indexed.
So i belive on sitemaps.
I agree with you. I think that sitemaps were created by Google primarily to debug Googlebot. Google engineers can use them to gather data on what pages Googlebot finds on sites vs. what pages humans say are on a site. Then the engineers can look at specific cases and give the spiders a tuneup.
The webmaster control panel is the carrot on the stick. There is valuable data in the control panel, but I don't think that it's necessary to submit the sitemap.
Hi,Help me understand this, what is the real difference if Google is crawling your site 'site map page' and the xml feed. It is still getting to your website from either channel. It's going to look at the page the same way. So if you have pages on your xml feed and sitemap page and this so-called 'link juice' would be the same either way.If you have pages that are on the xml feed and not part of your site and not enough 'link juice' you're right, it's called 'doorway pages'Just looking for a mutual understanding. lilbit
Hi RandI've always gone along 100% with your thinking on site-maps as it mirrors my experiences.
A question - what do you think of this: 11/04/2007 Yahoo, Google, MSN and Ask have reported this morning that they have joined together to make sitemaps autodiscoverable via the robots.txt file. As long as you tell the search engines where your sitemap lives (in the robots.txt file) they will be able to find it and use it.Do you think its worth a try?
Thanks.
PS: You are my SEO Hero and Guru!
Could you provide your source on that? THANKS
Danny's got it all here - anounced at SES NY
To add the location, just put a line like this anywhere in your robots.txt file: Sitemap: LOCATION-OF-SITEMAPS-FILE
Thanks Ciaran, great resources with all the links you need to get it setup.
Rand, I read your blog and i got a chill. Allow me to disagree with most of the people. To submit the sitemap to Google was one of the very worse things i have done in my short life as a site owner. Briefly, we just came from ranking at the top (in a few valuable keywords) down to the 300th place. How about that.
We run our business in a controversial area (gambling), however fan of white hat seo, i just found ourselves treated like black nasty spammers. I have been fuming since the begin of April, when I noticed the downhill.
At the end of the day, to rank at 30th or 300th is really irrelevant - if one doesn’t place in the first 5 results, the website is just filling up space in the server.
I suspect to know why this happened; instead of re-submitting a sitemap for www.mysite.com i submitted a sitemap for mysite.com - tragic.
We enjoy a brilliant position at y!, msn and other smaller engines but it’s a shame whatz going on.
We tried to get some feedback from google - but that’s more difficult than getting an appointment with God himself. For a few days I have been posting on the webmaster groups at Google groups to get clued about what to do. The Google webmasters forum is the type of place where one should be prepared to get the site slaughtered, but after all the tests to possible black tricks (we follow white hat and only the white hat)….we got the advice to give up on the sitemap and improve wording on the page titles.
The sitemap was actually erased from the server after I realized this was only a beta implementation at Google.
Just a note to chil-uno. Using sitemap autodiscovery on robots.txt seems not be working very well for many people.
Thanks for your post - as usual, very interesting and useful.
After getting rid of the sitemap....nearly 10 days after, i am happy to annouce we are back to NR1 (as we were before submitting the silly xml)..
Sitemaps (for the time being) no thanks.
From Google's perspective one of the things a sitemap XML can do is increases its awareness of the number of URLs within a site.
From here it is fair to say that some ways of measuring ranking factors for a site would be based on the "size" of the site.
From there potentially some factors could be "diluted" across more pages which could affect rankings.
You are more likely to hear about the mishaps and disasters rather than success stories.
I have built around 300 sitemaps, and uploaded and monitored the effect of around 50. I have seen more positive effects than "negative ones", but truly IMHO addressing the resultant issue is better SEO.
I realised this about a year ago when lots of reports of sitemaps being "bad" surfaced.
meaning in google sitemap webmaster who already sent, deleted it?
Found this post searching on XML sitemaps, and if I should bother to use them. Great reading, thanks for sharing this knowledge, it is still valuable even though it is 7 years old!
Generally we only look at sitemaps if the sites are more than three levels deep in the navigation.
This is a fantastic string of ideas and thoughts about Sitemaps. Thanks for sharing all your opinions & insights! I see the Sitemap development as simply a shift in the search engine process.
I will preface, that I am not an SEO expert, but I am curious what others think about the following statements:
1. Everything SEO is reliant on the status quo that search engines (mainly Google) deem important - based strictly on the process they use to collect info & rank it accordingly. Basically if Google changes their overall process, SEO tactics will change (for either indexing or ranking).
2. Sitemaps have been an initiative championed by Google and Yahoo. I would guess there is a clear benefit to them: The crawls become much, much more efficient in identifying sites and content.
3. What if the majority of site owners eventually embrace Sitemaps - aren't you just losing out?
From a bottom-line, cost-analysis view - there is a huge benefit for Google/Yahoo to have the Internet organized for them i.e. through a pre-determined Sitemap structure. You are doing some of the lifting for them...
So the question is if Google/Yahoo want it, why fight it?
umm... the xml sitemaps we are referring to are meant as a "supplement" to the normal "crawling" process. Bots go from link to link. Where a xml sitemap feeds all the pages to the engine all at once. Thus the engine gets a taste of how many pages they should have found via their normal crawl. And hopefully will adjust accordingly.
In addition I also hope this will help the engines understand dynamic pages more. Such as duplicate pages caused by session IDs and so forth.
The point is so long as links remain a vital part of search engine indexing and ranking processes the xml sitemap will always be supplemental to the traditional link to link crawl.
Also xml sitemaps are meant primarily for indexing purposes. Not for ranking purposes, it has nothing to do with tweaking content or links for optimal rankings.
To answer your question about search engines wanting something why fight it. Search engines are a great way to get free traffic or even paod traffic. The problem comes when they become hypocritical.
Google thinks they can tell us how to sell links, when we don't tell them how to sell links. they can sell a link any which way they want. But us if we want to stay indexed and ranked by them we need to put a "nofollow" element in our links.
So there are a lot of things like that, we SEO's fight. And fight hard.
Does someone know why would google sitemaps says that my site has 124 pages indexeds for more than a week, and they dont show up on any circustance on google index, even with site:mysite or site:www.mysite
If we had submitted a Sitemap, and decided to delete it now, is it possible to reverse the damage done?
Sitemap.xml for my blog is not showing all blog posts like feeds show. I fully agree with you Rand
Hello,
I'm new to SEOMoz and stumbled upon this post.
My experience (I blogged about it there: https://blog.logeek.fr/2008/1/30/about-sitemaps-and-their-submission-to-search-engines) is that sitemaps are very useful for small or starting content sites at least.
Like many commented here, with a sitemap pages get indexed in days (I got visitors through search engines roughly 10 days after each post).
This is true for Google but also for Yahoo, MSN and ASK.Com, FWIW.
cheers!
Thibaut
If you have an RSS feed and ping Google Blogsearch you can get indexed in less than 10 minutes. No need for the XML sitemap to get content indexed rapidly...
Thanks for the tip PocketSEO - I wasn't aware of that.
Unfortunately this doesn't seem to work for me. I pinged blogsearch 26 hours ago and my content doesn't appear when searching it with the blogurl:myblogurl query.
Is there something to do to ensure fast indexation ? I'd be interested.
cheers
Thibaut
Did you write new content, or just ping an older post. Contact me through my site's contact form if you would like.
(duplicate comment)
The articles were written before my test. I went to the Google Blog Search help and now I understand:
"Why aren't my oldest posts listed ?" => Since Blog Search indexes blogs by their site feeds, it will only include items that have been posted since it started indexing a given blog. For most blogs, that will be around June 2005, or the time at which you submitted your blog for inclusion. We are working on ways to include older posts as well.
It makes sense.
I like to ping as many valid traffic sources as possible. That way the engines will pick up a post or a page from multiple sources besides my ping.
Do a search for something like "blog post ping list". WordPress comes default with a few. I think I have around 100 for my blog. I probably should go through that list. Thanks for the very indirect reminder.
Rand,
This is total speculation! Where is your data? Have you ever verified this?
Its been almost a year now since the post.
I have always had great success with getting more pages indexed using a xml sitemap.
Hands down, xml sitemaps get more pages indexed which means more pages with links to other pages.
Especially for deep content and for clients who have web dev teams who won't change or who are slower to change the website.
-Bart
Bart, I think it would be a great idea for you to author a post that positions your point of view and experience with Sitemaps. You could author it in YOUmoz or on your own blog. I think it would make for a really compelling read.
Thanks Rebecca. I am so totlaly behind you 100%. I have so many case studies I want to release.
I think I finally have permision to do so from OrangeSoda, Inc. if I do not mention our client and that I in no way reveal who they are by keyword rankings etc.
I'll have to think about this more. Cheers,
-Bart
I still think it helps to submit for faster rankings. However interested in reading the other posts on SEOMOZ too.
hey guys,bad news, I got anxious and decided to submit my sitemap, then I came across this writeup - I feel terrible now. Is there any way to reverse the effects of submitting a sitemap?
Hi! This post is actually from 2007 and is no longer valid. Thanks for reminding us we should add that to the top of the post. It's a good thing to submit your sitemap, so no need to stress about that.
Any update for 2012? :) With the popularity of Google / Bing Webmaster Tools increasing and their tool sets becoming more powerful, I would think that submitting a sitemap is becoming more important.
Perhaps we should be using other tools / techniques to analyze page weaknesses rather that just 'lack of indexation'.
If the site has a good structure and is a static site, it does not need a sitemap.
You guys are missing one vital element:
Time to index for modified and new content
Set up a site with 1000 pages and change a single page, add a link on that page to a new page, etc. How long will it take for Google to find that change? How long will it take to get the new page indexed without Google finding your link to it?
Now add Sitemaps: you can specify a new change-date for your page, then ping Google to pick up the modified sitemap file. Now go and watch your server logs.
There is no way (other than ESP) that Google could otherwise go directly to a modified page, crawl it and get it indexed as fast.
If your site is getting a full deep crawl every couple of hours then it won't really matter if you're a bit quicker with Sitemaps -- but who's site gets a full deep crawl that often? Of course if your new/modified pages are always linked from the root page it'll be fairly quick as well, but that's not always the case.
Sitemaps are not a replacement for a crawlable site - but they can help the bots to perfect their crawling techniques, save your bandwidth and at the same time concentrate on the important pages. Neat stuff!
John, we all know you have a vested interest in Google Sitemaps:
https://gsitecrawler.com/
https://gsitecrawler.com/en/copyright/
Anybody who's familiar with the 'Google Webmaster Help Group' at Google Groups is very familiar with the Softplus user profile.
Hi Somelad
Sure, I've been doing Sitemaps since the start. Does that make me incompetent on the subject?
Set up a test and then tell me what happens. Set up 2 sites with each 1000 pages, get them fully indexed, run one with Sitemaps the other without, modify a lower level page and add a link to a new page on that lower level page. Watch your logs, watch the search results.
... then tell me I'm wrong. I've done this test 4 times in the last 9 months and every time I do it there's no competition at all -- with Sitemaps it goes straight to the modified page and indexes it (and crawls the link as well), without Sitemaps it's the normal slow crawl.
It wasn't always like this, but it has been for quite some time. I don't expect you to believe me, but if you don't test it for yourself (and again, it's very easy to test, assuming you get 2x 1000 pages indexed) it doesn't make sense to put it off as hearsay.
Of course a lower time-to-index won't make your rankings jump, but it can be worth the effort.
Give me one reason why Google would not want to optimize their crawls - to go straight to the modified pages instead of re-re-re-crawling pages that haven't been changed in months.
Miaow!
somelad - I had no idea of softplus' site until you mentioned it - he made a point, didn't promote his site, what's the problem?
I'd have to disagree as well - at least in most of the cases I've encountered.
Your arguement is dependent on the fact that you'll notice if something doesn't get indexed. Most of the time I know I probably wouldn't notice, and if I was spending that much time looking at the performance of a particular page there are other tools and metrics I can use to figure out if the page has problems.
I personally like the fact that I can specify the priority of each page. I don't have hard data on it, but if users land where I want them rather than where the search engine feels like putting them, then that's a good thing that I'd lose if I didn't use a sitemap.
My other reasons for using sitemaps have been covered pretty well by other people, so I won't bother to repeat them, but suffice it to say I see a lot of advantages to using sitemaps and very few drawbacks.
Bottom line - sure, there may be some sites where it makes sense to not use a sitemap, but I think they're the exception, not the rule - in most cases, a well-constructed sitemap is an asset.
I'm in agreement here. My rule-of-thumb is about 3000 pages. Anything under doesn't get a sitemap, anything above that usually does.
It seems like on one hand you say not to include sitemaps but yet on a post one down from yours you say, "10. Be sure there is a sitemap.xml at the root of your site." Linkbait story?
Sonicko,
I'm confused, where are you referencing the...
That post wasn't made by Rand, it was made by Guillaume
Well regardless of it was made by Rand or one of his employees you would think that there would be a unified message, especially on the same day and within an hour or two of each other. I know that Rand is putting forward an "idea" and his employee is putting forth what the established way of going about it is, but I think this is more about link bait than anything else.
Actually, Guillaume isn't an employee of SEOmoz, to my understanding, just has been granted moz-posting-power. I would say there have been plenty of varying opinions shared on the blog.
And like many things, there is very little black and white in SEO. Opinions and theory are in fact what make this such a fascinating industry... only in the extremes are there really any right or wrongs... and I'm sure there are plenty who would debate that as well.
That and I think this view is a fairly recent one, and clearly still open to discussion.
I apologize, I couldn't figure out how to edit my comment, so yes I saw after that he wasn't a regular employee. I am curious about the industry in general about posts like this, which are just ideas or simply link baiting opportunities for the writer (SEOmoz).
No worries. This is a very open community that expects people to question everything.
Part of your civic SEO duty!
EGOL, 2K, Guillaume, GeoffreyF67, and I, Brian are not SEOmoz employees, nor are they "regular" employees. They're just blog authors.
"They're just blog authors" ... *evil glare from blog authors everywhere*
Well, I'm sure they all have day jobs--just meant that they're *just* SEOmoz blog authors and not SEOmoz employees.
I agree, Visser. "Just" was a poor word choice here. No offense meant though, as you see from Rebecca's explanation.
If the page lacks link juice or has any problems, it isn't going to rank anyway (whether it being indexed or not). So why does it hurt to not upload a sitemap?
Once you have fixed the problems that you are having, then I think it is a great idea to upload a sitemap. But I don't see the advantages of having them index a page that won't rank.
We've found in our enterprises (~25 etailers sites with 15,000-150,000 pages each) that using GSitemaps actually decrease our indexed pages and overall presence on Google, even after fixing most of the mistakes exposed by the sitemap and resubmitting. It's somewhat useful as a diagnostic tool but for the most part I fail to understand why it results in this kind of behaviour. We've tried it on several sites and eventually pulled it out and watched our indexed count (and in some, possibly unrelated cases, SERPs) rise considerably, sometimes even 3x.
At some point you have to ask why, and I'm still not sure. I don't think any big mistakes are being made at the page/linking level but in such a large, dynamically generated site it's hard to tell. Speaking personally though, we don't use them.
This type of response makes me wonder about the syntax used in people's sitemap files. Many of the auto-generated sitemap files I've seen don't handle dynamic url's very well, so if you aren't really careful about your lastmod settings you could easily hurt yourself by fooling crawlers into ignoring new pages, that it thinks changed before the last time it crawled your site. Also I think you need to be really careful with the importance setting. If you set your sitemap to one or two levels of importance - is that going to be the same way the search engine would see it.
I personally think a sitemap is a great tool, but you have to be careful to use it as it's meant to be used. If you're lazy with it, ditch it and let the search engines do their job. It's just another tool in the box.
We've had positive experiences in Our Company with Sitemaps. We have used them when we've had sites with supplemental issues. We check the problems and fix them, then we submit sitemaps for the sites and voila! After a couple weeks all the pages for the sites come out of the supplemental (hell) Index.
I really see the benefit in submitting a sitemap when launching a new site. Quality, natural external links take time, and using a sitemap gives Google a bit of a "nudge".
Does it help with ranking? Not in my findings. Do I update my sitemap every time I make changes to a page? No. The quality of my site is what will bring GoogleBot and others back, but I love having the option of getting them there in the first place.
Interesting theory though. Will have to think on it some more.
And the whole map creation can be automated either through a script and some CMSs are building in the creation or provide addons.
I saw a site I launched with a brand new domain name and little to no IBL's do very well in the results out of the gates, which was especially important as it was a site for an event happening in a month's time. There was also near daily content changes and I resubmitted the map daily. I think you might have seen quicker response and possibly a bump early on with Google, perhaps being in the early beta stages it was given a bit more focus for testing, but we'll never know for sure.
If there ever was a boost, certainly now there is less impact like that, either because it has been tuned down or the volume of submissions has increased dramatically over the past year or so.
Once you submit a Sitemap, it will update by itself every time you create new pages to your site.
Are you sure? Maybe I'm wrong, but not to my knowledge, unless you have a script to do that or it is part of your CMS to update.
Otherwise it is just a static file sitting on your site. Every so often the SE may pull it in to determine if there have been any changes, helping to alert them that there may be new pages or updated content.
I imagine they check against some kind of "master crawl record," since I don't believe the file is necessarily accessed everytime your site is crawled like a robots.txt file is... or should be.
But I'd welcome any clarifications or corrections if I'm offbase.
Identity, my mistake, you're right, it was not my intention to mislead the readers of this blog, I was confused by a thread I read in a forum, they mentioned: "They (meaning Google) will download the sitemap on a periodic schedule, you can ping or resubmit it to hurry that process up, but it doesn't affect the actual crawling of the site at all." So I took it as if Google actually crawls the site looking for new pages and adding them to the sitemap.
Sitemaps and Robots.txt.. just good tools to control things, glad to have them. Ofcourse depends which kind of site you have.
I find myself wondering why you wouldn't submit a sitemap initially to get pages indexed and then drop the sitemap. Any harm in that? I find that Google indexes my new pages quicker with the sitemap than without. Of course, I'm no SEO expert, I'm just a web designer with some SEO knowledge—probably just enough to be dangerous. :)
That is what I am wondering. So should I delete my sitemap.xml files on all my sites? One of them has like 6 important pages and about 65 pages overal. The others are smaller. What about the other sitemap files? Should I delete those too?
I tend to use Sitemaps when launching a new site (both small and big) and when I need to change the URL structure.
From my experience with launching new sites, it tends to get sites who may not be able to grow links very fast initially indexed quicker -- which is always a nice first step to show clients! ;-)
But as a necessary SEO method for all sites? Nope...
Are the new backlink features of google webmaster tools not subject to submission of a sitemap?
if they are, then thats reason enough for me to submit a sitemap in 99% of cases
"Are the new backlink features of google webmaster tools not subject to submission of a sitemap?"
Yes that is correct.
I think of webmaster tools as an interaction with robots, when you do something new, notify and they will come and take a look. :)
I also am a little cocerned about the way this post was done Rand, if you say "Top minds in SEO believe that..." does this make what "top minds" believe true? I understand that this is just a way to give more weight to what you also seem to believe is "true" but if it turns out to be incorrect, it will mislead.
Hmm, the sites that I tend to work with are large IYPs, with over 200k pages to be indexed. For those sites I've found that having a Google sitemap works well, in that it gets the site indexed quicker than those that don't have a Google sitemap set up. (as an aside, we do need to come up with a new name for them since they're no longer Google exclusives, and sitemap is just too generic).
For my blog, I've verified, but not added a sitemap. Which means that I can answer Special K's question - I see the backlinks, so you do not need to have a G-map set up for that feature to work.
Agree, and glad to see the SEs coming to a consensus!
I propose xml-maps, xmlMaps, xmlmaps... choose your flavor, or flavour!
I have a question along the lines of what Mac discussed in a post a few days ago -- it seems the "conventional" wisdom on this topic is that only very large sites should submit sitemaps, but what about a very small site with very few links? Isn't it a good way to get "found" by Google et al quicker than it would be found organically? I put up a new site a few weeks ago, very small, very few backlinks right now, and after submitting a sitemap it was indexed and producing results within a few days. If I hadn't submitted a sitemap, wouldn't it have taken Google much longer to find it? Thanks for the advice, Mitch.
Hi Mitch
No - a sitemap will not make Google crawl more (or earlier) than it would anyway. The sitemap will just help Google find the "best" URLs to crawl next. For a smaller site, the sitemap file will help Google recognize the new and changed URLs and concentrate some "crawler-priority" on getting those covered first. It won't change your rankings, but it will help you get your new and changed pages into the index slightly faster than without a sitemap file -- that's a good thing if you're covering some kind of current event and want to be found as one of the first people to write about it.
First, I'm glad there's a "Page Down" key because scrolling down this page is a chore ;-)
We have not seen any negative impacts from submitting a Google xml sitemap for at least a dozen sites - large and small. But perhaps that's because the way in which we submit them.
For example, we have one client that has 20k+ pages, many of which follow similar navigation patterns and page structure. In their case, we simply prioritized certain pages we felt were top level and would give value to the site overall. This included pages that get updated with reviews or have "real" information on them - many dynamic or daily updates occuring.
So out of our client's 20k+ total pages, only about 4k get the go ahead to be included within an xml sitemap.
I think many people miss the point here about sitemaps and exactly what should and shouldn't be submitted. If you very selectively choose top level pages, it can often act like a boost or "nudge" for Google to crawl faster. And with no negative experiences, we're not going to stop using it yet.
you have to play games by rule.Thats the game set by google...skipping it ist not a good idea.Its not google fault if you are not maintaing yopur page juice ...
Thanks for that Rand. That is very helpful especially since my company has so many new sites rolling out in the coming months and this will be a great piece of info to use in improving our SEO!
"I always tell clients that if their site is built correctly, they don't need to submit a sitemap. I've also never submitted a sitemap for any of my own sites"
Sugarrae's comments reflect my own. Getting hundreds of extra pages listed is of little value anyway if they don't a) carry enough weight to rank or b) provide sufficient link-juice to help other pages rank.
Like PageRank, SiteMaps is a gimmick. If you find a use for it, so be it, but I haven't found a use yet.
For the most part, most sites that are in need of SEO can't afford to just sit and wait and see which page is indexed on it's own merit.
They trust that the SEO is done properly (and if your pages aren't getting indexed I would be a dollah that they aren't) and that their sitemap submission will only benefit their indexing.
I admit though, that it is the last step in my own SEO process, and I don't spend much time tracking it's effectiveness.
Controversial though eh? How'd you get 90 comments already?!?
Love the site, love the blog. Smooches from Down Under
Iv'e never really looked at it that way, I think I might have to revise my article about the XML-sitemaps.com sitemap generator:
www.seoco.co.uk/xml-sitemaps-sitemap-generator.html
Hi Rand,
I don't fully agree with your opinion.
Perhaps if you have a high quality site with a lot of high ranking pages you don't need sitemaps.
However if you are launching a site from zero it can be different.
One of our websites was created recently and added to Google Sitemaps and although it is a new domain and new pages (so without any ranking value) all pages were indexed in a week by Google.
So I think it can be important for small or new sites.
At least this is my personal opinion.
Regards
Mac
They may be indexed, but that doesn't mean they'll rank. And now they're indexed, you've got to wait for the ones with no weight to fall in to supplimental before you know which they are.
Sitemaps was created to help huge sites get every page indexed. In that situation, it's more important to have numbers, and get the rankings via shotgun thinking. They weren't meant for smaller sites.
You might want to do some more reading up on them, and what effects they have.
This is a really interesting discussion! Most of my clients have small sites (really small, like under 200 pages). Point taken about using sitemaps as a crutch to get poorly-written pages indexed, but in my experience, it's not uncommon for pages that have been carefully optimized and that have good content to still not show up. If a sitemap can help me get what I consider to be valuable pages indexed sooner rather than later, that benefit outweighs some of the disadvantages noted in this discussion.
Ha... Good news for me... I took a look at what was needed to do a site map and said.. Nope.. Not Today... There is skiing to be done and money to be made.
After reading this I don't feel to bad about that decision.
I have personally witnessed it at least two or three times, with medium-sized sites that've been around for some time, that submitting a Google Sitemap will result in some pages falling OUT of the index...and the 'new' URLs in the Sitemap will either never show up in Google's index anyway, or else it will really take so many months until they do show up, that it's unclear if the Sitemap has had any effect or not !
So I would only consider adding a Sitemap for brand new sites that perhaps are not yet even in the index.
Regarding 'Verifying' your site: are you sure this is always recommended?
I'm using one login to Google's webmaster central, and my sites admittedly vary in quality.
Do I really want to risk that the sites of lower quality are associated in Webmaster Central with my better-quality and better-earning sites ?
I'm paranoid that, if I verify ALL of my sites, Google will know that my crummy site A is associated with my premium site B and could, in some way, then discount the quality of my premium site.
This is a really IMPORTANT question to me and I'm surprised that I haven't seen this addressed anywhere.
Do you care to comment on this Rand ?
Do you see any danger in associating good sites with lower-quality sites in Google's Webmaster Central ?
Somelad - if you've got some spammy domains, then NO, I certainly wouldn't register them with sitemaps. SEOmoz in general, though, is giving white hat style advice, so it shouldn't be applied to more gray-black hat projects.
Rand,
When I first read your post, I had a mixed reaction - part of me agreed, but part didn't. I've been thinking about it for the last few days, and I may be coming around to your point of view. As SEO's, we all did fine before Google SiteMaps was around, and it seems that it caters to less experienced webmasters and sites with poorer structure, which is consistent with your points. I have noticed that it will get a site indexed pretty fast, but then again, so will quality links. I've been doing some testing and finding interesting results with my sites. It seems that there was a weekly dip in rankings for a couple days after Google SiteMaps downloaded the sitemap file, then it would float back up after a couple days, until it downloaded the file again. Once I took it out this stopped, and everything seems to be floating a little higher than before. Very strange, but you may be onto something. I'm not fully sold yet, because there may be other factors involved. I'll keep testing though. Thanks for this article as it has definitely sparked some serious thought and discussion.
Hmmm, I'm getting some contradictory results...Verdict is still out.
--NEVERMIND--
In my mind sitemap is nessary to submit to google. For clients ,it is very helpful for the content to be index. Other wise , Our clients sales products. For SE , there are some purposes for google to develope this tool. That's just my opinion.
There are always rules in game.When you submit to google you have to play by rules.Simly saying that dont submit isnt the right way....Indexing has the greatest advantage of getting ppl from search engines..so if you want to earn you have to get it indexed.
So be careful when u write...Play by rules not run away by saying let it do the other way...
Please don't speak in txt talk. Use proper grammer and spellings.
I have a website (classadrivers.com) that has 62,100 pages indexed by google right now. We do not use a google site map for this site. Another website that I own (cadwebsitedesign.com) shows 2,390 indexed pages and we do have a google sitemap for that site. The odd thing is, we have over 5000 pages in that site right now (news articles, etc...)
Both sites perform equally well for our desired search terms, but I see some merit to not using the google site maps.
I did, however, find a great inexpensive tool that helped me generate the sitemaps at xml-sitemaps.com If you just have to have a sitemap to submit to google and the whole sitemaps.org movement, this one was worth looking into.
Interesting read. Ive posted a response on my blog, at https://www.virtualmarketingblog.com
Here is the post if you wish to comment:
https://www.virtualmarketingblog.com/index.php/20070221/some-say-not-to-submit-sitemap-to-google/