How many pages has Google indexed?
This question and the problems surrounding it run rampant through the SEO world. It usually arises when someone starts doing searches like this:
Google claims to have 93,800 pages indexed on the root domain, seomoz.org. That sounds pretty good, but when I ran that search query last week, the number was closer to 75,000 and when I run it again from Google.co.uk 60 seconds later, the number changes even more dramatically:
How about if I hit refresh on my Google.com results again:
Doh! Google just dropped 8,500 of my pages out of their index. That sucks - but not nearly as much as managers, marketing directors and CEOs who use these numbers as actual KPIs! Can you imagine? A number that means nothing, fluctuates 300% between data centers, can change at a moment's notice and provides no actionable insight being used as a business metric?
And yet... It happens.
Fortunately, there's an easy way to get much, much better data than what the search engines provide through "site:" queries and this post is here to walk you through that process step-by-step.
Step 1: Go to Traffic Sources in Your Analytics
Click the "traffic sources" link in Google analytics or Omniture (it can also be called "referring sources" in other analytics packages).
Step 2: Head to the Search Engines Section
We want to find out how many pages the search engines have indexed, so the obvious next step is to go to the "search engines" sub-section.
Step 3: Choose an Engine
Choose the engine you want indexation data on and click. If you have both paid and organic traffic from this engine, you'll want to display organic only at this step, too.
Step 4: Filter by Landing Pages
The "Landing Page" filter in the dropdown will show you the traffic each individual page on your site received from the engine you've selected. This also produces the magical "total" number of pages that have received traffic, described in the last step.
Step 5: Record the Number at the Bottom
That count tells you the unique number of pages that received at least one visit from searches performed on Google. It's the Holy Grail of indexation - a number you can accurately track over time to see how the search engine is indexing your site. On its own, it isn't particularly useful, but over time (I usually recommend recording monthly, but for some sites, every 2-3 months can make more sense), it gives you insight into whether your pages are doing better or worse at drawing in traffic from the engine.
Now, technically I'm being a bit cheeky here. This number doesn't tell you the full story - it's not showing the actual number of pages a search engine has crawled or indexed on your site, but it does tell you the unique number of URLs that received at least 1 visit from the engine. In my opinion this data is far more accurate and more actionable. The first adjective - accurate - is hard to argue (particularly given the visual evidence atop this post), but the second requires a bit of an explanation.
Why is Number of Pages Receiving ≥1 Visit Actionable?
Indexation numbers alone are useless. Businesses and websites use them as KPIs because they want to know if, over time, more of their pages are making their way into the engines' indices. I'd argue that actually, you don't care if your pages are in the indices - you care if your pages have the opportunity to EARN TRAFFIC!
Being a row in a search index means nothing if your page is:
- too low in PageRank/link juice to appear in any results
- displaying content the engines can't properly parse
- devoid of keywords or content that could send traffic
- broken, misdirected or unavailable
- a duplicate of other pages that the engine will rank instead
Thus, the metric you want to count over time isn't (in most cases) number of pages indexed, it's number of pages that earned traffic. Over time, that's the number you want to rise, the number you want marketers to concentrate on and the KPI that's meaningful. It tells you whether the engine is crawling, indexing AND listing your pages in the results where someone might (has) actually click(ed) them.
If the number drops, you can investigate the actual pages that are no longer receiving traffic by exporting the data to Excel and doing a side-by-side with the previous month. If the number rises, you can see the new pages getting traffic. Those individual URLs will tell a story - of pages that broke, that stopped being linked-to, that fell too far down in paginated results or lost their unique content. It's so much better than playing the mystery game that SEOs so often confront in the face of "lower indexation numbers" from the site: command.
Some Necessary Caveats
This methodology certainly isn't perfect, and there are some important points to be aware of (thanks especially to some folks in the comments who brought these up):
- Google Analytics (and many other analytics packages) use sampled data at times to make guesstimates. If you want to be sure you're getting the absolute best number, export to CSV and do the side-by-side in Excel. You can even expunge similar results from two time period to see only those pages that uniquely did/didn't receive traffic. In many of these cases, you might also only care about pages that gained/lost 5/10/20+ visits.
- Greater accuracy can be found from shrinking the time period in the analytics, but it also reduces the liklihood that a page receiving very long tail query traffic once in a blue moon will be properly listed, so adjust accordingly, and plan for imperfect data. This method isn't foolproof, but it is (in my opinion), better than the random roulette wheel of site: queries.
- This technique isn't going to help you catch other kinds of SEO issues like duplicate content (it can in some cases, but it's not as good as something like GG WM Tools reporting) or 301s, 302s, etc. which can require a crawling solution.
I'd, of course, love your feedback. I know many SEOs are addicted to and supportive of the site: command numbers as a way to measure progress, so maybe there's things I'm not considering or situations where it makes sense. I also know that many of you like the number reported in Google Webmaster tools under the Sitemaps crawl data (I'm skeptical of this too, for the record) and I'd like to hear how you find value with that data as well.
p.s. Tomorrow we'll be announcing two webinars (open to all) about using Open Site Explorer to get ACTIONABLE data. Be sure to leave either Wednesday the 27th at 2pm Pacific or Thursday the 28th at 10am Pacific free :-)
Many valid points all around... I think the important thing to remember is that no tool or method is going to give you the most exact, perfect information that we'd all like. However, the level of information easily accessible to our industry is relatively unprecedented; and we often become so used to that that we can get lost in the minutia.
Most of this information gains more usefulness when contrasted to other information, and over time.
Actual URL list/count is still important. CMS may provide this, but usually more complex than that due to guided navigation, sorting, presentation, pagination... so also do a thorough crawl of your own site with Xenu, GSiteCrawler or crawler of your choice.
Of course, for that number to be truly useful as well, you need to then understand what level of duplication and URL bloat the site has...so not just how many pages are there or how many pages are indexed, but how many pages should there really be for optimal performance and then indexation based on that.
Expanding the date range to much larger periods may better help address long tail pages. Beyond that, if a page had delivered a single visit over the last 6 months, it may be worth questioning the value...does it need to be indexed? Is it merely a supporting character? Is it more valuable having but blocking from indexing and crawling to maximize crawl equity?
And of course, you still need advanced query data if you are doing comparisons...I'm assuming you don't have analytics access to your competitors sites of course.
In which case, at least in Google, I still prefer to append the following to the SERP URL and request:
Which brings in all omitted results and goes to the last page (of a 10/page configuration). That said, I've seen less "wobble" in numbers reported this way than in the past. Perhaps that means that the indexation reporting is more accurate, or this query modifiers may be less impactful than they were.
Sure, I'd love to have perfectly accurate information, but I know that isn't going to happen. But having multiple methods and data points at least helps to frame up the picture, and in the end, the more important number may be the relative change of all of the metrics over time.
Thank you. I was going to make this point myself. There are no exact tools in this industry. You cannot pull exact rank, you cannot pull exact PR, etc. This is all just getting good approximations in which to make educated decisions about your site.
Nicely stated.
TBH, I think Rand's blog entry is an excellent approach that just needs to be framed differently. This post isn't about getting "real numbers" or determining exactly how many pages you have indexed.
What this entry is really about (and he doesn't really state this until mid-post) is getting more useful data than advanced site search operators can provide. There's nothing actionable about performing site: searches.
Don't get hung up on data precision 'cause it's not going to happen.
Don't rely on GA to give you reliable numbers for this if you see this message:
"This report is based on sampled data."
- which you *will* see for any reasonably large site where getting indexation data is difficult.
By way of example, I've run this metric for the first two weeks of Jan and the first 3 weeks of Jan and got a higher number of landing pages for the first 2 weeks, which clearly isn't even internally consistent, let alone accurate.
I still use the site: operator on Google. It makes it easy to get information about subdirectories. I get figures that are good enough to show me trends, which is useful enough.
I have also found that you can miss out on some of the finer details when it is sampling the data; however you mention the solution - decrease the reporting period.
Some clever scripting via the API should make it feasible even if you have to reduce it to daily!
Yes this is a good point - but given that the stated idea was to find the total number of indexed pages, you're then faced with the non-trivial task of deduping the daily landing page counts.
More generally, I'd apply a smidge of scepticism to all stats, not placing too much reliance on any one source - but I use everything I can get my hands on to help me understand the whole picture. Not very controversial!
A useful article, but like you HenryPUK, I'm a little sceptical of GA.
Whilst I can believe the site:blah technique is far from reliable, I'm not overly confident in the accuracy of Google Analytics when it comes to providing exact figures. I've used GA alongside a couple of different web server log-crunching packages and had alarmingly different results for visitors, page views and referrers amongst other things. Google themselves say their GA reports shouldn't be taken as exact, may differ from other packages and should used for measuring trends.
I'd be curious to see a straight comparison on a few sites between indexed pages and inbound links as counted by GA, Webmaster Tools, site:blah and Open Site Explorer.
The old adage: there are lies, damned lies and statistics is fairly true when it comes to measuring web stats in general. Which is a shame, because one of the big arguments for pushing marketing spend online is that it's a more accountable medium than its offline brethren.
Tracking this kind of metric month-to-month is not only applicable to organic search traffic, but also other traffic sources that send traffic to the long tail of your site.
We recently improved our site navigation and internal search functionality, and noticed a lift in the number of unique pages receiving traffic from unpaid referrals. This suggested that improving the navigability of our site meant more pages were getting found and shared by our users.
I could imagine situations in which you might event want to measure changes in the number of pages receiving traffic from a specific source.
That is an awesome point. Thumbs up!
Great post Rand, thanks. While I agree with the idea of actual indexation numbers not being very important, and the site: operator can be wildly inaccurate, I still use site: but I add on /* to the end in an effort to measure only the primary indexation. So for example, instead of site:seomoz.org, I would have used site:seomoz.org/* to only see the pages that either currently rank or have the potential to rank, and thus drive traffic.
Does anyone find this to yield somewhat more accurate numbers?
Though the number of listings does not hold a lot of weight, does the order of the listings mean anything? I used the "Site:" query for my company and was surprised that some of our less-frequented pages appeared on the first page. Any value in the order? Thanks!
thank you for the easy to follow step-by-step explanation.
will definitely use it for some customers.
Rand,
Using the # unique landing pages within Google Analytics in my opinion needs more filtering than the steps you list above. Reason being, if someone visits the cached version of your site via Google (checking latest version of page cached), GA counts the "/search?q=cache:UoSXhkImiKUJ:www.domain.com/+keyword+keyword" as a unique landing page. Hence driving up that number.
Granted this would only likely be noticable on smaller sites and in a limited capacity, however, I think it is worth noting.
Enjoyed the post!
Cheers,
Ross
that's true Ross, but how many people are actually visiting your pages using "cached" version? I would think less than 1%?
This technique is only relevant if, and only if, the analytics is correctly installed on every page!
Rand,
I'd say you missed something - which is that looking at the numbers provided by the solution above is fairly limiting.
Try comparing these "indexation" numbers (whether accurate or not) to the numbers of pages you're actually serving up. Most CMS/server side systems should allow you to tally up the approx number of pages your expecting.
At which point you can do an excel of number of pages per site, number of pages indexed by site: in each google domain you target, and then finally add your solution - that will actually give you far more actionable data than simply "pages indexed"
I'm also a bit wary of your solution, since i'm guessing it only tracks pages gaining traffic, not pages indexed yet without traffic. And if your enterprise level (and maybe not just in those cases) you'll likely find hundreds of pages indexed but without traffic...
Good thought provoker tho.
Rob
I agree more with Rand in this, I don't really care at all if my pages are indexed if they aren't bringing in any traffic. If I show up on the 100th page of Google for 'red widgets', then is that page helping me out by just being indexed? No.
Actually my point is that whilst that report can be of use, you can spot bigger problems by combining the reports, such as indexation issues, duplicate content, faulty robots.txt etc. If your excluding everything that never gets traffic you'll never know....
The GA interface no longer has the search engine view in this demo
[You can delete this after reading]
You've got a bug. The following text:
Indexation numbers alone are useless. Businesses and websites use them as KPIs because they want to know if, over time, more of their pages are making their way into the engines' indices. I'd argue that actually, you don't care if your pages are in the indices - you care if your pages have the opportunity to EARN TRAFFIC!
Being a row in a search index means nothing if your page is:
is in a <span class="huge"> and appears in your page source, but is not displayed on the page in IE, FF, or Chrome. Which makes that part of the post quite confusing.
Excellent post, BTW
Doh!! Thanks Gil - fixed it up. I need to get someone in another timezone editing for me :-)
Hi, I know that this post was written a while ago – but I don't see "search engines" as an option in my GA now. Where should I look for this info?
Nice post! It really helped me see which of my pages are actually getting indexed and being sent traffic from Google!
It's been a while since this article and things have changed in Analytics. Can't find the Search Engine option now. Can you make a small update on how to find the indexed pages?
yep this works really well we have used it on large clients in the past to highlight indexing as an issue...
it is far more efficient and less painful if you plan on doing this more than once, or for more than one of your clients websites use Advanced Segmentation.
Also i would advise to remember to remove any of the following using advanced segmentation
Another quick tip if you are only concerned with pages indexed that attract visitors from Australia, use the geographic features in advance segmentation to just include Australia.
1) How many pages has Google indexed? is the first question to ask2) How many pages has not Google indexed? is the second one, even more important than the first oneThe ratio matters. Really.
I have not found a way to produce such a report in Omniture, using SiteCatalyst 13.5 nor the Discover tool. If anyone has found a way, I'd appreciate pointers.
Our Omniture implementation relies on "content ID" values, which group thousands of content pages into a single ID. It's useful for reporting in many cases, because we often do care more about traffic to a site section, rather than traffic to a specific article (of thousands) in that section.
So, reports that one might expect to be useful, such as the "entry pages" report, end up not being useful in this context.
Great post
Thanks for sharing,I really enjoyed reading the post and the discussions.
Nice post (btw, I described the same metrics in my post at SEOmoz.org: https://www.seomoz.org/ugc/seo-kpis-to-track-with-google-analytics), but I'd argue on indexation level metrics not being important... Having pages indexed is critical to think about ANY organic traffic (no indexation = no traffic). The figure you showed us is more about converting the potential of being indexed into benefits - good exposure that brings traffic to particular pages.
Let's take the following scenario as an example:
- I can see 10.000 unique landing pages for SEO in a specific period of time
- Later on, I can see 15.000 unique landing pages for SEO reported for the next week.
Is this a good information? Yes, BUT it's not enough - it doesn't really tell me if Google is (or isn't!) able to find & crawl new pages on my site...
This would be useful and complete provided that every page indexed by Google would bring me at least one organic visit (thus, would be reported in GA report).
The problem with indexation metrics is that there are no accurate ones, nowadays :/ We used "site:"very intensively for a long time but in last 4-6 months the reported data is so strange and inaccurate that we decided to drop it.
Instead, we use some unique strings added to the bottom of each page within particular web site and track number of reported for a query using this unique query string. Far from perfect but gives still valuable data (especially if web sites are not big and the number of results is not estimated or rounded by Google).
Hi
I am not sure whether this is the right place to ask this question, but I could not find a more appropriate place..
Can anyone shed some light on 'Index Saturation'?
What is 'Index Saturation'? How is it important from SEO point of view? Can it be measured??
Hi, you might find this podcast useful:
SEO101 Podcast on Index Saturation
Regards, Dave
Cheers for the post Rand.
One thing I see day-in day-out is people using the site: command in Google and taking the indexation count as the Bible truth. Nobody seems to listen to me when I tell them that they're wrong and it's an inaccurate count of what has truly been index.
Quick question, is this the type of thing that only happens to the larger sites? For example, I have a few clients with static sites who only have ~20 pages max. I've never noticed/tested this before, so would any of these pages drop out of the index every now and then, to reappear a few seconds later? From what you said, I'd assume so - though I know the likelihood will probably increase as the size of the site increases.
Hey, Great post! FYI there seems to be a formatting error with the bullet points making the font way to big.
That count tells you the unique number of pages that received at least one visit from searches performed on Google. It's the Holy Grail of indexation - a number you can accurately track over time to see how the search engine is indexing your site.
Is this the pages google has visited or people via google? I want to know the number of pages (including those with parameters/wrong upper case/lower case combinations) that Google has on its index as I try to work to remove these pages?
This method is very good!
I ran across a tool from ArtCharm.com that says how many links are indexed by the different search engines, but I guess this data might not be accurate either...
Maybe we don't need the real number of indextion.
we just need to know the overall NO. and delve into which pages are more important.
Hey I face the same problem, but i dont think even google analytic display the exact count of indexing pages.
when i look in google search for total number of indexing pages with "site:" query, it shows me 431 pages.
when i look in google analytic, as guided by you, its showing me 20 pages.
when i look in google webmaster in the section of site configuration -> sitemaps... it shows 216 pages.
now to whom we should trust. How we come to know exact number of indexing pages by google.
Good to see I'm not the only one noticing these site command problems - I blogged about it on 2nd Jan when I noticed a client's site showing 227 indexed in google.com as against 3040 in .co.uk. So beware anyone in the UK that uses anything like the SearchStatus tool shortcuts which use the .com.
Your suggestion, while by no means perfect, is a possible alternative that has better vaue and gives clients something more positive and less simplistic to think about.
More importantly it highlights the lack of genuine reliable data. I'm increasingly finding Webmaster Tools data to be useless - it's been that way since they changed the layout (early-middle of last year?) and the deeper you dig into Analytics results the more you have to think about how they might be misleading. e.g as a simple example a friend blogged recently about Product Search results appearing as organic.
Stay skeptical folks - after all, they tried to tell us there was no supplemental index any more!
what about sitemaps in webmaster tools? it gives you number of urls in sitemap and number indexed... if you really do have every url in sitemaps would you get an accurate indexation count?
Maybe, maybe not, Google can say whatever they want about indexation because they probably have seen it at elast once, sometime. That doesn't mean your page isn't in the lowest of supplemental indices.
What does this mean for link acquisition sources? I've notced on HQ websites, even a place such as JoeAnt, some pages weren't indexed by Google at the time of my search (I just inserted the URL into the Google search bar).
Is it possible I simply queried the wrong data center, and a search a second later could have gotten me a direct link to this page? Does the fact that this URL isn't present at every data center a strong indicator that I should not have this URL as a link acquisition target?
First time I was checking this in google it freaked me out too :)
I had an issue like this when I was trying to sort out some indexing issues with some new pages I uploaded.
I was working on a site targetted for the UK. I had just created and uploaded 9 content rich pages, and somewhat optimized, xml sitemapped them, went to stronger pages and included links to these new pages. Basically as much as I could.
lo and behold I was using site:www.site.com and checking.
Google.co.uk was reporting 5/9 pages indexed.
google.com was reporting 9/9.
3 hours later, google.co.uk reported 7/9, then eventually 9/9 before I left the office.
Next few days back, we were back to 5/9 pages, on co.uk. The .com for some reason seemed stable.
Now the issue has been remedied (all by itself?). All 9 pages have been indexed ever since.. but what was driving me a little nuts was why was it reporting it being indexed and then later it wasn't. Whether I was checking every 30 mins, or in 3 hour gaps.
If you have intentional duplicate content (with appropriate canonical info), each with unique URLs, that is not addressed by this GA breakdown. Of course, site:domain.com doesn't help with this either, but it's relevant when you are focusing the search on pages to concentrate on or improve.
I'm curious as to why you find the Google Webmaster Tools data to be "skeptical"? The number has been fairly accurate in my experience. How does that number compare for you as opposed to analytics?
Well, I am always sceptical about information which Google tells me. ;) Don't believe everything, Google tells you. :D But as far as I know, there were no reason not to believe to the # of indexed pages in webmaster tools. Nevertheless it might always be possible that, they were also partly projected or not 100% the truth.That for I asked if someone have had a bad experience on that numbers.
I must agree with Lucian, trusting what Google tells you about your site isn't the best way for Google to love your site. If you don't believe me, go search for 'search' in Google, last I checked, it was in 6th place on the SERPs.
Hi Rand,
the "Unique Yielding Pages" Metric is a very useful metric.
But if you have a reaallly huge longtail with content which is definitly not visited every month (e.g. private facebook profiles, etc.),
than this metric doesn't show you indexation status. Nevertheless it is very useful.
Especially if you use the "(Unique) Keywords per (unique) page yields" as a KPI. That one and the # of unique page yields are very actiobable.
To get your indexation status (in cases like the upper one) I do prefer more the number of indexed pages out of the webmaster tools if you upload a sitemap file with all your pages.
Does anyone have had a negative experience with that way of "Indexation monitoring" (sitemaps & webmaster tools)?
Good points Lucian. I would say that in the first scenario you describe, where pages aren't visited each month, you could certainly extend the timeframe of the metric to 2, 3 or even 6 months to help get a better overall number. However, even in those cases, the counts on a monthly basis are likely to be indicative of improvements or problems that you can measure/address respectively.
And I believe that combining keywords datas with indexed pages datas, we can also see quite well the sometimes so "foggy" long tail, am I right?
Yes, combining the # of unique keywords combined with the # of unique pages allows us to see the thickness of our long tail.
I can't agree with this post Rand.
First of all, many of pages can have not well done opimization, so they can rank not well in search engines - in this case, you will have no traffic from that pages - that pages can be very import part of your business offer - in case you describe, you will not focus on them.
Also you show all (total) results (including paid traffic). In step 4 - you should turn to non-paid traffic. It will be more useful information I think.
Great point! We don't have any paid traffic, so I didn't need to exclude it, but for many sites, that would be a critical move.
As far as pages that can't earn traffic - why do you want to know how many of these are indexed by the search engines (and do the wildly fluctuating, inaccurate numbers they show via site: really help)? I suppose it shows some type of missed opportunity, but it seems that internal analytsis and comparison against the data above would be more helpful....
Honestly, when reading the post, I was assuming that the CPC traffic was something not said but almost clear that has to be excluded (not because of being a genius, just common sense).
What about 301 redirects which may still be mentioned in the SERPS but may be excluded from the landingpage?
Regardless of the method you use, site:operator, log files, or this method outlined by Rand, I don't think you can get rock solid reliable numbers.
Rand said it best when he said "On its own, it isn't particularly useful, but over time...it gives you insight into whether your pages are doing better or worse at drawing in traffic from the engine.
Used as an overall indicator of progress (or lack of) is the best I think you can hope for.
How do you go about exporting CSV files of 16,000 landing pages to compare pages or types of pages that you have been loosing or gaining? 500 at a time?
Edit: found out, add &limit=5000 or whatever limit you need to the end of the URL in GA before exporting, hit enter to update the URL, then download as CVS (not CVS for excel).
I am left with looking at month over month numbers of visited pages, which I am not liking! Which is odd since our organic traffic has been trending postively.
Thank you for this great post.I have been scratching my head several times because of the difference in my amount of backlinks.
Now i know what to do
Thank you!
Annette.
I've got to cast a little doubt on this post, Rand. SEO activities can sometimes require consolidating landing pages as much as expanding their number, and this will affect the total number of pages visited.
Last year I took over SEO responsibility for a fairly ragged (SEO-wise) ecommerce site, and in preparation for the Christmas trading period took exactly this approach, broadening the number of long-tail keywords we were chasing, but at the same time consolidating the targeted landing pages into a (slightly) tighter group.
(The reason for the latter strategy was that some individual product pages were ranking highest for broad product category terms - e.g. a single product page was ranking highest for a range of 'funny gifts' keyphrases, whereas we sell many funny gifts and needed to target a more suitable 'gateway' page, i.e. offering more choice upon first entry into the site)
The result was that - for year on year December traffic - we saw our total landing pages decrease marginally, whilst non-paid visits increased hugely.
Our Google organic landing page results for December 08 and 09:
Dec 09 - landing pages down 2.3% on Dec 08, non-paid visits up 25.7%
I just wanted to sound a note of caution that, whilst an interesting exercise, this GA report may need to be considered against a broader view of your business and SEO activities.
Maybe i'm missing something here but that method only shows you the number of pages that came from search engines that got visited. What if you have pages that rank for terms that are never typed in. You would have pages indexed but would not be in there. Nobody has every page visited on their website. Unless you put a unique code on every page and then visited it through Google you would not know for sure. That number would be really inacurate.
Thanks Rand for sharing such an useful and easy way to find the actual traffic. I have always faced such problems for my client sites and I keep on explaining them that it is the indexation problem from Google.
Thanks for sharing great tips and the information on the webinar here.
this post gets a thumbs down for me. this is not the holy grail of indexation. this only tells you the number of landing pages that were reached using a keyword that somebody used, and that google deemed relevant for your content. that doesn't mean that more pages weren't indexed.
gauging this number over several months is useless, considering that you are only looking at data within certain timeframes. if you have a website that is dedicated to health prroblems, and you are cataloging your numbers in the summer. as you progress further along, perhaps, seasonally, you find that people are now reaching your content concerning influenza. maybe your site had content about h1n1 before anyone knew what h1n1 was. does that mean those pages weren't indexed or just that they weren't being searched for? then what happens come next summer, your landing pages drop again if you don't maintain that original start date. does that mean google dropped pages from its index? no. it's a lot of work for nothing.
Off topic, I know. So try not to slam me.
Does anyone remember a little while bck (last month) there was a blog post with a list of websites for essential little tools. In that list there was a website that scans a web page for any malicious code - pretty handy little tool. I can't seem to find it, doesn anyone remember?
thanks for the post Rand. So there is no real way to find it out.
However I believe that what really counts is how many pages do really bring traffic as you mentionned.
Anyways, I would rather have x10 less pages indexed, but x10 visitors on my site.
Great information. I look forward to the webinar on Open Site Explorer.
A few days ago I added my site online but when I tried to figure it out on Google it was missing. Does it usually take that long? Can i use some other tactics to enhance the indexing speed?Any good tips?
very good post i review at this post so Great post and amazing i am satisfied at this post very helpful post
thanks for moz team .
Thanks for the post, Rand. I have experienced finally, what I believe is a total Googlebot crawl of my 200+ page website. I now am wondering what I am seeing on the Google Webmaster Tools, as they are indicating about 242 pages crawled. I really only have around 206 or so. Is a Google Indexed page really a real page, or does Google break a long page down into 2 or more Google pages? Or are they looking at some really old and no longer valid older page versions that don't even exist? I'd appreciate your viewpoint, thanks. Rand S.
--edited by Jen - removed link
Thanks for the posts, Rand... as it comes almost perfect in timing for me, as a client of mine had just sent me an email crying out exactly this issue... "Google has stop to index the 25% of my eCommerce pages!!!!".
I was going to explain that this data is depending on the Google server responding to the search in that moment, but your tip & trick is going to be more effective (and my client can check it every day having access to the Google Analytics data).
Ciao