site:yoursite.com *** -asdf
is no longer functioning. I'm fairly certain that the source of this issue can be traced back to a Mr. Matt Cutts, who was permitted to sit in at the "Give It Up" session at SMX where one of the first broadly public mentions of the query was brought to light. If you aren't familiar, the query used to display the full list of a site's pages that resided in Google's supplemental index - the place where pages with small amounts of link juice live. After Matt heard people discussing the query, he promised that they would be shutting it off at Google, much to the disappointment of webmasters.
Let me explain why Google should bring the query's functionality back and why it's actually in their best interests to do so.
- Supplemental displays are really only useful to site owners, not competitors. There's not much I can do with the knowledge that one of my competitors has a lot of pages in the supplemental index. (OK, there's one thing*)
- Seeing those results lets site owners identify potential places where they may not be pointing internal links or exposing engines and visitors to content that they've "left by the wayside"
- Having site owners "fix" the supplemental issue on their own, primarily with better internal linking, is fundamentally better for Google and their searchers, because if a site owner values content but doesn't know that Google isn't seeing links to it, they may never recover it properly.
- Googlers - you don't have to make the query public. You can put it inside Webmaster Central and make it only accessible to site owners who've verified themselves.
OK. If you're more of a visual learner type, let's have Googlebot & friends walk you through this:
_
So, what we've learned is a rule that we'll learn again and again. When search engines combine transparency and trust, webmasters win, search engines win and users win. So, how about it, Google?
I just want to share one more story about this. A few weeks back, I was in Washington DC helping the great folks at NPR with some search marketing strategies, and we were able to use this query to see some serious issues with how they archive old content and their link structures on some lost but not forgotten sections of the site. With some changes in site architecture, they'll probably be able to bring some great quality pages, including audio and text content, back into the main index and ranking high in the results where it belongs.
What I'm saying is that we (webmasters) are not using this to do evil, we're using it to do good. Taking away tools that let people do good isn't good. It's evil. Do none of that :)
*That one thing is pretty bad. If you find a site with lots of pages of unique content in supplemental, and you're an evil splog creator or scraper/re-purposer, you can use that content without much risk of being kicked out for duplicate content, because Google doesn't consider those pages particularly valuable. However, Google can certainly find many ways around this, not least of which would be enabling the query through Webmaster Central.
Hey, I tried to comment a few days ago and it didn't seem to be working for some reason.
Anyway, a couple quick thoughts:
- This query wasn't new to us at the "Give It Up" session. It's been talked about publicly before, e.g. https://www.seobook.com/archives/002047.shtml back in February. If anything, the "Give It Up" mention of this query was more of a reminder to check on why this wasn't fixed yet.
- I believe it's good to remove this query because I don't want people to get fixated on Supplemental Results and focus on them to the exclusion of other aspects of SEO. We saw that happen with the toolbar PageRank bar and ended up slowing the update rate on the visible toolbar PageRank to every 3-4 months so that people didn't spend too much of their time concentrating on PageRank and less on other parts of good SEO.
- Over time, the supplemental results are less and less supplemental and more and more likely to show up for any given query. As I mentioned at SMX Seattle, my personal preference would be to drop the "Supplemental Result" tag altogether because those results are 1) getting fresher and fresher, and 2) starting to show up more and more often for regular web searches. Especially as the supplemental results get more fresh, I'd like to leave that tag behind because it still has some negative connotations for people who remember the previous implementation of supplemental results (which has now mostly been replaced with a newer/better implementation).
Those are the main reasons why I think it makes sense to change this query.
Maybe giving the effect a different name than "supplemental" would be better. It would cause less confusion and anyone who hasn't kept up would understand that it is different now.
I mean the definition of supplemental doesn't really match what is going on now, does it? Even the word "supplemental" has a negative conotation that just isn't deserved.
But it is definately valuable to know which pages are firmly entrenched versus "hanging in there" so I'd rather not see it go away completely.
I like Rand's idea to add it to Webmaster Central.
Matt - I take your point about supplemental (and you've mentioned as much before) but I think there should be a decision made on what exactly will become of them. If the suppliemental tag is left in place, then it clearly means something (just a low PR?) - and while it means something then we would like a way of reporting on it.
From a users perspective (or should I say, a webmasters perspective) I tend to associate a supplemental tag with google effectively telling me that the page in question is not really relevant and while it may rank for some specific highly targeted searches it is unlikely to be returned for any more general searches. For example, I see this a lot when a page doesn't have unique title tags on their site - a lot of their inner pages will fall into supplemental very easily.
While it may not be as harsh as it used to be, supplemental is still alive and kicking and while it's still around please can we get a report for it? (and to echo everyone above, I think webmaster central is definately the place for it).
Edit: the strikethrough doesn't show in my comment so had to leave it out...
Strikethroughs haven't been working for a while I don't think...
Matt you are right many webmasters started wasting more time on removing supplemental pages, but then if they should not work on that why are supplemental pages first of all shown in public?
Hamlet - I had heard from Mr. McGee that some datacenters are still allowing use of the command, but my guess is that this won't be for long.
That would explain it. I still get the same search results when I use the datacenter tool (https://www.mcdar.net/dance/index.php), though.
Providing that information via WebmasterCentral and through an API would be great.
Still getting the results in Toronto, Canada so they haven't shut it down here either yet.
WI must still be using a functioning datacenter. I should of read more before I posted.
Hm, let´s wait, till then use "*** -adghasdtrb"! ;)
I was pretty bummed when I tried it this morning and it didn't work. Seemed like a great tip one the SMX information embargo was lifted. I've heard some complaints that the tips from that session seemed pretty watered down once Matt was allowed to hang around. I guess the next one will be sans SE representation... and maybe have solid walls too, eh?
Nice post Rand.
Very informative and trustworthy blog. Please keep updating with great posts like this one. I have booked marked your site and am about to email it to a few friends of mine that I know would enjoy reading.
Amit I agree with you.
SEO is a composition of things. You can watch the search engines for 40 years and still be lost.
It is good to be a well rounded person to understand how people think; hence, what search engines want to serve to the people.
I tried "site:hamletbatista.com *** -asdf" from here (Dominican Republic), and I got back all my supplemental pages, 105 pages :-(
Maybe they are blocking it to US searchers only.
I just used "site:randmh.com *** -asdf"
and "site:www.randmh.com *** -sljktf"
From Wisconsin and they both appear to be working. There are way too many in there!.... but it's bringing back results:)
Until recently I was seeing less than half the results as what I'm seeing now... Hmm?
News just in!
If you're using Google.co.uk, you can still access the supplemental results. Using the site: command for UK sites on "pages from the UK" seems to bring back the supplemental results alone. Nifty, eh?
Good spot carps - although it's worth noting that the usual site:domain.com *** -asjb still works in google.co.uk so this is just an easier way of doing that.
Blimey - does this mean we've got one up on our transatlantic cousins for once? Or just that Google can't be bothered updating us as quickly?
Hmm...
The latter I do believe.
Interestingly, this seems to work for US sites as well. Try this for the supplemental results for Seomoz
I've got half a suspicion that Google is running down the 'pages from the uk' thing. Personally it's never really made much sense. If I'm on Google.co.uk, I'd assume I'm seeing the results from the UK... in which case a 'pages from the rest of the world' button would make infinitely more sense.
This is the hardly the first glitch to occur using this feature and Google hardly seem to rush around fixing it. Either the UK is an unimportant market for them (hopefully not) or they're planning for a better future (hopefully)
I agree 100% carps - the 'pages from the uk' appears to be irrelevant now. I'd be very interested to see what percentage of searchers use it though...
Anecdotally, from a seminar I attended with a guy from (I think) Nielsen, it was around 40-45% of people actually using it - which astonished me at the time. Since hearing that, I've been looking around and our MD is one of the ones who swears by it!
God alone knows what the criteria they use are though. I was looking into it in some depth and figured it was down to a combination of TLD and server location, but I keep finding sites hosted in Germany and... (throws hands up in despair and walks away muttering...)
40 - 45%!!? Wow - I never would have put it that high. I think it's kind of mis-leading these days anyway since Google usually returns pretty UK-centric results in a google.co.uk search anyway. To have a button which says "pages from the uk" suggests that without it you would have worldwide searches, which isn't really true.
On another side note - do you know if it's possible to find out who clicked onto your site from a "pages from the UK search" in google analytics (or other stats packages?). If you could get that you could get a rough idea.
When people do a 'pages from the UK' search, the url string (and hence referrer to your site) gets &meta=cr%3DcountryUK%7CcountryGB appended to it. You could dig this out of your logs. Not sure google analytics lets you find it (I haven't seen it anywhere) but log-file based ones might...
That's a cracking point Will. I might get one of data mining geeks to pull out some data...
Try the query site:yourdomain.com -www instead.
Thoughts?
I think this should be taken back as soon as possible or I would have to ask what is G doing to get out of supplemental for
https://www.google.com/search?hl=en&q=site%3Awww.google.com+****+-asdf
Hasan let me explain you this.
When some new article is published a short description of that article is posted in that category relevant to it, hence as per the google metrics they would show these two pages as copied and hence supplemental status is applied to them.
Supplementary results will let you see your domain page structure and alert you to pages that have problems. The problems may arise from duplicate content, bad link design - the lower a page is on the domain tree from the root the less PR it gets from the top page, and proxy attacks.
I believe supplementary results should only be shown in the Google Webmaster Tool account. By showing the supplementary results to the public Google is Spamming the internet and Spamming your domain name.
Hi Everyone,
I'm fairly new to the SEO world but this seems interestingrealy understand what this supplemental results is and how it benefits webmasters, can anyone explain for me?
thanks ;-)
Well said Rand. I completely agree. I have been a regular user of this query to help our sites achieve better site design, and to help them recognize (in some cases) low quality pages that needed more attention in terms of improved content too.
I just found using the asdf tool that my website has 26000 supplemental pages, seeing which i think this tool is very useful.
Now i need to remove those duplicate pages which are really penalising my website.
If that is the case, I wouldn't prefer it then. A snippet of information shouldn't put the whole page to waste. It should be fine tuned
A Snippet can be blocked from getting indexed by bots which can keep the main article original.
Amit, the forums are a search engine trap box.
There are about 10 if not more duplicate urls for each post.You have to redesign the whole url structure if you want to fix your forum. Get the posts to 301 to topics, change the next and previews to actual topics numbers. Get rid of all the post links. Redirect the highlight topic to topic…
Please take a look at my forum which I just finished redesigning.
https://www.travelinasia.net/forum/index.php
This thread should have been called how to get rid of supplementary, not bring back supplementary..
:)
Hope you could check out this link - https://www.google.com/search?q=site:travelinasia.net&hl=en&start=10&sa=N
which clearly shows you still have many supplemental pages which comes in phpbb forums because of the members page, groups.php and mainly the search pages hence search features should be completely avoided from the forums.
Also all the forum pages should have this kind of structure rather than the default one[Forum Name - Topic Title Name] - Topic Title name - Category Name.
BTW getting back to this thread - yes the tool should be brought back asap as its very much recommended and useful. :)
Amit
Don't worry about google supplemental result; your website ranking will not suffer because of this.
I am obviously visual because it was when I saw the illustrations that I really got it. :) Good post!
And now the supplemental index is no more.
Well, I don't really quite believe that, but the divide appears to have narrowed to a point where it serves as an extension, rather than an auxiliary.
I agree with you over there, and personally think you will supplemental was a great way to understand what was wrong with pages, with Google supplemental indicator gone, it has become increasingly difficult to identify the pitfalls of website. I would like to add of the request their to bring back the supplement and allow us to correct where we went wrong.
i use site:yoursite.com *** -asdf and it's still work
Aw goddamn it all! The stupid thing is that they still give this information away, you just have to use the "site:" command and scroll through till you see the 'supplemental results' thing coming up and count them by hand. So I'm not sure what they've achieved by doing this - other than making it harder for us to try and give searchers good quality pages.
Mind you - they *are* infallible ...
Well, i noticed with several of my sites that using the 'proper' method (looking at the supps from the general site: command) gave far more accurate results than the *** -asdf method. The second method often listed pages that weren't supplemental at all
Shame about the supplemental query. The great thing about that query is that it helps identify areas of your site that may be wasting pagerank. Funneling pagerank can help ensure that the most relevant pages receive the recognition they warrant.
I cant help but think that the thing to fix is the supplemental index rather than the query that allows you access it.
No supp index == no fixation on supplemental results.
I think it's not true because a lot of sites still have sup.results when you search in google. May be theese hesitations are caused by google updating (began a week ago).
Sorry for possible mistakes, I'm russian :).
Its easy to find supplementals
Use the various URLs in the search box
https://searchengineland.com
In order to show you the most relevant results, we have omitted some entries very similar to the 1 already displayed.If you like, you can repeat the search with the omitted results included.
Your search - https://www.searchengineland.com - did not match any documents.
Results 1 - 10 of about 830 for www.searchengineland.com
Results 1 - 10 of about 165,000 for searchengineland.com
You would think these guys could figure out how to resolve their problem via .htaccess ;->
or that Google would fix the issues themselves, but they won't because they don't care. Organic results are the canvas to carry the true money maker, paid links I mean ads.
Omitted results and supplemental results are not the same thing. See the discussion started by Michael Martinez at this older post on SEOmoz: https://www.seomoz.org/blog/whiteboard-friday-supplementary-my-dear-watson
Rand also did a rectification of this whiteboard friday a week later.
Really interesting post, and one which does a great job of explaining to Matt (not that I think he really needs it explained) as to why most of us aren't nasty spammers.
And the cartoons of the sad & happy pages has made my day, whilst the final image (It's great when we can work together Googlebot) almost made me laugh out loud, reminding me as it does of those cartoons where it always ends with the characters talking about what they've learned from their (mis)adventures.
Excellent!
I agree to u ciaran
Man, I noticed this the other day also, but thought I was just doing something wrong. Most of my sites aren't terribly large, but it has certainly helped me re-write title tags and descriptions in the past to walk the line between unique but topical to the page's content. And obviously revise the link structure a la the example that Rand laid out in the most excellent cartoon diagrams.
For sites without a lot of PageRank "playdoh" (Matt's own term from an SMX presentation) it can be incredibly important to know which of your pages G doesn't consider to be high enough quality to include in the main index.
(Somewhat off-topic: if 25% of search phrases are supposedly phrases G has never seen, isn't it in G's best interests to show more long tail pages that might better target those extreme tail phrases?)
At any rate, I really hope they add this functionality back into WC Tools.
Couldn't comment last night - this form was throwing errors.
The only issue with working backward or scrolling through a site search is the increasing tendency of Google to block the query with the 'your request seems like an automated response'. I'm seeing this more and more as I perform site searches. Could just be me I suppose...
Ugh. I found an older post on here relating to that query, and tried it on my site to find it doesn't work. Then I stumble back here, and find this post. That explains it.
My blog recently started going into supplemental index hell, and it's either because:
1) I was sending pages (with referrer data) just to bots, while end users got the same pages without the referrer data. Aka "cloaking". This is stupid on so many levels, but whatever.
or
2) I pointed two domains I own at the same exact blog. Perhaps it tripped Google's "duplicate content" checker? Who knows.
Ultimately, queries that returned my blog as #1 or #2 now don't even show up. I'm kinda ticked.
I rarely used it anyway...and like it was mentioned, you can still do it the long way if you really need the info.
Love the idea of adding to webmaster central.
FYI - The query is still working in Houston.
I agree. Nice post.
For the record, I feel the same way about trackbacks on this blog and others that don't have trackbacks turned on. I hope you will consider this line of thought as it encourages links and helps the issue discussed above.
David - we actually have custom blog software, hand-coded from the ground up, and we've never built in trackback handling... Perhaps that's something we can address in a few months, once our other development needs have been met.
Thanks for the tip, though.
My understanding was that Matt was aware of this already, but he thought that it had been "fixed" already, so it may have just brought it back to attention at the plex.
Regardless, it is truly sad to see this functionality go away, unless it wasn't truly reliable and they are working to implement something officially.
It certainly is a blow to webmaster-Google relations. Every site will probaby end up with some pages in SI and that's disappointing to website owners, but understandable. But hard to see what is wrong with a tool that helps to isolate and identify those, allowing a website owner to look through the results to find those pages that really are wonderful and deserve some attention and love.
So Matt, as I'm sure you'll be dropping in to make some comments, help us to understand please.
If it results in improving the quality of content in the index, just seems like one more tool that allows website owners to help Google do what Google does. Sure, webmasters can figure out their SI results without this tool, but if you have a way to make the job easier, why not?
I would have thought their spam team are also getting on top of this problem from an algorithmic point of view - they do know about content in supplemental after all - if they then discover new content that's a duplicate of content already in the (supplemental) index, they could treat that as duplicate rather than promoting it above the supplemental content (even if it gathers more links and wouldn't normally have been penalised).
Great post related to supplemental pages.
And cartoons looks well explaining a few things. And good points said :)
Great post and cartoon sequence! However I think you have the Googlebot to human ratio out of whack... Bot is bigger than a small planet, and growing. Scarybot.
Nice post Rand...
I agree with everyone who mentioned the results weren't always perfect but I fully support Matt McGee (and others) who pointed-out this doesn't make them useless.
Adding a supplemental results tool to Google's Webmaster Central would be great, but if Google don't want to play anymore, it could be a great opportunity for SEOmoz to develop yet another useful members tool.
I'm not a coder myself, but surely it wouldn't be too difficult to come-up with a script that can find any results from the "site:" command that display "Supplemental Result" in their description and output them in a nice easy-to-read interface.
You could even allow the reports to be saved so you can come back and check which pages have been removed from the supplemental results after giving them some link love.
Anyone else think this could be a good little SEOmoz tool to play with?
Yeeeeah --- I can post again!!
Rand - thank you for the very informative post.
That was a great example you used about your NPR experiences.
Would you be able to expand on that scenario in a future post?
Thnks,mb
Amit, "Now i need to remove those duplicate pages which are really penalizing my website."
You should not say that because Google will give you a -950.Matt C. says supplementary does not penalize a Website.
I have a few Websites and I disagree with Matt.I removed the supplementary with robots.txt disallow and noindex, follow meta tag for odd dynamic pages.
I had 25,000 sup pages on one site, after one month they are all gone and the organic search results are much better.
I recommend disallowing product pages and leaving category pages so they can be strong in main Google index.This is especially true for low PR Websites. For forums do a noindex, follow for deleted topics.
So if your Website already has a -950 disallowing supplementary pages can only resurrect your site...
Cannot die twice, can you?
Matt C. says supplementary does not penalize a Website.
@Can Matt show me some pages which are in supplemental and which still rank in the top of the serps?
I have a few Websites and I disagree with Matt.I removed the supplementary with robots.txt disallow and noindex, follow meta tag for odd dynamic pages.I had 25,000 sup pages on one site, after one month they are all gone and the organic search results are much better.I recommend disallowing product pages and leaving category pages so they can be strong in main Google index.This is especially true for low PR Websites.
@This is true and it works the same way, i also had a auto blog with 20k supplemental pages since all the content was copied, i tried to get tonnes of backlinks, change the content structure a bit and now all the pages are well indexed in google and also rank better.
Supplementals always arise in forums and blogs because of most similar body structure and placement of content.
All the forums have the same kind of structure and placement of content hence they get some of the pages in supplementals.
Looks like Google Inc. is hiding many things...
Google as we know it will not leave for much longer. Wall street will make sure of that..
We are on the road to Web 3.0 and the destination is not where any of us will be.
But there will always be ALT net where it all started many years ago, where one professor wanted to talk to another professor via a data link..
Google is basically trying to keep webmasters busy in these things and move their own research further :p beating yahoo.
Anil we are not children here...
Sorry! I can see this. I was just replying Amit post.
Thanks for the kind suggestion i can see what a geocities owned webmaster can suggest :p @ https://www.geocities.com/anilkrsingh1/
LMAO do you need sponsors to buy a domain name for you?
Thanks Amit, I have my own Domain https://www.anilkumarsingh.com. I don't need any sponsor form anybody. Having 7+ year exp. in search industry; I understand all of you guys. Please forgive me if I have heart you.
Anil, no one is doing personal attacks on you!
If you would have read my first comment about Supplementary results hurting a Website, but do not tell Matt C. that, you would have realized not to make a comment that you have made.
My post was a reply to Amit.
Amit, "Now i need to remove those duplicate pages which are really penalizing my website."
I am qouting myself, "You should not say that because Google will give you a -950.Matt C. says supplementary does not penalize a Website."
So why do you want to tell us something that Matt C. has already told us?
As for you having 7 years expirience around search engines that is nothing spectacular.
I started programming in 1985 with Fortrant, did It consulting for 5 years, import and distribution company for 12 years in Japan, and 7 years as a Webmaster and a developer.
And I still do not know jack shit, otherwise I would not be wasting my time on this board trying to learn what to do next...
We never know everything and always have something to learn from each other.
And if you think someone doing a personal attack on you, you are totally lost.
Igor
Anyway back to the topic before this place becomes a Zoo like GWHF...
Thanks Boss. Nice to here that you want to lean from earch other. I still respect you.
Anil, nice to hear you have 7 years of search industry exprience.
What did you do before that? Use an eye seeing dog?
Experiance is not always important in this industry :)Results is the main thing in this industry - I can say this because iam just 2 years old in this industry and have websites which rank for the top terms in top 10 like - Seo company, seo services, web directory, business directory, health tips etc just a example.
Hence age is not always important ;)
Thanks, this debate will never end if I will use same language as you are doing, I apologize. Your point is well taken; I don't want to continue this debate. SEOmoz is good platform to discuss about search engine industry update. Please don’t use this website for personal comment. You are doing personal attack. Anyways please stop this.