So last week Eric Enge interviewed everybody's favorite Spam Cop, Matt Cutts (not to be confused with everybody's favorite Axe Cop), about all sorts of juicy Googleyness. You can read the whole enchilada over at Stone Temple, or view an illustrated version Rand put together (I recommend you do both, it's really good material).
The interview shed a lot of light but also inspired some confusion and questions, especially about how much of a site Big G will or won't crawl. If you've got a big site, say an e-commerce site with lots of flexible navigation and categorization, this could be a big problem for you. Fear not! This week's Whiteboard Friday will show you a clean, simple way to allow for both great usability for your visitors and compact, straightforward crawlability for the engines.
You might also want to check out this older, but relevant post on diagrams for understanding crawl priority.
Great WBF Rand.
I loved that you used shoes as an example...you know we've worked with a few of those, though not all of them -- yet.
Actually though, footwear sites are one of the best examples of the complexities here...gender, style, color, size, etc., though faceted/guided navigation can be challenging for any ecommrece site.
I'm sure you'd agree and made mention of it in general, but I'd remove the "sorting" functions out of the mix, either firing w/ JavaScript or at least using canonical link element to the default view. This functionality is obviously great for humans, but the bots could care less; and it really falls outside of guided navigation.
The hard question site owners have to ask is "are people really searching using a particular facet?" And if so, "are they using it in a way that matches up with the guided navigation?"
For example, if they do actually search for a specific shoe size, does your guided navigation provide that shoe size as a facet, or is it a size range? Even if it is that size, we have to weigh whether the overall additional URL bloat adds or detracts from performance. These aren't always cut and dried answers, though some facets often bubble up to the top as being valuable.
Something else to watch out for is facet order within the URL construct. Fortunately, I think some of the 3rd party providers of faceted navigation have gotten beat up enough by SEOs that they've made great strides in fixing this, but the issue still exists and may exist in a CMS/cart that has this built in.
So for Rand's example, I first select "brown" as the color, then "cowboy" as the style. Let's say the URL I get is something like one of the following:
Okay, but what do we get if we first select "cowboy" then "brown?" Hopefully we get something like the above. If we get something like the following, then I'd be very hesitant to allow any of the guided navigation to be crawlable:example.com/womens/cowboy/brown/example.com/womens/cowboy-brown/example.com/womens/display?style=cowboy&color=brownexample.com/womens/page?abcd+12345 At this point, dreaded visions of Stats classes come to mind, calculating complex combinations of combinations and permutations...especially since when this comes up, sorting functions are probably in the mix as well as several pages of pagination.
And in that case, even what appears to be a relatively small site, could grow out of control with near infinite URL bloat, which gets us back to the concern of site quality being downgraded when the engines see an extremely large quantity of what appears to be duplication.
That's a great point - and it's why using something like cookies or session IDs for breadcrumbs can be fine, but you need to afix the URLs to a permanent, single structure. Even if you're using rel=canonical, for very large sites you can end up wasting a ton of crawl bandwidth.
Hey identity:
I see that you've got more than enough here to make a full YOUmoz post. I for one would welcome it gladly.
Awesome idea. However, what happens when your faceted navigation allows you to select more than one 'flavor' product? For instance, our faceted navigation allows me to go to the t-shirts department, and out of all colors, show me black AND white t-shirts. I love this kind of navigation (for instance out of all t-shirts I can select to only show the three brands that I love), but it seems impossible to create URL's for these kinds of facets!
Good WBF. I think we are in the group of sites that might have to think about faceted nav. We have over 60k products belonging to multiple product types and sub types. Each product has 10+ variable attributes that are used to refine selection, and each product has several pages on our website.
Whilst your recommendations sound like a good solution, something we have always worried about is presenting something to Google that is different from what we present to the user. It was my expectation that if they detect that they are seeing something different to our visitors, that we would be penalised on the SERPs. Isn't this what you propse with techniques like differentiated content based on visitor cookies?
Is it safe to engage with techniques that show Google something different to our human visitors?
Thanks
ed
If you were to differentiate your navigation by requiring cookies, log in or by utilizing ajax, etc. the theory is that Google wouldn't ever see it to know it was different as they can't access it.
AFAIK Google can't "do" cookies, but I'm positive someone with more experience will jump in and correct me if I'm wrong.
Even if Google doesn't do Cookies at the moment, technically I'm sure they can, and who is to say that they won't in the future? Likewise with AJAX, I guess a login is really the only safe way to ensure Google can't see the "other" content, if that's a concern. PR: wait... I: wait... L: wait... Cached: wait... I: wait... LD: wait... I: wait...wait... Rank: wait... Traffic: wait... Price: wait... C: wait...
Good point :)
@Rand
Great article.
Honestly, though, I can't agree with you on everything here. It's really a function of the number of facet values, and the values are "sparse-factorial."
A medium-sized number of products with a large number of facet values across a large number of facet-types is still a terrible problem, minus the pagination. Magento would do well to pay attention to that, for example.
You're also ignoring a bit of history. There are many facets that are debatably categories, and we know this from viewing many older sites before the advent of facets where they created one hierarchy that assumes the order of application of filters to capture long-tail traffic.
Of course that only takes you so far ... and faceted navigation was invented to fix that problem. That's no reason to exclude the long-tail, though IMO.
Ideally, those pages should be included in a search index, and dumping people on product pages is pretty subpar with regard to navigation. Both users and Google will prefer a page dedicated to that set of products that people are searching for.
Blanket exclusion is not right. You just can't go too hog-wild: then it's a giant spider trap. B&H does this, while J&R opens up a spider trap. Neither is ideal, but this is still shaking out. Both seem to be using Endeca.
Ironically, I've just plugged through these things, was already writing an article ... and there should be an article on Search Marketing Standard coming up this week where I respectfully disagree with you on a few of these things. The 2nd edition of my book will also cover the math and some optimization methods that minimize duplication while preserving useful permutations.
Probably the best solution is to hide certain facet-types entirely, not allow mutliple-value facet permutations (unions -- most sites don't really support this anyway) to be indexed, and build such URLs such that they're excluded via robots.txt from the start.
Regards,
Jaimie Sirovich
SEO Egghead, Inc.
Author of Professional Search Engine Optimization with PHP & ASP.NET
Ahh good point! So where you do have multiple value facet permutations (for instance white AND black t-shirts), you just don't index those but index the single value ones instead. Right?
How would you go about this with URL's though? For instance, you're in the women's leather boots department:
niceshoes.com/womens/boots/leather
You then select the color white:
niceshoes.com/womens/boots/leather/white
And then, you also select the color black. Now what should happen? You could keep the URL like this, but this seems very hard to build, technically (though I'm not very technical, so what do I know :P). You could use a hash tag for subsequent colors, so that possible URL's would be:
niceshoes.com/womens/boots/leather/cowboy/white#black&urban
Or is this totally wrong?
How about this...
The user is on the page:
niceshoes.com/womens/boots/leather
The page has form to allow the user to refine their results. The consists of three checkboxes - black, white & red.
The user checks black & red and submits the form.
The user is taken to a page with the URL:
niceshoes.com/womens/boots/leather/filtered.php?color=black&color=white
The bot will never (naturally) get to that page as it is not going to try every combination of the checkboxes and/or submit the form (is it?)
However, if that is a key page for your site (ie lots of your customers are searching for it) you could link to it from elsewhere in a way that gives it lots of link juice.
The bot will not submit the form, that's correct.
But that's exactly the thing. Your solution works fine if you don't want the color indexed as a page. The same when you just use AJAX for the faceted navigation, while the URL always stays at /womens/boots/leather. Either way, Googlebot will not be able to crawl, and therefore not index that specific subset/page.
What happens though, if you DO want those colors indexed? If you want to rank for 'white leather women's boots'? You'd want the URL /womens/boots/leather/white indexed, and /womens/boots/leather/black as well. I just wonder how you're going to implement that in such a way that a single value facet IS indexed as a seperate page, but multiple value facet permutations aren't.
Sorry, I misunderstood your problem. Looking back to your original question... "And then, you also select the color black. Now what should happen? You could keep the URL like this, but this seems very hard to build, technically (though I'm not very technical, so what do I know :P)." ...I don't think that is very hard to build - this should be straightforward with either a form or AJAX.
How about stuffing a string that means nothing to the app, like
?color=red&size=9&donotindexmeplease=1
then placing a wildcard rule in robots.txt to teach Google to read ? :) Only do it if facet_count >= 2.
That might work.
You don't want to canonicalize it. You want Google not to even bother to see it. It would be nice if Rand chimed in. I'll be posting ideas in a series on facets in Search Marketing Standard over the next few months.
That was exactly my point. There are some facets that are really desirable and relevant to search. Just excluding them blindly is potentially ignoring the long tail.
I would give up on nice SEF URLs as well. You're going to need an ID to make it fast enough on large databases ;/ I know. I like pretty URLs too :)
Oh well.
Another nice whiteboard friday :)
that's great , thnx for white board to share with us !@
It's great for optimizing your crawl budget by only letting the search engines see the pages that really need indexing and also pagerank will flow nicely down the tree.
But... it is now almost impossible to try to rank for "filtered" pages on a category level. For instance, if the color is a filter that is done through Ajax, hashtag, etc. Then you can't rank for "blue leather boots". Ideally you would want to optimize those on a category level and not on a product level.
Trbl - that's absolutely right. You only ever want to use AJAX type interfaces when you know those pages aren't things you need to rank for (like size 9 blue leather boots rather than just blue leather boots).
You can use search query volume (both from your own analytics and from service like Google AdWords or MSN AdCenter) to figure out what terms/phrases deserve their own indexable pages and which to ignore.
Great WBF Rand. You've turned AJAX from a SEO liability into a valuable asset.
Rand,
Excellent WBF!
What if the attributes you are blocking are 5 levels deep (sitename.com/jewelry/men jewelry/rings/ By Metal: silver | By Stone: diamond) the long tail keywords(ex: silver diamond rings) are too important to block.
If your site has a lot of departments (jewelry, watches, apparel),than the site structure becomes enormously wide and deep.
sitename.com/jewelry/men jewelry/rings/silver/diamonds/
sitename.com/watches/men watches/invicta/automatic/black dial/
again, superb work..thank you!
Igor
bump ;)
We are working on a clients e-commerce clothing site that has the following facets:
We're having real issues defining the URL structure for the site as ideally we'd like to target many of the different permutations (g-star + pants, womens + jeans, mens g-star, etc.)
Conceptually it seems very simple and I'm sure we're having the exact same problems as many other sites, but I am yet to find a definitive answer.
Anyone?
Forget the URL structure as a problem. What you need to do is highlight the combination of facets that you want to show the bots, and deemphasize those that you do not. Call that 'robot scent.'
The problem is that, unlike humans, robots to not have intent. So you have to feed them intent. You can figure out which combos are important using collective intelligence or rules. Either way, facets are a giant spider trap without some attention like Rand says.
So what is the verdict? Would it be possible, without penalties, to remove the filtering and sorting links serverside, when the useragent indicates that the visitor is Googlebot?
That's not the verdict. The verdict is that you can use AJAX or hash signs to prevent Google from indexing all different kinds of pages that faceted navigation can produce. Or you could only show faceted navigation to those with cookies enabled (I'm not sure what the numbers are on this, so this may not be the best way to go).
Combined with this, make sure that you have a solid navigation structure, with the right number of levels of navigation (womens > boots > leather > cowboy > etc) to suit the search demands of your target audience.
Or at least, that's what I made out of this very helpful WBF :)
I think I got the same thing out of the video - I just wondered if the simple solution, to hide the links from Googlebot, would get me in trouble :)
If it WILL get you in trouble is hard to say, but I think the difference with the suggested solution here is that you would actually show Googlebot (specifically Googlebot) something you're not showing all users (or actually vice versa here), whereas with AJAX/cookies you would be distinguishing different kinds of users... and include Googlebot in the process.
We have exactly this problem and saw two possible solution.
1. Use canonical tags to try to group filtered pages together.
2. What this Video suggests, to let Google see a different thing than the user.
We had the same worry as edralph that Google would penalise differentiated content, what some calls "cloaking".
We instead tried the canonical aproach and now after 3 weeks we have seen some indication of a positive effect. The reindexing by Google seems to take a long time though so we will have to wait more before we know for sure.
Edit:
About the problem identity mentioned we have ordered the filters in the canoniacl tag to for example make sure that colour is always written before style in the url:s
style=cowboy&color=brown -> color=brown&style=cowboy
Canonical can be a great solution a lot of the time, but as Matt Cutts noted in his interview last week, it does spend crawl priority, so you've got to make a decision about whether that's OK and the engines will reach all your pages regardless, or whether you want to control access to those links through some other system. It's really not cloaking - you're not trying to do anything untoward; you're doing the same thing you'd do with canonical tags or robots.txt blocking, you're just doing it in a way that doesn't waste the engine's crawl time.
Rand
Thank you for this WBF, I asked a Google enginer about this and I asked them if we can do like Amazon does and the asnwer I got was that one would be carefull when showing different things to humans and the bots, A very vauge asnwer :(
Great video and highlights why SEO is so important at the beginning of the design process
Is this video super low-res or is just my computer?
Looks solid over here.... Anyone else having trouble?
for me, too. it seems to be low-res quality-video. Why not via vimeo?
For your product level keyword research, don't forget to use your internal search analytics once the site is live and getting traffic. There's a goldmine of useful data here.
this is a perfect example of why the web dev and seo teams needs to work together from the very beginning and sit in on each others brainstrom sessions.
Excellent explanation of faceted navigation. Some guys were talking about it on a forum and I was lost, now I understand, sort of! At least, I realise it is not something I need to worry about any further!
I don't know... good info, but at the same time there was no nitty gritty. Could have used a lot more, this is how you should handle things. Like those seo pages that should built out, how should that be done and incorporated into navigation. I think the best place to learn about faceted nav and seo is to look at what the pros are doing and how they rank.
New Vimeo layout looks really clean.
Hi! Actually we use Delve Networks now for video hosting.
what are the advantages and disadvantages of using a)ajax or b)iframes (links to the parent frame) to hide useless links to the search engines?
How about a solution where some facets (the key categories) are shown as standard links and other facets are hidden inside iframes in order to prevent a useless job to the bots?
it is advisable to hide duplicated content within iframes?
I'll definitly go with Ajax (JS) over iFrames in a heartbeat. SEs are getting used to these <iframe> cloaking techniques used by darker hats seo.
Ajax can also be good for users as it will let them navigate through all the "sub-pages" without any reload.
Vote #2 for AJAX, or you can also use a &noindex=1 parameter to hide those links that are superfluous.
Great video. We're doing the seo on a large watch site here in the UK where I work, so this is very helpful.
Thank you for this new demonstration !
My question :
What happens if the whole structure of my website is based on the tag "link canonical"?
For example: my current CMS operates as follows in the case of URL rewriting:
Page (in front):www.mywebsite.com/epages/xxxx.sf/us_US/?ObjectPath=/Shops/xxxx/Categories/Woman_Shoes=> Tells the bot in the HTML: "link rel="canonical" href="https://www.mywebsite.com/Woman-shoes"
The "https://www.mywebsite.com/Woman-shoes" is well indexed by Google but I'm sure I lose all the benefits of the structure of my site .. because the road is too long for Google (and unnecessary but I have no choice if I want rewritten my URLs)
Is it easier to let Google index the page: www.mywebsite.com/epages/xxxx.sf/us_US/?ObjectPath=/Shops/xxxx/Categories/Woman_Shoes?What is your opinion?
Thank you
David
We where heavily in the same problem and it took me about 3 month to get beyond every aspect. Since we had to release our shop we did manage a part of the solution with nofollows and the robots.txt.
so here is our plan, any tips would be really helpfull:
magento filter navigation our own plan:
-we had one programmer programming a function which rewrites all magento urls to static ones. So every page has its own plain url.
-we had another programmer building us some options for attribute page. he did programm bakendoptions so we have metatags, pictures and text options. So we can say one option turns into a content page.
-we will now programm an option tthat will let us define if a non content option page is reloaded by ajax. That means that you simply choose an attribute option and it simply stayes the same URL.
-also and i am not that sure about it: Google just released a new video in which they say the 100 link rules per page is gone. So i thought it would be the best to set a canonical link from the pagination section of a selction to the show all page. So google simply gets all links.
Good advices. Personnaly i've used canonical tag for my customers for urls which display color or size parameter. But inside sitemaps i've only kept canonical urls. I think you can solve a lot of crawl problems this way
Another awsome whiteboard friday :)
Rand,
First let me say that I am a big fan of the overall WB Friday series.
It seems like what you are proposing is akin to cloaking because you are presenting users with a different view/expierence of the site than what you are giving the spiders.
It seems that the show company wants to position itself with sliced and diced views for each possible keyword set.
For example they may want to position for each of these phrases... all of which are at "category" levels and not product levels"
Leather Urban Boots (there are 10 of this type)
Leather Spike Heel Boots (there are 5 of this type, 3 of which are Urban and 2 are Country)
Red Leather Boots (They have 20 types of these)
where they have perhaps 10 varietes (3 with tall heels, 4 with stub heels, 3 with no heel).
From what you recommend it appears that they will just have to choose the most popular category to try to position categories and then hope the rest positions at the product level.
Woudn't a tag based approach solve a lot of these problems?
This is one of the most complex areas to address in site architecture and SEO, one that can only be slightly touched on in a WBF, though Rand did a great job trying to at least get people thinking about it.
There are many ways of addressing it, and often many tactics may be applied.
The approaches that Rand mentioned aren't based on cloaking. Ideally it is creating a base crawl path and URL construct based on an ideal ordering of cats/subcats and some facets if they make sense. That ordering is meant to avoid the "facet ordering" issue I illustrated above.
It may be desirable to simply keep the bots out of the guided navigation entirely, or if that isn't possible, relying on the canonical link element to associate the variations with the parent cat or subcat.
One approach that Rand mentioned would be to use cookies for keeping track of the selected facets...but this still presents the same information to users and bots, based on users who refuse cookies. If this is the purpose, the engines shouldn't have any issues with this.
Thanks Rand.
I particularly liked this WBF. Its very poinient to me at the mo as I am writting a redesign speicification for our new website, alot of which carries multiple navigation paths to vacancies. Vacancy pages on job boards can be complex. On the one hand you need to let them index, on the other hand you can't just let them run wild otherwise you end up with multiple urls serving the same results. This is esspecially true if you run roll backs on refinement results if none are available. For this, I find a no-index helps.
If you are returning profession bassed vacancies, but all those vacancies happen to be in one location and your architecture links to vacancies based on location then again, you find problems. Adding unique content to each level of your architecture is a great way around this, although can be time consuming particularly on large sites.
The rub is, webmaster need to get pages indexed, but how far do you go before tripping spam filters. My advice is get creative with your copy, bring in algo's to customise results, for eg if doing location results, inject location info into description for example, with the use of switch / case dynamic title / copy scripts so on a so forth.
Perfectly right in time. That's what we needed in order to the launch of a customer's new ecommerce website.
Glad you made our day. have a good weekand and thanks for the great video.
excellent wbf as always, I especially appreciated the translation of sneakers lol!
What would be the best suggestion for a site that already has 1000-2000 pages to get the nav better?
is it a case of rebuilding offline first? or doing it cat by cat online?
For a site that small (or really anything under 100K pages), I'm not sure that I'd worry about these types of techniques. Earning good links, producing unique content, getting site arechitecture and XML sitemaps right should get you pretty close to full indexation. It's when you get into the super high page counts with lots of facets on the ways to nav into detailed products that you encounter this need.
Thanks for the sneakers/trainer translation Rand. Great Post TFI Friday!
Nice WBF!
I think this wonderfully underlines what some people are saying to give some counterweight to SEO in general: Sometimes the 'simple' way of doing things (which is in this case creating a greate architecture) is better than over SEO-ing those elements.
Gives a nice perspective and some wonderfull information as well, as dannydenhard mentioned: this is why SEO has an important role at the start of the design process.
The canonical tag works for me. I have a small ecommerce site, not a very big product selection but it is faceted navigation with dynamic url based sorting functions. I've used the canonical tag from the beginning and do not see any URL bloat at all (based on what I see in G, Yahoo/Bing and their respective webmaster tools).
Also for those dealing with dynamic URLs, keep in mind that Google Webmaster and Yahoo SiteExplorer have a function that lets you specify dynamic URL elements that are to be ignored. I currently only use this to ignore the session ID though, the canonical tag alone seems to keep the URLs straight.
Taking on a similar project and glad to see we're headed in the right direction! Thanks!
I'm not sure that I understand what the supposed solution is here... I do understand that you could create a more 'straightforward' site architecture in stead of faceted navigation, but how exactly is this architecture build/presented to SE's? Is this through a sitemap? Or by adding a bunch of static HTML links on (sub)category pages to product pages that contain keywords which appear to have some search volume? Or..??
I feel I'm missing the point here. Thanks for any clarification :)
@Scott - I missed the axe cop reference the first time round, so I just followed the link.
Thanks for the loss of 30 minutes pal! That is one bizarre yet funny comic series. First time I'd heard of it (I lead a sheltered life)
Great vid.
I'm currently building a webshop based on magento which has a lot of great functionality built-in for this layered/faceted navigation. I've been strugeling to figure out the best way to limit those extra urls, since using those functions easily pushes, essentially the some content, to 20-30 different urls.
I used to think nofollow would be an easy fix, with one shortcomming in the event the some one from the outside linked to one of these urls anyway. But with nofollow being confirmed as a technique that still dilutes link equity, I don't really consider that useable anymore.
Is the canonical tag any good in this case? I mean in essence its the same content, but its not ordered or presented in the same way...
Canonical link element could be a good approach.
Even before that, since you are building out the site now, I'd recommending reviewing the category and subcategory architecture as a precursor to guided navigation. Try to refine these base classifications in a way that reduces the number of products that go into any one category/subcategory from the start.
Then if you can default to all products or a higher number of products per page, you can dramatically reduce the level of pagination.
Keep the basic sorting functions out of the mix entirely as well.
Do all that and you go a long way to minimize URL bloat and duplicate content, while making the guided elements, if any, that you do want indexed even more valuable.
I'd actually be careful with rel=canonical, too, especially if you're talking about thousands or more pages. The problem, as Matt noted in his interview with Eric, is that Google still has to spend crawl priority on rel=canonical pages. They're potentially better than nofollow (because supposedly Google will pass the PageRank/Link Juice from them to the original document), but not a perfect solution.
If you really do have views/pages/points you don't want the search engines accessing, something like an AJAX interface or showing that nav only to logged-in/cookied users may make some sense.
Just remember we're talking about a very specific technique here. 95% of the time, sites don't need to worry about it - it's really only for cases where getting full indexation is very challenging and there are an exceptionally high number of pages that need indexing (and many that don't).
Agree...it is a nice tool, but rel=canonical isn't the end-all beat-all, especially since it serves only as a recommendation to begin with. It's still up to the engines to determine how they use that recommendation, and it still impacts crawl equity.
And personally, I still think Matt downplayed the crawl equity concerns. I understand why, but it is easy to swing the pendulum too far the other direction. I've seen too many sites with not just 10x duplication, but 30x, 100x or more duplication.
The idea that it is better to let Google crawl everything and that they "might" consolidate duplication and PageRank to one canonical URL (even w/o the canonical link element) is great ... in theory.
AFAIK rel canonical is a solution to the multiple category problem. That is, if you have to provide multiple breadcrumbs to the same product, and you need that data "serialized" into the URL, it's the solution.
Especially if users might link to such pages, rel=canonical is a Godsend.
However, if the odds are low people will link to it, exclude the URL from the getgo, esp. if it looks like spidertrap, as facets do to anyone who's looked at n! :)
Hi Rand... I think that your incidental AJAX suggestion could be a really good one, and is essentially what the dev of a client of mine and I finally used for a Booking.com kind of site.
But we arrived to your same solution from the opposite track, as the website was at first plagued by AJAX use. In order to solve the SEO problems it was having we finally found out a principal schematic hierarchy (categories and subcategories) that was visible both to Bots and Users, then all the search splits possible (for instance: sport vacation, family hotels, vacation with pets...) where solved with AJAX calls.
Finally, I find very right your insistence in saying that "normal sized" websites don't have to worry about faceted navigation, as every time Matt or Google talks some sort of panic reaction (the all-in-one-breath-phrase Gosh I got to check out my sites because of this news by Google) makes beat faster any SEO little hearth. In that case, I suppose that canonicalization is the best thing to do.
Thanks for both your comic kind of post and this WBF.
Most Magento implementations I've seen fall short on exactly this.
AFAIK rel="nofollow" isn't the right move here, nor is canonicalization. The pages, esp. pagination pages, are not the same. I wish Google would clarify what rel=canonical is *really* for, but to me it's safest to consider it a silent 301.
I'd consider it for almost-identical pages -- and that's it. Page 2 and a different ORDER BY aren't identical to any algorithm Google could compute reliably in a reasonable amount of time. They may even be 100% different products. They're related, but not the same by a longshot.
You should probably consider excluding some facets either via a parameter and a rule in robots.txt. You might want to exclude combinations after N=2 or 3 entirely, etc.
The best solutions will shake out after SEMs think about it for awhile . . .
Oh man Rand, don't call anchors "hashtags". We don't need people thinking anchors are in any way related to Twitter.
It's very useful article for me. I have a job search engine website, which has more than 300,000 jobs, and it is faceted by:
- 8 job types
- 35 categories
- 4,000+ locations
- 5,000+ companies
- 5,000+ popular job titles
I find it's very hard to get search engine indexed properly and we often have problem with Google. We use canonical URL.
If you could suggest a solution for navigation, that would be great. Any comment for my site will be much appreciated.
My website is JobHits.net
Many thanks.
this is the type of question that really makes a pro-membership worth the value. You are unlikely to get a free answer on something like this in this area however you can most definitely get a very detailed answer using up your question credits with your pro membership... give it a shot. Best 80 bucks you'll spend in seo.
Hi SEORM,
Thanks for your comment, do we get this type of advice for Pro membership?
Hey Antharas, with PRO membership, in addition to more tool options and access to the PRO webinars (like the excellent webinar yesterday that covered multiple site reviews), you have the ability to ask two questions per month of the staff and the associates.
Your question and answer will then be made public to the rest of the PRO membership which is another great place to learn, reading others questions and the answers to them.
If you have a sensitive question, you can ask it privately but it uses up both your question credits for the month.
Thanks, I think I am ready for PRO :)
To me, the whiteboard is barely readable. Looks like a lighting issue.