Google's got a blog post out today (and SELand covers it) about how they now recommend that webmasters and site owners DO NOT rewrite their ugly dynamic URLs to be clean and static. What's the reasoning behind this?
We've come across many webmasters who, like our friend, believed that static or static-looking URLs were an advantage for indexing and ranking their sites. This is based on the presumption that search engines have issues with crawling and analyzing URLs that include session IDs or source trackers. However, as a matter of fact, we at Google have made some progress in both areas. While static URLs might have a slight advantage in terms of clickthrough rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking. Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.
The fundamental problem here is that Google is thinking about this from a completely different perspective as marketers. It's not that they're wrong or lying or creating misinformation, it's just that they're looking out for their best interests - effectively and efficiently crawling the web and serving up accurate data about the contents of pages. When URL rewrites go awry, it can screw up Google's ability to return the results their users want (and as content publishers, the results you want).
However, the fact that some developers incorrectly create rewrite rules does not mean that sticking with dynamic parameters is now the "best practice." It simply means you have to do it right.
Let's go over the list of pros and cons for static vs. dynamic URLs and see what's really changed:
Pros of Dynamic URLs
- Umm... they're usually longer?
- Google (1 of the 4 major search engines) says they can effectively crawl and index them
Cons of Dynamic URLs
- Lower click-through rate in the search results, in emails, and on forums/blogs where they're cut and pasted
- A greater chance of cutting off the end of the URL, resulting in a 404 or other error when copying/pasting
- Lower keyword relevance and keyword prominence
- Nearly impossible to write down manually and share on a business card or read over the phone to a person
- Challenging (if not impossible) to manually remember
- Does not typically create an accurate expectation of what the user will see prior to reaching the page
- Not usable in branding or print campaigns
- Won't typically carry optimized anchor text when used as the link text (which happens frequently due to copying & pasting)
Pros of Static URLs (mostly the opposites of the above)
- Higher click-through rates in the SERPs, emails, web pages, etc.
- Higher keyword prominence and relevancy
- Easier to copy, paste and share on or offline
- Easy to remember and thus, usable in branding and offline media
- Creates an accurate expectation from users of what they're about to see on the page
- Can be made to contain good anchor text to help the page rank higher when linked-to directly in URL format
- All 4 of the major search engines (and plenty of minor engines) generally handle static URLs more easily than dynamic ones, particularly if there are multiple parameters
Cons of Statics URLs
- You might mess up the rewriting process, in which case your users and search engines will struggle to find content properly on your site.
So - bottom line - dynamic URLs don't afford you the same opportunity for search engine rankings, usability or portability that rewritten, keyword-optimized URLs do. Just because one of the engines doesn't have trouble crawling them doesn't mean it's any less critical to continue optimizing this element of a site's structure.
If you buy into Google's argument that because rewriting URLs can occassionally cause problems (nevermind that we've done it at SEOmoz and with our clients dozens of times without issues), you're setting yourself up for something significantly less than search engine "optimization." I'd be tempted to call it conservative SEO, but it's not really even that. It's the mindset that living in fear of change rather than pursuing the best course of action is the better choice, and none of us who do SEO for a living should support that mentality.
Couldn't agree more.
In any event, in most systems these days static-type urls are the defaults:
* almost every PHP framework
* Ruby on Rails
* Django
One big problem is that people rarely do these redirects/rewrites well.
In many cases, they create as much potential Duplicate Content as they solve.
Take a simple case with two parameters, with a three and a seven digit value, and some common extra fixes that are either necessary or highly desirable:
# Specify acceptable index/root file. You could have a static
# index.html or allow index.php without parameters for root:
DirectoryIndex index.html index.php
# Redirect to remove trailing period or comma from URL request
# with parameters, such as from forum with autolink, and force
# www to always be in the URL:
RewriteCond %{QUERY_STRING} ^(([^&]+&)*)(\.¦,)$
RewriteRule (.*) https://www.example.com/$1?%1 [R=301,L]
# Redirect to remove trailing period or comma from URL request
# with path, such as from forum with autolink, and force www:
RewriteCond %{HTTP_HOST} ^(www\.)?example\.com [NC]
RewriteRule ^(([^/]+/)*)(\.¦,)$ https://www.example.com/$1 [R=301,L]
# Redirect two-parameter-based index.php¦html? or / URL request
# (with parameters in any order) to folder-based URL format, and
# force www to always be in URL:
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /(index\.(php¦html?))?(\?[^\ ]*)\ HTTP/ [NC]
RewriteCond %{QUERY_STRING} &?cat=([0-9]{3})&?
RewriteCond %1>%{QUERY_STRING} ^([^>]+)>([^&]*&)*art=([0-9]{7})&?
RewriteRule ^(index\.(php¦html?))?$ https://www.example.com/%1/%3? [R=301,L]
# Force all remaining requests for named index files to drop
# the index file filename, and force www:
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]*/)*index\.(html?¦php)(\?[^\ ]*)?\ HTTP/
RewriteRule ^(([^/]*/)*)index\.(html?¦php)$ https://www.example.com/$1 [R=301,L]
# General rule to force all non-www URLs to be www URLs.
# This rule must be the last one of the redirects:
RewriteCond %{HTTP_HOST} ^example\.com [NC]
RewriteRule (.*) https://www.example.com/$1 [R=301,L]
# Rewrite such that stray parameters or values on any index.html?
# or any index.php URL or on / URL request always fail to the
# 404 page:
RewriteCond %{QUERY_STRING} .
RewriteRule ^(index\.(php¦html?))?$ /this.page.does.not.exist [L]
# Rewrite URL request: www.example.com/345/1234567 to internal
# path: /index.php?cat=345&art=1234567 to serve content:
RewriteRule ^([0-9]{3})/([0-9]{7})$ /index.php?cat=$1&art=$2 [L]
Website now only directly responds with content as "200 OK" for paths like / and /345/1234567 and all other formats either redirect or fail to the 404 error page if a real file does not exist. It does this without having to do server-intensive !-d and !-f checks on the filesystem.
The site root can be implemented either as a static index.html page or by using index.php without any parameters. The script will also need to check that parameters are present, and that values are acceptable, and fail to the internally script-generated 404 error page if not.
One minor simplification to the above code:
# Specify acceptable index/root file. You could have a static
# index.html or allow index.php without parameters for root:
DirectoryIndex index.html index.php
# Redirect to remove trailing period or comma from URL request
# with parameters, such as from forum with autolink, and force
# www to always be in the URL:
RewriteCond %{QUERY_STRING} ^(([^&]+&)*)[.,]$
RewriteRule (.*) https://www.example.com/$1?%1 [R=301,L]
I had the same thought as you when reading their post Rand. It's almost as if it was written from an engineering point of view and not from a search marketing one. The fact that you can shorten the URL and mention your key term or product right in the address is reason enough for me to do it so when I read their article it was clear that they were just looking at it from one side of the equation. Still, for people that are not technically inclined, I agree with Google. You can do more damage than good if you do the rewrite wrong. Instead they should have mentioned that it's a higher level thing that should be done by a competent webmaster or programmer and not the general business owner.
Thanks!
And totally agree on the "made by engineers"-perspective.
It reminded me of this quote by Scott Adams:
Normal people... believe that if it ain't broke, don't fix it. Engineers believe that if it ain't broke, it doesn't have enough features yet.
I'm a big believer in the usability benefits of "static" URLs - they just make more sense to people, and architected correctly, can help show the path you're on. Another advantage is security - as soon as someone sees a ".php" or ".aspx" in your URL, they know that much more about the underlying strtucture and how to hack it. Also, as someone hinted above, if you ever want to change to a different platform, you won't be able to port those URLs.
Of course, if Google is just saying "hey, don't worry about this so much", fine. Much like duplicate content, I think we've made way too much of Google's "reversal", which isn't a reversal at all, but really just an acknowledgement that they're trying to do a better job of dealing with non-ideal circumstances. This is good news for the poor mom-and-pop e-commerce sites who don't even know what URL rewriting is, let alone how to do it. On the other hand, that doesn't mean that sloppy URLs and massive content duplication are good things that we should aspire to - they just aren't as bad as they once were for the spiders.
Like someone said above, forget about everything and do it for usabilty purposes.
When such misunderstandings occur, my only advice is to go with the common sense option. If you do it right, there can't be any doubts: make 'em static.
Another benefit of static URLs is that it's more likely that you can keep your URLs when switching the underlying system. For example, when I converted a Joomla site to Drupal I could keep all static URLs, because both systems support URL rewriting, which is the case for many other CMS and frameworks as well.
One thing you should care about when using a dynamic system with URL rewriting is to redirect your dynamic URLs to the static aliases, to avoid the same content being accessible via two or more different URLs. For Drupal based sites the Global Redirect module (https://drupal.org/project/globalredirect) does this for you.
Thanks for writing this up, Rand - I was lazy and just tweeted... I completely agree with you.
I especially like this part; "While static URLs might have a slight advantage in terms of clickthrough rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking. Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.
So Google recommends we take a "slightly lower" clickthrough rate. In the words of Sarah Palin, "Thanks but no thanks to that dynamic link to nowhere."
The other interesting thing is that the google blog post was written on a blog that uses mod rewrite to alter their URL.
I agree this was meant to warn people who do not do rewrites properly.
I remember Adam Lasnik last year at SMX in Stockholm saying he was so frustrated by how site owners only targetted Google and didn't add the other search engines into the mix. But here is a typical case of exactly the opposite, because if you follow G's advice, you are setting up your site for a harder time in some of the other search engines out there.
Although I can see where Google is coming from for newer webmasters or webmasters just starting out with a new site, the way it was written didn't really clearly outline that. It's kind of hard to believe that post ever made live status, but more so that it hasn't been retracted or updated to a more correct explanation since.
Especially the argument that switching back end doesn't necessarily matter if you have rewritten url's is HUGE!!
Great post Rand! Very logical breakdown IMO.
One main con of url rewrites IMO is the load it can put on your server under large traffic loads. It really depends on how many urls you're rewriting.
Rewrites are stupid workarounds caused by programmers who have no clue about SEO. I can build dynamic sites without parameters and rewrite rules. I exploit the ability to embed variables and developed an old style pathinfo type url parsing rule. This takes anywhere from 3 to 7 lines of code + a line for each var you embed. The embed is done dynamically when our CMS builds the staic looking pages. If you want to take it to extreme you can set the server to parse .HTM as .asp and then even the extension looks static.
You'll still need at least one rewrite to connect the URL request to your internal script path, no?
And, it will have to avoid the rewrite when robots.txt or any of the various search-engine WebmasterTools User-ID verification URLs, etc, are requested.
Or are you saying that your CMS actually outputs quasi-static files into a folder-based structure on the server?
I read this post and was sure I understood. Then I read some of the comments and got thrown off track a bit.
So, if I understand it right, as long as you use/write your static URL's correctly then it seems they're most likely the best way to go. If you're worried about potentially making a mistake, then go with the dynamic?
I think that's right...right?
PS - Welcome back Rand!
I would say, "worried about potentially making a mistake, then..." do some research or/end hire a good wel developer and still go with static URL...
Congrats to Google for being able to crawl the /?id=3&cat=4&car=11 jungle, but I certainly think they are serving their users better with /pontiac-firebird-1988.
Ridiculous post by Google IMO.
The Google blog post was a really badly written article, it created more confusion than clarity. Nearly everyone went away thinking they had to do away with URL rewrites.
Having asked (and had answered) for more clarity in the discussion forums, a Googler stated that if you are confident that your URL rewriting is up to scratch then continue to use them. If in doubt, leave them dynamic.
Not modifying their article for clarity after all the outcries leaves much to be desired - lets hope they rectify that soon.
Hang on, we could all agree that Google is telling us straight-up why we shouldn't rewrite URL's, but think about their blog post from a competitive industry standpoint and you'll start thinking, like me, that by getting folks to stop rewriting URL's means a heavy loss of indexed content by your competitors who have not yet made such advances in the dynamic URL field.
Randfish, as soon as I read your post's title, I was with you on this one.
However, I can see another point regarding "Dynamic URLs" vs "Static URLs" (I quoted them, because those are misleading names ;)), where if they are not done properly, your pages send incorrect HTTP response codes. Obviously this is an implementation issue, rather than a show-stopper. But I figured this is a good time to mention, when you're doing your SEO-friendly URLs, make sure they're search engine and non-browser agent friendly too. An obvious example is, make sure your 404 pages return an HTTP 404 response code. If they return HTTP 200s, it makes it look to search engines like a page exists on your site where it doesn't.
This could hurt you quite badly, because it could look like duplicate content, orlike your page is not optimised very well (i.e. it's your 404 page, but was linked to from some other page with a nice URL and link text etc.)
Just some things I thought I'd share on the topic, anyway :)
I couldn't agree more ChristainB. I'd love nothing more than for all our competitors to take that information to heart ;)
good point. most of the google blogs are for noobs. i rarely find myself reading all of their posts.
indyank... THANK YOU for bringing this up. It's annoying when people use incorrect terms. This article should say PARAMETERIZED urls vs NON-PARAMETERIZED urls. Any blog, CMS, shopping cart, whatever... all use dynamic urls. That just means they're database driven. Everything after a question mark in a URL is a PARAMETER. If it's a blog (and still dynamically created), but doesn't use parameters, that means the URL has been re-written, using .htaccess rules, or some other method.
So, maybe we can call this article PARAMETERIZED vs RE-WRITTEN urls. That would be correct use of terms as well.
I should say even the google article is not clear on the definition...but what they actually meant is this...Don't redirect to a static looking url from a typical dynamic url... (unless you know how to do it correctly)
CMS like wordpress handles pretty urls (using mod_rewrite of apache)..So nothing to worry.Did we see any major problems all these days? I don't think so...
or Is that article trying to say that we won't be doing it correctly anymore if you make things pretty...that is scary and totally unnecessary..
but i believe that the the article by google is mainly intended for those folks who do things as pointed out in the examples towards the end of that article...Otherwise most need not worry...
but i will have to say that the google article has created unnecessary confusion with wrong definitions and making people worry unnecessarily..
Well, that's yet another thing to add to the list along with...
- "alt tag" (it's an attribute)
- "header tag" (it's a heading element),
- "dofollow" (no such thing),
- and so on.
Glad to hear your take on this post today Rand. After reading it I was very confused by the post and reading some of the comments afterwards tended to agree with a lot of them. This post is simply confusing.
Going along with what your saying it's almost like Google "expects" us to be incompetent and screw up URL Aliasing to static URLs so it's just advising us not to do it. That's fine if people do that and don't use keyword rich static URLs because "google told them it's ok"... well it's fine because it means I'll rank better for the keywords I'm targeting.
But what Google advices people is not to tamper the dynamic urls to look static (unless you are sure about how to do it) and make a mess.For eg: a few people rewrite their urls to end with a .html extension.They believe that search engines will consider them as static urls (if they end with .html) and they will crawl them better.
The main reason they do it is because they do not understand the difference between static and dynamic urls.Google's article definitely makes this clear.Infact the article seem to be saying that if you try to do such rewrites for dynamic urls, you are only making things complicated for google.
Another con I'd add to static links is that if you don't trust whoever did the rewrites, it can lead to a debugging nightmare, as you constantly wonder whether the bug is being caused by the code or by the rewrite (mod rewrites are VERY difficult to debug)
No it's all done by the CMS by parsing the requested url. And yes it is a folder system that the static pages are published into. Usually its set up as /Category/subcategory/brand/sku.ext. The CMS builds product detail pages that do incredibly well, likely due to most product queries containing two or more of cat,sub,brand and sku. The text is entered in a separate file refereneced using the sku. I can generally get about 3X as many pages indexed as the number of products by embedding different query sorting and filtering to cut down on matches to any dupe detection filters. Brands with only a few products are troublesome but I am working on methods to lower the number of dupes.
the thing i hate about "dynamic" or "parameter-based" URLs is trying to decipher analytics data. it's great to see that mysite.com/default.asp?contentID=107 is my most popular page and has a low bounce rate but what the heck is on that page? just another reason for me to keep rewriting.
i guess that falls under the con: "Challenging (if not impossible) to manually remember"
Thanks a lot for providing this information. I got similar situation so thought to learn about it. I got a website for SEO and found dynamic long url
It was built 2 years back and now my question is whether I should change to static or not. How to solve the issue if it is changed to static. Is it ok to change to static for better SEO result or I should go with dynamic one..
Thanks in advance.
First of all, welcome back Rand!
A friend of mine actually paniced and wanted to remove his .htaccess file. Image what damages that would have done.
Don't fill up the url with useless stuff or try to hide anything. Just follow the good old Google guidelines: "Don't TRY to improve your rankings, work on your content and it will come naturally".
I have decided to recreate all of my web pages from dynamic to static. Instead of REWRITE, I am recreating each page and adding more content related to those pages. My question is.... Do I remove the old pages from my Sitemap and replace them with the new links or should I leave the old url's as well?
One question for all SEO masters please help me i have a website domain name is "bankifsccodesearch dot com" all urls are dynamic i set Title tag H1 meta description but site not ranking please please suggest me how can rank this website
Dear Masters my Question is Still pending my website database url is up to 3 lakh and ".net" based website and i rewrite all urls in a proper way i am trying for 6 months but still not getting the results other competitor dynamic based website is ranking but my website is not ranking please please check my website "here is the link bankifsccodesearch dot com
After reading this post even i will do rewriting to all my dynamic url using .htaccess i was gtg confused about it but now i think this post just showed me the path .Thanks Rand
I think we should not forget the fact when I see something interesting, I dont remeber the url rater tham adding it to to the "favorites" in my browses...so I will not agree with you about ...they are hard to be rewritten
as for the urls the secret is always the same: urls should be friendly urls (very few slashes) and descriptive
the tiny descriptive url... this is the secret. SE should not know if this url is static or dynamic... I mean we should not let them know...
so rewriting old style dynamic urls is something obvious and intensively required
if I have an dynamic url www.example.com/game.php?gameid=killer-instinct and I rewrite it to www.example.com/play/killer-instinct will be ok?
Or will be better www.example.com/play.php?gameid=killer-instinct rewritten to www.example.com/play/killer-instinct
Can someone help me please?
https://www.juegos-gratis-ya.com/
Hi there! This question is much better suited to our Q&A forum. If you ask there, you should be able to get some help.
I'm a huge fan of rewriting all my URLs using htaccess. Every time I see a dynamic URL I shutter just a little bit.
I am late in commenting but I guess I can end the discussion with this -
Pros Cons
Static URLs - 7 1
Dynamic URLs - 1 8
Stick with static urls though make sure it don't cause content duplication like drupal causes.
Pushkar
edited - formating problem
This article points out good ideas, but the main topic is based on a fundamental point being missed. There is very little difference between dynamic and static urls. Very little. Not like the examples people are using would lead you to believe. Either one can look just as messed up or pleasant as the other.
I'm so glad you wrote this. I had a Twitter temper tantrum when I saw this yesterday and was trying to figure out what to do when clients/developers came to me w/ this "Official Google" post and refulsed to do/pay for (correctly done) URL rewrites or re-framing of sites.
I will inevitably have to defend my insistence on "pretty" urls and the more supporting information I can find around the interwebz the better supported my arguments will be.
Completely agree with you on this Rand.
Seems like Matt Cutts added some clarification about the issue here: https://sphinn.com/story/74522#c54232
My perspective on this is that Google just want to reassure people building sites who struggle with rewriting URL, that they can index dynamic URLs. Having worked with a number of people building sites with CMSs like Joomla, I know that beginners can really screw up URL rewriting. So perhaps for beginners this is good news, but for those who can do URL rewriting its definately still worthwhil for the points you noted in your article.
I agree too. Even if your only concern is usability, static uris are still the way to go.
Of course, there are all the other pros you mention too!
Glad to have you back Rand.
would have to say that now that we use static urls from mod rewrites, there is no way that we would go back. much nicer in the url bar, and much better for seo.
Going against Google's official Blog post... and being right about it, lol. Got to love it!
Keep it up Rand.
First i want to say that i love to read Rand and trust what he says, and that is why i was suddenly worried.
Secondly -the question is not if the static is better then the dynamic, but : should a 4 year old site with trustworthy urls - should change them to SE friendly URLs.
And here comes the SEO interest. Of Course a new site will be better with static URL
But does an SEO advise his client to change all the URLs in his site? ( and at least in ISRAEL people charge money for it)
Thank you for your comments
Maybe there is even more we should scrutinise from Google's webmaster policy.
I see what you mean. In my opinion, it's usually a good idea to change the URLs on any kind of e-commerce site because products tend to change anyway and you get the benefit of the click through rate over time.
But in the end, it's a good idea to do some A B testing on a limited number of pages and see how it affects things in terms of ranking and ROI.
It's usually a minimal cost to the client and if you find that the overall result adds to the bottom line, you can make your decision and put forth your case for charging for the work if they can't do it in-house.
However if your site is well established and is ranking well, it's probably not worth it. But then again, I assume they wouldn't be calling you if everything was going how they wanted ;)
In the end, you need to rely on your experience and analysis. Every site has its own dynamic so you need to check all the factors and make a decision based on your test results and current rankings/inbound links and ROI.
If you have a format that works, and has been indexed for a long time, then don't mess with it in any way, except to be absolutely sure that:
- index file name requests are redirected to / or to /folder/ each time (and all internal navigation points to same format)
- non-www redirects to www (or vice-versa),
- parameters are always in the same order,
- print-friendly pages don't expose extra URLs,
- no session IDs are exposed,
- every page has a unique title element,
- every page has a unique meta description.
Those are the major issues that you need to address.
There are a few more that are "nice to do".
There is a long list over at: https://www.webmasterworld.com/google/3718246.htm to get you started.
I think you nailed it here Rand. I was writing a similar story on my blog then found your post. It appears some over at Search Egine Land and Search Engine Round Table were calling Google's statement a "major shift" in SEO Policy - if we were to follow that kind of mentality/advice I'm not sure we'd be in business much longer.
that post from Google opens a can full of e-worms... how sceptical can one get about google's latest url outcry? I agree that static urls always lend themselves more to better ranking and general SEO and if only for the sake of usability, i will keep on using them, though I have to say that i am also in favour of leaving the dynamic url in its natural form as long as it is short.
Overall, I am glad i read this post though as i had been wondering for a while what the latest SEO thinking with dynamic vs static urls would be .
but one thing is pretty much very clear from all dicussion over interent about static and dynamic url, if you know how to do static url do them they are best for your interest, i still believe in static url performace is and will better from dynamic
Thanks for putting this one to bed. I'm sure that Google blog post is going to cause a lot of confusion.
I greatly appreciate semantic URLs because it lets me use the address bar to navigate the site. Flickr does this perfectly, and so I frequently use the address bar to interact with their site. If their URLs were full of naked GET vars I wouldn't attempt it. (Side note: can anyone think of a website with a better user interface than flickr? I can't.)
I also don't understand the implication that dynamic URLs are harder to screw up or create duplicate content with. Personally, I find the opposite.
Absolutely 100% agree Rand, these are my thoughts! Thank you for sharing!
Angelo Palma
Great post Rand. I agree 100%. Especially since in the article Google states that they have only made "some progress" in reading dynamic URLs. It's hard to take their blog post very seriously, it's so vague and light on details.
Interesting, my definition of dynamic and static urls however is different. I consider dynamic urls as session-specific URLs, i.e. only valid for the user actively surfing the site and useless for others (including search engines).
And static urls are simply bookmarkable ones. URLs you can share with your friends. Never mind if there's a database behind it and should therefore be called "dynamic" ... if those URLs look good and have /tape-recorder.html instead of ?productid=34 in it, it's simply good usability (and better for SEs as well :-)) ... so just use the right webframework and you won't have to bother with ugly rewrite rules
Rand,
I was rolling my head from the time I read the post from Big G. Thanks your post came up in the right time to clear all my doubts. Thanks a ton.
Hey this post is still not clear on what a dynamic url is...Even urls (permalinks) generated by CMS like wordpress are dynamic and what Rand is talking about is dynamic urls that appends parameters like session id etc. towards the end (appending them using ? after the url and seperating the parameters using ampersand).
Is it the voice of the SEO people not wanting to lose their clients?
Can i trust Rand Totally, or maybe he protects his interest of wanting clients to pay for search engine friendly urls?
Nadavben - I won't speak to whether you can trust me, but I will say that there's no charge to optimize your URLs into static, SEO-friendly ones. I don't have any self-interest to protect here - I just don't think Google's advice is going to produce "optimal" results.
BTW - for folks asking about static vs. dynamic, I realize it can be frustrating as the SEO world has adopted this nomenclature of calling URLs that contain dynamic parameters (typically marked by the appearance of a ? and = in the URL string) "dynamic" and those without parameters "static." This is, I believe, how Google intended it and it's how SEOs and those in several other fields describe it as well.
I say "folder-based URLs" for URLs that look like folders, even when the website internals may be based on a CMS, and being rewritten.
I find that "Parameter-based URLs" is a good description for stuff with a ? and with more stuff following it, but I sometimes also end up calling those "dynamic URLs".
I say "static page" or "static website" only when the URLs exactly match the internal folder-based structure on the server.
That's my "best effort" on the matter.
It's a jungle of incorrectly used terms out there.
Matt has clarified the definitions correctly here - https://sphinn.com/story/74522#c54313 .
@nadavben. If you can't see the value in Google SERPS of having:
www.mysite.com/sheepskin-leather-gloves.html
versus
www.mysite.com/index.php?productid=12345$cat=5
Then I'm not sure you understand this issue completely. Rand has made a very good case here and it has nothing to do with SEOs. Any competent programmer would be able to do this in his/her sleep. It's just that SEOs understand the value and we're in a position to point it out to our clients thereby making them more money. Any client's IT department is free to make the changes to the site and we don't make anymore money as a result. The clients are the only ones that benefit.
*** www.mysite.com/sheepskin-leather-gloves.html
*** versus
*** www.mysite.com/index.php?productid=12345&cat=5
Ah, that looks like a rewrite that is done almost entirely inside the CMS scripting, with little help from .htaccess - except to repurpose the data from the URL and feed it to the script path like
/script.php?param=sheepskin-leather-gloves
In that case, the script can check the request is a valid one, and send a 404 error if not, and/or make a search and return a list of related posts or topics to the user.
In that case there is a 1:1 relationship between URL and the entry in the database.
In many cases what you see is something like www.mysite.com/sheepskin-leather-gloves-47284.html
In that case, the number is the key to the database entry, and the words are just "filler" for search engines; it's just stuffing the URL with keywords.
This is very dangerous, as any URL that includes the right number may very well serve the same content, irrespective of what actual words are present. That's an Infinite Duplicate Content issue.
This URL would also work:
www.mysite.com/this-site-sucks-47284.html
as would this:
www.mysite.com/our-products-are-a-ripoff-47284.html
and in many cases they do work, because the script does not do any data verification of the extra information contained in requested URL before pulling the data from the database and sending the page of content out to the browser.
That's one of the (all too common) issues that Google were likely referring to in their post.
But why would those URLs ever come into play? The CMS isn't going to generate those bogus URLs and neither is the webmaster.
Along the same lines you could add an ending parameter to the dynamic URL and it would still load the same content -
www.mysite.com/index.php?productid=12345&cat=5 &s=this-site-is-over-priced
So both suffer from the same problem and need extra validation added.
People can be quite bad at cut and paste, so all sorts of junk links will reside on other sites pointing back at your site.
Have you ever looked really deeply into the URLs listed for your site in, say, Yahoo SiteExplorer?
There you may well find a myriad of duff URLs with random punctuation on the end, partial URLs, and stuff that is just plain weird.
Leaving part of a URL open to being "wildcarded" is asking for trouble, not least from competitors who could use that fact to sabotage your site in the SERPs.
There's also the issue that search engines and other bots are getting quite, err, "creative" with what they request in the way of URLs.
They often start paring back URLs that look like folders, going on some sort of voyage of discovery, and have also been known to swap parameter ordering, or drop parameters, to see what they get back for those new requests.
Sure, parameter-driven URLs can suffer the same sort of issues, which is why my .htaccess rules only let through valid and believable URL requests to begin with.
See the simple code example above, where you'll see it expressly disallows certain formats from being rewritten. It redirects as long as the wanted parameters are present, dumping extra parameters in the process, and so on.
These are excellent tips seo. I can not emphasize enough how critical it is to ensure that you have a high quality website.
Thank you for sharing this information.
www.roatanhotels.travel