If you believe the rumors, we all now live in something called the real-time web. The once steady trickle of user-generated content became a torrent, and search engines face the difficult task of drinking from a fire hose without drowning. It only stands to reason, then, that fresh content is becoming more important, and anecdotal evidence seems to back that up. Every day, blog posts and Tweets seem to get indexed and ranked a bit faster.
Freshness seems important, but what signals does Google use to determine freshness? Beyond the original cache date, do the spiders pay attention to on-page signals, such as dates in body content or URLs? I thought it might be fun to try and find out.
1. Manipulating URLs (non-301)
My plan started out simple: manipulate a URL on my blog and rename it to use a date-based format (as some blogs do by default). So, for example, a URL that normally looked like this:...became something like this...https://www.mysite.com/topic-goes-here
I chose a blog post that was recent enough to still be archived and spidered but not so recent or popular that it was likely to attract new inbound links. I chose 3 long-tail keyword phrases to track for that post, and then flipped the switch and changed the URL. In part 1 of this experiment, I did not 301 the old URL to the new one. By not 301’ing, I was hoping to nudge Google into updating the original cache date. The graph below shows what happened:https://www.mysite.com/2009-09-01-topic-goes-here
The rankings axis is inverted to show low rankings at the top, with 1 line for each keyword phrase. Here’s where things got weird. Even after spiders indexed the new URL, that URL showed up in rankings on 3 different days for the 3 phrases (indicated by the gray, dotted lines). Some rankings dropped before the new URL appeared, others after, until they eventually stabilized slightly lower than the original URLs. Oddly, the one keyword that hit #1 after the switch also managed to cache the 404-error (so, that ranking was completely useless).
2. New URLs, Take Two (301)
Of course, outright changing a URL without 301 redirecting it is a bit unusual, and would mean that I lost whatever inbound link juice I had flowing to that page (it wasn't much, but it still can't be ignored). So, not generally one to learn from my mistakes, I tried again, this time with a new blog post but with a 301 in place.Not surprisingly, the spiders were a bit better behaved, with all 3 rankings reflecting the new URL on the same day. Somewhat surprisingly, though, some keywords lost ranking, some gained, and the overall average ranking change was roughly a wash. Not a promising sign for my URL-based freshness theory.
3. Mad Science Is Science, Too
So, what can we learn from my little experiment in freshness? I'm not entirely sure, but I'd like to offer a few takeaways to trick you into believing that reading this post was a good idea:(1) Google Isn't That Dumb
If you were considering changing all your URLs to trick Google into thinking that your posts are brand, spanking new, here's some advice: don’t.
(2) Always, Always 301
Although I had my reasons for not using 301s in the first experiment, don't ever rename an important URL without redirects in place. If nothing else, Graph (I) should be a lesson in what can happen if you do.
(3) Proceed With Caution
Even if you do rename your URLs for a perfectly good reason, and you put 301s in place, expect some short-term consequences. Rankings may fluctuate, and where you end up when you're done might not be exactly where you started. Changing your URL structure is a big job – sometimes, it’s necessary, but don't do it just to make a minor SEO tweak.
Good post Dr Pete,
I would recommend only changing the URL structure if you are planning implementing a site overhaul for a re-launch for example, if you do, don’t forget 301's where you can.
From past experience you will see some troughs in traffic before you see the peaks but if you are implementing keyword rich friendly URL's then you should see some nice peaks.
p.s. Dr Pete, are you a Dr. in a Ross from 'Friends' kind of way???
Id hope so, cause I dont think heart surgeons are going to be much help here..
Are you implying the SEO's don't have a heart :( ?
edit: posted that with the wrong account! hah!
No one's ever asked me the question quite that way, but I suppose so, yes :) I have a Ph.D. in Cognitive/Experimental Psychology.
Thumbs up for asking Dr. Pete that question :)
I love this post, why because I like to test borderline black hat SEO, I never do it on a client website but I think its another important testing method to try and glean as much from negative (black hat methods) as white hat. You can test like Rand has demonstrated with Perfecting Keyword Targeting & On-Page Optimization or Search Engine Ranking Factors but unless you look at both sides of SEO your only receiving half the data.
Good Stuff - As mentioned black hat SEO is not something to practice in but I think it can be useful in testing and better understanding how Search Engines work.
I have to admit that I had a secondary goal with this one. I see a lot of people overoptimizing lately, and URLs seems to be a big area they focus on. I'm all for clean, descriptive URLs, but revamping an entire site to fine-tune your URLs for SEO (if they're already pretty good) is a good way to get yourself in trouble.
Understandable, then can I assume your recommendation to a website that has decent ranking and average URLs; but needs some good on-page optimization to keep their URLs the same. This seems to be the take away that I am leaving with. Please correct me if I am misinterpreting your post. I also wonder if you changed the H1 and the URL would the results remain the same in your test or would this combination allow you to dupe Google.
I don't like to give one-sized-fits-all answers in comments, but I think there are many times when you should leave well-enough alone. Any change as dramatic as rebuilding your entire URL structure carries significant risks.
I doubt Google is using header tags (H1, etc.) to determine freshness - it's just too easy to manipulate. I also strongly suspect that the value of freshness is niche-specific. A general content page on a topical site isn't going to get a boost just because it's newer than some other page about the same topic. Freshness is much more important for news items and other time-sensitive content.
Nice testing. Nice presentation. This confirms some of my suspicions and also explains a bit why when I did a directory structure revamp on one of my blogs my rankings took a temporary dive. Glad I did it early on instead of AFTER the site went from 50 pages to 500 pages. Thanks for the read.
Dr Pete, nice post and very thought provoking. My question would be can freshness also be a bad thing. What I mean by that is, you are renaming your URLs to have a recent date. So this is great when people are searching within a short period of time of that date. However, what happens in 3 months? The page is no longer "fresh", so is that URL still worth the effort put into renaming, etc.
I would much rather concentrate on creating quality content and getting links to that content then having a "fresh" page for that day. Not that the two have to be exclusive, but just a thought.
My hope is Google doesn't go too far with showing only "fresh" information, as that isn't always the most relevant.
Honestly, I think we're all too obsessed with real-time information. Even in a fast-paced industry like ours, there are great resources out there that are a month, 6 months, or even a couple of years old. I see people dumping their RSS feeds for Twitter, and (as much as I love Twitter), I can't help but think they're missing out on valuable content from trusted sources. What if I wasn't paying attention yesterday? Should I just disregard everything written 24 hours ago and jump right to what's hot today? Pretty soon we'll only be interested in the last 15 minutes, and that's just crazy.
I couldn't agree more. It seems as though we just accept what's written because it's new rather then making sure it's quality information. I mean just look at MJ's death. Over the past few weeks I think the story and "evidence" on how he died has changed half a dozen times or more. Wouldn't it be more valuable for them to find out exactly what the cause was, and then report it?
The internet is certainly great for getting info fast, but I think we need to stop and realize that it doesn't necessarily mean it's good information we're getting so fast.
this is why I appreciate the presence of SEOMoz these days. With all the SEO/Social Media related blogs posting articles in a relentless manner, talking about the same damn things over and over, SEOMoz comes with fresh ideas and testings that really make SEO's think.
I'm a sucker for tests like this, Pete - really good stuff.
We've always suspected that something between 1-10% of link juice was lost in a 301 redirect. I'm also of the general opinion that Google's built some fairly hard-to-game systems around 301s ever since some friends of ours (Greg, I'm looking in your direction) decided to take advantage of some holes around them a few years back.
I'm honestly starting to worry about the number of people who seem intent on revamping their entire URL structure just to fine-tune the SEO value. It's one thing if your URLs are a mess, but to go from decent, descriptive URLs to slightly more strategic URLs is a small gain at what could potentially be a huge risk. Overoptimizing is a dangerous game that more and more people seem to be playing.
You can't 'over optimize' it's an oxymoron, either it's optimal or it isn't. The only other barometer would be variability to prevent such 'optimal' optimization aka complete reverse engineering.
Also I don't see anything remotely blackhat about this??
You're grammatically correct, but "Over-optimize" has been popping up to refer to situations where you do too much of a good thing, from an SEO standpoint. Even white-hat tactics can reach a point where it's obvious that you're just trying to game the system, and Google is getting smarter about that.
So we need to look at the 'getting smart bit' in more detail. Nice post though Pete.
Over-optimization is possible and it can hurt rankings. My brother has had a site #1 for years for relevant search queries. I started working at an SEO company (to see how the other half lives) and they suggested a couple things he could do to his site to "improve" it. He did those simple suggestions that would be employed by every SEO company, his ranking dropped. He returned the site to the way it was, his ranking moved back to #1. Too much of a good thing is still too much.
When it comes to "over-optimization" the best rule is if it's not broken, why fix it.
If his website was ranking well and doing well, then why bother doing changes.
However as Pete said if the website has terrible urls, and the whole website is a mess, then an overhaul would be fitting. If it's ranking well, getting solid traffic, then why bother, just keep adding content and keeping the site maintenance up and you're fine.
"If his website was ranking well and doing well, then why bother doing changes."
For the lulz.
I have always wanted to do this - grab a couple of blog sites and see how far you can push the boundaries with regards to black or blue hat SEO.
I am going to run through a couple of experiments on my own and will post them over at my blog... https://www.seo-wizzards.co.cc
Or, put together a YOUmoz post. Jen's been askin'for content, and it IS Analytics month.
i saw a webmaster video matt cutts did where he talked about freshness and that putting dates in the page / url doesn't really help. with the ability to search for more recent stuff now he said that google will know how fresh a page is and that the date in the page wont really matter because it can be spoofed or wrong
ill keep looking for the link for the video
I like the title - cute. Sometimes I try to "improve" my site and come out with less search results. I re-did my site, looks much better and fresher but I'm not coming up on page one for as many keywords. Verbage and keyword density is very close to the same. I scratch my head!
I really enjoyed this post. I'm also a sucker for seeing experiments run and interpreting the results.
It would be great if similar to whiteboard fridays there was "testy tuesdays" (ok that's a horrible name) where a post each week was dedicated to showing outcomes to tests run just like this.
Andy
I think that the web development companies should take a long look at the graphs you have created. One thing is clear here whenever you are launching a new site and moving content around 301, always!
It's great to read information that is based upon hard data. Opinions can only carry us so far. Great work!
This test fits exactly my experience in URLs rewriting: urls should be rewrited only if it's really needed by the page content, not for pure SEO...
thanks for your work and...good morning, I'm new here!
Fascinating article! Although I've never used the methods you suggest (and don't think I will), it gives a very interesting perspective.
I'll also say that I know my sites that have lots of new content definitely get rewarded for it, but often sites who are static aren't penalized.
Still laughing, even all the way down here !
Great information. I had a feeling things worked like that, but seeing some real, tested data is always extremely helpful.
Thanks!
nice testing and report, thanks
Awesome post Dr. Pete. I've been hearing lots of folks yelling about automatic URL rotation lately, and it's been driving me nuts. Now I've got some empirical data to back up my instincts that it can't possibly help.
In test 2 I suspect you lost rankings slightly even though you 301'd as I believe not all juice is passed via 301.
A small amount of juice evaporates along the way.
Whoops - Sorry, just noticed that I seem to have just echoed what Rand has already added, sorry for the repetition...
Thanks for the testing and this post, Dr.Pete
I guess Google will determine the freshness of a page by crawling log rather than URL.
After 301 redirect, it normally takes weeks for new url replacing old one, so yeah, proceed it with caution, you may lose some sales
I always want to test the boundaries by creating a blog and tring every bad thing in the book just to get it banned and just for fun.. Never got around to doing it though.
cool things DOC.
Thanks Dr. Pete, I love seeing tests like these :)
Dr Pete, Great post and very interesting test.
I'm new to SEOmoz and finding lots of very helpful info. But, I had a question for you regarding your "over-optimize" comment. When or how can you tell if you are optimizing for just optimization sake and it will not help you? Any thoughts/insights?
Well, on the one hand, I think we all know, deep down, when we're optimizing just for optimization's sake, but that's a bit of a Disney answer :) Let's face it, optimizing is part of our job as SEOs - in fact, it's 1/3 of the letters.
Speaking purely from a technical standpoint, Google understands patterns. When you use the same tactic over and over (don't diversify), make changes too often, or make sudden changes (like rapidly acquiring similar links), the spider's alarm bells (spidey sense?) go off. I think it's fine to pursue spider-friendly practices, but don't fine-tune so often or so blatantly that you draw attention to yourself.
In many cases, it's not just a white-hat issue. Changes often carry risk, not just for SEO, but for alienating your visitors or flat out breaking your site. Don't make changes just to tweak something so that it's 1% more optimized. Major changes should always be strategic, well-planned, and carefully executed.
Thanks for the post, I've noticed similar issues in the past with sites changing the url to appear fresh - it doesn't work..so ideally, erm...don't do it! :)
Great Post Dr. Pete. This just shows the value that Google is putting on links. Optimization is great; getting the urls, the keywords, alt tags and the like - all good. But without inbound links from a few reputable sources, optimization will only go so far.
Not true. I tested different url and site hiearchies against generic url/site hierarchies against competitors and each new url that is indexed is auto ranking higher than said competitors.
I can also personally vouch for this scenario.
We 301'd a few url's on the blog during some desing changes. It seemed like we lost all the link juice and rankings for the whole site began to fluctuate.
This article is timely. We are redoing a website and will have mostly new URL's. If anyone has any suggestions for easing the transition, I'd be happy to hear them. The site has reasonable rankings and we don't want to lose that. Thanks, Radiance
Well… I visit your website first time and found this site very useful and interesting! Well… you guys doing nice work and I just want to say that keep rocking and keep it up!!!!
Adam
Welcome, Adam!
Really Thank you,
Actually i m also visit your web site first time
But its very intrestin and nice designied.
(Edit: link removed)
This is clearly SPAM. Do we allow users to continue to post spam?
@rorycarlyle
I agree, I'm seeing more and more spam here everyday. Rand - can't you just delete the whole users spam comment? Many SEOMoz users love reading other users comments - all this spam is becoming time consuming for us too.
This is a bit off topic, but I'm fairly new to SEO and I have to ask:
Why do people spam blogs like this if you know the links have no-follows in place?
(I'm also new to being a PRO member, so please go easy on the thumbs down if this isn't the place to ask - I'd hate to start off with negative MOZpoints).
Great post BTW!
Because they're dumb
Also because automated comment spamming software takes a much less selective approach than a user would. its not terribly efficient to look at the code on the comments page for nofollow as that would slow down the process of finding a reply form, injecting a comment via cURL, and then moving on.
Seriously though, loads of spam comments at the moment. Did you guys p*ss someone off?
BTW, I'm not recommending this as a technique.
Because it's easier than spam email. A lot more people read forums than they do their junk mail folder.
I like that "Kandy" has -8 mozpoints. :)
-9. I just contributed my thumb.