Re-launching a site is a crucial and often worrying time. There are many many things that can go wrong, and when they do go wrong the results are often spectacular.
Over the last few years I've seen a whole bunch of sites re-launch. Almost every one of them has had some small minor issue. A small percentage have left me with some good stories to tell :-) The following is a list of 10 things that you should check prior to and immediately after you have launched a new site. I've focused the list on things that won't necessarily be immediately obvious, but with never the less cause you issues at some point. I've also focussed the tips on a re-launched site, which has a different set of worries to launching a new site.
A number of these checks are things that will involve work at the point of go-live. Any good developer will tell you that you need to avoid doing anything at the point of go live, so you should try to get as much as possible done in advance. The good news is that the majority of the list will require changing something external to the site, so the cost of failure is lower (but still annoying).
1) Put 301 redirects in place.
You've all heard in a million times, but its important enough to say again. If you are changing URLs you must put in place 301 redirects from the old URLs to the new URLs. This is often a tricky thing to test (since in most cases the URL of the test site will be different to the URL of the final live site). Below is one fairly simple way to test that all your redirects are inplace and working as expected.
Step 1. Ensure your test environment works when given the final live URL. If you use an apache server you can do this by adding another ServerAlias to your vhost.
Step 2. Add a line to your hosts file for the live URL. The hosts file entry overrides the DNS entries and allows you to point the live URL at the test server. The effect is that when you type the final URL, rather than seeing the current version of the live site you will see the test version.
Step 3. Run a site search on google.
Step 4. You can now run through each link from those results and ensure that when you put the site live they will correctly redirect.
Step 5. Remove the hosts file entry. A simple step that is often overlooked. If you forget to remove the hosts file entry you will always see the version on your test server.
2) Add analytics code to every page
One of the most , if not the most crucial times to see what is happening on your site is just after a re-launch. Time after time people forget to add (or update) the analytics code.
Not much more to say about this one, unless you want to be blind as to what is happening on your new site, can I suggest you ensure your developers know that they need analytics code on *every* page of the website.
3) Robots.txt and Sitemaps
I debated whether to include this in the list, because it feels that this falls into regular site testing, rather than something that sits on the boundary of things you would hope your developers already knew and were testing for. I decided to include it, not least because the title says 10 things to check, and without it I only had 9!
Sitemaps and robots.txt often play a crucial role in the SEO strategy, but from my experience they are normally added on as an afterthought. You should ensure that you update the robots.txt and your sitemaps to mirror your new site structure.
4) Update Google Adwords
Prior to the site going live, you should test that site still work with the adwords tracking code. In order to track activity within adwords Google appends tracking parameters onto the URL.
A client of ours once re-launched a website that broke in horrible ways if there were any parameters appended to the URL. It took a while to track down the issue purely because no-one likes clicking on their own adverts.
The following Google help talks you through the glid parameter. In a nutshell you need to check that the following still work.
https://www.example.com/page.html?gclid=test or https://www.example.com/page.html?foo=bar&gclid=test
As part of the go live process you should also update your Adwords campaign. Whilst your users won't notice and probably don't care that the advert is taking them via a 301 redirect, this proabably won't be doing great things for your quality score. At the very minimum you should update the URLs that have changed to be be in their new format.
A re-launch is likely to give you a new set of opportunites for adwords. It's likely there are new pages that open up options for bidding on new keywords. There are probably some keywords that are no longer relevant. The worst thing you could do would be to ignore you adword campaigns, since this will inevitably lead to throwing money down the drain.
I've talked about Google Adwords because this is by far the most common advertising platform, but you should revist all of your adverstising to ensure it matches the new site.
We once took over and "optimised" a Google advertising campaign that had been spending *a lot* of money sending visitors to a page that no longer existed.
5) Review all conversion endpoints
No doubt the old site will have stunning call to actions that encourage your users to do something that is beneficial to your business. These conversions will (obviously) be being tracked somewhere. I encourage you to review anything that tracks or reports on these conversions. Most likely there will be goals or funnels within your analytics program that rely on certain URLs. You need to ensure that you update these or else it will look like your website no longer converts.
If you are driving traffic to your website using Google adwords then you will probably have conversion code on certain pages of your website. Prior to the site going live you should check that the new conversion pages also have this code. On a similar note you should also check that any e-commerce tracking is also updated and migrated across to the new site.
In general people are very reluctant to trust data put in front of them and are always looking for a way to invalidate the data so they can go back to gut feel. The last thing you want to do is to invalidate your reports by missing data for however long it takes to get the conversion or e-commerce code put back on the site.
6) Full end to end test
I couldn't write a list without begging you to do a full end to end test. No matter how much testing you have done, and no matter how confident you are that everything works, please, please, please perform a full end to end test. In the case of a site re-launch something will have changed since you did a test. It could be something as small as a DNS update, an IP address changing or the fact that you have removed a hosts file, but something will have changed since you last tested the site. Whatever it is that has changed will have invalidated all your previous tests, so please, please, please perform a full end to end test.
Often payment gateway providers only accept requests from a given IP address. We knew of a site that re-launched onto new servers. Everything went smoothly, all the tests completed satisfactorily, and the website worked as expected. You could add products into your cart, and move all the way through the checkout process. It was only when you actually clicked the pay now button when the payment gateway failed because the IP address had changed.
If you don't perform a full end to end you are leaving something to chance, and will be leaving money on the table. Your customers won't ring you up and tell you that they can't order, they will most likely just find another website.
7) Server logs
The first few hours, and days after a site has gone live is a scary time for all involved. No matter how much testing has gone on something will have changed in order to put the site live.
Ask your sys admin to give you regular reports on any status codes that aren't 200. If I was you I'd keep an eye on any 404 errors you receive along with any server 500 errors. I'd probably also be checking Google webmaster central for any errors in their crawl stats.
For those of you who use Google Analytics the following blog post talks nicely about tracking 404 errors. You have to add a snippet of code on the page, but once done you can track 404 errors from Google analytics.
8) Ranking reports
The title pretty much says it all. If your URLs are changing and you run ranking reports (or for that matter any reports that rely on URLs) you should update them. As we said previously there is nothing worse than invalidating all your reports by having the wrong URLs.
9) Email footers
I hope by now you have all taken Rand's 1st headsmacking tip to heart and have added link requests to any automated emails. If you have, then now is the time to change (or at least check) the URLs.
On a related note, any links that you can get updated from old URLs to the new URLs will prove very beneficial.
10) Monitor bounce rates
Bounce rates are often a good metric to determine how the public have taken to your new site. Avinash has talked about bounce rates and as a general rule, when Avinash talks, you should listen. It is most interesting to look at pages where the bounce rate has changed dramatically.
If it has reduced then there are probably lessons you can learn and apply to other pages on your site. If it has increased then there are lesson to be learnt, and these lessons should be learnt quickly!
As with all metrics they are only an indication, just because you have seen a change in bounce rate doesn't mean you should hurry through changes for the sake of changes.
10 Things You Must Check When You Re-launch Your Website
Technical SEO
The author's views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
Thanks for the list Duncan.
While the site is still on a development server, and you have a hosts file in place, it's a good time to run Xenu over the site and find any error pages. Of course, you'll want to do that again after the site is live.
It's free to download for Windows users at Xenu's Link Sleuth.
Oh. I wrote a comment similar to this, having not read the comments. I feel terrible!
Hi Great post,
We actually create a sub section in the project plan for the site launch, we find that simply listing the actions still leaves too much room for oversight or last minute changes breaking previously checked items.
As the launch & test team includes both agency and client it is important to identify the tasks that must be completed, their logical sequence, interdependance and what can be carried our consecutively to save time.
Pulling this all together into a project plan clearly identifies who must do what and what collaboration is required, task by task, this is converted into a test plan that references not only to the items listed here but also to the agreed specification, a site that doesnt throw back a 404 but doesnt actually perform as specified is just as broken.
Once the testing has been completed both the client and agency sign off the site, noting any exceptions which can be addressed post launch and the site is site live.
So in addition to the list here I would add:
1. Create a test plan (usually a check list in excel)
2. Create a launch plan (in MS project or similar)
3. Present both the test plan and launch plan to the client ensuring everyone knows what is expected of them and when and how the process will work.
4. Execute the testing according to the plan and take physical sign off from the client
5. launch!
Hope that adds some value
Cheers
Alex
Aloha,
Please find a french translation at these adresse https://insidedaweb.com/dotclear2/index.php?post/2009/08/05/[SEOmozBlog]-10-points-%C3%A0-v%C3%A9rifier-quand-vous-relancer-votre-site-web
Thanks for your tips
Séb
Ask your customers what they think of your new site.
Ask them what they like the most, ask them what they dislike the most, and incorporate this information into learnings for next time.
I think...um I will re-check my website after I read your blog
So, I will try to find my best showcase and tactics to do SEO for my sites
301 redirect is a must!!
Thanks for the nice article
Great post, really enjoyed it.
About the validation. Yeah, I am big on validating... strict too mind you... but it is low on the priority list.
Still I do fuss about it when all other tasks are set and done... I figure its just one of those things to do that can potentially help somewhere, but more importantly it may possibly remove something that may be a hinderance.
The other thing, depending on the type of site you are running... some of the more savvy/picky visitors might copy/paste your url into the validation tool just for kicks. (I am sure I am not the only freak on here who has done this! :) )
When strict comes up... the person who decided to check may be left with an impression of "hey they put in a little extra effort, they pretty thorough."
Either that or... "Why did they bother going through the effort, this stuff really doesn't matter". :P
I can see this being a good little credibility booster if you are running a webdesign firm etc.
Add Negative words to your search engine marketing campaign. Using the Keywords tool look to see if there are search queries that are irrelevant to your business, then add these keywords to your negative list.These irrelevant words can lower your Quality Score which can cause you to increase your minimum cost per clicks and therefore lower your PPC campaign’s ROI.
https://www.virtualkreation.com/
Re-launching a website, its really need some considrations, and you did it well to provide guidance regarding this.
For my french readers, I made a post written for its most part from a translation of your article. You can read it here : https://www.christophebenoit.com/grosses-refonte-web-checklist-seo.html
5 years later and this article is still relevant. Thanks for helping me migrate a very large site with minimal problems (although still some ha) by using this checklist. It's funny the simple things developers tend to forget or see as not as important.
Good stuff, very helpful.
if you are swapping hosts too I find its useful to leave what you have on the old host when you move to the new one for a while until everything filters though
Duncan, great post.
Two things to add - 1) If you can - salvage your customer database and their settings to preserve customer experience and 2) Don't forget about any campaign landing pages
We wrote a similar one geared towards marketers, it can be found here: https://www.libertyinteractivemarketing.com/blog/lessons-learned-how-to-launch-your-new-website/
Xlnt article. I just relaunched my web site yesterday. What a mess. Nothing seemed to work correctly. If I had this a couple of days ago, I would have known what to look for.
Thanks for sharing.
thanks,I think the future I will pay more attention to these。
Thanks for the good info. Help a noob out... After writing the 301, does it matter when you remove the files from the old url?
Thank you for these tips!
A nice and clean CSS would help also!
You can track 404's via Google Analytics but you'll be waiting a fair old while for the thing to update.
My tip is crawl your old site with Xenu and export the URL list and save as a text file. When you've set up all your redirect rules and you're ready to go - crawl that list!
You can go live with a super high certainty that your rankings will be safe and sound.
'Xenu' for those who do not know is a freeware program called Xenu Link Sleuth. Please correct me if I am wrong Richard.
These are great tips for me as we're going through a site redesign at the moment. My scope list did and still includes URL redirects. I'll be sending a link to this post to my web development manager for sure. Thanks for the advice!
Good review. Question: do you think there would still be some loose of ranking and/or links?
Can I just add that this post, while totally valuable is actually much wider reaching than just during or after a relaunch of your website.
Most corporate sites have a structured rollout procedure, if yours doesnt then you should take the time and structure your updates accordingly, it makes everything much easier. Some major sites like amazon or ebay will have six monthly rollouts, and occasionaly bug fixes, medium sized online business'es normally run at monthly or bi-monthly release schedules.
Make sure you follow this checklist whenever you have a website re-release, no matter if the elements updated have nothing to do with the bits that you have changed. When you have had multiple (sometimes dozens) of developers working on a site for several years, you would be amazed how ostensibly unconnected things can affect pages in different parts of the website!
We typically/always test the new website on the server that will be hosting the website-using the Robots.txt file to block indexing of the new website until we are sure that forms, link validation and code validation are completed.. Do not forget to remove Robots.txt before or directly after going live or you can kiss your traffic goodbye or at least alter it to allow indexing.
Also, if your using tracking software that is not GA such as WebCEO make sure you transfer the tracking files if applicable.
These are the two biggest mistakes I have seen and when the client calls you up and mentions: "Thanks for the new website but we have lost all traffic" make sure your face isn't red and your tail isn't between your legs having delivered a beautiful website that is not indexable or isn't tracking data.
I'd password protect the live test site using .htaccess
Then crawlers and unwanted visitors are both blocked.
I would add validate, validate, validate. I don't like sites that have all this great SEO and wonderful rankings, but when you click to check their validation, it comes up with a big red square and error message!
I really seconded the full end to end test as well, I had a similar problem, though it wasn't IP-address related, once a while ago!
I also like to bring all the sites I work on up to W3C standards, including HTML validation, but validation is far, far less important than say HTML semantics, especially when it comes to SEO.
For example, a page will validate as (X)HTML strict even if you use a zillion nested tables or throw innapropriate html elements all throughout your code with reckless abandon, as long as you're free of syntax errors.
Semantics far supercede validation.
HTML validation can offer some instant gratification on the click of a button, but for the most part I reserve this phase of on-page optimization for debugging rendering problems.
As far as CSS validation goes, which likely has zero effect on indexing, the same applies; you can write external sheets that pass as valid CSS, but are far from being examples of well written CSS.
On the other hand, strict validation is a great outlet for any OCD tendencies I might have. ;)
May i support the checking of the robots.txt as many "exclude all" on the robots.txt on their staging environment and if that little bad boy found itself onto live it can kill all your SEO effort.
Trust me I know what it like if it does go live.
Once I had moved a site via a quick "auto" Wordpress install only to find that the robots meta tag was set to "noindex, nofollow". Whoops. Might double check that one too while you're at it.
Never auto-install WordPress. It's so easy to do, bu if you can't do it pay someone. Auto-install nearly always goes wrong and you have horrible code snippets that have to be removed by hand. Much more trouble than it's worth and a professional install would work out cheaper by far than the hours it'll take you to sort the mess out.
Don't forget to validate all code :D
Validate all code? I'd say that's very low on the priority list. Personally, I never bother.Accessibility is important, but I don't think validation is.
While it's still a contentious issue in the SEO community, I believe that code validation does very little (if anything) to help with SEO. Most of the top websites do not validate.
If the site has an accessible and flat architecture that can be easily crawled by the engines, it works well in all major browsers, and it was developed using web standards, then I don't see the benefit of investing the extra time into making sure your XHTML/CSS validates.
Some resources on the topic:
https://www.seomoz.org/article/search-ranking-factors (ctrl-f for Validation)
https://www.seomoz.org/blog/w3c-valid-code (old, 2005)
https://www.seomoz.org/blog/web-standards-w3c-guidelines (also old, 2005)
https://www.seomoz.org/ugc/web-standards-and-seo-more-questions-than-answers
I have to agree with Whitespark here. I'm a fan of valid code and always urge my developers to write as such, however I don't think this has much SEO benefit.
I also tried to focus the list on things that you wouldn't expect your developers to do "as standard" or things that you might easily forget. I would argue that valid code is something you should expect as standard.
I agree that validating code is not a priority.
UNLESS your plan is to get listed in XHTML-valid directories - not sure how much real value that will bring vs. investment in time (time better spent creating quality content).
OR maybe you like to have that little link in your footer that says that your code is validated. That always sounds cool.
OR maybe you despise your developers and you wish to cause them mental anguish...
I think the only worthwhile option is the third one.. i.e. causing developers pain :-)
I see plenty of sites with the "my code is valid link" in the footer, but when you click through the site is riddled with errors.
I agree: judging by the number of sites that don't validate, but do very well, Google could care less about code validation. lol
I agree that it doesn't seem that Google cares much if at all about whether code validates (it would probably be silly of them to do so).
That being said, before you roll out a site, every page should validate as XHTML. This task falls more on the development side than the SEO side, for sure, but we're still seeing sites rolled out with an HTML 4.0 Transitional doctype, some of which even have fully javascript based menus that aren't parseable by search engines. Having an antiquated or missing doctype adds a lot of time to the process of adding widgets, implementnig newer ajax/javascript techniques, fixing the nav to be 100% css, and other tasks that are involved in SEO and conversion paths.
Also, you said that if the site "...was developed using web standards, then I don't see the benefit of investing the extra time into making sure your XHTML/CSS validates.". If your code doesn't validate, then you're not developing using standards.
I would say that you're still using web standards even if the code doesn't perfectly validate.
By web standards I mean that you're generally using modern xhtml/css coding techniques for web development. You're using divs and css for layout, not tables. You're using unordered lists for your menu systems. You're using CSS and not javascript for dropdown menus. Etc.
Basically, I'm saying that if you're pretty much using "web standards" throughout your code, and you accidentally write a <br> instead of a <br /> in your code so now you get a validation error, then, oh well, it's no biggie.
Well, I just re-launched my site last week and guess what. My home page was unindexed by google. The results are a nightmare, the cause unknown...we checked and double checked everything.
I think things can just go wrong no matter how careful you are.
Any thoughts on this please email me :-)
Great article. And, for me, very timely. At work we've just taken down our old site and replaced with a holding site until the full relaunch at the end of this month.
This process has been slightly bumpy, but there are some tips on this list that should make the next stage of the migration smoother than a smooth thing.
Great post! I especially liked the instructions for testing 301s with another ServerAlias.
As far as throwing a sitemap in right from the beginning, I prefer waiting a while to see how juice flows throughout the site first without them. Especially on some smaller sites, where I sometimes don't use a sitemap at all for the sake of an unaided view of how bots are spidering my site. I'd be interested in knowing how others feel about this.
We use a great tool called Linkpatch to catch broken links on the site in addition to links from external sites that may be out-of-date and in need of a 301 redirect. Xenu and other link checkers don't account for the external links that generate a 404.
Linkpatch is a simple javascript snippet you can add to your 404 page and it sends you an email in real-time when someone gets an error, explaining all the details and how to fix the link. It's been full-proof for managing client sites.