In the real world, things go wrong. While we might all wish that everything we did was "fix once, stay fixed", that's rarely the case.
Things that were previously "not a problem"(TM) can become "a problem"(TM) rapidly for a variety of reasons:
- someone changes something unrelated / without realising it would impact you or just screws up (e.g. deploying a staging version of robots.txt or an old version of a server config)
- the world changes around you (there was a Google update named after a black and white animal a while back)
- the technical gremlins gang up on you (server downtime, DDoS etc.)
In all of these cases, you'd rather know about the issue sooner rather than later because in most of them your ability to minimise the resulting issues declines rapidly as time passes (and in the remaining cases, you still want to know before your boss / client).
While many of us have come round to the idea that we should be making recommendations in these areas, we are too often still creating spectacularly non-actionable advice like:
- make sure you have great uptime
- make sure your site is quick
Today, I want to give you three pieces of directly actionable advice that you can start doing for your own site and your key clients immediately that will help you spot problems early, avoid knock-on indexing issues and quickly get alerted to bad deploys that could hurt your search performance.
#1 Traffic drops
Google Analytics has a feature that spots significant changes in traffic or traffic profile. It can also alert you. The first of these features is called "intelligence" and the second "intelligence alerts".
Rather than rehash old advice, I'll simply link to the two best posts I've read on the subject:
- Here on SEOmoz by Rebecca Lehmann - 7 essential google intelligence custom alerts
- Over on Blueglass by Annie Cushing - stay alert with google analytics
This is the simplest of all the recommendations to implement and is also the most holistic in the sense that it can alert you to traffic drops of all kinds. The downside of course is that you're measuring symptoms not causes so you (a) have to wait for causes to create symptoms rather than being alerted to the problem and (b) get an alert about the symptom rather than the cause and have to start detective work before paging the person who can fix it.
#2 Uptime monitoring
It doesn't take a rocket surgeon to realise that SEO is dependent on your website. And not only on how you optimise your site, but also on it being available.
While for larger clients, it shouldn't be your job to alert someone if their website goes down, it does no harm to know and for smaller clients there is every chance you'd be adding significant value by keeping an eye on these things.
I have both good and bad reasons for knowing a lot about server monitoring:
- the good: we made a small investment in Server Density in May last year (and scored our only link from Techcrunch in the process)
- the bad: we've been more enthusiastic users of our portfolio company's services than we might have hoped - some annoying server issues have resulted in more downtime for distilled.net than I care to think about. To add insult to injury, we managed to get ourselves hit with a DDoS attack last week (see speed chart below)
There are three main elements you might want to monitor:
- Pure availability (including response code)
- Server load and performance
- Response speed / page load time
Website availability
There are two services I recommend here:
- Pingdom's free service monitors the availability and response time of your site
- Server Density's paid service provides more granular alerting and graphing as well as tying it together with your server performance monitoring
Here's what the Server Density dashboard looks like:
And here is the response time graph from pingdom:
You can see the spike in response time during the DDoS attack and the lower average response time over the last few days after we implemented cloudflare
Incidentally, you may not have noticed (it had passed me by until Mike gave me the heads-up the other day) that Google rolled out site speed to all analytics accounts without the previously required change to the GA snippet so you can get some of this data from your GA account now - here's the technical breakdown from some of Distilled's pages:
#3 Robot exclusion protocols, status codes
This was the most ambitious of my ideas for SEO monitoring. It came out of a real client issue. A major client was rolling out a new website and managed to deploy an old / staging version of robots.txt on a Saturday morning (continuous integration FTW). It was essentially luck that the SEO running the project was all over it, spotted it quickly, called the key contact and got it rolled back before it did any lasting harm. We had a debrief the following week where we discussed how we could get alerted to this kind of thing automatically.
I went to David Mytton, the founder of Server Density and asked him if he could build some features in for you lot to alert when this kind of thing happens - if we accidentally noindex our live site or block it in robots.txt. He came up with this ingenious solution that uses functionality already present in their core platform:
Monitoring for any change to robots.txt
First create a service to monitor robots.txt - here's ours:
Then create an alert to tell you if the MD5 hash of the contents of robots.txt changes (see a definition of MD5 here):
If you copy and paste the contents of your robots.txt into an MD5 generator you get a string of gobbledegook (ours is "15403cbc6e028c0ec46a5dd9fffb9196"). What this alert is doing is monitoring for any change to our robots.txt so if we deploy a new version I will get an alert by email and push notification to my phone. Wouldn't it be nice to get alerted in this way if a client or dev team pushed an update to robots.txt without telling you?
Spotting the inclusion of no-index meta tags
In much the same way, you can create alerts for specific strings of text found on specific pages - I've chosen to get an alert if the string "noindex" is found in the HTML of the Distilled homepage. If we ever deployed a staging version or flipped a setting in a wordpress plugin, I'd get a push notification:
Doing this kind of monitoring is essentially free to me because we are already using Server Density to monitor the health of our servers so it's no extra effort to monitor checksums and the presence / absence of specific strings.
#4 Bonus - why stop there?
Check out all the stuff that etsy monitor and have alerts for. If you have a team that can build the platform / infrastructure, then there are almost unlimited things you could monitor for and alert about. Here are some ideas to get you started:
- status codes - 404 vs 301 vs 302 vs 500 etc.
- changes in conversion rates / cart abandonment
- bot behaviour - crawling patterns etc - given how disproportionately interested I was in the simple "pages crawled" visualisation available in cloudflare (see below - who'd have guessed we get crawled more by Yandex than Google?), I feel there is a lot more that could be done here:
PS - today is the last day for early bird discounts on our Linklove conferences in London and Boston at the end of March / beginning of April. (There's also a sign-up form on that page if you want to make sure you always hear about upcoming conferences and offers). I hope to see many of you there.
The best way of SEO monitoring i have found over the years is through collobration with your web developer(s) and clients. Most of the time these guys are responsible for making direct changes to the website. So if you know what is going on in their head, what they are going to do today (in advance) on your precious SEOed website, you are aware of possible SEO consequences. For example. you have come to know that today your client's developer will work on rewriting URLs. Now if you know this in advance you may give him some tips on best seo practices for rewriting URLs before he actually start working on this task (usually in his own unorthodox way). Somehow you would wish that you know in advance his 'to-do' list.
The harder and more time consuming way is to stay in touch with him on regular basis via skype without making him feel like you are micro managing him. The better way is to use a project management tool like basecamp where you know who is doing what and when. I have been using basecamp for the last several years and have found it indispensable.
One of my website was attacked twice by a malware and i came to know only when i saw warning in my Google webmaster tools account. By that time any person coming to my website was being warned by Google not to proceed. I leared an important lesson and seriosuly considered investing in website back up and monitoring tools. Then i came across this awesome tool called codegaurd. This tool takes automatic backup of your site, checks your site hourly for any malicious change to its code and alert you via email. You can then choose to rollback changes and above all the service just cost £8 a month. Considering its enormous benefits i think it is a small price to pay. I consider this cost as a part of my hosting fees now and recommend all my clients to use this service. Imagine how much sales an ecommerce website can loose in a single day once labelled as the host of malware by Google.
I use Robotto to monitor any changes to robots.txt file. It is also a good practice to check the robots.txt file of your client's website once every day. For monitoring content theft i check the trackbacks on my wordpress dashboard. Few weeks ago some guy scraped all the contents and design of seobook.com page by page. Bill slawski was able to detect this only because of his trackback URLs. When the guy scraped the contents, he also scraped all internal and externals link of the site including a link to Bill's website. This incident just prove that how important it is keep any eye on trackbacks as the next website can be yours.
For monitoring back links i used linkstant. A great tool to find out immediately who has just linked out to your site. This tool is especially useful if you manage a link building team or outsource your link building and would like to know what type of links your team is acquiring. Just when you start seing bunch of spammy links from forums you can call your link builder and ask him to stop before it is too late.
I crawl clients' websites once a week through screaming frog seo spider to check the status of crawling and indexing issues like rel=canonical, no index, nofollow, duplicate title tags, response codes etc. You never know when some pages start using noindex, nofollow, rel=canonical; when some pages start using mutiple title tags. Call this a weekly health check. Screaming frog has nice drop down menus through which you can easily filter out any issue. No need to download any data unless you want to act on it. 'Crawl diagnsotic reports' in the SEOmoz Pro app are also useful in monitoring the site crawling and indexing issues.
Woah! Thanks Himanshu... just write down the names of some of the tools you cite :D
Awesome followup! Codeguard looks VERY useful. Thanks!
Just set up Linkstant yesterday, so nice catch!
Linkstant = relationship building with link partners FTW
Hi Himanshu, thanks for your valuable comment and lot of thanks for those links. Hope your seo take away and your even management blogs doing well.
Nice post Wil - the server density robots.txt / file monitor would make a handy free tool :-)
I'm surprised not to see more mentions of https://ifttt.com/ in the comments! For those of you that are already up to speed - I found this handly little cookbook: https://www.makeuseof.com/tag/10-great-ifttt-recipes-automate-web-life/ - I'd love to see more link monitoring ideas posts in YOUmoz with ifttt :-)
Nice idea.. If I may suggest writers... Dan Shure and Paul Gailey, you are advised :)
Thumbs up for the free tool for #3. I wanted to try straight away first...
Great post Will and I hope there's a few people reading this who realise they do no checks at all and are sitting ducks (and then go and change their gameplan right now!).
I thought I'd throw in a few more ideas below:
Email forwarding: As with the robots.txt example it can be very easy for a client to update a site to a page with an old no longer used email address (or to cancel an email address that exists on the site). If this happens your email server must have a routing address that forwards any emails that can't be found to a central inbox. Worryingly a lot of people don't have this, and miss out on potential leads.
Broken sales process: There's a few ways to monitor this. If your ecommerce system allows you to setup filters for times when revenue sets below an ideal % that's one option. If that isn't an option you can scheule regular tasks via Mechnical Turk for people to complete dummy purchases throughout the day, and to altert you in they fail. This is easier (and cheaper) than hassling internal staff throughout the day to test the site.
Content theft: For large sites content theft is a big issue, with scraped content being posted elsewhere on a regular basis. Using a mixture of Copyscape, brand monitoring tools and a DMCA account you can get regular updates when content (or your entire site) is scraped. These three tools used together let you automate the whole process, so are a mangeable way to do this on a large scale.
Mike the one thing I would absolutely like to monitor is the changing external link status for clients websites. The reason being unless you go through and manually check each link that you have executed, it would be very difficult to tell where the site is loosing link juice from. In addition there might be directory sites who intially post your listing but without knowing remove it. How would you do this?
From my knowledge majestic only provides a number and not a detailed report
Good to hear everyone's thoughts on this.
Thanks Vahe, that's definitely a problem. Linkdex actually does that, any links that drop out are mentioned in the report so give them a go as that should help.
are there any other tools? Just thinking about it from a cost perspective as we have several other tools we use for other features it has.
Good job!
This is an example of "advice post". I've never realized about the importance of alerts in SEO. Sure I'm going to implement some of them.
Thanks Will!
Being able to monitor for status codes - 404 vs 301 vs 302 vs 500 etc would be great. I always forget that you can set up alerts and how important monitoring these things are-will save a lot of heartache!
Actually that's a good point - faster recrawl on specific pages inside the pro toolset would make for a handly little alerting tool :-)
Will, you have provided me actionble alerts which I can use to monitor my sites with more comfort like a detective LOL. You have said right, we should consider about the symptoms so that before it hurts we get prepared for the required actions in order to minimize the loss. The most important part which I liked much was the Custom Server denisty alert for the Robot.txt file's Url Path and the "no index" phrase found in HTML page. This was one of the serious issue I was suffering from(because the developers do the changes without knowing the possible consequences). That was a great Read for me Will. thanks a lot. Thumb up to you.:) BTW I found 404 Link error for your MD5 Definition link i.e. https://dret.net/glossary/md5
Thanks for the great post. I hadn't found pingdom.com. Keep up the great work
Super stuff!!! Now i can keep a hawk like eye on the robots.txt & other important files..
Thanks for the ideas. I'm a big fan of Google Analytics because of its simplicity. There's always something new to learn and improve on in this mad internet world. Uptiming is surely something to keep an eye on.
I love the Google Analytics Intelligence, I didn't know about it before reading this. Thanks!
Now that is a lot of data and information. I know my company tracks things on site live status and we tend to use GA Intelligence to track a change in overall traffic on a day in a week scale (ie this day vs same day last week)
Nice post Will...
This is always good to have alerts (unlike Google Alerts, that respond only when it likes) as this help you to work against any uncertain issue at the early stages only and not to wait for the things to go worst! I am using GA custom alert of drop in traffic form quite a lot of times... Also in my opinion checking GA for multiple things at least twice or 3 times a week is always good!
WARNING!!!! BE careful.
As i understand it, the linkstant domain has run out and the new owners have writen a script to redirect visistors away from your site.
After investigastion by our webmaster this is what we have found.
Maybe Will has more infomation?
This has happened to me as well.
I just removed the code from my site!!!
As I understand it, here's what happened - the domain expired, and something with the way the hosting company handled the expired domain caused the redirect you were seeing. I don't believe that the guys' code was deliberately designed to add the redirect. This should all now be fixed as they have renewed the domain - see twitter here: https://twitter.com/#!/Linkstant/status/202412221408616448 and here: https://twitter.com/#!/Linkstant/status/202428607296057344.
Hope that helps.
Hi Exclaimer,
I'm Rob, I built Linsktant. The domain was breifly un-registered; I've posted a more complete explanation here: https://www.seomoz.org/q/linkstant-redirect-issues#post-89523
The traffic drop tip monitoring tip is handy. Does anyone know what's a good free tool for monitoring backlinks you've acquired (making sure they're still there?). I've found a number of paid services that do it, but can't seem to find any free solutions.
Custom Alerts are awesome! It's actually the topic of my next blog post.I didn't know all the other tools mentioned so thanks for sharing that with us!
<3 pingdom! their blog is really good for techies aswell, really interesting posts.
#1. Google Analytics: Like the info and suggestions for using the Intelligence. I need to use more of this myself. I already have the alerts being sent and reports, thanks for the push on using the Intelligence feature. By reading some of the past articles it has helped me already with tracking.
#2. Uptime Monitoring Software: Nice website monitoring tools. I used to use these all the time and just became a little lazy on it. I will have to reintegrate them into my tool set.
These monitoring tools are also very good and helpful not just for traffic and uptime monitoring but with monetization. If you are monetizing your site based on traffic or page view models and you are down it could cost you $100s if not $1,000s+++ if you don’t catch it soon enough.
Thanks
- Cap
I had no idea Google analytics allowed you to monitor traffic drops! I really need to sit down and explore analytics when I've got an afternoon spare...
Top post Will - always great advice
Hi Will,
Great post with some fab actionable tips.
I've been using Google Analytics custom reports and that's proven incredibly useful!! I haven't come across Server Density before though and will definitely take a look.
Cheers!
Thanks will, this is a great selection of practical advice and real-world experience. It's so important to be vigilant and maintain awareness of changes in activity on your site.
Excellent notes, the takeaway for me will be setting up alerts to drops in traffic. Genius, thanks for bringing it up!
Great post Will.
For those with no budget to invest on Serverdensity, there is a great Free uptime/downtime monitoring tool called uptime robot. When the server goes down/up you get notified by email as well as by a Twitter DM. All downtimes are being recorded and you have access to all that information at any time: https://www.uptimerobot.com/
Most of the time site owners aren't aware their site has gone down so these kind of tools can give extra credit to SEOs.
Yep, we use uptime robot & boxcar to make sure these important alerts are obvious on iOS.
Sound advice as ever Will.
I've long been a fan of custom GA alerts and Pingdom services , although we also use a series of search tools from Semetrical to monitor a selection of server monitoring and site performance requirements.
Regarding GA alerts, it's worth being mindful of your 'normal' site traffic levels and the potential impact of any offline activity or promos you've run, it's potentially worth tweaking the sensitivity of your GA alerts or at least notifying colleagues of any anticipated drops. If traffic has recently increased significantly due to a promo, a drop could be on the cards but you will still be alerted.
Simple, effective tools. Useful post, thank you!
May need to review our custom alerts after reading this. Great article!
Useful post Will. Some of the alert with GA were known and implement to me, thanks also to those posts you cited and linked to, but others not... so this is a very welcome post. And I didn't know Server Density, which looks an interstitial service indeed. I will take into account for some of my clients.
See you and the Distilled band in London for Linklove (ticket bought).
I've never actually implemented these Custom Alerts, but I'm going to do that straight away. And web developers changing robots.txt/.htaccess files without notice will never happen again.
Thanks Will
p.s. "gobbledegook"? Nice :)
Will this is something really new for me because for me SEOmoz and Raven was the only tool which I had ever use for monitoring and I had never thought that Uptime monitoring can be done or can be helpful like this I am definitely going to try Server Density. Thanks for the Distilled advices Will.
Great stuff, although a bit of a given. I was expecting to see a post with some betas and some new fancy tools that were future proofed rather than last years bit 'n' bobs. Great stuff for the beginners, thanks for sharing.
These tools (esp himashu's) suggestions definently help monitor client website's. This would help me dictate site recommendations and provide commentary when unexpected changes occur, rather than looking like a fool with the opposite occuring. Eitherway I think clients see this as part of the SEO service they pay for.
Thanks for the great post Will.
Thanks for the suggestions, although I'd offer a word of caution when setting up alerts. Don't go overboard and montor stuff you really don't care about.
You can easily find eyeballs glossing over alerts, unable to pick out the really important stuff because you're tracking too many notifications! All these tools are only as good as the person receiving them!
Also remember to make sure that your Google Webmaster Tools messages are being sent to a valid address. (whoops!)
Great article and great inspiration.. a lot of technical stuff and influenece the traffic numbers for sure... best regards henrik sandberg .. seocustomer.com
I wasn't aware of the intelligence alerts on google analytics. I'm assuming you can do them to be alerted of spikes (upwards) in traffic not just drops. One more thing on my to do list!
Great Effective Tool... Sergio Redondo you are absoutely right.
Thank You