Last November, I authored a popular post on SEOmoz detailing 15 SEO Problems and the Tools to Solve Them. It focused on a number of free tools and SEOmoz PRO tools. Today, I'm finishing up that project with a stab at another set of thorny issues that continually confound SEOs and how some new (and old) tools can come to the rescue.
Some of these are obvious and well known; others are obscure and brand new. All of them solve problems - and that's why tools should exist in the first place. Below, you'll find 20+ tools that answer serious issues in smart, powerful ways.
#1 - Generating XML Sitemap Files
The Problem: XML Sitemap files can be challenging to build, particularly as sites scale over a few hundred or few thousand URLs. SEOs need tools to build these, as they can substantively add to a site's indexation and potential to earn search traffic.
Tools to Solve It: GSiteCrawler, Google Sitemap Generator
GSiteCrawler: Downloadable software to create XML Sitemaps
Download a few files from Google Code and Install on Your Webserver
Looks like Google Webmaster Tools, doesn't it? :-)
Both GSiteCrawler & Google Sitemap Generator require a bit of technical know-how, but even non-programmers (like me) can stumble their way through and build efficient and effective XML Sitemaps.
#2 - Tracking the Virality of Blog/Feed Content
The Problem: Even experienced bloggers have trouble predicting which posts will "go wide" and which will fall flat. To improve your track record, you need historical data to help show you where and how your posts are performing in the wild world of social media. What's needed is a cloud based tracking tool that can sync up with the Twitters, Facebooks, Diggs, Reddits, Stumbleupons & Delicious' of the web to provide these metrics in an easy-to-use, historical view.
Tools to Solve It: PostRank Analytics
PostRank's nightly emails keep me wracking my brains for better blog post ideas
PostRank sends me nightly reports on how the SEOmoz blog performs across the web - numbers from Digg, Delicious, Twitter, Facebook and more. By using this, I can get a rough sense of how posts perform in the social media marketplace and, over time, hopefully train me to author more interesting content.
Addition: Melanie from Postrank added a discount code in the comments for SEOmoz users! Use the coupon code "SEOmoz" in order to get three free months instead of just one.
#3 - Comparing the Relative Traffic Levels of Multiple Sites
The Problem: We all want to know not only how we're doing with web traffic, but how it compares to the competition. Free services like Compete.com and Alexa have well-documented accuracy problems and paid services like Hitwise, Comscore & Nielsen cost an arm and a leg (and even then, don't perform particularly well with sites in the sub-million visits/month range).
Tools to Solve It: Quantcast, Google Trends for Websites
If a site has been "Quantified," no other competitive traffic tool on the web will be as accurate
Since both sites are "Quantified," I can be sure the data quality is excellent
I've complained previously about the inaccuracies of Alexa (as have many others). It's really for entertainment purposes only. Compete.com is better, but still suffers from lots of inaccuracy, data gaps, directionally wrong estimates and a general feeling of unreliability in the marketplace. Quantcast, on the other hand, is excellent for comparing sites that have entered their "Quantified" program. This involves putting Quantcast's tracking code onto each page of the site; you're basically peeking into their analytics.
Sadly, Quantcast isn't on every site (and their guesstimates appear no better than Compete when they don't have direct data). Fortunately, one organization has stepped up with a surprisingly good alternative - Google.
Google Trends for Websites allows you to plug in domains and see traffic levels. Much like AdWords Keyword Tool, the numbers themselves seem to run high, but the comparison often looks much better. Google Trends has become the only traffic estimator I trust - still only as far as I could throw a Google Mini, but better than nothing.
#4 - Seeing Pages the Way Search Engine Do
The Problem: Every engineering & development team builds web pages in unique ways. This is great for making the Internet an innovative place, but it can make for nightmares when optimizing for search engines. As professional SEOs, we need to be able to see pages, whether in development environments or live on the web the same way the engines do.
Tools to Solve It: SEO-Browser, Google Cached Snapshot, New Mozbar
_
A longtime favorite site of mine, SEO Browser lets you surf like an engine
_
Poor Google; that's all they see when they crawl our pretty site
SEO-Browser is a great way to get a quick sense of what the engines can see as they crawl your site's pages and links. The world of engines may seem a bit drab, but it can also save your hide in the event that you've put out code or pages that engines can't properly parse.
_
I wonder if Googlebot ever gets tired of blue, purple and gray...
Google's own cached snapshot of a page (available via a search query, as a bookmarklet, or in the mozbar's dropdown) is the ultimate research tool to know what the engine "sees." The only trouble is that it works in the past only (and only on pages that allow caching). To get a preview, SEO Browser or our friend below can be useful.
The mozbar lets you dress up like Google whenever the occasion is right
One of Will Critchlow's feature requests in the new mozbar was the ability to switch user agents, turn off JavaScript and images and, in essence, become the bot in your browser. Luckily, he also forced us to place a gray overlay in the right-hand corner that alerts you to the settings you've changed and gives you an easy, one-click "return to normal." Browsing like a bot = solved!
#5 - Identifying Crawl Errors
The Problem: Discovering problems on a site like 302 redirects (that should be 301s), pages that are blocked by robots.txt (here's why that's a bad idea), missing title tags, duplicate/similar content, 40x and 50x errors, etc. is a task no human can efficiently perform. We need the help of robots - automated crawlers who can dig through a site, find the issues and notify us.
Tools to Solve It: GSiteCrawler, Xenu, GGWMT
Mmmm... Parallel Threads
She canna hold on much longer cap'n!
We've already covered GSiteCrawler in this post, but for those unaware, it can be a great diagnostic tool as well as a Sitemap builder. Xenu is much the same, though somewhat more intuitive for this purpose. Tom's written very elegantly about it in the past, so I won't rehash much, other than to say - it shows errors & potential issues Google Webmaster Tools doesn't, and that can be a lifesaver.
Doh! I think we messed up some stuff when KW Difficulty relaunched :(
Google Webmaster Tools is extremely popular, well known and well used. And yet... lots of us still have crawl errors we haven't addressed (just look at the 500+ problems on SEOmoz.org in the screenshot above). Exporting to Excel, sorting, and sending to engineering with fixes for each type of issue can save a lot of heartache and earn back a lot of lost traffic and link juice.
#6 - Determine if Links to Your Site Have Been Lost
The Problem: Sites don't always do a great job maintaining their pages and links (according to our data, 75% of the web disappears in 6 months). Many times, these vanishing pages and links are of great interest to SEOs, who want to know whether their link acquisition and campaigning efforts are being maintained. But how do you confirm if the links to your site that were built last month are still around today?
Tools to Solve It: Virante's Link Atrophy Diagnosis
Does that mean Stuntdubl & SEOmoz are "going steady?"
This tool comes courtesy of the great team over at Virante, and it's a pretty terrific application of an SEO need and Linkscape data through the SEOmoz API. The tool will check the links reported from Linkscape/Open Site Explorer and determine which, if any, have been lost. Many times it's just links off the front page of blogs or news sites as archives fall to the back, but sometimes it can help you ID a link partner or source that's no longer pointing your way in order to facilitate a quick, painless reclamation. The best part is there's no registration or installation required - it's entirely plug and play.
Addition: Russ from Virante added a discount code in the comments for SEOmoz users! Use the coupon code "seomoz30" in order to get more results from these tools.
#7 - Find 404 Errors on a Site (without GG WM Tools) and Create 301s
The Problem: Google's Webmaster Tools are great for spotting 404s, but the data can be, at times, unwieldy (as when thousands of pages are 404ing, but only a few of them really matter) and it's only available if you can get access to the Webmaster Tools account (which can stymie plenty of SEOs in the marketing department or from external consultancies). We need a tool to help spot those important, highly linked-to 404s and turn them into 301s.
Tools to Solve It: Virante's PageRank Recovery Tool
3.99 mozRank for ~0.00 effort
The thinking behind this tool is brilliant, because it solves a problem from end to end. By not only grabbing well-linked-to pages that 404, but actually writing the code to create an .htaccess file with 301s to your choice of pages, the tool is a "no-brainer" solution.
#8 - See New Links that are Sending Traffic (and Old Ones that Have Stopped)
The Problem: Most analytics tools have an export function that, combined with some clever Excel, could help you puzzle out the sites/pages that have started to send you traffic (and those that once were but have stopped). It's a pain - manual labor, easy to screw up and not a particularly excellent use of your precious time.
Tools to Solve It: Enquisite
I love the ability to look across the past few months and see the trend of new pages and new domains sending links, as well as identifying links that have stopped sending traffic. Some of those may be ripe for reclamation, others might just need a nudge to mention or link over in their next piece/post. This report is also a great way to judge how link building campaigns are performing on the less-SEO focused pivot, sending direct traffic.
#9 - Research Trending/Temporal Popularity of Keywords
The Problem: Keyword demand fluctuates over time, sometimes with little warning. Knowing how search volume is impacted by trending and geography is critical to SEOs targeting fields with these demand fluxes.
Tools to Solve It: Google Insights, Trendistic
Hmmm.... Maybe we should launch Open Webmaster Tools next?
We need to make it out to India & Brazil more often, too!
Google Insights is great for seeing keyword trending, related terms and countries of popularity (though the last of these we've found to be somewhat suspect at times). However, sometimes you're really interested in what's about to become popular. For that, turning to trend sites can be a big help.
Although it doesn't yet have a "suggest" feature to help identify terms & phrases that may soon become popular searches, it does help establish the "tipping point" at which a buzzword in Twitter may become a trend in web search. As we've discussed in the WhiteBoard Friday on Twitter as an SEO Research Tool, finding the spot at which search volume begins spiking can present big opportunities for fresh content.
#10 - Analyze Domain Ownership & Hosting Data
The Problem: When researching domains to buy, considering partnerships or conducting competitive analysis, data about a site's hosting and ownership can be essential steps in the process.
Tools to Solve It: Domaintools
We should make sure to re-register this domain...
Long the gold standard in the domainer's toolbox, DomainTools (once called whois.sc) provides in-depth research about a domain's owners, their server and, sometimes most interestingly, the other domains owned by that entity. BTW - they're spot on; SEOmoz owns about 80 other domains besides our own (though we only really use this one and OpenSiteExplorer right now).
#11 - Investigate a Site/Page's History
The Problem: What happened on this page last month or last year? When conducting web research about links, traffic and content, we all need the ability to go "back in time" and see what had previously existed on our sites/pages (or those of competitors/link sources/etc). Did traffic referrals drop? Have search rankings changed dramatically? Did a previously available piece of content fall off the web? The question really is - how do we answer these questions?
Tools to Solve It: Wayback Machine
Before 2005, we were on a different domain!
_
If you remember this version of the site, you're officially "old school"
Yeah, yeah, you've probably heard of the Wayback Machine, powered by Alexa's archive of the Internet and endlessly entertaining to web researchers and pranksters alike. What might surprise you is how valuable it can be as an SEO diagnostic tool, particularly when you're performing an investigation into a site that doesn't keep good records of its activity. Reversing a penalty, a rankings drop, an oddity in traffic, etc. can consume massive amounts of time if you don't know where to look and how. Add Wayback to the CSI weapons cache - it will come in handy.
#12 - Determine Semantically Connected Terms/Phrases
The Problem: Chances are, the search engines are doing some form of semantic analysis (looking at the words and phrases on a page around a topic to determine its potential relevance to the query). Thus, employing these "connected" keywords on your pages is a best practice for good SEO (and probably quite helpful to users in many cases as well). The big question is - which words & phrases are related (in the search engines' eyes) to the ones I'm targeting?
Tools to Solve It: Google Wonder Wheel
_
Nothing about "Yellow Shoes?"
We don't know for certain that this is a technique that provides massive benefit, but we're optimistic that tests are going to show it has some value. If you'd like to participate in the experiment, take related phrases from the Wonder Wheel and employ on your pages. Please do report back with details :-)
#13 - Analyze a Page's Optimization of Images
The Problem: When image search and image accessibility/optimization is critical to your business/client, you need tools to help analyze a page's consistency and adherence to best practices in handling image dimensions, alt attributes, etc.
Tools to Solve It: Image Analyzer from Juicy Studio
Doh! We need to add some dimensions onto our images.
It's not the prettiest tool in the world, but it does get the job done. The image analyzer will give any page a thorough evaluation, showing missing alt tags, image dimensions (which can help with page rendering speed) and informing you of the names/alts in a thorough list. If you have image galleries you're aiming at image search optimization, this is a great diagnostic system.
#14 - Instant Usability Testing
The Problem: Fast feedback on a new landing page, product page, tool design or web page (of any kind) can be essential to smoothing over rough launches. But tools aren't enough - we need actual human beings (and not the biased ones in our friend groups or company) giving fast, functional feedback. That's a challenge.
Tools to Solve It: Five Second Test, Feedback Army
It can't be that easy, can it?
Wow... It totally is! Here I am helping give feedback to a local geek squad.
Users are easier to come by than we think
Both FeedbackArmy & FiveSecondTest offer the remarkable ability to get instant feedback from real users on any page, function or tool you want to test at a fraction of the price normal usability testing requires. What I love is that because it's so easy, it makes that first, critical step of reaching out to users a low barrier to entry. Over time, I hope systems like these help make the web as a whole a more friendly, easy-to-use experience. Now there's not excuse!
#15 - Measure Tweet Activity to a URL Across Multiple URL Shortener Platforms
The Problem: You've got your bit.ly, your j.mp, your tinyurl, your ow.ly and dozens more URL shorteners. Between this plethora of options and standard HTML links pasted into tweets, keeping up with all the places your URL is being shared can be a big challenge.
Tools to Solve It: Backtweets
Tweeting links in the middle of the night is fun!
Bit.ly can track bit.ly and many other services offer their own tracking systems, but only Backtweets is aggregating all of the sources and making it easy to see what people are saying about your pages no matter how they encode it. Now if only we could get this to integrate with PostRank and Search.Twitter.com and Trendistic and make the interface super-gorgeous and have it integrate with Google Analytics... and... and...
#16 - BONUS: Determining Keyword Competition Levels
Bonus! I mentioned last week in a comment that I'd make a post about the new Keyword Difficulty Tool. Since this post is all about tools anyway, I figured I'd toss it in and save you the trouble of clicking an extra link in your feedreader.
The Problem: Figuring out which keywords have more/less demand than which others is easy (and Google does a great job of it most of the time).
Tools to Solve It: New Keyword Difficulty Tool
The real problem was that our previous keyword difficulty tool attempted to use 2nd order effects and non-direct metrics to estimate the competitiveness level of a particular keyword term/phrase. While it's true that more popular/searched-for keywords TEND to be more competitive, this is certainly not always the case (and in fact, SEOs probably care a lot more about when a keyword has high traffic and relatively weak sites/pages in the SERPs more than anything else). The new tool attempts to fix this by relying on Page Authority (correlation data here) and using a weighted average of the top ranking sites and pages.
Running five keywords at a time is way better than one
(we're working to add more - promise)
The best bet here looks like "best running shoes" - relatively lower difficulty, but still high volume
Oh yeah, looking at the top positions, a few dozen good links and some on-page and we're there
Reversing the rankings is never easy, but parsing through KW Difficulty reports certainly makes it less time-consuming. Watch out for the scores, though - a 65% is pretty darn tough, and even a 40% is no walk in the park. At last, I feel really good about this tool; it was suffering for a good 18 months, and it's nice to have it back in my primary repertoire with such solid functionality.
I'm sure there are plenty of remarkable tools I've missed and there are likely questions about these problems, too. Feel free to address both in the comments!
p.s. This was written very late at night and I need to be up and on a plane at precisely butt-o'clock tomorrow morning, so editing will have to slide until Jen wakes up and gives this a good once-over. Sorry about any errors in the meantime :-)
Note from Jen: I finally woke up and made a few minor edits. :) I also added a discount code from Virante "seomoz30" AND a discount code from PostRank "SEOmoz". Tools Rule!
Thanks Rand,
and thanks even more for maintaining the promise you made in reply to the comment's chat ptech and I had in the last Dr.Pete post.
Post like this ones are really useful... a sort of "favorites bookmarks" with high value notes.
About tools... few notes:
when it comes to clients' sites which are not too big, I use Xml-Sitemap-Generator. Easy to use, give various options both for the configuration of the sitemap and different file format outputs. And it's online.
I tend to prefer Quantcast, because it's more understandable for me. And many times GTrends doesn't offer 'trends' for websites with very small traffic (which is a problem for local websites in Italy).
I discovered them casually while looking at the recommended companies in SEOmoz a few tabs aways from here. And loved at first sight their tools. Useful is also their "Duplicate Content Tool", as it fasten the job of checking basics duplication issues.
And as you say, there are many other there that I surely don't know the existance (a couple are in your post and I will going to check them out).
And now waiting for suggestions from others Mozzers
Ciao
AH!.... I was forgetting... a free tool for Analytics that can complement GA is Clicky
It offers real time web analytics, offers a better understanding of the bounce rates data, can be used on iPhone too... and it's for free for websites with not a big amount of pageviews per day.
PD: i've nothing to do directly with any of the websites listed... saying this just to prevent any thumbs down for inappropriate linking)
i've nothing to do directly with any of the websites listed... saying this just to prevent any thumbs down for inappropriate linking) LOL it's quite funny that actually you received a thumb down only for saying that you want to prevent any thumbs down :)
Well... I wrote it because many times people see a link a instinctively think is spam, and it's not always so...
Moreover, when everytime I link something I also procure to make that page open on a new tab to maintain open the SEOmoz post.
Thanks for your kind words on some of our tools (Virante), I wrote most of them myself, but now that we have a full development team, hopefully we can keep kicking out better and better tools.
Honestly Gianluca, you should apply for SEOmoz Associate status. Your comments are always spot on. Thanks for the additional sites.
Yeah, the duplicate content tool is an old stand-by. It is one of the first things we run (of course, we have a cooler in-house version ;-) ) whenever we get a new client.
Hey Mozzers, Russ from Virante here...
I have created a custom coupon code so you can get many more results from our tools listed above...
Just use the coupon code seomoz30
Thanks Russ!
You 'da man Russ! Total thumbs up dude!
Holy comprehensive, Batman!
Thanks for including PostRank, Rand. I do enjoy when others help me spread the word. :)
One slight correction - while as of just recently we can detect Facebook content, we still can't get StumbleUpon. (We'd love to, but they still don't have a public API.)
For those who'd like to check out Analytics, if you use the coupon code: SEOmoz, you'll get three free months instead of just one. Get started here: https://analytics.postrank.com/register/
(And if anyone's uncomfortable providing credit card info for a trial, that's cool. Just give me a holler and I'll get you organized: [email protected].)
Wow... this post is becoming what we say in Spain a "pasada" (something like a "extremely good thing").
Suggestions of free or fremium tool, Virante coupon and yours.
Thanks
Wow! Awesome Melanie, I'll update the post with this information. Thanks!
What is up with StumbleUpon anyway? I've been asking them to release a public API for years now. They're going to end up becoming irrelevant at this rate.
What a great post and thread, thanks to Rand and and those who have left their comments for sharing such a plethora of great tools. To be honest I was unaware of some of them.
While we are at it, I want to throw it a couple more (they are free).
Search in a lot of different Google datacenters -- this one is the best of its kind, I think.
Pingdom -- test your page load speed and see which objects are slowing it down (useful to SEO since speed is now officially a ranking factor).
Pingler --ping the search engines easily.
Others will undoubtedly occur later. Keep sharing, folks!
Nice additions Philip. I especially liked Pingdom.
Thanks for mentioning the Google Wonder Wheel. It was horribly ignored in the last great post on keyword research.
Here is my small list:
1. web developer firefox add-on:https://addons.mozilla.org/en-US/firefox/addon/60There was a whole seomoz post on this. Awesome tool for on-page analysis.
2. Page Speed- https://code.google.com/speed/page-speed/Tool to speed up your webpages.
3. Search engine SPAM detector- https://tool.motoricerca.info/spam-detector/tool to search for keyword stuffing, hidden text etc.
4. If-modified-since headerhttps://www.feedthebot.com/ifmodified.htmltool to find out whether your web server supports the If-Modified-Since HTTP header.
5. Hotlinking checker toolhttps://www.free-webhosts.com/hotlinking-checker.phptool to test if your images and mutimedia files are hotlinked.
Nice list!... but I have to correct a link you gave above (it doesn't work):
Search Engine SPAM detector:
https://tool.motoricerca.info/spam-detector/
Sorry to seem pedant...
but a problem with the edit of you comment made the word "Tool" appear as it was part of the URLs you wrote.
Page Speed:
https://code.google.com/intl/en/speed/page-speed/
If modified since header:
https://www.feedthebot.com/ifmodified.html
Hotlinking Checker:
https://www.free-webhosts.com/hotlinking-checker.php
thanks. i am new to commenting on seomoz blog. I post something but it comes up in a different format. Someone should either come up with a 'preview' button or make it a 'what you see is what you get editor'.
Another dev wishlist item: Come up with a way to keep the extra space insertions when you edit comments.
Maybe is a browser issue... I had editing problems with Safari whan on Mac... and with Chrome sometimes (especially when inserting a link).
No problems with Firefox.
Thanks for the tip Gianluca. Usually I browse in Opera and do SEO in Firefox.
So this is now being written in Firefox.
Now to see if the spaces hold.
New paragraph in edit mode.
Snap! Same old bug in Firefox too.
Know why I sometimes call you Randman? 'Cuz pumping out quality posts like this totally makes you 'da man Rand.
While I'd hard pressed to decide my favorite category of posts (CRO, Analytics, Business of SEO, etc.) SEO software/siteware is a definite top 3 for me. I love posts like this.
Between your excellent examples, and those in the comments, my next free weekend is going to be spent playing with these new discoveries.
"Hmmm.... Maybe we should launch Open Webmaster Tools next?"
I second that notion.
Great follow-up post. For slightly more in-depth usability testing, I've had some good results from usertesting.com - one particular test changed the whole way we displayed our home page!
Adding a couple of tools and thoughts:
3. Quantcast is great, but I wouldn't discount Compete and Alexa. From what I've seen, they are quite useful tools for websites with more than 100,000 unique visitors per month. In addendum to Google Trends, Double Click Ad Planner (formerly Google Ad Planner), is built off of the same data as Google Trends. It gives much more detailed information about website traffic such as visits, page views, demographics, time on site, etc. In my research, I've found it to be highly accurate.The service recently updated its methodology, and was submitted for a MRC audit.
15. Backtype is useful, but I actually prefer Topsy. Their API is fantastic, and their link counts seem to be more meaningful than Backtypes. Of course there's also Tweetmeme. For a tool that looks at URL mentions all across the social media landscape, check out UberVu.
And one more to add to the list that flys under most people's radar: Update Scanner Firefox extention.
You can set frequency for checking as well as a threshold for what triggers an alert. What's nice is it highlights the changes and you can compare between the two.
From an SEO point of view, good for monitoring key URLs on your own/client site, especially when numerous people are making content changes or structural changes are being made.
Or if monitoring potential clients (hate to walk in to a pre-sales only to find out that they just fixed everything you were going to talk about). And of course competitors.
Okay, one more....
WinMerge and/or Notepad++ for file comparison.
Anyone who has done coding and possibly webdesign will already understand the immense value of a file comparison tool.... you can load up a couple HTML files (perhaps an old one or one you've pulled from Google's cache) to compare against a new one to identify not only the visual changes but the underlying HTML changes.
Update Scanner sound sooo useful... you cannot imagine how hard to make understand to a client that has at least to drop a note when he makes some content update on his site (and not have a RSS to look at)
Wait....aren't you constantly watching and monitoring the site manually for any changes!?
I do monitor personally client's sites.
in my comment I was meaning that the Update Scanner sound useful as a sort of "alarm" if something change.
Beware me if I wouldn't check my clients' site, and that tool is something that can make easier this task
Sorry....was totally being facetious....as if you had nothing else to do but sit at a computer, constantly clicking through a site to monitor any changes.
;)
:)... it's ok... I didn't get it... maybe because I was checking too many updated pages today ;)
This post rocks. I'm especially stoked about Quantcast. We recently had a client who was approached by a competitor who was showing him data from Compete.com to scare him, showing that his site was lagging well behind his top competitors. We were able to debunk it by comparing the traffic pattern on Compete to the traffic pattern in Google Analytics, but it's also great to see some authoritative backup to our assertion that the data was just plain terrible. I'm super excited to have a more reliable alternative, and can't wait to try out the "Create a Network" option!
I would like to add quickly some wordpress plugins that i use and solve some of the problems that Rand explained: 1. Google XML Sitemap Very easy to use xml sitemap generator. Click here 2. Broken Link Checker This imho is a little gem, every 72 hour this plugin will monitor your blog looking for broken links and let you know if any are found. Click here 3 Redirection This plugin will help you to manage 301 redirections, keep track of 404 errors. Click here P.S. Yes, i love to help WordPress with relevant anchor text links :P
Thanks for very intresting post! I've heard for the first time about some services you've written about. Will now research them.
I have 1 addition to your post:
It is also good to use Google Alerts for websites monitoring. For each website you monitor you can create an alert "webisteurl.com" and you will easily analys mentions about your website on the internet!
google alerts is also good for competitive analysis, link building (to find potential linking partners in your niche) and of course reputation management.
How do you find potential link partners using Google Alerts?
Good question. Say you are targeting 'migraine' related keywords like 'migraine treatment', 'migraine symptoms' etc. Now go to Google alerts and set web and blog alerts for these keywords. Whenever someone will write about them, you will get an alert. When you will scan these alerts you will find websites like blogs which talk about migraine, its symptoms and treatments etc. They are your potential linking partners. Now you need to find a way to get links from them. It is much better way of finding linking partners than manually searching them on the net. Also since Google is picky in selecting sites which it shows in Google Alerts, you will most probably won't get crappy sites through it. I hope it helps.
Thanks for explanation! I will try this method too :)
Excellent tip Himanshu. It continues to amaze me how much I still have to learn about linking (and all SEO for that matter). The learning process seems never-ending!
Wow great post! 16 problems solved in just one post! I was hooked at at GSiteCrawler, although I'm on a Mac so couldn't give it a go - can't wait to grab a Windows machine! I liked the seo-browser too - for quick checks I tend to just turn off CSS & JS, but having a tool like that will definitly be useful. Thanks again!
Never have used PostRank Analytics but it looks like I have been missing out on a powerful tool.
Love this type of post, thanks!
I just found a useful firefox extension for searching different Google servers to view results from another country or region. Its called Google global:
https://addons.mozilla.org/en-US/firefox/addon/5977
I'm living an American living in Russia and involved in search marketing back home so I need to see US search results for my clients sites. The results are quite different among google's servers. For example, one of the sites I'm promoting appears on the first page for a keyphrase for most servers but recently has dropped to the third page only on US Google servers. Frustrating because the site is marketed to the US and Canadian markets...
While nowhere near the 75% you mentioned above, I believe that tools are like the web in that they are always changing. Some of them either fail to match pace with our needs or quickly become ousted by better tools. It's nice to have an authoritative source come out and match tools to objectives. Thanks!
Once again, SEOmoz provides some fantastic information. Thanks for a great list of tools. Now if you could only provide a way for me to clone myself so I can do everything I want to do, while doing what I need to do.
Thanks SEOmoz!
Great info here. There were 4 or 5 tools in here I've never heard of and will be checking out. The Virante tools look awesome! Thanks again for making all of our jobs easier!
- Evan
Hey, thanks for saying nice things about Virante :) If you ever have any ideas for a tool, let me know!
It took me about an hour to read this post because with every new tool I was jumping into a new window to discover what I had been missing! Honestly, if you are into SEO, this is probably the greatest source of information available to you... it's so great it has inspired this, my first comment :)
I have to give a big shout out to Melanie over at PostRank for rapidly replying to a question I had and getting my subscriptions sorted out.
Great post Rand! Have you ever (or anyone here) used a tool called "Bad Neighbourhood Link Checker" https://www.bad-neighborhood.com/text-link-tool.htm Any thoughts on it?
Tony
Hey Tony,
I've seen this tool previously. Honestly, while I think it *can* be useful to some of us, I don't like it.
I mean, the calculations that it uses to determine so-called "bad neighborhood" seems to be a mom-and-pop algorithm. For example, it told me that Facebook was potentially a bad neighbour, since it has a high occurence of backlinks from blogs... Well, hell, yeah!
Imho, it's better to concentrate on common sense when taking a peek at our links using OSE, for example, than to rely on this tool.
Thanks ptech....I wondered about their algorithm.....I did get some decent results about certain sites but there were obvious ones like you mentioned that didn't make any sense being on that list.
Tony
Great post Rand, a couple of tools that will be very useful in future. I really liked the usability and Virante's PagRank Recovery Tool.
Just a couple of comments about tow of the tools, as I am not sure these should have been listed.
1) The sitemap generators, whilst they are fine ifall the pages on a site are static,the two tools listed are not very useful if there are a lot of dynamic pages on the site. Also, I have found that many people are using these without proper background knowledge and set priority and change frequency to as high/often as possible as they think this will get them indexed quicker.
2) The Way Back Machine, I used to love this tool but it has not been updated since mid 2008 and whilst it's still interesting data, it is now two years out of date which for many of my projects is too much..
Nic
Thanks for the positive remarks on our tools :) If you have any improvements you would like to see, let me know! We are always trying to make our free tools better and better!
wow...
Rand, you have outdone yourself in my eyes, what a fantastic list!
I am thinking I need to pressure the boss into paying for Pro just because you give so much for free!
Great start to a Wednesday...
Any plans to remove Alexa rank from the Trifecta reports then? :)
It's as if you're making an SEO Bible and these are the commandments..
Holy cowfunk batman! How had I never heard of the link atrophy tool? Most excellent find- thanks Rand!
amazing post!
I especially love the backtweet tool (didn't know how to check that before)
The one thing that would be great to add is for point 5 there's a part that I think is suppose to link to an older blog post about why you shouldn't block pages with robots.txt, but it doesn't link
If i remembered the post I'd put the link here :)
Ditto - I'd like to see more on that. One eCommerce site I manage doesn't give us 100% access to all source code, and there are a few places I have had to use directives via robots.txt vs. being able to utilize nofollow for certain links to keep the spiders out.
The "good" spiders are behaving as requested - but I'd like to learn more about the downside of doing it this way (especially if you cannot implement nofollows in every case).
Cheers!
Silly me, here it is: https://www.seomoz.org/blog/headsmacking-tip-13-dont-accidentally-block-link-juice-with-robotstxt
And the best practice link is here (basically the same content, but this will update in the future): https://www.seomoz.org/knowledge/robotstxt
Once again Rand, over-delivering....you had 16 posts here but combined them to one!!
Great stuff, and always nice to get pointed at 1 or 2 new ones.
#11 WaybackMachine, unfortunately seems to have become more hit or miss. Used to be around a 6 month delay, but often find lack of data even greater now. However, it is still a great tool if only for this reason:
Limiting the view to just the site itself and mostly architectural issues, I think if I could only have 2 tools, they would still be Xenu and the Web Developer Toolbar.
Now if only I could block off the rest of the week to play with all of these!!
that's perfect for me ,tks rand!
Thank you for the information your provide.
Awesome information. I have used some of these tools but there are a few others like the twitter one which i have failed to use. Funny thing is I use Twitter a lot to help drive traffic to all of my sites. Thank you for the information, I will make sure to bookmark this list and use it as a reference and go to list... ;)
Rand this is such an extensive list, it's fantastic. Thank you for taking the time to compile the list. My SEO bookmarks folder is getting bigger and bigger. I don't even want to look at my Firefox bookmarks.
Super !!!
Thanks Rand these are some great tools!
I wanted to add that for me sometimes the online .XML sitemap crawlers time out and do not get all the pages but I have had very good luck with free version of the WebCEO program. You can crawl a site and manually add pages or remove duplicate pages, plus you can set your priorites. It has helped me out.
I'm so overwhelmed with information at the moment! Some of this I knew about and some of it I didn't. It is going to take me forever to digest it all and put it to work!
Thank you for doing the work for us! :)
Respected Sir, I read your blog it is marvellous & it is also more useful for me but I have one question tell me something about seo tools? Why it is necessary ? What are the main uses of it. Thanks & Regards
Great compilation of SEO tools, thank you Rand. I love those developers who made this tools.
Adding all of these to my routine, I am sure they will all help
Jennita, This is why I love "summary time" I am working on a few big projects with Hundreds of urls and sub-directories, and sub-domains and the tip on sitemap generators for big websites just saved my rear. I Love This Stuff - THANKS
Sly
Another problem is...
How to compare two different sites by used keywords?
For that you can use https://media4x.com/keyword-spy/
great post i love your blog
Great stuff Rand. Although after your article featured some of these tools, the sites' started charging for the "free" service! Never heard of the Google KW wheel and will give it a try. One tiny thing though.. Can you consider using the target="_blank" attribute for your linked tools, resources etc? I don't want to be brought off moz.com's site! Many thanks!
Wow! This is very useful man. I just want to share I heard a company named results driven marketing does free negative keyword audits i would recommend talking to them at 888-648-5526
This is my web url - https://webdesignkent.co and My targeted kw is web design kent and web design sittingbourne, I did on page and off page SEO for this site but when I have searched in google.co.uk there I have seen my url is not present in top 300 position. Another matter is when I have been searching for web design Sittingbourne on google.co.uk there I have found my other site url www.dancinggemssittingbourne.co.uk/ in 58 position but my question is why Its appeared into google.co.uk when I have searched for web design sittingbourne. Please someone suggest me what's going on to my sites or What would be my next footstep?
This is a fantastic list! Thank you SEOMoz! I can now go back and reinvent the wheel!
together with part 1 this is a kind of pro-alternative to your new SEO Beginners Guide!
Thanks from Germany!
Thank you very much.
I am researching SEO technology.
tham tu
Feedback Army is a solid tool. Always remember that you can reject junk comments. I'm going to test out the link atrophy tool. That is a very interesting statistic that 75% of the web disappears in 6 months. Is 2012 happening on the web already? ;)
Great list; thanks very much Rand. Even for the tools I've already discovered, a list like this is valuable as an added endorsement. A lot of resources look great at first glance, but prove to have some bugs or deficiencies that make them less useful once you really start to use them a lot. Seeing the ones that others have found to be great can give extra confidence that it's a good choice and worth investing the time to get to know it well.
+ Meta Search (meta.cn)
Meta.cn's rank check feature will show top 30 search results on baidu and google.cn, on the right side of each search result, it will show the ranking result of that website in other major Chinese search engine, such as yahoo.cn, soso.com, youdao.com, and bing.cn, this feature is just like the rank checker of SEOBook, as far as I know, this is the first Chinese rank checker tool.
Thanks for sharing! Nice list! but I have to correct a link you gave above (it doesn't work):
Search Engine SPAM detector:
I am overwhelmed! Keep up the great work!
Great post Rand, time to get started on all this.
I always use the Yellowpipe Lynx Viewer to pretend I'm a bot.
Not that I go around like Michael Winslow or anything.
I feel like I'm missing out on a non-visual method of determining the mark-up on headers etc with all these tools though. Bots must care about the semantics on some of the content, no?
Awesome list! It's great to see some of these that you know and get a nice reminder of why you should be using them, and ther are a couple on here that I have never used before. *thumbs up*
We made a simple site for testing usability by a/b testing your images, ad copy, page layouts. 50 responses for $5 at PickFu.com.
This is such a great post - very valuable.
what's cool about quantacast is it shows stats about visitors coming to your site from home or business
Great post Rand!
I'm slightly new to the SEO game and have entered a simple .txt sitemap to Google and Bing. It is of considerable interest to me to have my site better indexed on Bing. Oddly, they have many of our deep, more obscure pages indexed, but not our homepage or more popular pages. To what degree will creating an XML sitemap help me out with indexing?
Awesome tools..I will use all the free ones :)
I suggest you to invest some money in those ones that have also a "premium" version, if you like them a lot and find them useful for your job.
When it comes to tools, no investment is a bad investment.
nice roundup rand, but there is one tool missing that I would like to see an in depth review of.
80legs is somewhat of an advanced users playground, that pretty much shuns the casual seo, but from the limited testing I have done I've found it the absolute best way of extracting a full linking profile on both websites, and niche's in certain verticals.
Im sure the community here would love to see a roundup ;-)
Hey MOGmartin. I vote that you do an in depth YOUmoz on 80legs. :)
I'm with goodnewscowboy..
we deserve a YOUmoz post from you since last October, when you wrote a great post about a great WordPress plugin (I can say it because I use it).
I linked your post here, as I think it can fit perfectly in this thread.
Great stuff rand, love most of these tools...bookmarked them and will defiantly be using most of them.
Now I just have to find the money from somewhere to get Pro membership :)
Thanks
Awesome article.
I will signup for PRO, theaese tools are great.
Thanks.
Wow... this post is going to be an example of CRO through quality content!
LOL. You're absolutely right. Judging from this post (as well as many previous posts here), I sense that it's a definite part of their strategy.
Very useful, thanks Rand! Favourited it!
I love the info you give out for free. I use at least one or two every time you post these kinds of help articles. Keep em coming!
These tools are great. I know most of them, but a couple of them, I've never seen. Thanks Rand.
This are great tips rand. Hope you can share your more secret tools. :D
WOW! The sheer amount of information contained in this post is staggering.
Nice post randfish. After I finish reading and "applying" these 30 points, I am going to find you part one and read through it too.
I particularly am paying close attention to your point number two. Much thanks for the tools you recommended. These are huge timesavers.
Mark :)
Outstanding. What a valuable post!! Even the comments provide an incredible amount of valuable information. This one gets a bookmark!
Love this post Rand, very detailed and will really help everyone from newbie even to advance SEO professional. Keep up the good work.
Thanks for the list. I am using some of these right now to see how they help.
Most excellent dude!
Some new tools to me here and I think I qualify as an old timer :-)
Quick note - on number 4, GWT allows you to fetch a page as googlebot which can be pretty cool. Doesn't render the page for you, just spits out HTML but is a useful check alongside the UA switcher type tools (of which the mozbar is one).
Thanks! it very useful
Great screenshots as always.
Whoo hoo Rand! I didn't wait for the weekend, and my entire nights gone down the tubes as I've been playing and experimenting with the tools I hadn't used before.
So I just now tried the new and improved Keyword Difficulty Tool. I have three words for all you awesome devs at SEOmoz. Totally stinkin' awesome!
I predict it's going to become one of the most popular moz tools. I know I'll be using it a ton more myself now with it's improved algorithm.
Note to non-Pro members - As of Wednesday evening, (I'm assuming it's due to the abnormally high usage volume) it's been restricted to Pro members only. Don't shoot me, I'm only the messenger!
Its been this way for weeks unfortunately :( I guess its just going to stay pro, thats fine but the message should be updated.
Rand, so much great information here. Thank you so much for doing such a comprehensive post and putting in the not-overly-exciting-thing of listing tools.
Only problem for me is I need to use eight or nine new tools and don't know where to start.
Most valuable post in a long time!
Thanks!
I love free tools :)
Thank you Rand, my bookmarks became bigger today.
Outstanding post. It's of great help. I was desparately looking for some of these tools. Going to use this tools in all future work. Thanks Rand.