Every day, SEOs are challenged in their jobs to solve problems big and small - some are technically complex, others are merely time consuming, repetitive and tedious. At SEOmoz, we love to build, use and recommend tools to help solve these issues. Tools and automation aren't always the right answer, but for many of the challenges we face, they're a welcome ally in the battle for effectiveness and efficiency.

In this first of the two part series on the subject, I'll be covering tools on SEOmoz. My next segment will focus on tools across the web.

#1: View Source Sucks

We've all had the experience of loading a web page, viewing the source code and sorting through trying to determine whether the H1 tag was implemented properly or if the <head> contains a rel="canonical" tag or, worst of all, counting internal/external links manually. These are essential elements of the SEO process, but they're a "Royal Pain In The Butt" (RPITB from here on out).

Solution: Analyze Page in the mozBar

Thankfully, the mozBar has this spiffy "Analyze Page" button that opens a visual overlay with critical stats like meta data, link counts, rel="canonical," Hx tags, and even counts of characters in content areas. I've used this personally in a lot of live-review and client meeting scenarios and people are consistently impressed (and it makes us look like Pros).

mozbar page analysis overlay
See tags and character counts without having to view source (example from Oyster.com Hotel Reviews)

mozbar link counts in the analysis view
Link counts and page attributes are visible at the bottom of the "analyze page" overlay

Getting the data fast is awesome - looking professional and raising eyebrows while we do it is another thing. I love tools that make SEOs look good - honestly, I'm focused on making more of SEOmoz's products in this vein. I wish we'd built more of our tools historically with the mindset of ease-of-use and simple, obvious value (I sometimes worry that we've gone overly advanced in past tools that I've designed - hopefully Adam & the product team will keep us better focused).

#2: Determining a PageRank Penalty

Sometimes it's hard to know whether a drop in PageRank (or the PageRank score on a page you haven't visited before) is due to natural factors or modifcation by Google's webspam team. Whether it's a review of a potential client's website, a look at a potential link partner or an analysis of your own site, knowing what's happened with the PageRank score is an advanced, but sometimes essential piece of the SEO process.

Solution: Historical PageRank + PageRank vs. mozRank

Thankfully, there's a very good system for solving this problem (or at least getting closer to the answer). First up is a free tool we've had for a long time - the Historical PageRank Checker:

Historical PageRank Checker

When PageRank has been lowered more than one point, particularly in a timeframe that doesn't correlate with a standard PR update, you can feel relatively confident that some sort of PR penalty was incurred.

Next are the metrics mozRank & mozTrust from Linkscape. Since mozRank in particular is both highly correlated with PageRank (on average ~0.55 off from toolbar PR) and calculated independently, you can use the comparison between these metrics to help identify disparities. When PR is significantly lower than mozRank, particularly on the homepage of a website, there's a potential that a PR penalty may exist (though it's also possible that PR simply hasn't updated - Linkscape recalculates metrics every month, while Google updates PageRank on a fairly random schedule every 3-9 months). 

The metrics from Linkscape aren't perfect, nor are they a sure identifier, but they do provide an alternate source for comparison and contrast. You can get mozRank via Linkscape itself, or use the free API if you'd like to employ it on tools or in a more scalable fashion.

#3: Valuing a Potential Link

It's hard to compare the value of links from potential pages, and yet this is an essential task in the SEO world. Managers need to know whether link acquisition is going well or poorly. Link builders need to be able to judge the quality of the sites and pages they're targeting. SEO consultants and analysts need to determine where good links are coming from, where competitors have earned great links and what links might be spammy/low quality.

Historically, we've had a very limited number of metrics - things like link counts from Yahoo! Site Explorer, PageRank of the site's homepage and others have low correlation with rankings (we explored this on the blog in Ben's Ranking Models post) and data accuracy issues, too (PageRank's update cycles and lack of granularity - one point of PageRank is a huge amount of variance).

Solution: Linkscape Metrics

Linkscape has a lot of depth when it comes to metrics (sometimes too much, actually!). You can see data about numbers of links, linking root domains, scores around raw link popularity (mozRank) and trustworthiness (mozTrust). The metrics run on both a domain and an individual page, so you can get a sense of the importance of an individual URL and the domain it's on. You can also feel confident that the metrics are provided with a greater eye to providing specific value to SEOs. The folks behind Linkscape are uniquely focused on providing metrics that prove valuable, predictive and accurate.

Linkscape Metrics via the SEOmoz Toolbar

One of my favorite places to get the metrics quickly is via the mozBar, which shows them at the top of the analyze page overlay. For even more depth, you can use the data detail tab (e.g. for Raveable.com) on the Linkscape basic report - and for large amounts of data, you can view (or export to CSV) the top 3,000 links to a page or site via the advanced reports.

#4: Watching Rankings Over Time

Watching rankings is a pain and manual systems aren't scalable or a good use of anyone's time. It's also tough (perhaps even a RPITB) to track rankings across multiple engines and TLDs (.co.uk, .com.au, .co.nz, etc.) and keep track of the data in a format that can be exported intelligently.

Solution: Rank Tracker

Thankfully, there's the Rank Tracker, a serious upgrade from our previous Rank Checker tool. You can watch rankings across multiple engines and geographies, and the interface is simple + easy to use.

Rank Tracker Selection Interface

Choosing terms to track is straightforward, and the system automatically pings every week and stores the historical data, which you can download in CSV. Lately, I've been impressed with accuracy - despite the personalization and geographic modification, the team's been making great strides to ensure that the rankings are a good estimate of what a "normal" (non-logged in, geographically agnostic) user would receive.

Rank Tracker Results

BTW - I've also heard good things about Advanced Web Ranking (and always like to recommend good competitors - definitely more of that coming in the next post in this series).

#5: Quickly Comparing Two Pages Metrics

Answering the question "why does that page outrank me?" has plagued SEOs since time immemorial. There's so many things that goes into the ranking equation that it can be tough to determine what's critical to the process vs. unimportant. It's particularly challenging to understand the difference in link metrics - is one on a more important domain? Does one have more links, but they're mostly nofollowed?

Solution: Visualization & Comparison Tool

The Linkscape Visualization Tool is a great way to "see" into the rankings when comparing two pages.

Linkscape Visualization & Comparison Tool

The visual shapes represent the degree to which the page is meeting that metric's potential, and somewhat amazingly, we see that the bigger area nearly always outranks the smaller one. It's a great way to show clients, prospects and managers the gap between your site and a competitor's and explain how far you have to go and in what direction. The tool doesn't show all of the metrics in Linkscape, but it's a good representative set and in future iterations, we plan to have more refinement and options available.

#6: Finding Competitors' Links

Who's linking to my competitors but not linking to me? It seems like a simple, straightforward question, but, as usual, the devil's in the details. Most of the existing toolsets on the web (I mentioned several in this post) use the Yahoo! link query - linkdomain:site1.com linkdomain:site2.com -linkdomain:mysite.com (for example, see this search for pages linking to hotels.com and kayak.com but not Oyster.com). The problem is you have no good way to determine whether the list returned includes nofollow links, whether you're getting the most valuable, important pages/sites listed first and whether the list filters out some potentially great stuff.

Solution: Competitive Link Research Tool

Enter the Link Intersect Tool in Labs. Just enter your site plus at least two competitors:

Competitive Link Finder Tool

The tool results will show you a list of domains that contain links to pages from your competitors but don't point to you:

Link Intersect Tool Results

At SEOmoz, we've been calling this "cheating" for link building. The results are so useful and instantly actionable (and the data's quite excellent, particularly when sorted in DmR order) that it just doesn't make sense not to use it.

#7: Tracking Links & Mentions in the "Fresh Web'

Watching what's happening around a blog post, website or brand name is a challenge. Lots of blog search engines and some of the emerging real time search engines can give you data points here, and some are actually quite good for their niche (I'll definitely cover a few in Part 2), but sometimes, you just want a graph of what's been happening in the blogosphere/twitosphere with a list of URLs where the action's taking place.

Solution: Blogscape

We don't talk a tremendous amount about Blogscape, but it's getting to be a very good tool (and more upgrades are on the way). The dataset currently comprises 10 million feed sources that we found significant links to via Linkscape. These includes news feeds, blogs and, yes, Twitter accounts, too. The threshold was a number of unique linking root domains, so while this source doesn't contain everything, it's also not bogged down by a ton of noise, helping to make the signal rise to the top.

Blogscape Graph

Don't miss the query operators page, which shows extent of search parameters and advanced data you can get from the index.

#8: Fast Access to Links & Anchor Text

Sometimes, you just need to see a list of links fast. Yahoo! Site Explorer has historically been the "go-to" source for this, but over time, not being able to filter nofollow'd links, nor see metrics, nor have any idea about the sort order used has made it a frustratingtool.

Solution: Backlink Analysis Tool

Labs' Backlink Analysis Tool is terrificly useful for this scenario. Not only do you get a list of links ordered by relative importance in just a few seconds (slightly longer if the URL/domain has many thousands of links), you also retrieve an ordered list of anchor text distribution pointing to the page, subdomain or root domain.

Backlink Analysis Tool

It's not pretty, but it is simple to use.

Anchor Text Breakdown

The fast anchor text breakdown is terrific for making short work of comparing multiple sites' link profiles.

Link List

The link list itself is ordered by mozRank passed, a metric helping to show where the most "juice" is originating (though not necessarily the most important domains/pages). You can get more advanced in full Linkscape reports, but quick, dirty link lists and anchor text at the touch of a button, this is hard to beat.

#9: Quickly Comparing Metrics from Numerous Sources

There are times when client reports or c-level execs need a long list of metrics from a variety of sources - Compete, Alexa, Google PageRank, Yahoo! Link Counts, Google News mentions, etc. Going to each of the individual tools, running reports and gathering the metrics can be an especially tedious RPITB, particularly if you need to gather this data for multiple domains/pages.

Solution: Trifecta Tool

Trifecta isn't always perfect - it's pulling data from a lot of sources, some of which don't have great uptime and can be squirrely about the ways they return information. However, it can be a much needed ally in the fight against the laborious process of manually collecting the numbers.

Trifecta Tool Results

The comparison feature is also a neat way to see and collect data from multiple sources at once:

Trifecta Comparison Report

#10: Finding Competitors' Most Successful Linkbait

How is it that my competitor earned all their links? What content did they put out there that was so successful? How can I figure out their strategy? Unless you're willing to do a lot of surfing, this is a tough problem to solve.

Solution: Top Pages Tool

Thankfully, through Linkscape, we can collect data about which pages on a given subdomain or root domain have earned the most links. To give greater accuracy in the data, we use # of linking root domains. It's our sense that seeing pages that have earned large numbers of links from different sites will give the best idea of where and how links are flowing to a site and how they've been acquired. The Top Pages Tool in Labs shows this data:

Top Pages on SEOmoz

Now that you can see them, you can go visit those pages, learn how they got the links, and reverse engineer those crafty strategic moves (also great for ID'ing spam that's been created on your site).

#11: Identifying Pages that Can Flow Link Juice Internally

How do I know which pages on my site have the most link juice to share? It's a common query as the right internal links can help to make the difference with both competitive SERPs and indexing problems.

Solution: Top Pages Tool

Once again, it's Top Pages to the rescue. Not only can we see which pages have earned link juice, but we can also identify potential problems (302s and blocking w/ robots.txt being two of the big ones):

Top Pages on Netflix

I'm guessing someone at Netflix should really look into this... FYI - the "0" usually doesn't indicate a problem; it's just a marker (we'll work on fixing that up as I know some of you have asked about it).

#12: Get Social Media Monitoring Data

While Blogscape is a good search tool on our fresh web index, there's a lot of demand for a more functional montoring tool. Robust solutions from companies like Visible Technologies here in Seattle are quite pricey (worthwhile if you're a big brand making a serious investment, but no geared to SMBs or most consultants).

Solution: Social Media Monitoring Prototype

The Social Media Monitoring Prototype is one of our latest releases and while it's still very much in early alpha, the data is quite compelling and usable.

Social Media Monitoring Prototype in Labs

The counts (links, mentions and tweets) can be used to help determine the value of blogosphere, Twitter and linkbait campaigns over time. Just be aware that because of how Blogscape's index and retrieval of sources functions, data from the last 48 hours is less stable and complete than older material. It's a good tool to use after the fact, not necessarily in the heat of the campaign.

#13: Streamline Common Link Search Queries

If you've ever been tasked with manual link acquisition and told to use "all the common link queries" to find potential sources, you know how incredibly frustrating the process can be. No one likes searching for the same combination of phrases dozens of times over and over to retrieve the one or two credible sources that result in the SERPs.

Solution: Labs' Link Acquisition Assistant

Danny released a spiffy tool earlier this year - the Link Acquisition Assistant - that's a big time-saver on this front. Enter a few pieces of data about your site and the link campaign you're running and it will spit back links to tons of relevant search queries and link lists. While it doesn't automate everything, it can also be a huge boost in exposing ways to find and earn links you might not have considered.

Link Acquisition Assistant

#14: Determine a Keyword's Relative SEO Competitiveness

How hard would it be to rank for a particular keyword? Which keyword would be easier to rank for today? These questions are tough to answer unless you're willing to dig deep into data on the top results - and that's horribly time consuming (and a RPITB).

Solution: Keyword Difficulty Tool

The Keyword Difficulty tool provides a quick view into metrics that have historically helped SEOs determine potential competitiveness, as well as a percentage score that gives a sense of relative competition level.

Keyword Difficulty Tool Results

Like Trifecta, the data isn't always perfect, and a new version of this tool is actually on its way (employing lots of the ranking models stuff we've been building with Linkscape to help actually analyze a page/site of your choice and tell you if you've "got a shot"). However, it's still quite a good tool for getting a robust dataset automatically and

#15: Getting On-Page Optimization Right

Have I targeted my keywords in all the right tags? Did I misplace or mis-code anything? Am I as "on-page optimized" as I can/should be? Sure, you can dig through the source code manually and check, but that's a (last time, I promise) RPITB.

Solution: Term Target

With the Term Target, just plug in the keyword you're targeting and the page you want to rank and it sends back an analysis of the keyword usage, along with recommendations for where and how to employ the query term.

Term Target SEO Tool

There's nothing particularly complex here (though, eventually, we'll be switching to recommendations based on our correlation and ranking models data), but the usefulness is easy to see. We have members that I know just run the report on lists of pages, send the results to clients and get the changes implemented.


Next week, I'll look to cover many of the hairy SEO quandries that tools outside SEOmoz can help to solve. If you've got other ideas, tools or requests around any of these, please do leave them in the comments!