Link auditing is the part of my job that I love the most. I have audited a LOT of links over the last few years. While there are some programs out there that can be quite helpful to the avid link auditor, I still prefer to create a spreadsheet of my links in Excel and then to audit those links one-by-one from within Google Spreadsheets. Over the years I have learned a few tricks and formulas that have helped me in this process. In this article, I will share several of these with you.
Please know that while I am quite comfortable being labelled a link auditing expert, I am not an Excel wizard. I am betting that some of the things that I am doing could be improved upon if you're an advanced user. As such, if you have any suggestions or tips of your own I'd love to hear them in the comments section!
1. Extract the domain or subdomain from a URL
OK. You've downloaded links from as many sources as possible and now you want to manually visit and evaluate one link from every domain. But, holy moly, some of these domains can have THOUSANDS of links pointing to the site. So, let's break these down so that you are just seeing one link from each domain. The first step is to extract the domain or subdomain from each url.
I am going to show you examples from a Google spreadsheet as I find that these display nicer for demonstration purposes. However, if you've got a fairly large site, you'll find that the spreadsheets are easier to create in Excel. If you're confused about any of these steps, check out the animated gif at the end of each step to see the process in action.
Here is how you extract a domain or subdomain from a url:
- Create a new column to the left of your url column.
- Use this formula:
=LEFT(B1,FIND("/",B1,9)-1)
What this will do is remove everything after the trailing slash following the domain name. https://www.example.com/article.html will now become https://www.example.com and https://www.subdomain.example.com/article.html will now become https://www.subdomain.example.com. - Copy our new column A and paste it right back where it was using the "paste as values" function. If you don't do this, you won't be able to use the Find and Replace feature.
- Use Find and Replace to replace each of the following with a blank (i.e. nothing):
https://
https://
www.
And BOOM! We are left with a column that contains just domain names and subdomain names. This animated gif shows each of the steps we just outlined:
2. Just show one link from each domain
The next step is to filter this list so that we are just seeing one link from each domain. If you are manually reviewing links, there's usually no point in reviewing every single link from every domain. I will throw in a word of caution here though. Sometimes a domain can have both a good link and a bad link pointing to you. Or in some cases, you may find that links from one page are followed and from another page on the same site they are nofollowed. You can miss some of these by just looking at one link from each domain. Personally, I have some checks built in to my process where I use Scrapebox and some internal tools that I have created to make sure that I'm not missing the odd link by just looking at one link from each domain. For most link audits, however, you are not going to miss very much by assessing one link from each domain.
Here's how we do it:
- Highlight our domains column and sort the column in alphabetical order.
- Create a column to the left of our domains, so that the domains are in column B.
- Use this formula:
=IF(B1=B2,"duplicate","unique") - Copy that formula down the column.
- Use the filter function so that you are just seeing the duplicates.
- Delete those rows. Note: If you have tens of thousands of rows to delete, the spreadsheet may crash. A workaround here is to use "Clear Rows" instead of "Delete Rows" and then sort your domains column from A-Z once you are finished.
We've now got a list of one link from every domain linking to us.
Here's the gif that shows each of these steps:
You may wonder why I didn't use Excel's dedupe function to simply deduplicate these entries. I have found that it doesn't take much deduplication to crash Excel, which is why I do this step manually.
3. Finding patterns FTW!
Sometimes when you are auditing links, you'll find that unnatural links have patterns. I LOVE when I see these, because sometimes I can quickly go through hundreds of links without having to check each one manually. Here is an example. Let's say that your website has a bunch of spammy directory links. As you're auditing you notice patterns such as one of these:
- All of these directory links come from a url that contains …/computers/internet/item40682/
- A whole bunch of spammy links that all come from a particular free subdomain like blogspot, wordpress, weebly, etc.
- A lot of links that all contain a particular keyword for anchor text (this is assuming you've included anchor text in your spreadsheet when making it.)
You can quickly find all of these links and mark them as "disavow" or "keep" by doing the following:
- Create a new column. In my example, I am going to create a new column in Column C and look for patterns in urls that are in Column B.
- Use this formula:
=FIND("/item40682",B1)
(You would replace "item40682" with the phrase that you are looking for.)
- Copy this formula down the column.
- Filter your new column so that you are seeing any rows that have a number in this column. If the phrase doesn't exist in that url, you'll see "N/A", and we can ignore those.
- Now you can mark these all as disavow
4. Check your disavow file
This next tip is one that you can use to check your disavow file across your list of domains that you want to audit. The goal here is to see which links you have disavowed so that you don't waste time reassessing them. This particular tip only works for checking links that you have disavowed on the domain level.
The first thing you'll want to do is download your current disavow file from Google. For some strange reason, Google gives you the disavow file in CSV format. I have never understood this because they want you to upload the file in .txt. Still, I guess this is what works best for Google. All of your entries will be in column A of the CSV:
What we are going to do now is add these to a new sheet on our current spreadsheet and use a VLOOKUP function to mark which of our domains we have disavowed.
Here are the steps:
- Create a new sheet on your current spreadsheet workbook.
- Copy and paste column A from your disavow spreadsheet onto this new sheet. Or, alternatively, use the import function to import the entire CSV onto this sheet.
- In B1, write "previously disavowed" and copy this down the entire column.
- Remove the "domain:" from each of the entries by doing a Find and Replace to replace domain: with a blank.
- Now go back to your link audit spreadsheet. If your domains are in column A and if you had, say, 1500 domains in your disavow file, your formula would look like this:
=VLOOKUP(A1,Sheet2!$A$1:$B$1500,2,FALSE)
5. Make monthly or quarterly disavow work easier
That same formula described above is a great one to use if you are doing regular repeated link audits. In this case, your second sheet on your spreadsheet would contain domains that you have previously audited, and column B of this spreadsheet would say, "previously audited" rather than "previously disavowed".
Your tips?
These are just a few of the formulas that you can use to help make link auditing work easier. But there are lots of other things you can do with Excel or Google Sheets to help speed up the process as well. If you have some tips to add, leave a comment below. Also, if you need clarification on any of these tips, I'm happy to answer questions in the comments section.
Hello Marie,
As usual great information, love to have it. I wish to add my point here, I always did link audit manually. Well, I am not preferring Google spreadsheet but using Microsoft excel sheet for gathering all the backlink data. We have seen many times that from one domain like, directory or blog comment, pointing to numbers of backlinks, so in that case getting unique domain in the sheet becomes very hard. I would like to share a simplest method, just have a look below -
Download all the backlinks, go in Data => text to column => Select Delimited and press next => click on OTHER box and put "/" this symbol, then finish it. Now you will be have https:// , www. and example.com(domain) in 3 different columns. Select the domain column and hit remove duplicates. Finally, you will get all the unique domains and duplicates/repeated will be removed.
It makes you feel good if you are going to audit more than 1 lack links and after doing this process, 40,000 will be filtered.
Hope, it helps to others :)
This is exactly the type of comment I was looking for! That makes perfect sense Shubham and may be a little simpler and faster than my method.
Hello Marie,
Thanks for considering :)
Shubham great!
Thanks for sharing.
Hello Sotelor,
Your welcome, just try this method for your link audit camping and you will be happy for sure :) .
Good Reply Shubham, Thanks
My favorite formula to extract just the domain from a list of links in column A is
=IF(ISNUMBER(FIND("www.",A2)),MID(A2,FIND("www.",A2)+4,IF(ISNUMBER(FIND("/",A2,9)),FIND("/",A2,9),LEN(A2)+1)-FIND("www.",A2)-4),MID(A2,FIND("//",A2)+2,IF(ISNUMBER(FIND("/",A2,9)),FIND("/",A2,9),LEN(A2)+1)-FIND("//",A2)-2))
While I normally use the Text-to-Columns method described elsewhere in the comments, I found when I had to have someone else do the work, it was easier to make it cut-and-paste easy for them.
Well that's a formula to make your head spin for sure! Thanks for the tip.
That's true Marie, time consuming too!
If you don't want to get down to a domain level using a formula, you can just use find and replace:
https://
https://
www.
/*
The /* will get rid of the trailing slash and everything after it. The * is a wildcard in Excel.
If you're really clever, you can just macro up this process and do the whole thing with one click ;-)
Brilliant. I am going to try this with my next spreadsheet. Thank you!
I was just about to post the same. This is the method I was using for years. Then I found a free online tool "url to domain . com" Copy and paste your URLs in there, select the options appropriate for whatever you're doing, copy and paste the results back into the spreadsheet.
Agreed, I have also used https://www.seoweather.com/trim-urls-to-root-domain-standardise-urls-prefixes/ a couple times and pretty useful, time saver..when you are working with some lists in TextPad / UltraEdit etc.
Wow, Thats great tip. Short and time saving.
Okay so my head imploded about a minute into the "how-to" here. So you know this is REALLY good info and WAY over my simpleton analytics mind. Thank the heavens I rely on YOU for link audit work Marie. This post proves why I trust you so much for that work. :-)
my simpleton analytics mind
Alan, you are an honest man to make a comment like that. :-)
All of those spreadsheet functions like "=LEFT(B1,FIND("/",B1,9)-1)" are the real trade secrets. I'll bet only about five people on the planet know how to do that stuff. But now that you have those functions you can probably step through parsing the data. But, it probably would pay to let her do the rest of the work because there is a genius mind doing the evaluations.
Thanks Egol. I agree with you that it's quite unlikely that Alan has a simpleton analytics mind. :)
:-) LOL EGOL!
Here's the reality: One of the keys to my success in business and life is to know what I know, and know who to go to when I don't know something. Another key is "work smart, not hard". I've come a long way over the years, and much of my success is due to those people I go to when something is beyond my then current capacity.
Although I have continually evolved and learned more as each year has passed, there are some things I find hurt my brain. Regex and advanced Excel formulas are in that list. So rather than forcing myself to endure the learning curve (which yes, I admit I COULD do), I find it much more efficient in moments like this to listen to my "brain is about to be crushed by complex topics - bail out now" inner voice.
So while I most definitely do link evaluations as part of my audit services, like every other aspect of my audits, it's strategic level pattern identification only. That way I can leave the trench-work to great people like Marie who have their own path, their own passions and tolerances for such things.
I've gotten my audit process down to a highly efficient process and because of Marie and others like her, there's no need for me to reverse that.
Love your comment Alan. That's been my philosophy my whole career. It is not about how smart you are, its about how smart the people are that I work with. I say work with instead of hire, because I get right down with them and never expect them to do things I wouldn't do.
Oh Alan...I'm going to take this as a compliment, but my goal was to make things simple to understand. I may not have met the mark on that point! I greatly appreciate the link audit work you send my way. Thank you. :)
A top-notch piece from a great mind of our industry. I'm damn sure, the community will gonna religiously follow it.
I'd like to know is there any method you use to check the active links? I usually use the Visual Basic in excel for that with this script,
Sub MakeHyperlinks()
Dim cl As Range
For Each cl In Selection
cl.Hyperlinks.Add Anchor:=cl, Address:=cl.Text
Next cl
End Sub
Great tips. I was using Scrapebox to check active links. I've tried some other tools as well. But I found out the hard way not to trust those when I got a failed manual penalty reconsideration and it turned out that some of the links I was given as examples were ones that I had marked as "page not found". It's a pain, but I actually manually check each site to make sure that the page or site is actually gone.
You're right! Do check out this way, I'm sure you'll gonna find it useful :)
This is a great post that has invited some excellent comments.
thanks Marie, The post itslef is a great effort however, the method you have shared with us is rarely known. I must appreciate and admire the guidlines you have provied.
Thanks explain the whole process. It has served me a lot of help.
Yes, i totally agree
I do something similar Marie, but see your method and other comments here I realize it could be easier. Also I include the target URLs from the links, remove the duplicates and run them through a link checker to find links that return a 404 (the good links) that I can recapture with a re-write rule.
Another great idea! Thanks for sharing. I do often do this as well if I am dealing with a site that has built links to particular pages and then 404'd the page. This will often help me get through the audit quickly as we don't need to disavow links that point to 404 pages. However, we have to be careful that the site isn't also linking to another page on the site.
I like the idea though of checking for 404 pages that have good links pointing to them so that those pages can be redirected to a good equivalent and the link equity can be regained.
The only thing that worries me about doing that is if I don't have full control there is a possibility the site owner might decide "to fix" the 404's, no matter what advice is given - reactivating the poor links.
Exactly. This is why I still check 404 pages. If I am analyzing a really spammy link profile and I think that that 404 might have previously hosted an unnatural link then I'll disavow just to be safe.
Thank you for sharing this, really great tips. Going back to step 2 "Just show one link from each domain": what I like to do, before deleting any duplicates, is to list all the links along with other useful tags like for example follow vs nofollow. Let's said then I use for example only two columns A with extracted domains and subdomains and B with Link type (follow/nofollow), after that I will try to delete the duplicated AB pair-combination as follow:
In Excell
On the Data tab-> Data Tools group-> click Remove Duplicates
Remove A and B combination
Click OK
In Google spreadsheet
In column C run this formula
=unique(A:B)
Press enter
You will end up with a list of unique records or keeping unique pair-combinations, perhaps allowing you not to miss some exceptions by looking at one link from each domain.
Also, I really enjoyed looking at the animated gif. I must say I have never came across such great use of it.
Thanks again for sharing these really useful tricks and formulas
Ah...this is good too. Thanks for sharing Raul.
This is my first time trying out animated gifs. They were very easy to do. I used Snagit to capture the video and GIFBrewery to make it into a gif. I had to play around with the frame speed so that I didn't end up with a 6 meg gif at the end, but it was quite easy to do.
Link auditing must be a terrible job to do... what do you do when a site has 20k pages?
It's not the number of pages that's the concern, but rather the number of linking domains. Often a site that has millions of links will only have links from thousands of domains and quite often just hundreds. I have created some software that helps me eliminate some of the low hanging fruit by marking the nofollows and other things like that. I also have a blacklist that is now over 22,000 domains that I always disavow (and some that I always allow) and this helps me speed up the process as well.
But there are some link auditing projects that take me a long time to complete.
Adriana,
as someone who routinely refers my site audit clients to Marie, I can state for a fact that she's got the capacity to handle the big projects. Just one example - site has over 20 million pages indexed in Google, and when I sent her that client for her to help them clean up bad links, the site had millions of inbound links.
That is so helpful, a complicated subject explained simply.
Hi Marie, Thanks for sharing such a brilliant article. Really makes a good read!
Hi Marie.
Very interesting article.
Unfortunately my level of knowledge about links and backlinks is a bit limited so far, but I'm going to work into my current disavow file from Google.
Thanks for your insights
Always helpful Marie, although I have to ask - how can you love link auditing?
I honestly don't know why I love link auditing so much Richard. Once you get going it's kind of mindless and I can binge watch Netflix at the same time.
It's also pretty darn cool when I have cases where I get to see the the results of my hours and hours of link auditing result in improved rankings. :)
Wish I could watch netflix at work... guess I will have to start my own company
Hi Marie,
I wonder on disavow links, Is it good for a web masters to do so? Because i have done it before for some links which is not relevant to my targeting. So i have removed those. In that scenario if we are doing it regularly it will affect our website ranking?
Nithy
I only recommend doing disavow work if either of the following apply to you:
-You have built links to your site with the attempt of manipulating Google
-You are noticing a large number of links that are there as a result of a hack on your website or possibly negative SEO. In this case, Google is usually pretty good at just discounting those links but they do recommend disavowing just to be safe. More info here: https://moz.com/blog/preparing-for-negative-seo
If you haven't been actively link building, in most cases you should not have to disavow. Google knows that every site collects "junk links" that aren't relevant. The purpose of Penguin is to demote sites that have actively been trying to cheat.
The original post is great, and the comments are a great bonus! Hook up with Annie Cushing for her Excel wizard skills, and you could sell this as an ebook/worksheet template!
Hi Marie, I really would like to work this out. But somehow I get a #ERROR in the 'A' Column. I tried to copy paste your formula, but it doesn't select the cells. So I tried: '=FIND(B1-FIND("/",CELL(B1)-1))' and the cells turn orange in the formula. But I'm still getting an error. Can you tell me what I'm doing wrong?
Larry,
When you are pasting the formula, instead of doing just a straight paste (i.e. CTRL-V), try doing CTRL-Shift-V to paste without formatting. Sometimes copying and pasting a formula will also paste odd formatting which will cause an error. Alternatively, try typing out the formula. Hopefully that helps.
Unfortunately it didn't :(
Awesome sharing Marie! This sure seems to be a complex and time consuming job. Content audit is really important and always missed out by bloggers/content marketers.
Marie, thanks for the great article. I use a set of tools called ablebits for excel. It really increase productivity and automates many processes without writing formulas.
Another shout out for AbleBits. Love it. Well worth the $$.
thanks for sharing . i like it
Hi great article!
A tip: If you want to make it simple to get just the domain from the url, try my free add-in SeoTools for Excel where there's a function called UrlProperty that does this:
https://seotoolsforexcel.com/urlproperty/
Best regards
/Niels
Excellent, Niels. I've been a big fan of your tool set for years, but hadn't actually used this one. Will save me a lot of time. Thanks!
Hi Marie,
I wonder on disavow links, Is it good for a web masters to do so? Because i have done it before for some links which is not relevant to my targeting. So i have removed those. In that scenario if we are doing it regularly it will affect our website ranking?
Nithy
Uau, that's a really effective way to audit links. I knew "Link Detox", is a payment tool and you can do the same, but is really expensive. Definitely your solution is the best
For #2, I'm a big fan of using pivot tables here. You can use the same step for #1 for isolating the root domain as a separate column, then make that the top tier in the pivot with the folder level domains aligning underneath. If you use "count" as the metric for this, it also gives you a quick tally of the number of URLs from a specific domain, etc.
Thanks Jeff. It seems that every time I read about pivot tables my eyes gloss over and I just can't get it. It does sound like something that would work well if I could wrap my head around it.
Hey Marie - Pivots used to do the same thing to me, could not get my head around them - now, I wonder how I lived without them. Seriously. If I don't get to do at least one pivot table a day, I get antsy. You will too. Try Lynda video or just keep messing with one and believe me the lightbulb will come on for you. What helped me, somehow, and I hope helps you is that pivots simply "dedupe and summarize your data table". Trust me, they'll be your new BFF in short order! Ping me if I can help with a quick screen share; seriously, you will love them, and I'd be glad to help.
Thank you David. I may take you up on that offer.
Marie
I am curious - while I understand the value that can be gained from first paring down to one link per domain, what I have found in my work is that sometimes it helps to check two or three links - because when I do that I can sometimes spot artificial patterns that the first link checked APPEARS to have, yet where the confirmation I get from the 2nd / 3rd link checked is enough to move me from "looks suspicious, yet borderline" to "OMG blatant pattern".
What's your process in cases where a link may appear borderline on first pass? (If that's a trade secret, by all means no need to reply with a detailed answer!)
Ha! No trade secret here. I'll gladly share.
As I'm auditing links I have a comments column that I use to write comments to the site owner. But I also have a column for comments just for me. Here's a very common process that I go through when auditing links:
...obviously unnatural...I'll mark this as disavow.
....obviously a good link...I'll mark this as keep.
....yahoo...another good link...
...yuck...that was an ugly spam link...I'll mark it as disavow....
...hmmmm....I'm not sure about this one...it could be a good link but it "smells" a little like SEO to me.
For those debatable links I'll make a note in my column. Let's say I think this is actually an Advertorial link (i.e. a paid link from a news publication), I'll make a note saying, "Advertorial?"
As I go through my audit, if I keep seeing links that smell like this one then I'll make more notes. At the end of my audit there are only 1 or 2 links I've marked as possible Advertorials then I may go back and be more lenient on these. My ultimate decision depends on what kind of a penalty I'm dealing with. I may let a little more slide if it's Penguin as opposed to a manual unnatural links penalty. However, if I see a real pattern where there are a LOT of links that look like Advertorial links what I'll do is discuss these with the site owner to help form my disavow decisions. In some cases the site owner will say, "Oh yeah...we bought those links years ago." Well then that's an easy decision. But in other cases the site owner will tell me that these links came about as the result of some great press they got or an article on an authoritative site that got scraped many times. So, together we make the decisions.
I also have my blacklist as a guide as well. If I see a link that *could* be natural, but it comes from a site that I have disavowed for other clients then I'm much more suspicious.
The other downfall to just looking at one link from each domain is that sometimes you get pages where the link doesn't exist such as a /page2/ page or a feed url. In these cases I'll look up the domain in my original spreadsheet that contains all of the links and find a page that contains a link to my client. With that said, my in house software that I designed for myself is pretty good at picking a url for me to evaluate that is the most likely to contain an unnatural link. And how I did that is proprietary. :)
Wow Marie - you just described my process! That's essentially EXACTLY how I evaluate the links I review. Except scale. Because wow. Your scale is insane compared to my review slice. :-)
But uh, where you said "Yahoo... Good link". I was like "wait. No - a link from Yahoo is Good?" hahahaha :-)
Haha...perhaps I should have said, "Yippie!!! another good link".
I've a different kind of question to the author.. how you made 5576101d4bfd63.36099233.gif.. this image? Thanks in advance! Good tips :-)
I captured my screenshot with Snagit which I believe is a free Chrome extension. I use Snagit for all of my screenshots - it's excellent for allowing you to annotate and describe things. But it also does some basic video work. Then, I bought a $5 Mac app called GIFBrewery. I had to play around with the frame speed to get a gif that made sense and wasn't huge, but it was quite easy to do.
If you're conducting link audits by hand, knowing a few convenient spreadsheet tips can go a long way way for you. Knowing how to be smart in spreadsheet use can be excellent for manual link audit success. If you've downloaded an abundance of links from many diverse sources and wish to go to and carefully assess representative links from each domain, you may find yourself overwhelmed by the seemingly countless links available. Removing the subdomain or domain from every single URL, however, can make the process significantly easier and more efficient. Filtering can also help with manual link audits. If you filter your results so that you're able to view single links from every domain, you can streamline the process significantly, as well. A couple of other useful and effective manual link audit tips include looking for patterns in suspicious links and regularly evaluating your disavow file. If you make these all of things part of your manual link auditing routine, you should find the experience significantly easier on you.
Wow Marie thanks for this amazing article this one is a life saving post. I was searching for different ways to find how to do manual link audits and thanks to you. I only have one site to market and so I personally don't use any tools which are paid ones. My website is a small advertising company from a small city in Kerala so am sure this is going to help me a lot.
Wow. After reading the article and the comments I suddenly feel rather humble about my Excel knowledge. That said, there has to be easier way to do this, right? Someone's built a macro that does all this with one click, right?
I'm asking for friend.
White hats off to you, Marie, if link audits are the part of your job which you love! When I'm staring at a spreadsheet with sometimes up to 1M rows (a rare occasion thankfully) even with each domain limited to 3-4 pages...I definitely wish I was doing a monthly SEO report!
Anyway, a nice post. As you mentioned, only showing one link per domain is quite risky. I prefer to include minimum 3 pages per domain as just one opens you up for missing those spammy forum pages (for example) or comment spam, even site wide links where you might also have a link in content on the site etc .
Valid points Andrew. Scroll up to see my comments to Alan Bleiweiss about my thoughts on just looking at one link per domain.
There are some great tips here, thanks. I'm not an Excel expert so always looking for ideas and alternatives.
My (extremely) simple tip for the disavow file for domains is domains in column A, domain: populated into column B and in column C use =CONCATENATE(B1,A1)
Once populated copy column C, paste in place using values only.
Yup yup yup...and you actually reminded me of another tip I was going to put in the article but forgot. I used to do the same thing as you, but now I actually have shortened the task a little. I create a new column next to my domains in column A and type this: ="domain:"&A1
I've forgotten concatenate already! ;-)
Thanks for sharing a usefull information for Link builders, Marie.!
Its one of its kind post. Link auditing is not work basically but these formulas will help surely. Excel is such a advance thing. Thanks for sharing this post and valuable comments.
It's good and very valuable details shared for Manual Link Audit...
I didn't work like this... i search manually as if i have to search Health related site, i go to google and put the "weight loss"+"partners" ... etc methods
and collect data and searching manually in excel sheet... and i don't collect any subdomain website and collect monthly 2000-3000 industry relevant data for link building without any duplicate domain.
I think my technique is slow...
This is simply top-notch stuff, Marie! Spreadsheets have virtually uncountable number of formula to be explored in order to make life easy for the user. Your five tips are a great starting point for any SEO professional finding themselves in the myriad of reports arising from competitive analysis. Thanks!
Very informative article Marie. I used to apply first two steps but didn't know the remaining 3 steps. So I used to do manually audit links one by one. It was very pathetic task to check all the links manually. Now after reading your post i am bit relieved. It's really helpful to anyone regardless of whether he or she is from SEO industry or not. Thanks for the article.
Hi Marie , You are an expert of link auditing, Pl help me In GWT Links from google.com to https://www.oodlestechnologies.com/blogs/Implement-SocialAuth-using-Grails I have downloaded the link table and I found some unnatural links like https://plus.google.com/wm/trollface-meme-troll-gif-pics-lol-funny/+Oodlestechnologies/posts/W2sYPrRgdeE Can I disavow them. I asked in webmaster forum too but No help so far :| . TIA
Hi. That link doesn't resolve for me, so it's hard for me to assess it. Links that should be disavowed are ones that you yourself made for SEO purposes that hold very little value outside of SEO.
Brilliant tips, thanks for this.
Thanks Marie for the tips you explain the whole process in really good way very easy to understand.
Hello Marie!
Thanks for sharing your tips with us.
I'm no expert in Excel, but looking online I managed to get a decent Excel sheet to audit my web.
Metricspot initially used a free tool to audit rather simple to use, but I like to do it in an Excel spreadsheet. Maybe because I like to learn new things and gives me more satisfaction to be learning how everything works.
I love you like audit.
Very Nice post Marie, i like your tips,
i would love to read next post about Easy Product Data Feed Management for Google Shopping, i would love to know shortcuts for managing feeds.
Thanks a lot .
Me writing a post on managing data feeds would probably be the equivalent of me writing a post on the best ways to cook on the moon. I have no idea!
Hi Marie , You are an expert of link auditing
Thanks for sharing a precious information
What an awesome write up Marie! Side point, did you use Licecap to make the .gifs or something else?
I used Snagit to screen capture the video and GIFBrewery to make it into a gif. It was quite an easy process.
Nice to read a crash course for link audit process, Thanks for sharing
great.thank
https://doantutai.net/
Hello Marie, thank you for sharing 5 tips to manual links audits, I work in Online Marketing Agency but these tasks are new to me , but these 5 tips will help me to make a proper manual link audits. I had´t occurred to use these shortcuts to apply to excel audit links.
Thank so much, Marie! :-)
Very useful article, great. Thanks!
Hi Marie,
Great post! A quick question...
We have links from a few blogrolls, i.e. 100s/1000s of links from the same domain due to the blogroll being on every page. These are natural links and they come from decent blogs. They actually form about 80% of our total links even though they're from just a few domains. I know Google's John M has already said this is fine and there's no need to get real blogroll links removed/disavowed. But the issue is that our website name is itself a keyword. I won't name our actual website here, but let's pretend it's 'dress.com' and we retail dresses. Some people who link to our homepage use 'dress.com' as the anchor text, but others would use 'dress' as the anchor text. So we have blogs who have independently and naturally added a link to our homepage in their blogroll with the anchor text 'dress' or 'dress features'.
I'm reluctant to remove or disavow these links as they are genuine good links (and there are only 4 blogs/domains that have done this), but I'm concerned Google will think it's manipulated. What would you recommend?
Cheers
Simon
This is tough. I had a penalty client who had a similar situation as you did. But in their case disavow decisions were difficult because they had keyword anchored links that were natural but they also had a boatload of them that were paid links. When Google gave us examples of unnatural links some of them were actually natural ones.
I think that my answer to this question depends on how much link building you have done. If you have truly natural keyword anchored links mixed in with self made keyword anchored links then I might decide to remove or disavow some of the natural links that came from debatable sources. For example, let's say that a site that is linking to you naturally is also linking out with links anchored with "Free Casino", "Buy Viagra" or even "Fresno Lawyer", then I'd try to get your link removed.
With that said, if you have not engaged in building your own unnatural links and these truly are natural mentions then I would not worry about them at all.
Hi Marie,
Many thanks. The sites in question that link to us via their blogroll are high quality sites and don't link to any bad quality sites or use any questionable anchor text. So based on that, presumably it's ok to keep those links and not disavow even despite the large number of perceived links?
However, it sounds like we need to consider more than just the blogrolls based on your response. We have in the past engaged in link building. We've done a great job of having loads of those removed and disavowing the rest with a focus on links involving exact match anchor tags. But when auditing all the links, we've often retained links that appear natural where those links are to our homepage with anchor text similar to the keyword in our domain name. It sounds like we need to be more ruthless and even if links are natural, if they're from low quality sites and use the keyword in our domain name, then we should disavow?
This is really tough to answer without knowing a LOT more about the situation and even then my answers are my best guess. Regarding the blogroll links, don't worry about the numbers. I know it looks bad in GSC (GWT) when you see these, but really, Google doesn't count 10,000 links from a site as 10,000 distinct links.
Regarding the self made links, there are so many factors here. If you had a manual action then I would be very aggressive in trying to remove links. If you've been affected by Penguin, I might let some of these stay depending on how overtly spammy they are and on what patterns I saw with your link audit. If you haven't been affected by Penguin but are just trying to avoid getting hit I might be even more lax on my decisions.
Hi Marie,
It's really kind of you to advise here; much appreciated!
It's for recovery from Penguin. We've been very vigilant with carefully auditing every link and disavowing anything unnatural or from content farms, link pages, spammy sites, etc. But, based on your comments, I think we need to just go through those sites that link to our homepage with the keyword from our domain name and disavow a few extra ones that look particularly low quality, even if the link is natural.
Hi Marie,
It's really kind of you to advise on this topic.
We have truly natural keyword anchored links mixed in with self made keyword anchored links which is wm/trollface-meme-troll-gif-pics-lol-funny/ so Do we need to disavow those links.
That's tough to comment on without knowing more about the situation. And even then, no one outside of Google can really say with certainty what the best action is. Ideally, if Google's algorithms are working correctly you should be able to disavow the self made links and keep truly natural ones. With that said,I have had some stubborn manual unnatural links penalties where Google has given us example unnatural links that were good, natural keyword anchored links. In that case we elected to disavow those as well because we wanted to be safe rather than sorry.
In most cases though, a site has only a very small number of true mentions that are keyword anchors.
Thanks for sharing Marie.