Google obviously looks at a great many factors in determining site rankings, but what do they know about us as administrators of our sites, and how do they use that information? In today's Whiteboard Friday, Cyrus talks about some of the actions that Google takes based on information it can see, offering advice on how you can see the inherent benefits and avoid the pitfalls.
For reference, here's a still of this week's whiteboard!
Howdy Moz fans. Welcome to another edition of Whiteboard Friday. My name is Cyrus. Today we're going to be talking about how Google knows what sites you control and why it matters. How much does Google know about the sites you own? Can that be used to your advantage? Can it be used to hurt you? These are important questions that webmasters often ask.
Now technically, when there are relationships between websites, when you own websites, you have a lot of websites you control, this is traditionally known as an administrative relationship, meaning that you are an administrator. You can control the links on the site. You control the content of the site. Maybe these are sub-domains that you own. Maybe these are multiple properties within your business.
But Google, over the past decade or so, has spent an incredible amount of energy trying to figure out administrative relationships between websites, both to help you and sometimes to discount links between those sites or potentially some negative consequences. The reason that this is so important to Google is because, if you think about the link graph and relationships between sites, links between sites that are controlled by the same person probably shouldn't count as much as links that are editorial and controlled by other people, because when you get links, you want them to be natural and not something that you control.
The Good Side of Related Websites
But other times, Google wants to reward you for links that are related to one another. There are some definite advantages to establishing those relationships between sites that you own, and sometimes you want to tell Google that you own multiple sites. One example is to distribute the authority between those sites.
Now a perfect example is something like eBay. eBay has a site in the United States, and they open a brand new site in Ecuador. They want that Ecuador site to rank well, but they don't want to start over. They've already built up their American site so much. They want to transfer some of that link equity. So they want to let Google know that, "Hey, this is eBay. This is us. This should be an authoritative website."
This also works on a much smaller scale too sometimes, often on sub-domains. You see a lot of blogs being started on subdomains' websites because it's easier from a development point of view or for whatever the reason. You want that sub-domain, that blog to have the same authority as your main site. Now it's oftentimes up to Google whether or not they give that authority to your blog or your sub-domain. But if you can give them signals to tell them, "Yes, this is associated with my main domain," that often goes a long way in helping that sub-domain to rank.
Same with alternate languages. You have French content. You have Spanish content. You have English content. They're all on your site. Maybe they're on a different sub-domain or a different top level domain, but you want Google to know that they have the same authority as your main site that you worked so long to build up.
Also, we're starting to see identity play a role in administrative relationships, more at a page level with things like Google Authorship and things like that. But identity is becoming a big issue, and Google is working to figure out those identities on the web.
Negative Side Effects of Related Websites
Then there's the flip side, the bad side of administrative relationships. That's traditionally what SEOs and webmasters have been dealing with when they think about these things. The biggest problem is diminished link equity. Again, that problem of Google seeing that you control these sites, why should they pass as much link equity as sites that you don't control? So a lot of black-hat SEOs and gray-hat SEOs go to great lengths to hide their relationships between sites, because they don't want Google to discount that link equity.
Also, there's the idea of link schemes in bad neighborhoods. If there are 12 sites, and they're all interlinking to each other, that might be a pretty good signal to Google that it's sort of a link scheme and those links shouldn't count, or they could be penalized.
Finally, we're seeing a new phenomenon in Google: penalties following people around the web. These are instances where people are penalized. They burn their site to the ground. They're so frustrated. They decide to just start over on a completely new domain. But when they do so, ironically, amazingly, they find the penalty transferring to that new domain, even though they've cut all the backlinks. They've changed the URL, everything. How does Google know that that's the same site?
So these are important questions to ask yourself and help determine: Can you be helped by establishing these relationships between sites, or can you be hurt? If you understand some of the signals Google is using, you can take advantage of this.
Potential Signals of Related Sites
Now one thing I want to emphasize is we don't know all the signals. We have a few clues. Traditionally, Google has been looking at things like ownership, WhoIs records, very freely available on the Internet, where your site is hosted, the IP address, things like that. Elijah, what's the name of that website that we go to, to check who owns what?
Elijah: SpyOnWeb?
Cyrus: That's right. SpyOnWeb. Here's a simple experiment you can do. Go to SpyOnWeb.com. Type in a very common domain, like Moz.com. You can see all the relationships that we have, Moz, with all these sites that we either own or hosted on the same IP or same Google Analytics code or the same AdSense code. All this information is publicly available on the web. You don't need access to your Google Analytics account or your AdSense account. It's all there in the source codes of the websites.
By scraping the web and gathering all this information together, you can create a web of ownership that's pretty easy to dissect. Traditionally, C-blocks have been an indication of relationships on the web. It's something, at Moz here, we report in Open Site Explorer, number of unique linking C-blocks.
Right now though, we are in a transition with this, where the web is moving to a new Internet Protocol version, Version 6 (IPv6). The old C-block was based on Version 4. So C-blocks, it's actually going away, and the engineers here at Moz, working with some very smart people in the consulting world, such as Distilled, we're figuring out some new standards to report instead of C-blocks because we're losing these very soon.
Also, link patterns, when you have, again, a lot of sites linking to each other, and Google has a complete catalog of links or the most complete catalog of links on the web, when you take all this together, using various statistical analysis methods, you can determine pretty closely who's associated with what, who has control over what. These are all things that people are looking at, all that publicly identifiable information.
Some signals that people don't often consider are what I would call soft signals or content signals. These are more advanced signals that people don't actually always think about, but things that Google could look at, that we've seen them talk about in patent papers, are things like when two sites have identical or similar content, meaning content on Site A is the same as content on Site B. This would be a strong clue to Google that it may be the same site. They would probably look for a few of these other things, such as who has registration or analytics code or something like that, because a lot of sites get scraped. It's not a very clear signal.
But if you're simply moving your site from one site to another to escape a penalty, that may be not enough if you're using the exact same content and some of those other things, such as similar images. Two images hosted on different sites with the same content could be an indication that the sites are owned by the same entity.
Formatting, CSS, you'll often see sites that are owned by the same individual use a lot of the same WordPress templates, for example, or a lot of the same CSS files or JavaScript files. Again, by themselves, this is not a definitive clue because there's a lot of templates out there, a lot of free stuff floating around the Internet. But when combined with the other signals, it can create a very, very clear indication of those relationships.
Even something as simple as the contact details on your About Us page, if those are the same from site to site, it can be very clear that these sites are related.
Then on the page level, we have things like authorship. I've seen this work really well with in-depth articles, certain authors. This isn't a domain level signal, but more of a page level signal that can help individual pages to rank.
For content and language signals, the hreflang. Again, this is when you have sites in different countries, different languages, using this attribute can help establish those relationships to help you to rank.
So in general, it's very hard to hide these relationships from Google, because they have so much data available, and it's really not worth it. But oftentimes it is worth it in the cases of sub-domains, alternate languages, authorship, that you want to help boost these signals. Understanding how these all work can give you clues as to why you're ranking, why you're not, and sometimes what you can do to help.
That's all for today. Thanks, everybody. Bye-bye.
(hint... special bonus scene starting at 8:03)
Great post Cyrus, you've covered some excellent points which if we're all honest we all might have forgotten the odd one or two!
You have a very interesting point about Google penalties following you round the internet. As you say Google are very clever at finding signals to make the connection between the sites. These are my tips which hopefully might help someone #TAGFEE!
From the work I have been doing for the past couple of years specializing in (mainly Google) algorithmic and manual actions, it is possible to move a site onto a new domain to escape a penalty, if that is your only (last)option I might add. I personally would always recommend cleaning up a domain as the first option, but in reality sometimes it's just not possible due to the amount of spam/ resource.
How to move a website to a new domain to escape a Google link penalty
You need to remove as many signals as possible between the new and old domain. These are some of the signals to block when doing this unfortunately sometimes necessary activity;
I'm sure that there are many others identifiers which Google use as you say Cyrus, they have ALOT of data about sites, far more than we can even imagine I bet, but these are the ones which I have found to if done work.
Informing customers and visitors
The only safe way to inform visitors of your new URL I have found it to include the URL of the new website in image format without a link. Obviously this isn't great for usability, so this should be backed up with a marketing campaign helping to inform your customer base of the move.
I hope this helps someone else out who is struggling with a Google penalty!
Any questions I'm on @mocwoods
Whew! That's extreme.
Hopefully the vast majority of webmasters will never have to face this situation. 98.5% of the time, if you are faced with a penalty it's simpler and cost effective to address the penalty itself.
I want to stress that if you do choose to start your website over, you risk losing:
So it is definitely an option of very last resort, for sites that have burned their link profile so bad there is nothing left to redeem, and who have little invested in their brand and community.
Yeah, after experiencing a penalty follow to a new domain I would rather over egg it slightly than go through it all again. Lesson learn't the hard way in the early days.
I think you're probably about right with the 98.5% of webmasters not wanting to go down this route, thanks for clarifying how last resort this is for readers. It really is as you have said, when there is nothing left to redeem, and there were no good links in the first place.
Interestingly I did this new domain tactic, and the removal / disavow strategy with the same client on different geo-targeted domains. Each domain warranted a different approach, but both got results.
A little extra tip for anyone doing this. Watch out for intermediate links in your GWMTs link data, this is a very good sign of your domain flip not quite being perfect as Google notices the connection.
I concur here with Martin, this does help sites with a manual penalty. good work :)
Thanks for the confirmation it works Nick. Usually I go down the cleaning domain route as I said, but this way has work for 3 / 4 dozen projects I have worked over the years. My first couple of domain flips didn't work, but I quickly learned from my mistakes, as you do. It's nice to start a fresh some times
@Cyrus Shepard
We are looking for some clarification. In the white board video, you say "But if you're simply moving your site from one site to another to escape a penalty, that may be not enough if you're using the exact same content and some of those other things, such as similar images. Two images hosted on different sites with the same content could be an indication that the sites are owned by the same entity." yet @Martin Woods commented on his initial response that "New Content for the homepage and top level pages (ideally more than this, but it's hard with very large websites." to which you replied "Whew! That's extreme..you risk losing: Your content, if you chose to rewrite it."
Can you please advise on what should be done regarding moving to a new domain and still using the old content?
Please check out the question we just posted. We could really use your assistance and clarification on this matter - https://moz.com/community/q/manual-action-penalty-...
Escape from penalty and then what? Do link buildin in "old way" that caused penalty? That's the solution?;)
Better to cope with it and change link building to avoid penalization again.
Here is a massive case study of such a case - https://karakehayov.com
Key points:
1) Once Google knows a website owner doesn't comply with Google guidelines, they will try to identify all his websites and penalize them with no regard if they are good or bad. They will dig into Whois data and everything else they can use, no matter how effort if will take.
2) They manually penalize even domains, that are redirected in a relevant way to other websites. That is, killing non-existing websites for breaking guidelines.
3) Reconsideration requests don't help. Google just wants to obliterate the owner completely and forever.
4) The case is about selling backlinks. Google repeatedly penalized the sellers and not the buyers, who rank. So it's not about search quality, it's all about economical reasons (greed) - Google don't want people to buy backlinks, because these money could be spent on Adwords.
5) In the EU, Google is considered a monopoly. As such, they have the legal obligation to treat websites equally, according to their quality. As this is clearly not the case, they are breaking European law.
Several years ago, I helped someone with several small AdWords clients. One client didn't quite understand what AdWords was, and disputed 2-3 charges from Google on their credit card. The total was less than $60.
The AdWords accounts for ALL of my accounts that I had verified under GWT (several from this person, plus two others that were independent of each other) were suspended without notice or explanation. The GWT verification was the one and only common thread in all of these sites. I finally did reach someone in AdWords by email, and got the other accounts reinstated.They wouldn't elaborate on what had happened, saying they needed to protect their fraud detection methods.
In response to Cyrus' comment about blog/link networks:
"link networks that survive are becoming indistinguishable from actual quality web content."
It is kind of funny when you look at it this way actually, funny because it is quite true. I have friends who run some pretty major blog networks and do so successfully- but the investment, both in terms of time and money is HUGE.
These guys very carefully build these networks, set up social accounts for the sites, pay decent writers to hand write unique, relevant content, don't manipulate the anchor text or link the blogs together, create unique logos, about us pages, themes, etc. and know what? It works- sometimes anyway, but it is a lot of work, and expensive, crazy expensive.
And here is the really crazy part, recently many of these sites have actually started to rank and get traffic!
Traffic?!?!? Rankings?!?!?! Wait a minute, this is supposed to be a blog network! Did all that hard work, attention to detail and decent content actually result in REAL BLOGs with real visitors?
Successful blog networks have had to evolve with the times to the point where they are, well, becoming real blogs.
Perhaps I am dating myself here, but it reminds me of an episode of Growing Pains where Mike Seaver fails to study for an upcoming test in one of his classes. Instead of studying he stays up all night and painstakingly writes the answers on the bottom of his shoes so he can cheat during the test.
Long story short, after putting in all that work copying the answers he finds he actually learned the material and ended up not having to cheat to do well on the test.
blah blah-
I think you all see the analogy here-
At a certain point getting good at cheating and manipulating the system is even more difficult and time consuming, or at least equally so, as just doing things the right way. I won't lie- sometimes I check out the Black Hat Blogs and forums and even try some of it on on my own testing domains. A lot of that stuff is so crazy it's like "Is that really easier than just writing some decent content and telling people about it?"
An over simplification, perhaps, but the idea is there. I don't know how many nights I have stayed up late trying to perfect the latest ranking scheme only to realize "hey- I could have wrote something really awesome in that same time and it might have actually worked. Instead I just auto-generated 100 Word Press Blogs, filled them with spun gibberish, SPAMMED the h*** out of them, and now have a bunch of worthless crap on my hands that I wouldn't dare link to a client (or my own) sites."
In the end poor Mike Seaver got busted anyway. Feeling quite proud due to his success he leaned back in his chair and put his shoes up on the desk in front of him. In doing this he exposed the bottoms of his shoes and his teacher saw that he had written the answers on them and assumed he had cheated even though he did not. The teacher gave him an "F" on the test.
Think about it, ha ;)
I believe that if you own two sites that compete in the same SERP that Google will throttle the weaker site. You might have two sites that are capable of ranking #1 and #2, but because you own both of them, google will throttle the second site to the bottom half of the first SERP - even if they are absolutely different in content and one might even be retail and the other informational. Anybody agree?
Cyrus really 'working the room' - "Hey Elijah, what's that site we use.. ? ... "SpyOnWeb" - thanks Elijah :)
Interesting that a penalty can follow a site around even when starting afresh. I haven't done much in the way of penalty recovery, but I guess the best advice would be to change up the content and code as much as possible, so the site is as different and "new" as possible.
On the subject of code, unlikely to happen, but with this in mind, I wonder if the same Wordpress Theme was used on many sites with penalties, if it was possible that the Theme could get a bad name in Google, and consequent 'innocent' sites using the Theme could also be penalised. Maybe a mass test for Rand's IMEC project!
Hey Greg, I think you're spot on. I don't know if a theme alone would get flagged, but combined with other signals I think it's definitely something Google would watch for.
Spammers who spin up 1000s of sites at a time don't spend a lot of time on design, and typically reuse the same themes over and over.
What about Google Chrome, Gmail, and other Google properties being used to connect the dots... Might sound like a crazy conspiracy theory, but so was the NSA scandal before Snowden blew the whistle.
They are without a doubt tracking logins from webmasters. You can find all that info deep within your settings. If they haven't worked it into their algo yet, I'm sure it's coming in the near future.
Interesting topic Cyrus. The key takeaway for me is that there are many ways Google can infer which sites an individual or team administer so using manipulative linking strategies among sites you control should be used only in very specific circumstances like the examples you mentioned (e.g. like a new ebay site in Ecuador or a blog sub domain). Manipulative link building with keyword rich anchor text requires the special ring you mentioned at the end :) What was interesting is that even if you torch your domain, Google penalties can follow you around.
I wonder what will replace c-blocks in terms of the IPv6 implementation?
Re: IPv6 - while we don't know what parts of the IPv6 search engines will look at (if any) our engineers have been examining this issue in earnest to try to find a conceptually equivalent measuring stick. We actually use these here in our metrics here at Moz (such as Domain Authority)
We'll update everyone when we've got it all sorted out.
Great timing on this topic Cyrus! I was just discussing this with one of my clients yesterday. That tool is really cool! I especially love the end of the video! My precioussss....
Hopefully the vast majority of webmasters will never have to face this situation. 98.5% of the time, if you are faced with a penalty it's simpler and cost effective to address the penalty itself.
There was a good article on Forbes last June about online fingerprinting technologies. I'm pretty sure Google uses a similar technology to build relation networks between Google accounts.
SEOs often switch between different Analytics and WMT accounts to separate the dodgy ones, but if Google uses fingerprinting then it's completely useless.
Forbes: https://www.forbes.com/sites/adamtanner/2013/06/17/the-web-cookie-is-dying-heres-the-creepier-technology-that-comes-next/
It seems that an easy way to think about it is this:
If a very (or even somewhat) attentive human being can detect patterns that may show affiliation between web properties by seeing patterns in linking, addresses, design, content, hosting and so on- you better believe the big G is going to pick up on it.
I know people who actually use proxies to work on their different sites if they don't want Google to suspect they are affiliated. No one really knows what Google can, can't or will do to identify foot prints and other things implying ownership trends of web properties.
All I know is that I am not so bold as to think I can fool a super algorithm developed by a team of geniuses backed by billions of dollars- with this in mind it makes sense to play the game by Google's rules, to me anyway ;)
Thanks Cyrus! So a question which has been bugging me is this
We provide website hosting as an add-on. Clients can choose (or not) to host their website with us. When we design a site a client can choose to keep or get rid of a 'credit link' to us as the designers. However, those sites are on the same server as ours, so I am assuming therefore that the links will be of lower value because of us cross-selling effectively and gaining hosting clients - i.e. a valid business decision as a negative impact on rankings and link value.
Is this your understanding as well?
Hi Martin,
My assumption is any repeating-pattern link likely has less value, and putting them on the same server could be another potential signal.
That said, the situation you describe is extremely common. I've heard of sites getting dinged because of similar arrangements, but I've also seen site continue to dominate the SERPs with this setup. Seems like there needs to be a combination of more elements to tip the scale - but yeah, this is a situation I'd be wary of long term.
Thanks for getting back to me. Really tricky to avoid without the cross-sell. Guess doing something to vary it is needed. We've worked really hard to prevent too many patterns in such links (e.g. not demanding anchor text) but hard to avoid the repeating IP thing.
Lots to think about. Thanks!
So basically it's only going to be a bad thing if you're doing bad things with one or more of your websites? Also I wouldn't rely on SpyOnWeb too much, just punched in our site and it didn't come up with most of the shared properties we have.
Most importantly of all, is that the one true ring to rule them all? "Them" being the Search Engines, obviously! :)
Another great resource is Nerdy Data:
https://search.nerdydata.com/analytics#
Search code, backlinks, analytics data and more.
Looks more promising, thanks, will check it out.
On initial glance I can see it is definitely an awesome tool, Thanks. :-)
NerdyData looks pretty good! Thanks Cyrus.
If anyone wants a laugh, find a client of your local web design company (search for "web design by brand"), then use that spyonweb.com tool.
You'll often find that they host every clients site on the same ip address, while still linking back to their site in the footer.
And in my case, this is the no1 reason they rank top for "web design + city".
So if you think about it, they host all sites on the same ip to show they are all affiliated, all linking back to the same web design site, shouldn't this pass a penalty with the theory of the video?
Oftentimes, it does result in a penalty. Not always, by any means... but it does happen unfortunately if Google suspects manipulative, editorial links.
Well written RR and You have a chance to get back. But I think website will be monitored then.
Krzysztof, could you elaborate on what you mean? You're leaving a lot of comments in response to other comments, but they're not always making too much sense.
All comes from my experience and knowledge. I haven't found anything about "monitoring" websites by Google after penalty lifted but I see serps and get sometimes reports from customers about their Seo after. For a some amount of time serps stands still (not defined time - week/2/month etc) no matter what they do but if they use "old techniques", they get hit fast and rather in a matter of days-2weeks. That's my conclusion - sites are monitored later and if they use bad techniques again, they'll be penalized fast.
may be we have a little language barrier here, but I understood what he ment - in this comment above
So if You (all) don't understand something I've wrote here or there, write me pm. I know I use abbreviations and not everybody is familiar with that:)
Matt _J_UK you are most probably right and I 've learned my lesson recently the hard way. I am working for a small agency in Greece. We built sites for our customers while at the same time we host (shared hosting provided by a third party company) all these sites under the same hosting company (not to show that they are all affiliated) but because the vast majority of the clients do not want to mess with hosting details. We have not taken any manipulative actions (link building wise) but we use a "dofollow" link "Powered by xxxx" pointing back to our corporate site.
Everything was ok until early February (between 3-5 Feb) when we suddenly lost over 2/3 of our organic traffic. I didn't receive any mail in GWT indicating a manual penalty. Investigating the issue I came to the conclusion that we've (most probably) been hit by a Penguin algorithmic update taken place in early February.
Since then, I have (very recently) converted the "dofollow" links to "nofollow" but we have not yet recovered our organic reach. I also suppose that the "administrative relationship" is strong enough as I am also verified owner in GWT and GA for all entities.
Any suggestion from the community will be much appreciated.
By the way Cyrus Shepard, your article helped me a lot to confirm the "bad" part of the "administrative relationship".
Why change to nofollow?
Just switch to a VPS where you can still host all the sites on, but can use different IP's.
Pass the extra cost of the IP's onto the customer and you should be fine.
Don't wanna lose that link power.
Here what Cyrus has explored, I was wondering the same for last few days. I saw Zurb apps. They are all very good for link building. But I was just doubtful that Google will catch if used all four of their apps. Which are these:
https://prntscr.com/3fkto1
So after this whiteboard Friday, my confusion is solved. I must say Google is a clever robot.
What if domain author info is hidden behind "whois guard"? Do you think google can see that too?
I actually don't know the answer to this, but I think if most privacy protections are activated at the time of domain purchase (not after) than the privacy company becomes the owner on record before your registration data hits the web, and you in fact remain invisible in this regard (don't take me for my word, I could be very wrong about this)
Google is really smart and there is nothing we can hide from Google. Google control the online world and they have access to every single website. Even the Ring can’t hide you from Google Evil eye.
By the way I really like the concept of “Lord of the Ring” at the end of the video. :) It’s my favorite movie ever.
I always wait the whole week to read the WhiteBoard Friday's content. I have seen people changing hosting and other things but with the same web content and it didn't work. I just loved the video and Cyrus I need that secret black hat SEO ring, please provide me that.
For companies that have alienated their blog for the original site by putting in a subdomain that does link to well to the home site, is making sure that there is a clear administrative relationship, enough ? Talking only from the perspective of google ranking because I believe it creates a bad user experience for their audience anyway.
Hey Cyrus,
Thanks for the exceptional WBF. You're right Google has very much advanced now to track the domains authorship but so as the Black Hats.
Recently in my gathering, friend of mine told me that they are running a blog network that has been very successful. I straight away asked him, you haven't get penalized? He said, "We know the game of Google and we know how they track. We have the domains registered from different parts of the world with all the international IPs and hosting data. We don't link and talks about each other at all and this is one of our secret".
I'd like to know is Google still not much advanced to track these kind of hecks? Plus, I'd love to see any real time example that carry the penalty coz of the CSS/Java Script files.
By the way, you totally rocked the last part. May I also have that ring?.. :)
Looking for your feedback.
Thanks
Some of my black hat friends have suggested it's still possible to run link networks (though most have been burned in the past couple of years) but 2 major things have changed that make it both exceptionally harder and more expensive than it used to be. The networks of today are much larger, and if the network is going to last more than a few weeks the content has to be much better. In other words, link networks that survive are becoming indistinguishable from actual quality web content.
Makes you wonder.
Thanks for your thought Cyrus.. :)
I could see that spyonweb shows domains that are hosted under the same IP but have otherwise nothing to do with me. I guess it is like this because I use shared hosting. I wounder if I could get a penalty if one of those domains (where I have no relationsship just the same IP on the shared hosting) get a penalty by Google. My guess is that Google would recognize that there is just the shared hosting together but I am not sure.
dito - and some domains hosted at STRATO he cant say anything about...
I'm guessing that if Google does use this type of information it would only be for manual reviews. Given the number of sites that are on shared hosting, it wouldn't make sense in my mind for Google to penalize sites across the board for being on the same shared host as a spammer.
"Finally, we're seeing a new phenomenon in Google: penalties following people around the web."
Anymore articles/case studies on this? Google vs BHs always makes for great reading :)
Hey Nick, I've heard stories of this anecdotally around the web, and it's usually not black hats. It's typically well-intentioned small and medium sized business owners who hired the wrong marketing agency or aggressively practiced link building practices that worked in the day, but are considered outside of Google's guidelines today.
"penalties following people around the web" - ouch! It's like you become a criminal on Google.. Would also like to see more on this.
People learn from their mistakes and a lot of companies will test out different strategies to see what does or does not work (the smaller agencies). I guess the days to "experiment" are long over since there is so much at stake...
Thanks for the reply Cyrus, that's certainly interesting to know!
I have always wondered a lot about this topic, but never found anything good to read. Since reading it was a breath taking experience. No queries!
But just one thing in case... What about people using a proxy on server, location or website ? Can they trick Google into proxies?
Also is there any thing one can do (If we could or if we just want) to stay anonymous?
There's no foolproof system, for Google or anyone else. Proxies and different programs can effectively hide your behavior from Google, but it's cost and labor intensive.
From a browsing point of view however, it's much simpler to remain anonymous than it is to host a website. (even from the NSA)
how much for the blackhead ring - and is it available on amazon?
does spyonweb.com not work anymore? I keep getting an empty screen. from any browser...
found the solution already. have to be using a vpn server to be able to access the page somehow. doesn't work from my location.
Excellent post Cyrus.
I have a question related to the IP address issue that you spoke of.
"You can see all the relationships that we have, Moz, with all these sites that we either own or hosted on the same IP or same Google Analytics code or the same AdSense code"
So when setting up a site or account for a client on the server, do you think it is a good idea to have the accounts on a dedicated IP address?
We host and manage a lot of accounts, that have separate services and audiences, and have been using dedicated IP addresses for each account, unless they are all under the same account (multiple sites). Do you think this is the best method of setting up the accounts? Our company website is also on its own IP, and not shared with any other client account. All the accounts are located in the same analytics account though. Any advice or insight would be greatly appreciated!
Unless you have reason to hide these relationships, this is perfectly normal and expected.
Great WBF Cyrus! Migration of site to IPV 6 is something that everyone is going to go through and it would be exciting to see what comes up. C-Block links is something which really matters when it comes to following you from SEO watch dog "Google". Maybe caring about this would be surefire way to get through this trap. Also Martins comments were very commendable on tips to prevent Google from following any web masters. Thanks again!
I would like to talk especially about network sites. Both through manual check as well as Google Algo, it can be determined whether a set of sites are potentially a part of the link network. I would say that in 60-70% of the cases, it can be found out by one of the following ways -
1) Link to the posts of the network is either equal to 0 or there are very few poor quality links
2) Each site of a given link network links out to at-least two common money making sites.
Yet another prime example of it's not what you do, but rather how you do it.
I'm pretty sure that Google knows more about what I've done online than I'll ever be able to remember. That in mind, I try to keep as cleen as possible. Thankfully to date I've not been penalised. But what's considered clean today not so much tomorrow - much like underwear I guess. :)
It is difficult to create a blog network with different systems and languages. You have to spend more time in it.
Cooooool Whiteboard Friday! Thanks for sharing! I was asking myself those questions as well :) Well done!
Really great information. I do manage many sites but most are not my own and are all over the board which we try and gain natural links and content for. I think I'm in the clear..... I think...
Think it's more likely and far more accurate for Google ( penalising innocent would not be good ) that they track your NAP.
How I would do it.
This is just great video, I hope that all the things you explained in this video is same as Google use to find out the relationship between two domains. I learned a lot. Thanks.
Back in December 2013, Google penalized 8 websites, including some I had removed from GWT a few weeks before. I started a discussion about it at Webmaster World:
https://www.webmasterworld.com/google/4632518.htm
Guess your post explains it all, Cyrus. :)
I always interlinked my websites whenever it was relevant. I don't have to justify myself in front of Google, but I understand their point given the fact they run a link-based search engine.
I guess that's why not a lot of personal blogs figure in Google, as this is a common practice in the personal blogging world.
Link building these days is hard, so a good way is to have various websites with only one service You provide. I've found this is rather cheap way to boost rankings for desired keyphrases. Like we all know: theme matters these days. Good example of it is a multiflavor cake. If our company have 10 services, cake has 10 flavors so only 1/10 is the flavor You want (query typed in google search box). Other cake has 100% of one flavor and it's likely to be highier (without any link building to Your or this website - just content) in serps. If You want chocolate flavor, You'll choose 100% chocolate cake, not 10%. Google does the same whem it comes to set rankings (as I said without doing link building just based on content). So making 10 different but useful one service oriented websites is good practice. I said user useful and friendly, so that's not spammy technique to boost serps for main website. It rather could be changed to just a list of services with short descriptions, than remain the same, as earlier. Doing link building for these new websites will be easier to get high serps than doing it for main website not one-service oriented.
In my opinion when it comes to build a new website like that You don't have to hide some info to not get tracked by Google. Of course doing it right (no spammy techniques at all and no unnatural links) and making nofollow anchors would be all You need. In this case using the same GA account and GWT won't harm You.
But if You want to be more invisible (my thoughts about how to hide):
- make whois hidden when You buy new domain
- buy domains from different registrars
- buy not only .com but .net, .info, .biz, etc...
- don't buy domains with money keywords inside - emd
- don't buy all domains at the same time
- use different server providers
- use different c class (at least) for each domain
- every c class must be in the same country as service You provide: if Your business is located in US, c class must be located in US and if possible, in the same city
- change ns servers to for example ns1.domain.com
- use different template (more unique - better)
- don't copy content between websites
- don't link from one website to other
- don't use GA and GWT at all OR if You want, make a different accounts for all websites BUT every loging must be made from different ip (use proxy etc); or use hide my ip etc;
- if You put there a link to Your main website, don't forget about putting there other links too to competitors (yes!), company (for example main website of a brand)
More paranoid?
- don't go into website by google search
- don't check serps with google
- hide Your computer ip
- block ga code (comodo's priv dog for example)
- block every google tool on Your computer - or delete
- don't ping google or send sitemaps at the same time
- don't write content at the same time for all websites
- write content in business hours (if You have some sort of automatic tool posting content You write - set there posting hours)
Am I missed something?
Thanks you for your ideal! this is a correct analysis
I've read multiple times that Google doesn't use Google Analytics data/signals when ranking your site. Are you suggesting that they might use the "meta data" info from Analytics such as what sites are linked together administratively and they might use this to the sites' advantage/disadvantage in the case of a penalty or suspect neighborhoods?
I think a lot of SEOs assume that Google wouldn't use Analytics relationships because there are lots of SEO/Inbound agencies that have admin access to dozens, if not hundreds of sites. Perhaps the strong signal they use is who originally set up the account and under what email. Either way it's unsettling that they would use Analytics information, but I guess if you've got clean, honest sites there's nothing to worry about?
Right. Anyone can read the GA account in your source code, without having access to the data itself. We don't know if Google or Bing actually use this information, but they could.
So in this sense, you can't see who has access to the accounts (such as an agency) but you can see if the properties share the same account number.
This also means they don't have access to the email that setup the account, just the number itself and the association with the website.
So if your an agency managing multiple sites, no worries.
In my opinion it's naive to assume Google doesn't use the data. It's possible they don't use it, but if I had to put my money down I'd say they will use it should they feel it helps them to improve SERPs and reduce spam. I mean why else would they pour millions of dollars into a free tool, if not for the information they get back from it.
I dont know if the is just parnoia or not, but I have always wondered if a penalty has followed me or my email address that I was not aware of. As in I was penalized by Google but never given notice. Does that happen?
Here is the history. Around 2007, when Google maps let you edit businesses, I paid a bunch of amazon turkers to update locations that did not have their website listed to link to the business on my business directory. Some did just that, others removed websites and put in my link. Google would change some back, others would slip through. In fact, I still come across some from time to time. That was my first website and I thought I was really clever and it worked for a time. And then it didnt. Since then I have designed other sites and kept my nose clean and have stayed away from black hat tactics. I have never been formally penalized by Google. However, my sites rank well in Bing, but are in the weeds in Google. Something like 90% of my ad income for various sites comes from bing traffic.
Paranoid? Will this follow me forever?
Google search engine is really very smart and each and every word we search records in its server. Google control the online world and they have access to every single website.
Cyrus, great job knowing how to both educate & entertain your demo ;)
As a marketer who is more familiar with the content side, I found this super informative. Love how Whiteboard Fridays switch it up to cover every aspect of SEO.
It seems nowadays that the SEO conversation is all about getting out of Google penalties. Just stop getting into them in the first place.
So don't use toxic spammy blackhat cheap techniques:)
Good information, but I've seen in a few industries where most of the website have more or less same content and images, especially in Legal. As they use the standard clauses and definitions, so most of content is almost same. I've even personally experienced these things. I had about more than 50 legal sites of more than 15 attorneys and all had used almost same content and images and gradually all websites started dropping in SERPs and I lost all my clients. I had never used any black hat technique. But as you explained, Google might have thought that these websites are owned by the same person. Isn't this a flaw in Google's technique for identifying whether these websites are owned by the same person or not.
they may have the same clauses and definitions, but if that's the only content they had, or even the majority, why wouldn't you just link to it?
A website is a lot more than clauses and definitions. Each should have had their own unique content. Look up some more posts on duplicate content.
Haha and that's my daily SEO job:) You forgot about blocking bots from moz (haha), ahrefs, majesticseo etc to avoid negative seo done by competitors. Or to not get found by them.
You forgot to mention to delete yourself as a user in Google Webmasters for any shady domains.
Hi Cyrus! Your post is really interesting because you can share some different information that I have never read before. Secondly, you have given the site link, where we can find all abstract information that was really pretty awesome. Thanks
Hey Cyrus, great whiteboard Friday! I wanted to get your thoughts on multiple sites on a smaller scale with the same owner. For example, a personal injury attorney that has say a main firm site, and then a site on Birth Injuries and then a site on Nursing Home abuse. I'm not a fan of the multi-site strategy and have seen signs a Google penalizing these companies. Do you see this as an effective strategy or would it be more beneficial have all the content on one domain?
Like you, I'm not a huge fan of multi-site strategies simply for ranking purposes. It's harder to build authority to four sites than it is to one. And a lot of law sites tend to interlink themselves together with way too much over-optimized anchor text.
If it makes sense for the audience, by all means do it. Many companies sell different products that benefit from their own branding.
For example, Proctor and Gamble could reasonably have 100s of different sites for their catalog of products. For the average law firm, not so much.
Great Post.
It is not so easy to identify the relationship website.
this is article really amazing , thank you
I'm surprised you didn't mention anything about phone numbers and Android devices. Having the same phone number associated with different Analytics or Webmaster Tools accounts seems like a dead giveaway signal they are controlled by the same person or company.
I also assume that if you link the same Android device to two accounts (let's say that you use Gmail as your 'official' email-address for the domain) that Google would be able to link the two.
Some people go even further and assume that accessing different accounts from the same IP (say your residential or work IP) would signal a relationship to Google. I don't know if they do this, but I'm sure they could if they want to. The data is theirs for mining.
Thanks for a great white board Friday the Spy-on-web tool is great thanks for sharing. I wonder if Google can pick up signals from sharing Google analytics or webmasters (so you wouldn't technically have the same code).
Cyrus, I see in my local market that a business has bought up 100+ domains that are keyword stuffed, each website covering different pieces of the business, though limited unique data and very plain sites. Each website has same contact details, names etc and it should be crystal clear to Google it is the same site owner / business across domains. Each site has no back links (or close to none), are overloaded with keyword stuffing, yet all sites rank on page 1 on Google for the keywords in the domain. If Google was as advanced as you indicate, how is this possible? I can understand if it is 1 domain with keywords, but when Google see same contact details across 100+ domains how can Google rank each domain on page 1 for relevant keywords that many other business with 1 domains and much more relevant content are competing to rank for?
Unfortunately, spam like this still happens.
As a long term strategy, it hardly ever works out for most folks.
Also, we only see the ones that Google hasn't caught. You don't know if there are 50 businesses that have done this and failed for every one business that is left.
Hi Keri, It would be interesting to hear about the ones Google has caught vs not caught. It is easy to point out 1 case, like I did, that has gotten away with it, and likewise you can point out 1 case that has not gotten away with it. If there is an objective test run of 100 websites and we see a pattern with 70 being caught, then the writing is obviously on the wall. Not an easy experiment, but it sounds to be like there is an assumption that Google is so advanced in these types of tracking (which may be the case) but I feel there is a lack of data to back up the claim.
Here is my opinion, if you don't install google analytics code or the webmaster code on your website then they will have a hard time tracking all your data and links! I know everyone loves google tools but you can find other great ones on the internet as well.
The bottom line is google hates anyone that does any link building. If your doing so called "white hat" link building today it will most likely be considered black hat tomorrow. All I'm saying is, if you going to do any excessive link building then uninstall googles code from your website and use another tracking program. Google loves having everyone wrapped around their little finger!
Good link building - without money keywords.