Tonight's post comes via the Pubcon conference in Las Vegas and is likely of interest to many in the webmaster and search communities. Today, during the Interactive Site Review Session, Google's head of Web Spam, Matt Cutts, along with Vanessa Fox of NinebyBlue and Derrick Wheeler of Microsoft took thorough dives into a number of sites. The session was well coverd on Twitter, and in live form by Barry Schwartz at SERoundtable.
Matt Cutts and Vanessa Fox on the Site Review Panel (photo credit: davecolorado.com)
A few points in particular stood out and are worthy of coverage:
- Blocking Internet Archive may be a Negative Signal
Matt Cutts noted that spammers very frequently block archive.org from crawling/storing their pages and few reputable sites engage in this. Thus, it's a potential spam signal to search engines. SEO Theory has a good writeup on when and why there may be legitimate reasons to do this, but webmasters seeking to avoid scrutiny may want to take heed.
_ - Web Page Load Time can Positively Influence Rankings
Maile Ohye actually mentioned this at SMX East in New York, but Matt Cutts repeated it again today. In a nutshell - while slow page load times won't negatively impact your rankings, fast load times may have a positive effect. This comes on a day when the Google Chrome blog introduced their new SPDY research project. I'm particularly happy about this news, because it's also true that load times have a positive second-order effect on SEO. Pingomatic recently published some excellent research on load times from Akamai noting the expectations of users for faster web browsing have doubled in the past 2 years. In addition, fast loading pages are, in my opinion, considerably more likely to earn links, retweets and other forms of sharing than their slow-loading peers. This tool from Pingdom is a great place to start testing your own site.
_ - It May be Easier to Walk Away from Banned Domains
Sites that Google's webspam team has severely penalized or banned entirely from the index can be very difficult to re-include, and thus, Matt suggested that "walking away" and "starting over" may be a more prudent strategy. In my opinion, this is largely due to link profile issues - if your site has a "spammy" link profile, it's tough to ask an engineer to sort out the wheat from the chaffe manually (or algorithmically) and stop counting only the bad links. Thus, re-consideration requests may not be as effective a use of time as registering a new site and trying to re-build a more trusted presence.
_ - Repetition of Keywords in Internal Anchor Text (particularly in footers) is Troubling
During a specific site's review, Matt noted that keyword usage in the anchor text of many internal links, particularly in the footer of a website, is seen as potentially manipulative. Yahoo!'s search engineers have noted this in the past and we at SEOmoz have seen specific cases where removal of keyword-stuffed internal links from a footer had immediate impacts on Google rankings (removing what appeared to be large negative ranking penalties sitewide).
_ - Having Multiple Sites Targeting Subsections of the Same Niche can be Indicative of Spam
Matt Cutts today mentioned that "having multiple sites for different areas of the same industry can be a red flag to Google." Though Googlers have mentioned this before, today's site review panel brought renewed attention to both Google's ability and proclivity for carefully considering not only an individual site, but all the other sites owned by that registrant/entity/person. Given Google's tremendous amount of data on web usage behavior, many SEOs suspect that they track beyond simply domain registration records.
I also presented at Pubcon today - on a panel called Linkfluence: How to Buy Links with Maximum Juice and Minimum Risk (live SERoundtable coverage here) - as the counterpoint speaker (on why not to buy links). I'll try to have that presentation in written format early next week on the blog.
p.s. I was asked by a large number of attendees at the conference about our venture capital fundraising experience. I expect to be able to write about that very soon and certainly appreciate all the support. :-)
p.p.s. For those who are interested, my brother, Evan Fishkin (who works at Portent Interactive) had his head shaved by Google's webspam chief. On a personal note, I must say I was particularly impressed with Matt's ability to shave a head without nicks or cuts, and his foresight in bringing proper equipment. Unfortunately, I'm not fully briefed on why this occurred, but I do know that my little brother was in terrible need of a trim (photo of my shocked observance of the event here & more photos/video here).
Speed is money. Page load should be a KPI of your web developers because sure, there is some SEO benefit, but the biggest benefit is to your bottom line. Low perceived load time = high user satisfaction = profit.
A 500ms delay that Google accidentally introduced caused revenue to drop 20%. There are also numerous A/B tests showing the cost of speed to Amazon and Yahoo.
Interesting sources you have there.
What's the best practices for tracking load time?
Here you will find tools to downlaod to track loading time https://code.google.com/speed/page-speed/download.html
In addition to the site review session, Matt also mentioned the speed of sites in the Super Session. There, he stated that although Google hasn't used how fast a site loads in the past as a ranking factor, that they were highly considering it for the future. He specifically said that it would be good idea to get your sites loading quickly in 2010.
Thanks for your addition, Jennita... I think speed is indeed quite important. Google already uses speed for quality scores in AdWords so it is a logical step to include the technique in the organic results also.
And the speed factor can be - hopefully - another good reason why many really old institutional websites (especially case of the small companies websites) still designed without CSS or other light weight kind of web design, will finally be updated.
I don't know you, but sometimes is really hard to convince a company that the 1st best web marketing action should be the total redesign of its website.
That makes it sound like it might be in "coffee" or at least something they can turn on?
"Matt suggested that "walking away" and "starting over" may be a more prudent strategy."
But I've heard Matt say time and time again there's no such thing as a bad link, and that Google recognizes that Web Masters can't control who links to them.
Are you saying he said something different or is this just your opinion?
This is true, and we know that bad sites in bad neighborhoods will link to good sites in good neighborhoods to help legitimize their site.
But, it's the bigger picture that is at play here. If 5-10% (just random numbers here) of your links come from bad sites, then this probably doesn't even cause a blip on the radar; but if 90-95% or maybe even 50% of your links come from bad sites and neighborhoods, that starts to raise concern. This starts to look less like normal linking and some kind of association between sites.
Keep in mind that these scenarios aren't the norm. Most sites are not banned, or even penalized. If they are simply penalized or dampened, then there are probably steps that can be taken to reverse that. I'd say that the walk away strategy being described is for much more sever instances.
Sounds good, I'll have to do some more research into what exactly a bad site and a bad link and more details on what you said. got any ideas on any articles that could help jump start this research.
One of the things I want to avoid is creating such a bad neighborhood from sctratch. And just general research into this area.
Anybody know what site Matt was referring to when talking about footer links? We've all seen this done in varying degrees and many instances are spammy and understandably a concern. So I'd love to know what site he was pointing out when mentioning this...what does this site look like int he footer?
In the SERoundtable coverage that's linked to above, Barry did specifically note which sites were being reviewed during each back-and-forth, so you can see them for yourself.
you mean here:
https://www.seroundtable.com/archives/021107.html
about the nileguide_com site?
www_nileguide_com/destination/prague/best/off-the-beaten-path-things-to-do
this footer was said to be spammy?
How would you suggest to title those links (yea maybe have a bit less too)
I knew that if a page took too long to load a user would leave the website after so many seconds, it is good to know that search engines are paying attention to this as well.
actually , with modern eweb sites, (data, video) etc. it's very easy to have a page that technicly takes a long time to load but appears to load almost instantly to the user. I don't suppose Google will take that into consideration, how could they? It's not a human that goes to the site.
I wonder if this anything to do with with their whole deal with ISP's on the internet neutrality thing?
I'm interested in learning more about the load time of website and effects of SEO. What factors are taken into consideration?
If a site targets UK users with a server located in the UK, who loads a bit slower in the US, would that have any effect on UK rankings?
How slow is too slow?
How much of an impact will it make?
On the other hand, if speed factor is going to be important (or more important), on a technological side it will be a must to study also the quality of the hosting company where the website to be optimed is hosted, or the quality of internal server structure of a company.
As before, I'm writing having in mind small to medium size companies.
Agree, it would most certaintly be something to rank hosting companies based on. The question is just how.
If a server is quick in the US and slow in EU, what effects will that have on rankings?
I think that's a great question to send over to Google. If I get the chance, I'll ask myself.
Thanks for that. If load time of sites is going to be a highly important factor, it would be good to know more about it :)
Interesting about the footer links - I've seen many well-ranking sites with crazy amounts of keyword rich internal links in the footer. One of my sites has in fact tripped a filter and penalized 10-20 spots before but was easily reinstated when the abundance of links were removed.
Am really interested in the note on "Internal Anchor Text (particularly in footers) is Troubling" Would be interested in hearing other peoples thoughts on this as my company is going through a redesign and rebuild.
This is making me think to go easy on the footer links that go to deep level pages and maybe just keep the links going to things like "My booking", "About Us" etc
Really interested at the talk of page load time to become a ranking factor - I suppose this ties in with SERP bounce rate too - if someone clicks on a result but has to wait a few seconds for page to load chances are they'll hit the back button and look for something else.
@shanedj - we've definitely seen it being harmful in some cases, although many times it stays under the radar. It's something I would consider stripping out if you're diagnosing ranking drops. Beyond that I'd say use footer links (a) to link to 'new pages' that arent already linked to from the nav menu, (b) make them section/page specific rather than the same ones sitewide. They CAN really help flow of juice within a site if done sensibly.
Ok so linking to the about page with the anchor text "about us" site wide is an issue? This doesnt really make much sense and sounds crazy making for webmasters
A handful of sitewide footer links is never going to cause a problem (in fact its a basic web design and usability practice). Its when you have 50-60 sitewide footer links with keyword rich anchor text it *might* be a flag.
Even so I think a lot of these kinds of things are messaged by G at big conferences to get people scared and stop using them voluntarily - so many sites get away with this kind of stuff without any discernible problems.
My company just recently added GZIP compression to increase load speed. Not only is this good from the user point of view, but will potentially have a positive effect on rankings. Difficult to understand the actual effect though if so many other things change on the site that could affect rankings.
Note that SEOmoz has still not removed it's footer links after Rand's post in 2008.
Good one, although in the case of SEOmoz, those 'category' footer links point to categories where relevant subcategories are present.
From a usability point of view, those type of footer links are just handy to use, especially on large sites (such as this one) with a whole bunch of categories. Because the word 'categories' is above the footer links,it makes sense to list them there, sitewide.I'm therefore quite sure it's nt just a matter of 'footer links with keyword terms', but the way how they're used. Google / Yahoo! are way to smart to just penalize on 'flat / single metrics' per ranking factor.
I as well find this to be a little troubling from a slightly different angle.
I don't really use many footerlinks, but what I do use is CSS positioning on horizontal navigational menus.
I am speculating the effect of it when you do the CSS positioning tweak for a horizontal navigational menu to be lower in the source code, so your first 100 words of text are at the top of the page.
If you slap the menu (composed of a bunch of navigational links with anchor text) all the way down in the source code... it seems that it may be consideredto be spammy footer links?
Just something I thought I would throw in here.
And so my solution (and feel free for anyone to jump in) is to somehow stuff that menu somewhere near the bottom-middlish of the page so we don't get shafted.
Just curious. Have you seen positives results with moving the nav bar menu below the first 100 words of text?
I have seen some positive results, though along with doing alot of other things as well. I have not isolated it on its own and then tested. Its just something I do along with other things.
To be honest with the community on here, I have not given it the rigorous testing that can give a good firm answer as a direct cause of it on its own. Perhaps someone else has?
The only reason I'd consider moving text above the Top Nav Bar would be to improve User Engagement... Perhaps to promote relevance or provide helpful info.
Top Nav or other primary navigation tools are there to help users once they get to your site. I'd rather sacrifice a tad less traffic than to risk loss of engagement and conversion.
But good question you asked. I'm curious, too, if others have seen positive results moving the Top Nav bar down.
Hi Cathy,
Actually the top nav bar stays in the same location visually.
It is only in the source code that it is moved down, and then repositioned in the CSS to be at the top.
So visually the site has its navbar ontop... but when parsed say through something like seo-browser or whatever, the text is the first to show and not the menu.
Gotcha! I like the idea of placing the code below the 1st text paragraph, especially when adding a bunch of rollover images for some navigation bars. Thanks,
To remove any doubt about your CSS menu coding being blamed for anchor link spamming, throw the code in an exterally linked css file. No one gets shafted and you'll sleep better at nights. :)
What about adding keyword-targeted anchor text leading back to the Homepage? E.g. Return to Cheap mp3 players?
Is that acceptable? Should it be varied? Should it be limited to generic 'About Us, Contact Us, Mission Statement' type links?
As a reply to a few of these, I'd just say that Google's stance is not "never do these things" or "these are always bad" but that sites who tend to do these kinds of things (the internal anchor text links, the multiple domains in a niche, the blocking of archive.org, etc.) tend to be spam in other ways (they get manipulative links, they're not beloved by users, etc).
Don't be scared that you can't engage in any of these practices, just be aware that the more things like this you do, the more you begin to look "fishy" fit into that "correlated with spam" bucket.
I doubt there's much to worry about if you use "nofollow" properly. Same goes for blogrolls or link lists in Wordpress. My site got a -20 penalty when I included a list of the 15-20 or so social networking accounts I use. Once I nofollowed the links, my site went back to Google page 1 for itself rather quickly.
If your FOOTER LINKS are for additional information that a user would naturally be interested in following, then please use them. It's always helpful to reinforce linking to your site's most popular pages (Top Pages). That's like giving your site visitors a quick link to the pages most visitors like the most.
Regarding your concerns about Deep linking: The bottomline is to consider how would your visitors find those pages otherwise? If you are helping people find additional relevant stuff faster - good for you (and good for them).
The comment about having multiple sites targeting the same niche being seen as spammy is troubling. Why would that be the case? I happen to have multiple sites in the same niche for a couple of different niches, with each site having a specific function within that niche (directory site, forum, new site, etc). I would hate for those sites to be penalized as being spammy.
Thanks for the recap. I am going to share this with my team now that we are back in the SEO.com offices. Can't wait to see your post about your presentation on why not to buy links... It was a great presentation and I would like to share that with my team as well.
I'm practically weeping with joy - someone finally confirmed that a fast-loading site can help with rankings.
THANK YOU THANK YOU THANK YOU Matt Cutts. All the mean stuff I've said is now erased. Er. Wait. Is this thing on?...
Actually what he has said i that one , maybe, in the future it might help with ranking. This actually confirms that at th moment and in the past, it does/has not.
Sorry.
I believe that technically, the previous line given by Google was:
"While slow load times cannot hurt the ranking for a web page or site, the reverse is not necessarily true."
That suggests to me that it can be a positive ranking factor. Now Google's saying that in 2010, it may indeed make its way into the algo as a more direct ranking signal in both directions.
Umm...something more to think about this coming weekend...sigh.
Well, at least I know now that what I've always "suspected" is true, eh!
WooT! speed helps!
:-)Jim
Ya, he seems to have problems with just stating facts. I swear sometimes it's like listening to Dick Cheney talk. He was pretty straightforward this time though, whuh? I mean for him.
You see his vid over @ WPN? he kinda seems to enjoy toying with the truth. Is he like that in person?
Awsome post. Totally baffled as to why blocking no archive could be viewed as spam. I block it becuase I don't want to polling the server. No other reason.
Seems like a good way to add to global warming.
It seemed to me that it wasn't the simple act of blocking archive alone that was viewed as spam. But that doing that in conjunction with a few other things made it seem spammy and would cause his team to take a second look.
I agree with you, Google will no penelise webistes on a single factor but in conjuctio with others.
However I think webpages (not websites) might be penelised on single factor. What do you think?
I like the idea of ranking on page load time. I can't tell you how many times i've gone to a page that took to long to load and hit the back button. This would help keep the web fast and more and more people want answers quicker and quicker.
Regarding:
"Having Multiple Sites Targeting Subsections of the Same Niche can be Indicative of Spam"
How many is "too many" ?
I own 2 sites of the same niche. Each site caters to an even more specific part of the niche, and only 10% of each site overlaps with the other with regards to the percentage of the niche covered.
These sites are NOT spam, and they are almost 2 years old with quality content on them.
Anyone care to comment?
One simple question. How does Google diagnose what and where a footer is? Because it is the last piece(s) of code in the HTML? Seriously that would only require moving the code higher up right? Now no more footer links????? Confused.
I think the niche businesses piece is complicated and hard to penalize sites unless you are doing something else black hat.
i did hear about the site speed and i believe google are providing a tool to ensure your site is perfoming well.
the link is : https://code.google.com/speed/page-speed/
Thanks ali
Having Multiple Sites Targeting Subsections of the Same Niche can be Indicative of Spam.
Does this imply having multiple sites of the same niche in hosted in different countries and in difference languages is also indicative of spam?
I think the SE also looks at who owns the domain names.
Interesting post once again.
"In a nutshell - while slow page load times won't negatively impact your rankings, fast load times may have a positive effect."
Even if the rankings are not influenced in a website it's logical that people will stay longer. It will have a positive influence on the user experience. When that happens (and google is able to see that) it will gain you rank since the user experience will improve.
"Internal Anchor Text (particularly in footers) is Troubling" I am really interested in what kind of influence this will have on sites like wikipedia which has scripts to automaticly link to internal pages. (although google will probably exclude them manually from this).
Placing e bucketload of links in the footer is bad news for a long time now so that is not really a new insight as such.
Have been following your updates on Pubcon. Is there a industry standard for website speed or load time that one can measure against?
Any feedback from your session against link buying Rand?
Regards
Neil
I just removed the archive.org block:) Thanks!
The times are changing, even being out doing offline activities for a week can cause missing out on major changes. Google rules our lives:)
Beck @ https://profitseo.com
As normal - great post with fantastic info for us SEOers!
I think the website speed ranking factor has been on the cards for a while, when the recent caffeine update was introduced they mentioned back then they want the internet to be faster - the first good indication that speed would be important at some point!
Also the footer links issue is interesting - after a post here on SEOmoz and you guys finding that less than 25 links in the footer is best practice I would be interested to see what you guys say now!
Thanks,
Dave
Now this is great post, I never thought that having multiple sites targeting relatively same niche would alarm spam...
So if I have one site that target a keyword and another site target the synonym of the keyword it would raise an alarm then...
Google also tracks IPs on the same C class
Im not sure where I stand on the use of sites like archive.org as a way of increasing a sites level of trust. Look at DMOZ and the way its gone, could happen to anyone
very useful stufff for those people and webmasters who are insanely pouring links and buying links for getting optimized in search engine especially google search giant.
i liked that mattcuts has mentioned that if sites has beend to register new and work on instead of reconstruction of the same site.
overall article is worth.
If your competitors get a higher rank from fast loading page, a slow page would have a negative effect.
Rand if a domain it to penelised because of a "spammy" link profile, then it will mean that one can hurt sites by just linking to them.
Do you think that one can hurt sites by liking to the, specielly if the domain is new?
Why? It just makes good sense to have different site with different info about different products or different aspects of the same industry (like wholesale and retail for example)
Consider a coat manufacturer. A site for fur coats. A site for leather coats a site to target teens, a site that targets wholesalers etc, etc.
(I chose coats because I haven't ever done a site about coats).
I agree, I have seen to many reputable business with multiple sites for different areas of the same business.
I think, actually, I agree with Google on this one. For most reputable brands and web companies, a single site with their brand name (or a few that represent a few different lines) is much more common. For spammers and manipulators, having 50-100 domains, all targeting very similar foci in a niche is common.
There's not a ton of crossover in my experience (big, quality brands with dozens or hundreds of little niche-targeted sites). Thus, Google can potentially use this as a quality signal. It's not that you should never do it, it's just that it is highly correlated with manipulative behavior.
"Having Multiple Sites Targeting Subsections of the Same Niche" is more about businesses that would have a whole bunch of domains with a similar foci all directed to the same niche market?
That must be different than having a few additional URLs that are created to draw new relevant prospects (especially those who do not know your brand), who are using a very specific keyword search phrase to find a specific product or service that a company offers... right? The content per domain is unique, additional information focused on a different product or service developed especially for that customer market.
No problems with my continuing along that strategy?
What is a paid link? Is a directory listing that you pay for a paid link?
Thanks.
Depends on the listing... and the directory. If you always get included and there is no moderation then Google will not like it and will see it as paid links.
If there is moderation and you can be refused (though you paid money) then you pay a fee to check a website. It is slightly different but Google sees it that way.
Interesting summary.
Thanks,
S
I don't see any correlation between page load time and the relevancy of a site. This seems like quite a reach.
"slow page load times won't negatively impact your rankings"
Contrary to that statement, I saw google results for my site temporarily fall back 2 pages when a db mis-configuration made page loads take ~8-10 seconds. This is very anecdotal, but the results jumped back up to their old spots the next day after I had fixed the bug and no content had changed on the site.
The longer a website takes to load, the less likely a visitor will wait for the page to completely load.
The longer a website (and internal pages) take to load, the less likely a visitor will return to the site.
These two alone would result in higher bounce rates and fewer return visits. In that case, I'd be less concerned about ranking relevance and more concerned about customer conversion.
My perspective: Long load times might not really be an issue, but only if the wait was totally worth it, if the website was just a hobby, and I had a lucrative side business...
very important news
google rule change very fast
if we not following google we will crash