I've recently been thinking more about some ranking signals that in the past, I dismissed (perhaps foolishly). Some of these the engines have previously disavowed, while others don't get the attention or discussion they potentially deserve. My list includes:
- Mentions of a domain / brand name - particularly in sources that the engine has classified as "news." I suspect we'd find a reasonable correlation and probably plenty of examples of domains that begin ranking once they earn these mentions.
- Nofollow links from trusted sources - by running a bit of analysis across the domains on the web, engines could see, quite simply, who links to very good pages/domains and with what level of consistency. From there, it's an easy step to simply "count" those nofollowed links as followed or treat them similarly to the mentions above. This metric already gets a lot of attention, and our correlation data, at least, suggests that a high number of links/linking root domains with no-follows does correlate to better rankings.
- LinkedIn + Twitter profile links - since these sites (and likely others like them) are used primarily by real humans, most of whom can't afford to have a spammy site seen by potential employers/networkers, these links are likely golden for search engine uses.
- Traffic patterns via aggregated Google Analytics data - if the search quality team received a list of domains that sent/received traffic and the relative quantity levels, I suspect they could put this to use as a methodology to sort the spam from the real sites (spam tends not to send out traffic, nor receive it from a diverse range of good sites). It would also be an incredibly tough metric to game - how do you draw down lots of referral traffic from many unique high value sites (directly - most ads would get filtered) without actually being interesting and worth visiting?
- Mobile visits, check-ins and interaction - Though still tough to determine/track compared to some other metrics, I'm thinking that a local business or relevant website only gets clicks and interactivity from mobile browsers/devices if it's highly relevant and useful. This could be another solid way to filter spam and get data for local/maps types of rankings (presuming the engines had access to the data at scale... can you say Android/Windows Mobile?) :-)
- Links and references in Gmail - Again, it's unlikely Google's actually reading our email, but certainly the search quality team could get a list of the number and diversity of references to sites used in email (much the same way Gmail delivers "personalized" ads based on the content of emails)
- Content that garners comments/UGC - if real people are actively participating on a site around unique content, I'd wager to guess that content is likely the type engines would want to rank. Things like comment RSS feeds, trackbacks and content uniqueness analysis could all be leveraged to help sort.
- Rich media present on site and around the web - Spammers don't make a lot of unique graphics, images and photos. Likewise, they don't film original video, don't post podcasts, don't build Flash elements, upload Excel spreadsheets, graphically heavy PDFs, or the like. Real websites and businesses run by real people and businesses do. Since the engines already have the indexing and segmentation capacity, there's nothing to stop them from examining the data as a quality signal.
I'm not saying that Google/Bing are definitely using these, but I'd suspect that all of them have practical applications in improving search quality and relevancy. And, by running correlations and analysis of these datapoints ourselves (where possible), we may be able to learn more about what makes a site "look natural" and rank-worthy to the engines, particularly since so much of my email and our Q+A seems to be worried about false positives of late.
I'm curious - any other factors you think fit this pattern/system?
Great, the whole no-follow vs follow link debat will continue in 2011. What a difficult and most unclear tag it's turning out to be.
You should realize that the SEO industry doesn't like to let anything slip into oblivion. There are still discussions about the importance of the meta keyword tag. :)
This I have insisted for a while. ====> Nofollow links from trusted sources
Rich media present on site and around the web I am not sure about this one - many many spammers have started using videos, galleries etc - it doesnt tke a lot now to rip videos and reload to your own accounts, and the same with images. There are massive content farms that are full of these - you just need to wade around promoted links n imgur to see these...
Rand's point was "unique graphics, images and photos".
Using the algo to determine "uniqueness" of Rich media isnt great yet - its on the horizon, but not strong enough IMHO to factor yet. Maybe in 6 months time...
nope it's not there have a look they are mostly all the same bloody image just at different resolutions or different file names, they need some of that image de-duplication software run over their collection a bit more frequently
I created purehairstyle.com to test image spam & image seo.
Step 1 harvest images from google serps via Outwit images plugin
Step 2 throw them into wordpress
Step 3 throw in some adwords
Step 4 forget about it except when a small adsense cheque arrives a few times a year. It was doing 300 visits a day at its height
I did this manually but its easy to automate. Just drag a list of keywords out of "hot trends", automate the image extraction and upload, auto SEO your page using your initialy extracted keyword
Use stuff like skimlinks etc etc etc
This was just a little test for me, so I imagine the folks that do this for a living are a damn sight better at it and a hell of a lot more effective
Yep - quite esy to achieve - done that multiple times to see if it works, and have managed to get images ranking in the universal SERPs for loads of KWs, without owning the original image...
Ironically, I considered going a high value route, taking photos and making it look great but abandoned that because I knew that the pics would get ripped off immediately ;)
Funny old world
Hi Stephen idea of checking image duplication is good
I use this website for checking image duplication
Hope it will be useful for you and other Youmoz family member here
Thank for your comment and info about that website "tineye"
Actually I was looking for a website like this. It is very useful for image duplication checking Thanks a lot
thanks for your response, I use copy scape for content plagiarism checking.
hey rishil, totally agree. i always say you should not even bother looking into follow or nofollow at all! same goes for PR. just make sure the link source is ok. it makes live that much easier you know :)
re: the bounce rate, I do believe it is a highly usable signal, when combined with other signals. Bear in mind that Google has 3 ways to get at this:
1. Google Toolbar 2. Google Analytics 3. How quickly you come back to the search results
They can then filter this against the type of query and the behavior of other sites shown in the same SERPs.
For example, if you are searching for some information that is simply "lookup oriented", nearly all the SERPs would show a single page view, and the one that had the LEAST time on page would be the best.
Even if this would work only for specific types of queries, it is still something that Google (and Bing) could use just for those queries. Google already heavily segments algorithm behavior by query type, so this is not a new concept for them.
Hey Eric - I wonder if Bounce Rate can used as a negative signal, but rarely a positive. I.e a top ranking site that increases bounce rate sees a drop in rank, but a lower ranking site with low bounce is not benefited. This type of metric might serve to "protect" users from sites that suddenly appear owing to spamming activity, where the result is not as relevant to the user search intent. This might stop SEOs artificially enhancing rankings with false bounce / CTR signals. Then again it could be pretty easy to take a site down a few positions with this concept :-)
I think all social ranking signals boil down to two things: Become popular and maintain a positive profile. More you are talked about on social media the better it is for your business. Google has the ability to do sentiment analysis as it is evident from the fact that how it suddenly tweaked its algorithim and downranked businesses like 'decor my eyes' with lot of negative mentions. I think citation source and its postion is going to play a bigger role in the near future. You may get more weightage in ranking depending upon whether you are mentioned in new york times or on some abc website or whether you are mentioned in the main body of the page or in the comments section. Spammers will abuse the comments section so i dont think such mentions will get much weightage in the near future.
If you look at one of my favorite website socialmention.com you can get a rough idea of how Google may be using social ranking signals in its algorithim. It uses very unique social media metrics like:
1. Strength- How strong is your social media presence. It is calculated as: brand mentions (within the last 24 hours)/total possible mentions.
2.Sentiments- Ratio of positive and negative sentiments.
3. Passion- How frequently you are mentioned by same source. If someone talks about you all the time then he is passionate about your brand.
4. Reach- What is your range of influence. It is calculated as: no of unique sources referencing your brand /total number of mentions of your brand.
5. Top users - Who is following you? If top influencers consider it as a strong signal.
and there are many more metrics. check the website.
If you are into social media analytics or influncers tracking you may like to consider the aforesaid metrics to measure the impact of your social media marketing campaigns. 'Sysomos MAP' is one of the product of sysomos.com and is a another great tool for media analysis. Happy Analyzing :)
Was that a comment or a Youmoz post?
Hehehe... a comment a la Seo-Himanshu.
But everybodies has his style, no?
lol. sometimes i think i missed guest post opportunity.
And what about the power of the social media profile attained, by getting lot of huge number of powerful followers, friends etc. I have recently read about a comment about the profile strength achieved by twitter accounts of websites like mashable etc
Mobile visits are gonig to skyrocket in 2011. Everyone should have a mobile version of their site ready.
I would like to throw out there the idea that maybe geo-ip-location information might form part of some signal. Knowing the ip addresses of say colleges and universities and how they are interacting with content. A piece of content about some new technology view alot from MIT could be seen as strong signal that it maybe its interesting and deserves promotion in the serps....
Great, I like it.
The SEOmoz readers are really awesome. Here at SEOmoz you dont just get information by reading post but the comments by the users are much more informative as well. This is the reason why I love SEOmoz, the community of people over here are really helping & providing useful feedbacks. The healthly disscussion on most of the post by users are really entertaining & porviding great value to the SEO community.
@Links and references in Gmail
I can well believe they might but....seeing as how they told Steven Levitt they would never release search data to the World Health Organistation. Even if that data could help contain disease outbreaks and save countless lives, because it impinged on their user's privacy. So they'll never admit to using data in GMail.
When talking about inbound links, I think that the trend is (and will be even more) favoring the Trust of the site linking out. In this sense I agree with you, Rand, as you point to this "value" almost in every point.
As a consequence, I suspect that also are (and will) be considered less potentially spamming sites that links out to other trusted sources.
That means that the "nofollow" attribute is probably loosing its relevance when it comes to rankings. Probably, but that is a suggestion I do, the weight of PageRank is diminishing as far as Trust and SocialRank are gaining relevance. I think it could be logical, being the citation sources and patterns very variated now.
Finally, I launch an idea that maybe is totally missing the point but that could to other better ones: what about "Mass Citations". With it I mean when a post, for instance, is copied without any reference to the original. It's sure Search Engines can recognize the original source also without a link back to it (the signals to that source are usually more and from trusted sources). This ability could be used:
I think Mass Citations as you put it, defines retweets and social shares as well.
My robots text file for example ranks for " I wrote a letter to google" because it got retweeted loads of times with that message on twitter - it is also the title tag that google automatically gave the file... https://www.google.co.uk/search?q=I+wrote+a+letter+to+google (explicitly.me/robots.txt)
Yes, I remember that post and that Tweet (I was one of the retwitters)... and followed up your observations. Probably also because of them that I have this idea tickling me.
This needs to be tested as you could also rank for that phrase due to inbound anchor text such as the one from this page: https://www.talksy.com/72828/i-wrote-a-letter-to-google
Very interesting Rishil, shows the power of retweets =) still ranking strong too
As the internet is a network of networks, according to me mentions, links, email accounts on your own domain, etc. present anywhere in the cosmos of the web world and search engines will surely pass on the necessary link juice if the content is readable by the spiders and can be indexed when the internet servers are being crawled.
I know many might have a good laugh on this comment but the intentions behind having the content look natural shall surely get back the rank worthiness as that can be called as your SEO Karma coming back to you.
Just today in the morning I read the statement made by Matt Cutts '…The objective is not to “make your links appear natural”; the objective is that your links are natural.'
All I can say is Ahh, Matt Cutts - how I love thee, let me count the ways...
I kinda like going against current trends in my SEO strategies. When everyone stops pursuing nofollow links, I start pursuing them (along with dofollow links, obviously). When the competition is moving to other parts of SEO, its so much easier to get it.
The same goes for mentions in news sources. I am not too concerned about a mention is in an a href tag - if just the reference says domain.com, it can't be hard for search engines to see that this as well could be a link.
Most importantly, I think we all become better SEOs by paying attention to a broad set of (potential) ranking factors. It makes us focus on creating better websites rather than doing some silly non-value adding activities like page rank sculpting (boy, how many hour have we all used on this two years ago).
Happy holidays!
Thomas
I like your strategy. Go where the competition leaving. Very interesting indeed. Thanks for the post. Here is a look at my Kona coffee site. It has a long ways to go. I am still trying to find a solution to shipping rates at the same time as trying to avoid 3rd party calculators.
Thanks for the interesting post Rand.
Bounce Rate from Google Analytics is a minor signal.
I have noticed with many of my accounts that "nofollow" from authoritative sources do have positive implications particularly if the link is in a prominent position on the page.
I disagree about the bounce rate. I don't think they could use it as a ranking signal. Bounce rate is recorded only when someone visits your site and then doesn't visit any other pages. There are countless examples out there of quality content that's all on one page. So, someone searches for information, finds the perfect content on the page they click through to, and doesn't need to look any further. A bounce will be recorded in Analytics for this, but if Google's returning the perfect page for the searcher's query, then it's actually a testiment to the quality of the page.
Many people think bounce rate has something to do with time on page. It doesn't. It's simply the case where only one page of the site was visited in any given session.
See this comment for more info: https://www.seomoz.org/ugc/how-a-site-redesign-increased-traffic-by-515#jtc84706
Interesting comment about bounce rate, I had never looked that up. I think you are right in that our number "bounce rate" in analytics cannot be used by the SEs.
But I do agree about "bounces" in another format as a signal about quality. Not the bounce rate as in Analytics, but rather the traffic patterns of a user clicking a result and within a small period of time (not going to try to guess a metric here) returning to the search results. I think that can be measured by Google and can be used to determine the quality of a site.
I have no proof of this, and someone might be able to prove me wrong, but it seems like a good indicator of quality of many users return to search results very quickly from the initial click to the search result.
Hi Kate,
Yeah, I agree that the search engines could be using a combination of bounce rate and time on page as a quality indicator. If I was building Google, I would! Maybe there is a way to segment out the bad bounces (gone in under 2 seconds) from the not-so-bad bounces by combining bounce-rate with time-on-page in Analytics.
Hopefully someone looks into it and writes a post about it. I don't have time! ;)
Darren
High bounce rate and low time on site could still signal a good site. Take "what my ip address" a site where you can quickly get your ip address, you asr unlikly to go to a second page or stay moore than a few seconds yet the site was helpful.Matt cutts said that they do nnot use anything from analytics in a video, so unless they can get this from some other source or he was telling the truth, i suggest they do not use it. What i have noticed but have no hard proof, is that listing in local directories with address data helps in local search.
Don't think of it like bounce rate on the site; think of it as user behavior in Google. If Google sees a user type in a query, visit one page then click back within a few seconds and visit another, that first page probably didn't satisfy the information need. If, on the other hand, the user either never returns from that query or begins a totally different query, then the page probably did satisfy the users information need.
IMO neither can be taken as a strong signal as both would be relatively easy to automatically spam, not to mention screw with competitors.
Yes I do agree with you, there is a possibility of Google considering the bounce rate as a quality metric.
But it is not necessary that a high bounce rate has a negative connotation always
In fact if the bounce rate is high but the time on site is also high and the visitor executes the call to action of the page then it is a positive factor.
In fact I have some statistics and analytic details of a site which was redesigned and the bounce rate went up to 61% from 28%. You can view the statistics and data on the following link:
https://blog.webpro.in/2010/12/high-bounce-rate-landing-page.html
bounce rate combined with time on site?
I would think that if you bounce back to the SE, that this kind of bounce rate would count against you. However, I agree that good landing pages, like definitions, facts, maps, etc. Would create a bounce that would not be negative.
I'm not saying that I believe bounce rate is 100% most def. a signal, but I'm more than willing to consider it as a viable signal.
You're argument here positions this metric in a vacuum. When you consider this, or any other signal, you need to think about its relationship to other data being collected such as number of pages indexed, time on site, time on page, unique visitors etc etc.
If you have a site with a highly optimized landing page that gets the job done by itself and has an incredibly high bounce rate, rest assured, the SE would be able to factor other data in to figure that out.
On the other end of the spectrum, if the landing page has a low time on site, no goals, no events being tracked and in relation to the other SE results has a low visitation level...I'd bet dollars to dognuts that the bounce rate will also play in to why the page moves down the page.
Now...is this something that is easily tested and proved? Well, I'll leave that up to Joanna.
Wouldn't Bounce Rate put together with Time on Page indicate the relevancy/quality of content?
gg
.
I like the point about LinkedIn potentially being an authority because of professionalism & less spam - but I wouldn't lump Twitter in there as well as the amount of spam on Twitter is horrendous. Hope Google is selective if they are using Twitter in such a way.
- Jenni
I like putting some graphics and video on my website but not to much because it can be too distracting to the visitors reading my blog.
Great post!!
I have a strong feeling that wikipedia links are still counted even though they are no-follow.
W
These are interesting points, and I’d be curious to see if Google Analytics will implement any of this tracking in the future, such as for mobile devices, check ins, mentions. Despite all the social media, I would not doubt Google to sway away from the most important ranking factor: quality link signals. Inevitably, the way Google ranks will keep changing and they’ll keep on adding to the mix, but I’d imagine the algorithm will continue to highlight links as most valuable for the time being.
I have just one thing to add... maybe LinkedIn isn't filled with spam websites, but twitter surely is... There are tons of affiliate guys that just do this
I think it's a great idea to customize your links on LinkedIn. You can do that by just selecting other link and customize your anchor text.
Great post by Rand. I really like this post and its active discussions by main industry leaders. Great and most valuable. I really like, Matt Cutts says about his role as web spam team head. Good
I have recently been working with a client to improve Quality Score in their AdWords campaign, and despite many claiming the landing page is not a factor, we have had some success in increasing their QS by reducing bounces back to the SERP and driving more customers into the cart by improving the call to action.
These increases in on-page to conversion metrics have of course been helpful in all channels of acquisition, direct, referral and search including organic.
Totally anecdotally, we have also found some ranking increases at the same time. I would contend that none of the changes we made should have affected organic rankings, so I had been wondering if the decrease in bounce from organic clicks has helped with the rankings.
Another possible ranking factor that I have been trying to figure out how to test, as inspired by a successful remarketing campaign I am running for one client in AdWords, is the power of return visits. If you search for something and click on an organic link, that should be saved in your cookie for 30 days (if you don't clear cookies), and I would think that continually returning to that site directly might have some indication of page value.
Interesting that nobody brought that up before, but Google could be using Toolbar PR requests to estimate the traffic. More importantly, they could use the sequence of PR requests to let the link juice of outgoing links flow unequally. I actually have some interesting info to share about this, so if anybody is interested, drop me a DM.
If you something interesting to say about this theory, I invite you to share it with a YOUmoz... here is not place for "SEO Secrets", especially if the holder of the secret launches the news first :)
You are right. I submitted it to YOUmoz 10 months ago, but Imageshack messed up the images and it fell under the table. I'm not too motivated now to submit it again :-(
I have read somewhere, but can't remember where that google uses a combination of tool bar and googlebot to determined a websites speed. So definitely the toolbar is used as a signal
Hi Rand.
On your last point I strongly agree. I recently added a .doc file and a link to it from an internal page on an Aran Islands site. The doc is useful to my visitors and results in maybe 3 or 4 downloads daily. The serps have improved considerably but this could also be due to other ongoing improvements.
I intend to try this soon on another less active site to see if the improvements are replicated.
good comment,
We completely agree with your findings, Rand, particularly on nofollows. We have definitely seen that these send ranking signals to Google. Very excited to see what challenges Google, Bing, etc throw our way in 2011!
Yeah all are definitely possible so good post. I definitely think the nofollow from trusted sources is a must and I think your spot on the money with that one Rand.
I have had similar thoughts on most of these , so couldn't agree more.
I've had the same suspicion about the no follow links for a while now. It's just good to hear it come out of your mouth Rand!
Good article . How about "time on site" ? :D
I do not think it would be really a reliable information. Just think to two cases time on site is not really correlate to the use of the site:
1) how many time do you open a site in a tab, then you left it open while consulting other site in other tabs? I do it always, for instance having SEOmoz open all day long.
2) if it was just "time on site", I could open mine own, surf a little, pause, do my job using other sites, return to the tab of my site... but not really interacting with it.
Maybe if it was correlated with other datas (time of reading, that could be calculated using existing formulas; navigation into the site...) maybe. But those datas are something that have more to do with Google Analytics, and I don't believe that Google wants to use Google Analytics information as a ranking factors especially because it is against its own privacy policy and all we know what problem privacy is causing to Google already.
I think 'time on site' can be a strong signal. This metrics along with unique visitors and pageviews give a useful insight on how the content is being consumed by visitors. There are data sharing options available in Google Analytics through which Google can share your analytics data with other sites and google services. And i dont see any reason why Google wont/cant use this data for computing rankings. Though Matt cutts denies that. But who believes every word he says, anyways. Regarding spending time on your own site or keeping seomoz open all day, a session cookie (by default) automatically expires after 30 minutes of inactivity. I think website usage data plays an important role.
I just rely on 80+ million germans pissed off with Google, 70+ french, 56+ italians... I mean, I just see the problem Google is having in last 12 month with european privacy legislation and the simple suspect of the use of the GA datas for rankings (that by law can be easily proved as essentially private) can open a Pandora box with UE to Google #justarguing
Again, just like bounce rate. If I am looking up a definition, or translate a phrase, how much time does that really take? I can buy on amazon in less than 3 minutes ordering the latest SEO book, and do nohing else or I can spend 20 minutes on SEOmoz reading a blog and comments before searching for that book.
To google which was more relevant, all other factors even, the 20 minutes on SEOmoz or the 3 on amazon?
Time on site doesn't mean the page is better or worse. We changed a page because we saw the bouncerate was high. After the change, time on site was decreased aswell as bouncerate. A high number in Time on site can mean the page is very difficult to read through, or have lots of distractions and might not give the best answer to your query. Low time on site can mean the answer is very clear and readable.
But neither of these statements is absolute.
I never thought about the links within gmail. That really does make sense that they would use that information though.
I don't agree that twitter profile pages would not contain spam. There is lots of it on twitter. I think other things would have to also be considered as to what that profile links out to.
I think as we see social media metrics and stats continually growing in popularity and impact in SERPS we will start to see the follow/nofollow metric sway. We continually discuss "natural" link profiles and the mix between follow and nofollow, but as time goes on I believe that Google will continually restore some sort of impact from nofollow.
Can I just point out that Matt Cutts has posted a video about these supposed "social signals" in the ranking algorithm. Pay close attention to how he says they are used: for real time and social search. All this garbage about social mentions and sentiments and blah blah blah is bogus. If social signals like that were used in organic web rankings, the system would be WAY too easy to game. Hello, you can already buy 100,000 Facebook fans, not one of whom is a real person.
Social networks are 100% compromised and easy to game if you just dedicate the time and resources to do so. And if the "authority" of a user becomes some kind of ranking signal, then hello spam! Those top users are going to get inundated with lots of money to tweet and mention certain websites. And why wouldn't they? Plus, wouldn't 100 mentions from 100 small users show more popularity than 1 mention from a big user?
These signals are obviously for real time and social search. You have authority then your tweets get posted on top. I wouldn't look too long at developing some full fledged social mentions/links strategy for web rankings. Google can't do that just like they can't do bounce rate or time on site as a ranking factor: too easy to spam.
With all due respect, I think you are on the wrong track. If you take Matt Cutts word as gospel then you are taking a massive risk - his job isnt to make our jobs easier, but to add loads of misdirection :P
Check this test out - rankig on purely retweets... https://www.seomoz.org/blog/ranking-signals-hiding-beneath-the-surface#jtc129016
Social signals will be an influence, not just in real time, but in ordinary rankings.
"his job isnt to make our jobs easier, but to add loads of misdirection"
Rishil, my job isn't to make life for site owners harder or to add loads of misdirection.
My job is to ensure that our search results are of good quality--Google's highest priority has always been our users. After that, I try to help webmasters and site owners, answer questions, and dispel misconceptions. I did that new webmaster video because I wanted to highlight that since May 2010, our use of social signals had changed. But if you read the (exhaustive) article that Danny Sullivan wrote, you'll find that our use of social signals is currently quite limited.
Right - sorry Matt - I do sound like a douche in that comment -- it isnt meant as a direct insult to you.I should learn to phrase this stuff better.
Point I guess I was trying to make is that neither you nor any other G employee would give out ranking data without exposing a significant risk to the SERPs, because spammers will jump on those signals. - that being my point about its not your job to make it easier for us.
Re the Misdirection - this is my personal opinion, and may not be fair - so in that scenario I apologise to you.
Rishil, I saw that robots tx of explicitly. You are blessed with good creative brain. And your blog posts are good.
I hope Google do no tolerate big Brands when they buy links, I know many good sites who also invest a lot in Adword campaigns buying spammy links and from what I see they are doing well on the Serp. It REALLY frustrate me.
My fears are that Google target only spammer who are not big brands, after all if a brand buys links and and rank better it is not bad for the user since it the SERP becomes more relevant. But on the other hand it makes it difficult for those who do not buy links to compete. I hope to see Big Brands being penelised for buying links.
Our Twitter test from two weeks ago is still showing that the URL with 450+ tweets but 0 links is ranking ahead of the URL with 20+ linking root domains and 0 tweets. In the interview with Danny Sullivan, Google said quite literally that they're using Tweet/Share data in the web rankings, and every data point I've seen backs that up.
From the interview:
Google: Yes, we do use it (twitter references excluding the nofollowed link) as a signal. It is used as a signal in our organic and news rankings. (emphasis mine)
I'm actually really glad to see them coming straight out with this, as testing/correlation/experiments are never persuasive to all marketers, but when combined with word straight from the source, it's usually far more accepted (and SEOs need to start thinking more broadly about search, social and all organic marketing integrating, IMO).
Totally agree... our keyword for the future, if we want really succeed and not make happy the "seo is dead" fans, is "Holistic".
Rand, I'm sure you caught it already, but Matt tweeted this and I'm sharing for the benefit of others:
@randfish in organic, primary use is e.g. realtime, not main web rankings.
You may have correlation data to support the idea that Twitter links=good rankings, but recall the experiment where someone put a gibberish word in their meta keywords tag. They then ranked in Yahoo for the gibberish word. What followed was Yahoo saying they only used the meta keywords tag in the absence of all other ranking factors, something that only happens with gibberish and changed how we research things in search engines.
I believe your example may be similar. You had 20 links to a page and 450 tweeted links to a different page. Perhaps there was a general lack of data coming from the big ranking factors so they used Twitter data in its absence? It's possible. All I know is if word gets out that links in tweets will boost search engine rankings, then heaven help Twitter. Spammers will take over.
I have also seen that video published by Google webmaster help, in which Matt Cutts was confirming the article written by Danny as Google counts the links coming from twitter and Facebook. Just check the link this is original video at Youtube https://www.youtube.com/watch?feature=player_embedded&v=ofhwPC-5Ub4
Rishil, I love this: "If you take Matt Cutts word as gospel then you are taking a massive risk."
I think the exact opposite is true. Ignoring Matt Cutts is probably the biggest risk you can take in this industry. I've always taken his advice and implemented every tip he gives on SEO for Webmasters since I got into the industry years ago. It's funny what happens when you just listen to what the folks at Google say and do it. Instead of spending my time trying to spam or manipulate or find the silver bullet so to speak, I have followed the Gospel according to Matt Cutts in my SEO Bible and done just fine :)
Great analysis of the 'missing' factors. I would add 'time on site', but that may be a bit too obvious. Looking forward to 2011's challenges and the no-follow debates.
I still can't see how the time on site myth is being perpetuated. Can you image if this were true? I can see the Craigslist ad now:
"Work from home! Browse the web all day and we will PAY you! Just click on our sites in Google, stay on for a while, browse around, and then go to our competitors and do the exact opposite- so easy a caveman could do it!"
Seriously, in an industry that is getting outsourced more and more to India and Singapore, you don't think the spammers out there would love for time on site or bounce rate or other user behavior to be a ranking signal? Think of how cheap it is to pay someone halfway across the world to sit at a computer all day and do what search engines think they should to consider a site valuable and another garbage.
Instead of the who can stuff more keywords in their meta tag game we would have who could hire more zombies to stay on their website game. Think of SEO like this- the easier a ranking factor is to spam, the more strict Google is with it. And in some cases the less value Google bothers to give it.