2011 is here, and that means it's time for our biennial search engine ranking factors survey to be renewed. This year, we're planning something much bigger and, we hope, better. Our plan is to offer a report that provides:
- Aggregated expert opinions on the importance of factors individually and ranking influencers overall (as with 2009's version)
- Correlation numbers for the factors (on as close to a 1:1 basis comparing data against the question asked to those surveyed)
- "Causation" numbers on a relative scale derived from our machine learning-based ranking models (with error margins)
- A representation of the relative chunks of the algorithmic pie by their contribution to the overall algorithm
This is, obviously, a huge undertaking for SEOmoz's team, and we could use your help. First, we need your help to recruit the right experts. The form below will enable you to submit nominees:
We'll likely take between 1-200 participants (possibly more), so please send us your best and brightest!
In addition, we'd love your help in defining the factors we'll be measuring this year.
Rand's Current List of 224 Potential Factors
The list below represents my first stab at creating a list of datapoints to use in our correlation and ranking model analysis. Your mission (should you choose to accept it) is to add potential factors (not listed here) that we could gather and analyze in the comments below. This means they'd need to be available on the page/domain itself or fetch-able on the web through an API or other request in a scalable fashion.
In addition to adding your own ideas in the comments, please upvote your fellow mozzers if you like the ideas they're presenting. The comment with the most thumbs up at week's end will earn a special gift from the mozplex and recognition in the final report.
Obviously, not all of these will be directly translated to ideas/concepts of ranking factors for the survey participants to vote on, but many can help to inform their construction and compare against as a datapoint.
Some additional notes on our plans:
- Use only Google.com US results for this version (but plan to replicate in other geos/countries)
- Retrieve geographically agnostic results using a query structure similar to this one for barber shop and this one for ice cream
- Record the presence of ads and universal results on the page, but don't count these URLs/references in our analyses
- Segment the data in several variations (by popularity of the query according to Google's AdWords estimates and number of words in the phrase, for example) to be provided as drill-downs from the main report
If you have other suggestions, feel free to comment below or use the form above. I'm looking forward to a remarkable step forward in the understanding of Google's ranking system - thanks so much for your help!
p.s. A huge thanks to our many contributors from years past, many/most of whom we'll be "nominating" for inclusion this year ourselves :-)
p.p.s. You'll likely notice lots of factors on my list that are obvious non-factors (meta keywords, for example). I'm including these only to help us show data on their impact (or lack thereof) which will hopefully assist those of us who need further evidence to help convince clients, managers, etc.
A suggestion for an improvement to the survey process: I've struggled in the past with answering questions where I didn't (honestly) have a clue (but there wasn't an option for that). If I remember correctly, previous surveys have had a scale for how big an impact a factor has, but nothing for how sure I am about my answer... I'm not certain of the best way of representing this / making it easy to answer, but I'd love to be able to at least flag those answers I'm most and least sure about.
I agree.
Great idea. That would be similar to how David Mihm did his local search ranking factors mid way through last year which I found very interesting.
So, you'd essentially create a weighting system based on degree of certainty. That could help ensure that niche experts (local, video, eCommerce etc.) were given slightly more voice for their areas of expertise. Is it as simple as a 1-10 scale that would impact ranking factor strength but not degree of agreement?
No matter how it's implemented I really like the idea.
Thanks Will for stealing my idea ;-)
Seriously: I think a 1-10 scale as AJ suggested would be overkill. Imho 5 degrees of certainty are enough. A mediocre certainty should be preselected in order to save participants' time.
As I said, I don't know exactly how you'd implement it :) Agreed that five with middle selected might be more efficient. Though that seems a bit more like a likert scale.
I think that's an excellent idea.
Nominated.
Don't forget that we have some awesome SEO's here in the UK too and most of Europe aswell.
Would be good to see a split of search factors based on location too.
i.e.
people in the UK think this,
people in india think this,
people in the USA think this
etc
I definitely like this idea!
May I add something to it? It would be interesting to see if a factor correlates with website intent (ie: localization, language, etc.)...
For example, since I'm a french linkbuilder, not all of my links comes from french websites (The most powerful ones, according to OSE, are those from well-known english websites - Such as my SEOmoz profile). Since we DO NOT have an english version of our website yet (I know, we're the baker's children who have no bread), I know the juice passed from these link tend to be minimized, but I would like to know what panelists think about it.
I agree. It would be interesting to see whether search factors vary between different locations; and if so, to what extent. I think this -coupled with what DanCristo has said below about the weighting being dynamic - would paint us an even clearer picture.
It would make sense; not to go off on too much of a tangent , but we should consider that Localization has always been an important part of international marketing/cross-cultural marketing - despite the internet having less boundaries, why wouldn't it vary from country to country?
Edit: Just realised you weren't referring to what I've just said - you literally meant a varying consensus from country to country! Apologies in advance.
Great point Shane. Something I am particularly frustrated with is Googles inability to effectively identify the difference between UK and US sites that both sit on the .com domain, but which use /uk subfolder as the international SEO approach.
The main problem seems to be that every few months, Google looks at the two sites and freaks out, punishes the UK site and then takes a month to see the error of its ways thereby costing us a fortune in traffic and credibility!
Look forward to the new results!
Maybe how Google Place and Maps influence rankings would be another area of interest to look into.
I agree with Jackson, that would be great info!
Looking forward to it and appreciate the list of ranking factors in advance.
Offhand, I'd offer the following as possible additions to ranking factors:
Hopefully I scanned the list well enough and haven't included anything that's already been covered.
Nice call on the non-200 status codes and the microformats/RDFa. We've tried to convince a client who was building an ecommerce site to put semantic markup on it for reviews and products - to no avail. It would be nice if there was some data backing that recommendation up.
Thanks for the kind words Debi.
From experience, the non 200 response code percentage can definitely impact your rankings. Whether this is truly a ranking factor or simply a 'user experience' safeguard could be debated.
Microformats and RDFa, particularly in eCommerce are an interesting topic. It's a bit unclear whether having them will impact rankings directly. However, they absolutely increase CTR on a listing, which could impact your rank. Perhaps Google normalizes for the impact, but I think that might prove difficult.
Whether or not they impact rank, I'm with you and would continue to lobby your eCommerce client(s) to implement them.
It would be most useful if it also included local search ranking factors
Hi Rand,
given that you want to aggregate 200 or more opinions, I think it might be good to give the experts a possibility to choose for each answer how sure they are about it. Participants should only vote with the highest certainty if they did their own testing on one particular ranking factor. When you calculate the averages, you weight the votes accordingly.
In my opinion, if you don't do something like this, your survey is polluted with a lot of second-hand knowledge. After all, many people read the same blogs and go to the same conferences.
I would be very interested in seeing how Analytics Content Benchmarking Statistics plays a part (or not at all). I'm swaying towards the idea that a site's actual success is in fact a ranking factor (e.g. remarkably low bounce rates for a non-interactive/media website, high average time on site, high user loyalty/repeat visits from user etc.)
The reason being that I have been in a situation where a relatively new site has performed very well in the SERPS for its primary keywords, and has maintained a no.1 position, despite the slight absence of enough links and enough internal SEO - ranking above sites like wikipedia, amazon, and equivalent sites.
The keyword was highly competitive, in a very saturated niche. The only thing about the site that stood out was that the users were veryloyal i.e. almost all New Visitors were repeat visitors with very few instances of keywords showing 100% bounce rates. This is why I refer to the Analytics Benchmarking. It's usage stats are far higher than any site within it's niche as well as within its size (if that makes any sense at all).
Other factors I would consider? whether trigger words like "link" or "advertise here" or "useful links" have a negative impact on the site's standing.
One other thing I had an inkling about is whether Google determines whether a site can survive without being indexed or listed on its SERPS i.e. traffic from other sources other than its search engine - there is obviously a positive trend of good sites with good content receiving traffic from places other than Google. I.e. authority sites like wikipedia, video sites, howto sites, social media sites and so on. I would also be very interested in knowing whether there is any +points in terms of SERP positions for using proper HTML Formatting tags like <cite>, <abbr>, <strong>, <SAMP> and so on. I once had a discussion with a fellow SEO colleague and he had this personal belief that they did have a positive impact - when they are used. That's all I can think of for now, will hopefully post some more later on.
Agreed.
I had an instance where huge ranking increase occurred in correlation with a PPC campaign. I received some commentary from a few individuals stating that google would do this directly in their best interests (which I whole heartedly disagree with.) It was simply not live long enough to encourage natural linking, not for this kind of effect. The only clear change was in site traffic and positive movements in site analytics.
I would suggest that Google Places pages could be a fairly good ranking factor to include.
ie. Having a powerful Google Places page for your business could help improve it's standing.
An excellent list, though I'm missing some factors that may devalue a page such as duplicate content (which is very common on many CMS), spammy links, or added value of forum/ comment linking, in-text vs sitewide footer links, added value of linking out to authority content and a few others such as:
Server location (country specific)
dedicated server hosting and IP
Number of searches for a brand domain
Use of on page Anchors (example.com/page.html#anchor-name) for long pages
and there probably many others...
I would love to do duplicate content, but without a full content index of the web (Linkscape only keeps link data and some content portions about pages), this is currently non-scalable to the level we'd require. Link positioning is also something we'd need to build into the index before we could use it (checking content on ranking pages is easy, but on the thousands to millions of links that might point to a page/domain, it's not).
Search quantity for domain brand is one I love - great call. Also like use of on-page anchors - both of those are definitely going in. Thanks!
Perhaps test with two similar versions of the same page using adwords site optimizer.
Looking forward to the new results!
I especially like the fact that you include obvious non-factors. That creates indeed an important trustworthy source for us to convince some clients with misconception opinions.
My list of potential ranking factors:
1) CTR of the organic search result
2) website usage data (visits, pageviews, avg. time on site, bounce rate etc)
3) Search of branded terms on twitter, facebook and linkedin
4) Search of branded terms on mobile devices
5) Site performance like speed
6) Traffic from social media sites including Facebook
7) Keyword CTR
8) Search on Google TV (will have considerable impact in the coming years)
9) UGC (user generated contents) like comments
10) Sentiment Analysis (more positively you are talked about on the net the better)
11) Citation source, relevance, geolocation and its postion (You may get more weightage in ranking depending upon whether you are mentioned in the main body of the page or in the comments section)
12) Social ranking signals like:
Strength- How strong is your social media presence
Sentiments- Ratio of positive and negative sentiments
Reach- What is your range of influence
Passion- How frequently you are mentioned by same source
Followers- Who is following you? If top influencers consider it as a strong signal.
It only allowed me to like this once, so I thought a reply "Like" was warranted.
Hi Himanshu - these are awesome, but I think, unfortunately, none of them fit the criteria of being fetch-able for automated analysis. Unless we had access to data from the sites' analytics accounts and more access to social/link data that's currently not provided by Facebook/Twitter's APIs (or collected by other third party services), these are all fascinating, but impossible to study.
If I'm wrong, and you know of a way to get this data, please let us know! It would be awesome to include.
Socialmention.com is the nearest i can get to when it comes to understanding, how the social ranking signals might be used by Google in its algorithm. It uses variety of social media metrics (Strength,Sentiments,Passion, Reach, minutes avg. per mention, last mention, number of unique authors etc). It has the ability to do sentiment analysis (almost real time) and can do advanced citation segmentation (blogs, microblogs, bookmarks, comments etc). You can download its data into excel and it also provide API: https://www.socialmention.com/api/ When social mention can do all this then you can just imagine what Google is capable of. Following is the citation analysis report of seomoz: https://socialmention.com/search?q=seomoz&t=all
Yeah - I saw similar stuff from Klout on social factors, but I worried a bit about using another third party's analysis of these items when they're not focused on the same goals/outcome as Google/Bing. Still, you're right - it would be good to at least check the correlation of these where applicable.
Any way to get any of the first 8 factors you mentioned? (excluding speed, which we can grab)
This might not be a perfect match, but you might be able to do something via Backtype or Bit.ly. And while you might not be able to get traffic from certain social sites, you could determine number of Facebook fans (should one exist) and number of Likes via the Open Graph.
The latter is super easy for individual pages. It's a bit more complicated for domains.
Well you could use DoubleClick (Google) Ad Planner to get some of this data. Here's SEOmoz as an example. Hypothetically, they represent Google's estimates of your website's traffic that they might use as ranking factors.
[Delete]
Perfect... just sent my feedback (no, I did not give my name as expert) and already offering for when the International versions of the Survey
Ah, if I can help with the huge amount of datas you will be receiving, free to hit me :)
I vote you get the #5 SEOmoz user on the panel.
Along with the ranking factors mentioned in the sheet above I think the following factors also shall have an important influence:
I think the keywords in the title and domain name shall gradually become less of a ranking factor in future as this has become a common practice now and easily exploited by the black hatters.
The SEOs in future will have also an additional responsibility of helping the clients to manage the positive and negative reviews and other UGC help them establish online reputation and thus see that the SERPs do not get affected adversely.
The power will lie in the hands of the people (User) if the people prefer it then the search engines will promote it. Only this way the on-site manipulations can be taken care of to some extent.
https://blog.webpro.in/2010/12/power-lies-in-hands-of-people-user-in_27.html
Great ideas, but again on some of them (CTR, time on site, etc.) it's sadly unretrievable/unmeasureable for us. Contact details we might be able to do something with, though. Thanks!
I have been giving a little more thought to this ambitious exersize in light of reading a recent New Yorker Article Here, its a very interesting read, and in particular there some some applicable learnings that the SEO industry could take heed: Particularly the comments in the article by Schooler with regard to a "establishment of an open-source database, in which researchers are required to outline their planned investigations and document all their results" I also liked the notions here that have implications for us in SEO as well: "“We’re wasting too much time chasing after bad studies and underpowered experiments,” he says. The current “obsession” with replicability distracts from the real problem, which is faulty design"
That said, for this exersize to be worth while i'd offer three suggestions for the potential problem of "faulty design".
1. Limit the panel of people reviewing the ranking factors to less than thirty people who all have a proven track record working in SEO for companies spanning different verticals that are leading the pack as reported by Htwise or Comscore. For this to then have credibility these panelists backgrounds should be disclosed.
2. Limit the number of ranking factors being reviewed into two buckets with three groups. The two buckets would be factors that will hurt rankings, or help rankings. The groups could be simplified into furhter categories, link related, onpage, offpage, or anything along these lines. Then further limit "ranking factors" to include only the top 15-30 factors for each group. The end result would be a simple three column spreadsheet with no more than 30 rows.
3. Then have each panelist stack rank the columns by putting the most important ranking factors ( or maybe instead of "factors" change these to top things your SEO managers should be thinking about ), and then they can either decide to remove items or add items as well.
If you see what I am getting at, the end result will follow a K.I.S.S. model because given that some of these factors are weighted very low, such as pagespeed.. ( I cant believe Google made so much hype over it as a ranking factor only to later state that its only weighted at 1%).. it is clear that if you divide some 200 factors by 100% it is impossible and impractical to measure the exact weight in terms of % contribution to some correlation in rankings. That said, it is possible to measure consus among a smaller data sample. This is why i'd say reduce this down as much as possible both in terms of factors, and panelists. Again i'm trying to avoid faulty design by focussing on the top factors and then measuring consensus.
Finally once the some odd thirty panelists have stack ranked each "ranking factor" where 1 represents most likely to improve rankings down to about thirty which is important things to consider as a top ranking factor or it shouldnt be on the list. For example. If you get down to page speed on the list, then the list went too far. So, once the panelists have then taken their three columns and stack ranked them on their own, each line item could get a number beside it which represents how many times it was put in position 1 and how many times it was put in position 2 and so on. Then from the total number of times an item was put in each position we would be able to measure some degree of consensus on only top factors. Which is all anyone really cares about anyway. I dont think anyone really cares if a keyword was mentioned in the first 10 words on a page vs. some where further down the page , and stuff like that should be pulled out because essential what is more important than that, is that the content is siloed properly or if it has any real value to the end reader in the first place and many other factors.
I bet Google would get a kick out of how much lack of consensus we might find. My final input in this is that it would be okay for everyone to totally disagree and for the study to show no consensus even among people at the top. To the point in the New Yorker article.. it seems that everyone is trying so hard to prove themselves right or to prove a point, that it can flaw the downstream analysis.
A person whos opinion I respect recently commented on this NYT article in short...
"Lies, Damn Lies and Statistics."
Cheers.
Warren.
https://seo-cubed.com
I hope people are clear about talking about what factors people believe are impacting search rankings vs. what they personally believe Google should be looking at.
I am in the boat that thinks that the majority of influence on final rankings come from only a small collection of factors. Sure there may be tons more that have some small supportive impact but in my opinion it comes down to things like domain age, number of anchor text links matching keyword searched, number of unique domains linked to site and page, quality of domains linked to site and page, keyword in meta tags/url. It is still pretty easy to push your way up the rankings with a quality page with exact match (or I suspect synonym) anchor links from a collection of domains. The rest of the factors can probably help you a bit in various senarios i.e. social media signals for news/current events/blogs or on-page factors for long tail keywords where matching anchor text links are unlikely.
Your comment gives the impression that you believe there are a handful of signals that get looked at for every query and always have the same weight. I believe that the weight of the signals is dynamic, and changes based on a number of factors such as query intent, number available results, user device, location, etc.
My intent wasnt to say some certain core factors have the same weight for all queries. I think Google puts queries into different pools of intent and then uses an algo that matches that pool. So yes I would agree dynamic. But some people seem to want to focus on improving their site/page with the 200 or so potential factors as if they are all really important. I would argue (just my opinion) that only maybe 10 factors have a lot of weight and the rest are pretty small. So I would rather focus my time on the big factors than on small ones.
The best thing is that Google has been able to keep so much of this a mystery to us all. Unfortunately (of fortuately) Google's heavy weight in certain factors allows lots of SEO to rank high with poor quality content. This can be done with keyword match domain names, lots of pretty (unnatural) anchor text links, and generally well thoughtout on-page SEO. Why Google feels the need to favor small simple domains with crappy/no content just because the keywords match the domain, I have no clue. But they sure do for now at least. I hope they learn other ways of identifying branded keywords in the future.
-Brandon
Totally agree with dancristo. The weight has to be dynamic.
Without detailed anaysis of ranking factors, we wont know where to focus our efforts.
Ok, there are always the known factors domain age, title tags, keyword rich anchor text etc, but it would be blind of any SEO not to explore other ranking factors. Social Media being the big one.
I would love to know how SEOMoz performs tests on these different factors to determine the top ranking ones. It will be helpful if Rand can present this in a blog post once the analysis is done.
Thanks
I did not find coffee and cigarettes anywhere on the list. What kind of an operation are you running here, anyway? ;-)
Hah! First I was surprised this comment didn’t get more thumbs up?
Anyone else playing with the notion of testing whether or not coffee and tobacco indirectly impacts rankings by surveying the penetration of coffee and tobacco use amongst internet marketers, then against whether or not coffee as a stimulant can boost work efficiency (increased link building) against studies that tobacco breaks can decrease marketer efficiency, and then measuring increases or decreases into 2011 in search demand for these products as a proxy for the propensity for these to have causational correlating effect:
https://www.google.com/insights/search/#q=coffee%2Ccigarettes&geo=US&cmpt=q
might find that such thinking has perhaps been in fact a result of drinking too much coffee or smoking too many cigarettes themselves..
https://www.impactlab.net/2009/05/05/pictures-reveal-shocking-effects-alcohol-cigarettes-and-caffeine-on-the-brain/
Ha, just kidding.. But yes I’d say CAFFEINE should be considered as part of this survey.
https://googleblog.blogspot.com/2010/06/our-new-search-index-caffeine.html
There was a great deal of conjecture about whether or not this update changes the relative weight of some of the items mentioned in the spreadsheet in terms of their impact on rankings.
Nice comment!.
Warren
Thanks! I always found them to be the most important off-page factor for me. Then again, we do things differently here in Kansas. :-)
done. Nice idea - looking forward for the results.
I think besides the single factors one thing is very important: the relationship between the factors. But first let's see what the experts say...
Here is my list
1. Authority of the tweeter - (measured by tweets per day, relevancy of tweets, number of retweets, number of followers, connectedness to other authoritative tweeters)
2. # of Diggs, stumbleupons, sphinn and delicious bookmarks. Authority of each bookmarker can be measured by the number of rediggs/stumbles, number of followers, relevancy of the bookmarks.
3. # of Facebook likes.
4. # of Facebook followers
5. # of javascript tags on the page. Google spider cannot read the content within the javascript but can certainly detect the script itself. Higher number of javascript can impact page load time so technically this can fall under pagespeed but it is a possibility that this might be a separate factor because it also impacts the content to script ratio.
6. Total number of duplicate pages including the one with campaign code parameters in the URL.
7. # of links from the domains located on the same IP blocks
8. # of links from the domains located on different IP blocks
9. Geographical location of the inbound linking domains (this one is hard to measure)
10. # of 404 and soft 404 pages
11. Domain canonicalization issues
12. RSS Feed subscription count (Feedburner acquisition is a good indicator that RSS subscription count is being factored in somehow specially for blog rankings)
13. # of times the domain/content was migrated (same here. hard to measure)
I like number 9 (especially for small sites trying to rank in certain countries). Would love to hear aggregated opinions about this. Also, I think this is actually one of the easier ones to evaluate.
In 2010, there has been a considerable activity in forums to try to get back links indexed in G (info:mybacklink)
My own personal (but temporary) conclusions is that it has a small influence in ranking (but only a small one).
Even with a lot of efforts, the vast majority of backlinks to a site will never be indexed in Google. For example, 90 to 95% of Yahoo site explorer back links are not indexed in Google and will not be even with significant efforts.
I'd be curious to see a section that analyzes the influence of indexing/boosting or not back-links
I have used last years ranking factors very effectively and hope i will contribute my experience to compile the new list of SEO factor. Few things must have changed specially the impact of Social Media must be a strong candidate for the new list.
What about adding a conversion factors list as well. As we all know ranking is not enough these days, it's just the first part of the equation.
What about a 'social ranking' component on review sites.
Remember DecorMyEyes earlier this year that gained rank off of tons of bad feedback? Google claimed to have 'fixed the problem'. To me that's a declaration that they're using at least some review based info in ranking.
This is a great list, Rand! Recently I was discussing new ranking factors with our CEO, and how much social factors in. I know that the sharing of content is becoming an important factor, but I'm curious as to others' thoughts on the following:
Do you think that the presence of social media profiles and a link on your website to these profiles is already becoming a ranking factor?
Thanks everybody, I appreciate any input you may have!
Good idea Rand but recently I have new new websites making it on top with keyword in domain name and number of inbound links (from various ips). Still lot of people are easily manipulating results. And I don't see much change in near future unless untill Google listens to your earlier post of 5 or more outbound links + red PR
One of the search engine ranking factor will never changed is focus on human. Search engine is getting humanize.
Rand,
Did you find a reliable source for the Social Media or UGC data yet?
www.Radian6.com from Canada has awesome data partners for their Brand Management tech platform. I know they have a sort of algo for measuring thumbs up/down, pokes, finding trend setters, and trolls.
I bet they might be willing to share some data, if not they might be willing to share their contacts with you so you can have the raw data to play with.
Today, I have perfect reason of my question, Why I love SEO Moz Blog & always eager to read new post over my Google reader account. My morning starts with Google reader & first SEO Moz blog post. 5* is less for this stuff. I am facing too many time issue like that: people who are working in SEO have fear to share knowledge. But, on SEO Moz: I never felt this kind of issue. Always very transferent in post & aggressive to serve new in SEO. Any one can be good SEO with help of SEO Moz? Thanks again !! & always eager to learn new thing with help of SEO Moz.
Unless it's already in there and I missed it (sorry, only skimmed for now due to time constraints), how about Spam Reports.
As in it being a factor having a spam report with Google being filled in against a site, and whether it has much or indeed any of an effect.
I've read here and elsewhere that there are reasons behind the perceived inaction (reasons that do make some sense), but are there ever any actual changes due to spam reports? If so, what?
Love to include local data from Portuguese tests and analyse how that local data is different (or not) from another country.
Could translate it to Belorussian.
# data provider citations
I have a feeling this will be an awesome report! Can't wait to see the progression!
Nominated.
But I'd also like to add that the google.ca index/serps are so dang close to the google.com ones, so might be included, eh?
:-)Jim
Like allways great thought..I appriciate that..My experience defines..besides on page factors and Linkage enhancement some good ranking parameters are:how many keyword rich(targeted) video and image you have across the web for your website and of course social influence and server speed for your website..
Good idea Rand but recently I have new new websites making it on top with keyword in domain name and number of inbound links (from various ips). Still lot of people are easily manipulating results. And I don't see much change in near future unless untill Google listens to your earlier post of 5 or more outbound links + red PR
I'd like to see *all* factors represented by questions (which would include grey hat, black hat and stupid hat) , with the grey, black and stupid getting their own section titled something like DO NOT DO THESE or RISKY TACTICS or something similar.Then the factors could be rated in an inverted fashion, with the higher number indicating worse practice.
I would then use the results of your survey as my official "Best Practices" list that I show clients. And when the client says "I think we should...(insert some really bad idea here) because my brother/mother/cousin/friend/plumber/hairdresser/etc. did this and they're now getting a lot of traffic" I would email them a pdf of the DO NOT section and say to them "According to the experts, it's a baaaaad idea"
To compound what our British friend said:
It would be interesting to see geo-coded sites and their relevance to local listing inside of search engines. One more item id like to add, for those brands with apps attached to their sites (submitted to the obvious channels, android, iphone, etc...) have rankings and search improved as a result.
Cheers!
This is a failry comprehensive list. The few areas that could be added are in the realm of Brand related Citation & Query signals. Google Places has shown that non-link and text based citations can lead to better "trusted" listings. 2011 is probably going to be the year that these factors move into the spotlight.
Nominated and looking forward to the release of this awesome data!
Hi All, My name is Warren Lee, I manage SEO in house for MOVE inc. I work on many enterprise websites such as Realtor.com. I did a fair amount of testing for new possible ranking factors last year, and oh boy has Google been keeping us all on our toes in recent months with all of its own testing as well! But in any case, one particularly interesting test from last year had to do with questions that came up in regard to Google Goggles, and developments in text in image recognition. The test was whether or not bots could read rasterized text in an image, perhaps this could be a ranking factor? So I posted my findings and how I approached the test here:
https://seo-cubed.com/seo-blog/2010/02/09/seotest-can-google-understand-text-in-images/
But in conclusion, I do not think that Google, with all of its power and technology, can take clues from rasterized text in an image. ( Perhaps unless it is in a pdf ?) . So raster text in image = NOT A RANKING FACTOR. BUT in any case, there is that one for you all, and I propose that there should be a healthy inclusion of image optimization related ranking factors, and am happy to contribute further. ie. Raster vs. Vector a ranking factor? YES :)<shameless plug> PLEASE VOTE THIS UP if you want others, or myself to post more IMAGE OR VIDEO, NEWS, OR OTHER BLENDED SEARCH RANKING FACTORS. CHEERS - WARREN LEE. </shameless plug>
<-- No thats not HTML 5. :) Oh yes, and what about HTML 5. Microformats, and reviews and ratings... ;) More to come people..
If the ranking factors of 2011 was to take into consideration blended results, then for image optimiztion.. add : # hotlinked images, for video optimization add presence of KML sitemap. .. There are a few more along this vein that should get folks thinking. :)
Another one perhaps i didnt see it on the list in a different vein of things discussed earlier & more to do with site architecture, is volume or lack of 404, such as soft 404 or simply a high volume of 404 pages, reference https://bit.ly/i6HP2E Cheers. W.
Also I should add that I think it would be interesting to add a layer of depth to this analysis in terms of recent changes to Google.
Petit 4, branded search, Hotpot as well as blended search 2.0
For example, lets talk about defining "BRAND CONSIDERATION" and its impact to rankings for branded terms or generic keywords that make up some branded terms. This is an important consideration for 2010, 2011 dont you agree?
One way to add this to this spreadsheet might be to add domain name: generic vs. non generic keyword, external links with brand name, additional branded product pages, offline contribution to branded rankings ( this is a ranking factor in an indirect way, and yes beofre you say you cant really measure it. I'm sure one of the "experts" has found a solution. So its probably worth thowing out there before we discount it. )
Cheers
Warren.