Adso, if I knew the answers to everything, I would be teaching theology in Paris.
(William of Baskerville - The Name of the Rose)
I am not a mathematician, therefore I cannot give you formulas to play with; I am not a what can be defined strictly as a technical SEO, therefore I cannot give you insides about technical methodologies to fight spam. I have an old marketing school background mixed with humanistic studies. So, my approach to the quality of SERPs issue, so hot during these last weeks, will be more philosophical and theoretical than high tech and statistic.
The Socratic method will be guiding me here in a series of questions I ask and answer to myself. Are they the Answers? They are not, but I think they are painting a probable future.
Is Web Popularity the same as Web Quality?
Mostly not.
Let's be clear: even if spam did not exist, people do not usually link to the most valuable things in the Internet. People link to cats playing the piano, talking dogs, some kitsch website and, oh yes, sometimes to viral content crafted by some agency. Or to Brands.
But when it comes to niches, web popularity becomes a more blurred concept, where popularity gets mixed with authority. For instance is more probable that we as SEOs (yes SEO is a niche) will link to SEOmoz citing their posts because we cite them, blame them, commend them than to some unknown SEO newbie.
Sometimes the miracle happens, and an authority discovers a great piece of content hidden in the web, and therefore it becomes popular. Remember: authority.
Fortunately, there is SEO, and popularity can be obtained with creative link building; but "black hat" techniques makes the "popularity" factor a very risky one to base the SERPs on.
And the risks of popularity are even more enhanced now that tweets and shares are officially counted as a ranking factor.
Should not SERPs present popular content?
Yes, but...
If people do not find in the SERPs what everybody talks about, well, Search Engines would last like a breath. But popularity should be based just in links or should have to be based mostly on trusted links?
Ask yourself: would you choose a restaurant suggested by bazillions people and one link on Yelp or in hundreds of affiliate sites?
Trust, authority...again. And that is something that we as SEOs always preach in our Belief Pray for a Well Optimized Website. And Google preaches the same. But, if it is so, why still is it possible to see so many websites artificially popular because they own millions of links from thousands of unrelated and not authoritative sites?
Maybe the reason is that something is failing on the trust authority check by Google, and it knows it.
So...is it possible to balance popularity and quality?
Yes.
Personally I am not one of those who pretend that the Search Engines should show only astonishing web sites. OK, maybe I am a little bit of freaky tastes, but I don't want search engines to become some sort of Wikipedia.
But, at the same time, I do not want SERPs polluted by clearly spam/insignificant sites. What I want is to see and explore genuine web sites, and I believe that Google could use tools and concepts that already exist making them better.
If link popularity (as other factors) has proved to be a too difficult factor to control now that exist billions of websites and searches and a quite easy formula to game, than another factor (or factors) should have to be highlighted for rankings.
If this factors exists, what are they?
Authority and Trust.
And we all know they are the real factors we really care for, because we do it already in our life. It is simple common sense. We buy that car because we trust that brand; we see that movie because we trust what says a friend of ours; we believe in what a scientist says about climate because he is an authority in climatology. Therefore it is logical that also the search engines should base ranking mostly on those two factors: Authority and Trust.
They are already counted in the Google algorithm, as Rand told in 2009 and Trust Rank is an old dude.
This graphic from another 2009 post here on SEOmoz explains better what TrustRank is than I could possibly do.
Someone, using the Occam's razor principle, could now say: "Put Trust Rank as most important factor and we will see the end of spam".
But that would not be so true in this moment.
Are trusted seeds really to be trusted?
Theory says yes, practice says no (ok, I am a little bit paranoid, but – hell! – I am Italian).
The J.C. Penney case is just one that came to light because the New York Times pointed its finger on it. If not, we would be still probably seeing its site quite well ranking, as many others trusted brand sites. But J.C. Penney is not the only website that consciously or not makes use of not licit SEO tactics. And, on the other hand, it is a clear example of how much Google has to improve the trust factor in it algorithm.
What happened to BMW some years ago seem did not teach that much to Google.
And we know well how easy can be to obtain links from .edu sites and also .gov ones.
No, trusted seed can be gamed... if Google forget to control them first.
WTF can be done (exclaimed the SEO in despair)?
In reality a lot.
And a lot of things seems are moving to a new big algorithm change. Let see the signals Google sent especially in the last two months:
• December 1 2010. Danny Sullivan publishes the famous article What Social Signals Do Google & Bing Really Count?. In the post Google, apart saying that use (re)tweets as a signal in its organic and news rankings, also affirms Yes we do compute and use author quality. We don't know who anyone is in real life :-). This is not like saying that also Users are now counted as trusted seeds?
• December 1 2010. Another article by Danny Sullivan, that did not received a deserved attention, maybe because published in the same date of the previous one: Google: Now Likely Using Online Merchant Reviews As Ranking Signal. In that post Danny cite this declaration from the Official Google Blog: In the last few days we developed an algorithmic solution which detects the merchant from the Times article along with hundreds of other merchants that, in our opinion, provide a extremely poor user experience. The algorithm we incorporated into our search rankings represents an initial solution to this issue, and Google users are now getting a better experience as a result. Danny adds that customers' reviews are probably used as a new factor in the algorithm (but not sentiment analysis). Again, user signals used as confirmation of the trustiness of a website.
• Between December 2010 and the end of January, the SEOsphere saw an increasing number of posts claiming against the everyday worst quality of Google SERPs. Somehow as a reaction, we started to see an increasing number of ex Search Quality Googlers answering in Quora and Hacker News and usually predicting some big change in the algorithm. During this period Matt Cutts says that all the engineers that were moved to work on other Google project will return full time into the Search Quality Department... that means more people working on the Algorithm or more manual reviews?
• On January 21 2011. Matt Cutts publishes a post in the Official Google Blog, the most official of the many Google has Google search and search engine spam. It is the famous announcement of the against-content-farms Google campaign. In the post, Matt Cutts affirms: we can and should do better. Again a move that seems showing how Google is going to favor trusted authority sites. In the same post he says how the May Day Update and the later "Brandization" of SERPs were meant as previous steps in this direction.
• January 31 2011. The always clever Bill Slawski publishes a post that can give hint on how Google may rank social networks, presenting three 2007 patents that have been published few weeks ago. Probably some of the signals described in the first patent are the ones Google is actually using in order to bestow authority to influencers.
• February 1 2001. At Future Search Google accuses Bing of copying its search results detecting them thanks to Bing toolbar. Ironically, another ex Search Quality Team Googler reveals in Quora that Google use the same technique with its toolbar. Again, users' data.
• February 12 2001. The J.C. Penney case comes to light thanks to an investigation of the New York Times. Google intervenes, but this delayed intervention shows one thing: that Google does has serious problem on the Trust side of its algorithm.
• February 15 2001. Matt Cutts presents a video where he explains how Webspam works at Google (an advice?) and promote actively the new spam blocker Chrome plugin launched on San Valentine's day. Another way to detect useful signals from users about what is relevant or not on the web.
What conclusions can be drawn?
- That Google seems to have understood that it has to come back to its origins and the base of its core business: quality of SERPs;
- That Google has probably understood that old classic link-ranking factor can be so easily gamed that some other factors, as Trust and Domain Authority should be given priority;
- That Social Media is so influencing the way people searches, that social signals must be considered as important ranking factors and that Trust and Authority must be translated to the Social reality;
- That users generated content and users interaction with the websites is more active than ever was before, therefore that the users factors must be considered as relevant, at least as a litmus mirror, even though it has to be very well crafted into the algo, as elements like reviews can be easily gamed.
And that the frantic series of news about Search is just at its beginnings.
Post Scriptum: I wrote this post between the 13th and 14th of February, totally unaware that Rand Fishkin was writing a post that touches the same subject. Anyway, I hope mine will give another perspective to the search quality issue and the predictions that can be done on the basis of the last event in search.
Update - 03 of the March 2011
In my last line I was saying that we were still at the beginning of a long series of events and change that could change - a lot - the SERPs we knew.
Infact we had: the penalization of Forbes for selling link, the Farmer Update, Google Social expanded in the Universal Search and today March the 3rd Google has announced that will retouch the Farmer Update in order to penalize legitimate site...
Let see if Google - citing "Il Gattopardo" - is changing everything to change nothing.
You might not be a mathematician, but you do a hell of a job breaking down a very important topic as if you were one. Nicely done.
I love the idea that users could be viewed as trusted seeds. The thought of users = websites is a fantastic one. In much the same way we build up reputation and trust for websites, people could do the same.
Does this open up a whole new branch of marketing? I suppose we could call it 'Human Being Optimization'. After all, I've always liked HBO...
Thank you Ryan.
Effectively I've tried to recollect all the "sensations" the events in Search were/are provoking to me, and to give them a logical explication about what Mr. Google wants to go.
HBO... ah, well, let see if we will have to call "Band of Brothers", as one of the best HBO productions ;)
In the end, I suppose that's a good line to describe the SEOmoz community...a band of brothers (and sisters too!).
Good post Gianluca. You have put lot of emphasis on trust and authority as major ranking factors in your post. But these two factors are not independent of the 'classic link based ranking model'. When a trusted site links out to your site, you gain trust. When lot of websites with same or similar topic link out to your site, you gain authority on a topic(s). For e.g. if lot of SEO blogs link out to your blog, then Google may consider your blog relevant for SEO (besides other factors). You can't gain trust and authority on its own. You can't gain trust and authority with tweets, re-tweets, sharing etc. At least not upto that level that you can compete with well established sites like wikipedia on SERPs for long time. Also the life span of social signals is very short. Most tweets die in few hours or day. You would not like to see SERPs which collapse every breathing minute becuase authority users all over the world are continuously tweeting on a particular topic. Moreover not every topic, service or product is tweeted, retweeted or shared. So social signals based ranking model has severe limitations.
Hi SEO-Himanshu,
and thanks for your comment, that I appreciate a lot.
I agree with with what you say, and I believe that it complements what I say.
When I was talking of Social signs, I meant them as a another ranking factor. And, as it happens in the case of Bing (or so Bing says to consider tweets and likes) we can assign a trust value to author/people, therefore we could have also a new authorative sign of quality toward the link retweeted, hence its website.
First of all congratualtions on your post for being promoted to the main blog. You said we can assign trust value to people. But on what basis and how this is going to help in improving search results by and large. As i said earlier not every topic, product or service is shared, tweeted or retweeted. Gaming tweets and 'facebook likes' is no harder than buying links. There are already some shops setup online which sell twitter followers in bulk. Soon they may start selling influencial followers like people sell links from authority sites. In the end we will be back on square one and internet will become more untrusted than ever as everything from links, reviews, ratings, likes and tweets can be bought and sold in large volume.Consequently people will prefer buying only from big brands with solid online/offline reputation and small businesses will die a quick death. While this is all just speculation but is still not far away from the realms of reality. If Google really wants to reduce spam then it shop stop telling people how it ranks pages and what he is doing to fight spam. You know what i mean. The wholesalers and suppliers of fake recommendations are just one major announcement away (the announcement which makes social signals a major ranking factor).
Ciao again :)
Your fears are my fears too. And I never told that social signs cannot be gamed, and I know that there fields that does not have a real social life (but less than you can imagine, at least in Facebook).
The real challenge would be when it comes to real people selling tweets and likes (the share button is dismissing as Facebook as announced). but not so when it comes to bots, because - at least right now - bots miss one human characteristic: they never answer to a reply.
Personally, but mine is just a theory based on my past in linguistic and semiotics, a way to discriminate genuine tweets from sponsored one is in the language used. Studying the linguistic patterns of the tweets could be possible to find mark ups that could help discriminate between the two. But, well, yes that just a theory.
I don't agree with your conclusion that "Trust and Domain Authority should be given priority".
Google already does this far too much, in such an unsophisticated way that it's causing them harm.
Trust and authority counts so much for Google, all of us have seen sites ranking on these alone, without content to match it. That is actually what's wrong with Google nowadays.
Was the Farmer Update about small, no-authority sites? No, it was about sites that fully understood how authority is everything to Google and "leveraged" it.
In real life, abusing your authority has a high personal risk involved and not much to gain. The climatologist might lose his credibility, you would begin to trust your friend's opinion less etc.
The opposite happens online: there is a lot to gain from abusing your authority. You'll have an accomplice in Google, who, on the one hand, turns a blind eye to your shenanigans for a long period as long as you are an authority, and, on the other hand, is actively helping you monetize your authority in the meantime (by sending you loads of traffic, making you a special advertisers, looking away when you sell links...).
In real life, abusing your authority has personal and/or legal consequences; within a search engine context, it has just business consequences to be factored into the SWOT analysis; calculated risks that many people/companies will always be willing to take.
It's time to wake up: some concepts will never work as well on the web as they do in real life - authority is one such thing.
Hi,
thanks for you comment. I liked it, because I always like when someone contradicts me because he's helping me in reconsidering my ideas and to make self-criticism, which is at the base of improving ourselves.
Personally I do not think that we are really saying different things.
When I talk about trust and authority, I do not leave an implicit "as they are calculated now". If it was so, I would not have ask myself if trusted seed are really to be trusted.
That means that I agree with you that Google is not doing the right things when it comes to Trust and Authority, but my hope is that Google has understood that this is one of its main problems (and somehow they had to admit it)
That is why to enlarge the concept of trusted seed also to social profiles can be a way to follow, because the Social aspect will be the most closer to real life, because real people are easier to be identify behind the avatars. Obviously, as commented also with others above, the great challenge then would be to be able to discriminate real natural tweets, shares and whatsalike from "sponsored" ones.
Finally, as I've tried to make understand with the quote from "The Name of the Rose" in the beginning of the post, I do not pretend my answers are the Answers, just I'm trying to give another view, mine perspective, to all the flood of facts that we are living in these last weeks.
Again, thanks for your comment.
Gianluca,
I think the same as you when it comes to contradicting opinions: it does help in finding new ways of thinking. So I definitely do not mean to disparage your opinion; it is just that I see it differently.
You believe trust & authority is not handled properly right now, but it could be done better.
I don't believe that. For a search engine, trying to make a decision on social signals will always be a losing battle against self-interest and human ingenuity + adaptability, all of which conspire to distort such signals. Even if we're talking about personal social profiles, ways of manipulation will be found, just like with everything else that came before them.
Social signals seem to be all the rage now. Eventually, they will be more of a dead end.
Instead, search engines should concentrate on better AI to help them actually understand the inherent value of information based on more objective signals, not the shifting sands of social interactions.
Good post Gianluca!
I've been doing some research into SPAM in both Google and Bing recently and have to agree with you on many of your points. Authority and Trust seem to be something that the SE's have forgot about. From the data I've pulled together so far it seems that authority and trust play only a minor roll in determining rankings and things like the number of root linking domains are playing more of a roll, regardless of where they are coming from. I'll be interested in seeing what Google has up their sleeve next in regards to SPAM in the SERPs.
Thank you Casey for your comment... and it is "nice" to see that all my "theories" seems proved by the facts, as you show.
We've definitely seen this as more and more online gaming sites with no authority and no high quality links (but loads and loads of low quality links) have become a threat.
Congratualtions on getting promoted to the main blog dude! You 'da man!
Thanks GNC!
Ah... I am still waiting for your first YouMoz... ;)
Your points on trust rank are excellent food for thought. It seems like trust rank originally worked so well because the trusted "seeds" didn't realize/understand/exploit the value of being so authoritative on the web... now, the secret is out and the opportunity to make some cash by selling authoritative links is too good for a lot of sites to pass up. Manual devaluations of sites like Forbes are looking more and more like patches on a crumbling dam from my point of view. It also makes me rethink a lot of previous assumptions I had about how such a model could work on social sites.
Hi Jeffrey,
I am glad not only that you enjoyed my post, but that it made you think about what Google is doing what it is doing beyond the actual facts.
When it comes to social sites and Trust. I do not think that trust should have to be linked directly to the social site, but to the profiles. Using an example that Bing did in the interview to Danny Sullivan, the trust should be related to the profile. Bing said they where taking into account what a person like Danny Sullivan tweets; that means that Bing gives an authority score to the links he shares.
Obviously this could be a way to add new trusted seeds and better the SERPs.
Is this a defitive solution? Not so sure, but it is a way Google should have to explore deeply.
Wonderful post! It's interesting to see the chronology of devleopments within the industry. It's clear that there is a huge algorithmic or political challenge for Google (or both). If there wasn't, they would have taken action long ago.
What happens if Google DOES take the action you think is foreshadowed? If Rand's impromptu Twitter survey last week was any indication of things to come (Rand's survey results put paid links on an estimated 80% of the most competitive terms), there may be some huge changes in the SERPs to come.
Great post!
Matt Cutts always said that these changes has been cooked since a long time... but maybe the uproar in the web against the quality of SERPs, the NYT article and, to extend the "theory" the return of the Brin & Page, could have given a last push to the implement of the algo updates.
Hi Gianluca,very sophisticated point of view. This is one of those posts I have to read very attentivly (most of the time those are the technical ones) because of its philosophical touch and due to those many backlinks to very useful informations it's kind of time consuming - but worth to read it!
Thanks for the thoughtful post Gianluca.
Speaking of trust, if people don't trust Google's results they are out of business. Your comment on how Google is likely puttting quality results at the top of the to do list would almost have to be right on.
I wonder if the seemingly mythic "Web 3" and symantec indexing could or will be brought in as a tool to better evaluate trust and authority?
Do you think links from pages having original topically relevant links would probably remain valuable across however Google may refine their algorithm?
Hi Warner,
thanks for your comment.
Personally I don't know if the basic concepts of semantic web (the mythic "Web 3.0", as you call it) are already considered in the algo or will be, but I believe that they are quite probably taken into account (as the LDA tests by SEOmoz seems to show), but we must never forget that they would be always just some factors between hundreds, therefore even though we would have to look at them with attention, we would not have to make the mistake to forget all the others.
Anyway, thinking to how Google is one of the biggest promoter of HTML 5, which is a clear bet to semantical coding, your "intuition" could be in the right direction.
nice post. so we go back to era of trustrank dominance?
links will remain relevant if connected with social networks, but what about the old beloved content? ;-)
ciao
elena
ps- someone from the Seomoz Staff should remove the "monclair" spam just above....
Even though I have not talk about content, that doesn't mean that I don't think it won't still be the core of any website, and the Panda update is there to confirm its importance.
But, even though it is content what finally can make of a site a trusted seed, we have to never forget one thing: you can have written/designed astounding content, but if you do not promote it (with social, with ads, with SEO... with all the tactics web marketing offers you), it won't be discovered. And without links it won't be considered by search engines.
Therefore:
Thanks so much for that recent timeline. There's been a lot of upheaval, or more accurately potential upheaval, in the search engine world lately and while I was familiar with some of those developments I was not familiar with all of them. Thanks for helping me keep up. If you don't keep up, the ground may shift and knock you flat.
You're totally right.
Agreeing or not with Google, for sure we cannot say they are not acting now... and acting quite "strangely" open, which something unusual, as the big changes of the past usually came all of a sudden, like a thunder in a clear sky.
Gianluca-- great post!
If you conclusions are indeed true, I personally consider that to be a really good thing for searchers worldwide. It will be interesting to see how Google does with those who are trying to game social influence (i.e. tweets to a link, and the authority and trust of that tweeter).
Yes... that's is biggest problem. For instance, how to differentiate a tweet with link from a popular tweeps like Kandashian (who tweets for money) and - to stay on celebrities - Alyssa Milano, who is a geek.
I've some ideas about how it could be done, but it could be the topic for a new post :).
I think you can almost take it back the difference between search engines and hand-picked content. In theory, if I type in a question, the site with the answer (the "best" site for that scenario) could have no links and no Tweets and still be the best place for me to land. The problem is, only a human can determine that, and humans can't rate all the content on the web, even if the web stopped growing today.
So, we have to find an algorithm, however complex, that creates a proxy for "best" - and that's incredibly tough. PageRank went a lot farther than on-page factors alone, and now the social graph comes into play. Of course, we'll game that, too, but some of our social activity will be honest, so it has value as one more piece of the puzzle.
Good points, and I'm all for an algo proxy. It's true that humans can't rate all the content on the web (despite the hopes of blekko). However, they can rate some. However, human rating is subjective. Look at the recent NY business owner who crapped on his customers so that they would leave negative reviews, driving him to the top of local results. Similar situations arise from social signals. The software will continue to evolve, but it seems to run into a conundrum with local social rating signals.
It seems obvious that we want positively reviewed sites to show up high. But can't the case be made that we want to see what users are saying even if it's bad? So reviewed sites, positive or negative should be displayed. Obviously, I'm not as smart as the googlers, however, it seems to me that social and local signals have much more complexity than links.
Interesting article overall… I do agreed that the search is very much manipulated with the SPAM these days and its hard for search engines to provide a SPAM free results in SERPs though they are working hard to show SPAM free results tp the users…
What I personally feel that working more on social signals will minimize the rate of spam in the search results... (I know search engines are working on this) I am not saying that its hard to game social medias yes t is fairly possible and people are doing that too, but I guess its at least not as easy as it getting a link from non-related .edu or .gov websites…
Working more on Social Search, Profile authority, Time and other factors (if included) as ranking signals for SERPs I personally believe that search engine will be less spammy place…
Timely and very topical. I arrived at the same conclusion through a different avenue. Matt Cutts Trust and Authority vid was famous. It's just an evolution of Search. Ideas come and go. Algo updates come and go. Quality and Value last. Well done.
Great post and certainly something to think about. (not wanting to pull any of it apart but it seems your date is incorrect February 15 2001 (February 2011?) Thanks for sharing. Rob
Oh... yes it is indeed wrong, but it is a mistyping :)
Thanks so much for that recent timeline.
I really enjoyed your approach to SEO and Google. I'm rather new to it and certsinly use any and all help I can get. Thied a couple SEO's but got burned. I'm determined to learn it on my on now. I trust myself.
Doug
Exceptional post and one that I am sure reflects the thoughts of so many in the SEO space at the moment...So, at what point does someone start saying “But the Emperor has no clothes?” – how can it be that that 20,000+ of what many call the smartest guys on the planet can miss these obvious manipulations of the SERPS?And if so, what about the less obvious manipulations...and some of the simply silly manipulations...like the effect of having keywords stuffed in a URL? We have a client that is a leader in their field, a long established site that delivers great content and a company with a solid reputation for delivering high quality products and services that has been bumped from their number one spot for their major keyword phrase by a website www.insertkeywordstuffedurl.com (URL changed to protect the innocent) with appallingly written content and a few lousy pages with a backlink profile that is full of junk links.Hate to say it but...are the search engines clothes wearing just a little thin?
Great article congrats on YOUmoz promotion! :-) SEO job description! this is what alot of smaller In-House's are doing popularity control and infulence. All part of the big game of marketing!
Google has had trouble dealing with webspam. Google is yet to come up with an algorithm that ranks websites in a way that is not "gameable". Google still does Hit and trial where it sees what impact a particular tweak has on search results.
After its convinced that the change has enhanced SERP quality, it decides to roll out the change. Caffeine made it possible for them to roll out changes real quick.
Exceptional post and one that I am sure reflects the thoughts of so many in the SEO space at the moment...So, at what point does someone start saying “But the Emperor has no clothes?” – how can it be that that 20,000+ of what many call the smartest guys on the planet can miss these obvious manipulations of the SERPS?And if so, what about the less obvious manipulations...and some of the simply silly manipulations...like the effect of having keywords stuffed in a URL? We have a client that is a leader in their field, a long established site that delivers great content and a company with a solid reputation for delivering high quality products and services that has been bumped from their number one spot for their major keyword phrase by a website www.insertkeywordstuffedurl.com (URL changed to protect the innocent) with appallingly written content and a few lousy pages with a backlink profile that is full of junk links.Hate to say it but...are the search engines clothes wearing just a little thin?
Congrats on making the main blog! You don't always need to bring a lot of math to the table to add a meaningful, useful post, and this is evidence for all those of you hesitant to write a YOUMOZ post.
The timeline was extremely useful for me, it helped add some structure to all of the things I've heard in the past couple months and puts everything in new context. Lots of work to be done!
Does anyone know what these "trusted seeds" are? Which sites are they? And I like what SEO-Himanshu said. How he broke it down makes a lot of sense. Authority is basically a function of theme-relevancy-in-outlinking. I hadn't thought of it like that, but that helps clarify the concept for SEO. While trust is the degree of separation or the distance between the links from the "trusted seeds" to ... your site. Makes a lot of sense.
Oh yes, SEO Himanshu has given a perfect definition of the two concepts.
About if exists a list... no, it doesn't. But trusted seeds are usually being considered very authorative sites, for instance Universities ones or sites like the BBC or New York Times... therefore the sites of those institutions that already offline we tend to trust.
Usually they are those sites that tends to be linked to from the highest number of websites and to the largest number of pages.
But, IMO, that can be also reduced in scale to topics. For instance a site could be a trusted seed for one specific topic (for instance archeology), but not for others.
Very thought provoking post G. I've been thinking about it throughout the day today and it's been kinda depressing. About the only concrete thought I had about it all was "I'm sure glad I'm not a Google engineer"
It seems that anything they produce in an algorithm can be gamed. The only 100% certain way to weed the spam out is via human intervention But as Dr. Pete mentioned above it's not a tenable solution as it's not scalable.
Perhaps they might adopt a hybrid solution which would invlolve human editors after they that are notified about spammy results and sites by actual users . That way the burden to find poor results or poor websiteswould fall upon those that voluntarily report the same.
I like your idea of trust and authority, but they are still subject to being warped by those trying to "game" the system.
Taking social signals as an example, when the weight of a tweet as a ranking factor reaches a certain point, spammers will focus their energies on Ttwitter (or Faceboook or whoeverr the next front runner is.). and start gaming that system.
I know I must sound very negative, but after thinking aobut this all day, I've come to the conclusion that the SERPS are becomingreally...full of beans.
Whoa! After a day out of the office what a treat to come back to a post from you Gianluca! I will save it for suppertime so I can devour it slowly.
we miss you goodnewscowboy. RIP!
Popularity, quality and authority are aspects of content that have always interested me. So many people state that amazing content is the best way to rank well, but I'm still of the opinion that authority has the edge. - Jenni
Excellent Post! Yes i Agree with you a change is needed and nothing works without a human touch.
google is pretty impressive. and seo is now really one of the most effective way to get ranked
:).
with the trusted seed sites, the chart shows that 3clicks away from seed site only 86% of links are good stuff and 14% spam. Are SE's going to treat all of these sites less kindly then those only one click away?. This would mean that 86% of the sites are being punished for nothing, and 14% that is spam would be treated as better then spam. this seems to me a bad metric as it may tell us the ratio, but can not differentiate them If you knew that 14% of a cities people were criminals, would you lock all of them up?
Hi Gianluca!
Good post the first! And excuse my poor english to all (ya que los dos hablamos muy bien el español y estamos en Valencia)
In #searchcongress @seomom show us a slide... 1 domain with a lot of back links, and any twwet. Other domain with any back links, 500 tweets.. Best position the last. About the content of the last.. The keywords to rank, take from tweet or for the destination page from domain?
Kind regards!
(un abrazote!)
Oh yes... I remember well that experiment, that was launched here in SEOmoz a couple of months ago or so. It was demonstrating the importance of tweets for rankings. But the recent case of smashing magazine tweet about the Seomoz SEO guide for beginners seems showing that tweets can give a boost in rankings that tends to low force along time depending on the competitiveness of the keyword.