This week's Whiteboard Friday features the return of Rand (woo hoo!) and his self declared biggest SEO mistakes. We screw up a lot here at SEOmoz (hell, they hired me), so we feel it is only appropriate to take this opportunity to share what we have learned in an effort to prevent you from making similar mistakes. SEO is complicated. The best we can do is practice, work hard and compare notes.
1. Reciprocal Links + Robots.txt NoFollow
Back before the formal SEOmoz days, Rand used to dabble a bit in some grayer areas of SEO. The first mistake he mentions is a tactic involving offering reciprocal links but blocking the outbound links via robots.txt/meta robots so that he could get all of the link value. This tactic didn't really work and he ended up having to do a lot more work to get in the good graces of the webmasters he had fooled. Head smack!
2. Buying Links for Clients
This tactic also took place before formal SEOmoz days. At the time, Rand spent client budgets on paid links. This is a bad idea because the value of the links can't be determined (was Google even counting them?). He later found out through Google employees that the links were not being counted and that they may actually be hurting the client's site ability to rank. Oops!
3. Recommending People Use H1 Tags with Keywords
This mistake is a little bit more subtle. For years, SEOmoz recommended including keywords in the H1 of pages. After we started doing formal machine learning correlation tests we found out that this tactic didn't actually help very much at all (including the keywords in normal text in bigger fonts worked essentially the same). This was a shame because it meant we wasted time and energy convincing our clients to update their H1s.
4. Recommending People Not To Use XML Sitemaps
When XML Sitemaps first debuted, Rand and SEOmoz recommended not using this. While the idea was sound in theory (having a XML Sitemap can make it difficult to spot information architecture problems) the observation ended up being outweighed by the impact we saw with the increased indexation rates of sites that employed this tool.
5. Incorrectly Redirecting Linkscape to Open Site Explorer
Recently we decided to 301 redirect all of the old Linkscape reports to our newer, better converting, Open Site Explorer reports in a 1-1 relationship. This was in theory a good idea but unfortunately including various tracking components on the redirect URLs resulted in us losing a significant amount of traffic. We later fixed this with rel=canonical but a lot of the damage was already done. Ouch!
Do you have any lessons you have learned after making some noteworthy mistakes? If so, we would love to hear what you learned in the comments below.
Nice to see this. Takes a lot to admit misteps or mistakes.
Often we learn more from other's mistakes than from successes. (sorry this sounds like a fortune cookie)
The reciprocal link + robots.txt NoFollow technique was very smart Mr. BlackHat Fishkin. You failed only because you were ahead of time and the link command in Google worked. Today the only way to uncover this trick is to check the robots.txt file and i bet not many do this but i agree it's unhetical. BTW we want more techniques from your shady past Rand! :D For the H1 thing i still use it even if there's no correlation in rankings because it's a best practice recommended by the w3.org folks and gives a page the correct structure.
I agree with Marco, I was expecting real confessions "black hat". It's an article a little too consensual. Come on Rand, we are sure you can be truly diabolical during your SEO tests :-)The "HN" tags remain, however, very interesting to prioritize the items, at least visually.
David
The H1 thing is a little chocking but I still use keywords in the H1 tag as well. If it really doesn't help much it would seem that it would be a good idea to still put it in H1 for usability purposes.
I use H1 to simplify styling. If I get some SEO juice then that's a bonus. My theory on the diminished value of H1 is simple... not enough people used it. So the engines changed to font size and placement for fears that "the few" who paid attention would get higher rankings and could push "good content" down. Just a theory.
I think heading tags are important because they map out the structure of the document. That's especially important for users accessing the page via screen reader software.
From https://www.webaim.org/techniques/screenreader/#headings
Another way to skim the page to get an overall impression of a page's content is to jump from heading to heading. Users can hear an outline of the page's main ideas, then backtrack to read the parts they are most interested in. The main drawback to this technique is that too many pages lack headings. Without headings, this method of skimming through content is completely useless.
Implication: Authors should organize content with headings. To the extent possible, the headings should represent an accurate outline of the content.
Yep I would still use them, and add the keywords. It just tells users and bots what the page is all about.
I agree with Marco about the h1 tag. I still use reasonable keywords in the H1 tag. I think smart way of using the h1 tag is never unethical. It determines the relevance of your page. Moreover we all learn a lot from our mistakes only.
It takes a lot of courage to admit mistakes and show them to such a large audience:)
It is quite scary to find out that even Rand is doing some mistakes. What chance SEO newbie has got? I know there are heaps of resources out there but many of them say opposite things. I am starting to think that SEO is much more art rather exact science .
Luckily it becomes harder to make big mistakes because they are well known. But there are plenty of opportunities to put energy into something what does not harm but does not help either :)
Great topic!
I remember about more than 10 years ago my boss told me to put the "index,nofollow" into all meta tags - not to overstuff the "poor" search engines.
About 2 years ago a client of us was wondering why he doesn't show up in the serps. This was one of those cases - never had a homepage relaunch. And it was pretty embarrasing to explain that.
Petra
Spurned on by Rand's confessions of a blackhatter - I thought Id chip in with my own relatively similar one.
Back in the day I also used to employ something vaguely similar, I used to have my link page in site navigations as normal formatted text (so it looked the same as all the other links in the navigation) but was actually a JS onclick command.
That JS onclick would most likely have been enough to prevent any of the linkjuice flowing out, but I went one step further, and stuck my links.html files in a subdirectory that was also noindex. In some extreme cases I seem to remember sticking it on a whole different domain.
I also suffered the wrath of many a webmaster who had in good faith linked to me.
At that point though, it seems that our paths differed Rand, I dropped the whole concept of reciprocal linking when you built a directory - but I didnt abandon paid linking or other means of cajoling clean links..
..how different things could have been ;-)
Nice list. Maybe I'm too cynical, but surely you've got some other skeletons in your SEO closet? Never advised spamming social media or spam reporting on a competitor? What about that keyword density stuff that used to be gospel (maybe I'm showing my age).
oh! keyword density... dark ages... have heard or read about it for a very long time!
It seems to be more placement of the keywords in the code nearer to the top more so than the actual tags from personal experience. All the same, a good header should give a person an idea of what is going to be discussed, and this usually includes keywords. I think we can all agree though the robotic keyword stuff isn't helping anyone so the H1 machine learning results was a nice revelation. Your results seem to also back up that theory and experience. If only links had less influence...
Very good stuff Rand. I'm guilty of the <H> tag issue. Thanks for clarifying this. Having H tags can sometimes not look as good to the visitor as merely bolding the text.
Great post...
Just throwing this out there. If buying links weighed heavily in lowering your SERPs, than hypotheticaly I could purchase links on porn sites and point them to my competitors and lower their rankings.
Just saying...
Definitely a topic that a lot of folks are interested in and we wrote about it here - https://www.seomoz.org/blog/what-if-my-competitors-point-spammy-links-to-my-site - just a few weeks back.
This was covered in this SEOmoz blog post by Rand https://www.seomoz.org/blog/what-if-my-competitors-point-spammy-links-to-my-site (What if My Competitors Point Spammy Links to My Site?)
My epic fail was this in the robots.txt from the first month of a SEO contest...
User Agent: *
Disallow: /
Fortunately, finally we win :P
I did a test with 20 sites I own around 1 year ago to test if XML's are really beneficial. This included small sites to large and overall i think that a XML sitemap is something that is needed 100%!!
Reccomending H1 tags is also something I am guilty of, I think they are beneficial yet only to a small extent, its similar to Meta data is it advisable yes, is it crucial NO.
Paid links are never a good strategy, it burns to see SEO's still using them =(
Hey Rand. Great WBF! I'm always impressed by your guy's ability to be humble and admit your mistakes - even though you're probably all the foremost experts on the subject. Much appreciated!
Some of the mistakes you outlined are definitely ones I've made. I ventured into the reciprocal/nofollow thing last year when I was working under another SEO - never felt right about it. You can always justify "well it's their fault for not looking", but end of the day it's just bad practice and not worth the effort.
I'm moving all our clients away from that kind of go out and email for links and more to content-based approaches. More work, but far better rewards.
Also just signed up for ProTraining! Looking forward to catching up with you and the team!
@SEOmoz
Can you run us all through SEOmoz's use of sitemaps? Noticed there isn't one in the normal spot on www.SEOmoz.com. Just curious.
We use a sitemap but compress it with gzip so it has a slightly different URL than simply .xml. This is a common practice (and resulting URL) and is actually a standard place Google looks for sitemaps.
You can see it's URL declared in our robots.txt.
That's cause their domain is SEOmoz.ORG!
LOL never mind rand et al, we've all been there. my tip is (back in the early 2000's) don't use noscript to hide a bunch of H1 wrapped links - it gets you banned for 3 months lol.
and for those who know me here, i started my new job as SEO Manager at www.blueclaw.co.uk this morning... good times ahead.
Congratulations Mark! I wish you exciting and rewarding experiences.
Your new co has a nicely done website. Small point but y'all might consider an "about" link either up top or in the footer.
It took me quite awhile of hunting before I found the "Meet the team" link buried in the sidebar. IMHO of course ;-P
Learn more from mistakes than sucess and being able to admit them is huge. Great post to summarize what did not work....
Also, thanks for confirming the value of XML sitemaps. I've submitted those a few times recently and am waiting to see index results. Rand's comments that this is they do a great job to get sites indexed was music to my ears.
I've got one for you:
Recently, my website underwent a major overhaul after being in operation for approximately a year. A url was later found to have 100 backlinks to it, one hundred, and was returning a 404. We had done a press release regarding the release of a product, and it linked to a page that didn't make the cut in the re-make. Yikes.
I think that everybody hides some "black to grey" secret in his SEO wardrobe... and mostly because SEO was (and is still) such an experience based profession that - and I think I am saying the truth - we all have experimented any tecnique looks good to achieve the objectives, especially in our beginnings.
If I actually look back to my first steps in the SEO world, I can easily see how many stupid things I did (for sure, the reciprocal link "scam" was something I did too, creating the link page in a robot text nofollowed carpet).
The most stupid ones were almost all related with the "optimization" of flash websites, using almost cloaking tecniques in order to make visible the content of the website, content that was obviously fantastically optimized with any possible and supposed useful Html tag.
Ah... the other great urban legend I got hipnotized by was the "keyword density percentage"... so much that I found myself doing very weird calculation in order to achieve the perfect formula: I looked like a necromancer ;)
Say it ain't so Joe! I can't imaging you doing anything bad G. I always assumed that not only your hat was white, but your trousers and pants too!...wait a minute, I'm looking at your avitar a little closer and it's now looking like you've got a grey hat on!
AH! If you look better you will see its color is olive green and not grey.
And, about White... I'd love to be like Gandalf the White, but still have not definitely won the Google-Balrog
the reciprocal link checker i wrote actually goes and does all the work... i pull the meta tags, check for nofollow and then i even built php to parse and apply robots.txt rules if i detected one... It cost a couple more calls to make sure that the link wasn't blocked in some way but i think its worth it because now you dont have those grey hat smarties looking like a link is real when it isn't. I really built it for others that have relationships where the link has to be monitored to make sure they aren't going to "cheat" and remove it later i dont like links where i have to monitor to make sure it still there and would rather have relationship and know it will stay but its useful when you want to know if a page is passing juice im working on converting the robots.txt parser to JavaScript so I can include it in the next update of SEO Site Tools... but ive got crazy scope creep problems because i keep trying to roll more functionality into the tool...
the other tool im building uses the webmaster tools data feeds to pull sitemaps and internal links and check the cache date for each page (essentially telling us what made it into the index and what need some more link love) this is having some trouble on larger sites... im trying to figure out how I can distribute the requests so you dont hit rate limits much the way i built my snoeshoe function to distribute PR queries and avoid issues with pulling those metrics
hahahaha between the headline and Rand's face in the opening of the video, I couldn't help but watch it :P
A lot of good advice, I like how open seomoz has been with their SEO mistakes, pitfalls, etc. Thanks!
Thanks a lot for sharing these Rand (and others). I really like these sorts of posts and also like the conversation it generates afterwards!
Definitely look forward to hearing more mistakes and particularly like the ones from "before my time" in the SEO game... it's always crazy to hear just how much things have changed in such a short period of time.
Rand! I couldn't believe my eyes. ;) But the first blackhat technique was really smart and I have never thought of such chances available before. lol.. I agree, I have also been a part of those xml sitemap mistakes before.
Good one Rand!
Good advice as always and refreshing to see the 'mistakes' mentioned. We all make them and that's how we learn. Hopefully, on our own projects and not our clients though!
I am not sure about H1s on pages... But I will keep track of this question... Thanks for the post.
re: Buying Links for Clients
Some confusion here, How would google know that link is from paid site, is it kind of logic that from the page you are taking link should not have lots of third party links... as let's say we have few domains some of are same market some others, and we put our keywords in the site to get backlinks, are these domains or links will hurt our rankings also? so what are the options left for theme links?
regards
Kashyap
Greatttt Post! Hilarious humility.
The nature of SEO means keeping pace with the search engines with intuition followed by science. It's a good lesson that most of your mistakes were corrected by hard data and facts, and you passed this knowledge onto the rest of the world. You deserve credit because not only did you learn from these mistakes, but everyone else did too.
Like I said, great post. Whoever gave you the idea must be a genius.
You know - I thought it had just come up off the top of my head, but looking at your comment, I'm guessing that's where the seed entered my brain.
Thanks a ton for the idea - much appreciated!
I always used XML sitempas and i still use H1 tag and add keyword because it is recommended by W3.ORG
Hehe. I must admit that 'redirecting Netscape to IE' made me giggle a bit :-)
So why is h1 still used in the on page reports?
Wow a huge sense of nostalgia, cant believe the intro graphics & music. Its huge to see how Mox has developed in 5 years
It's refreshing to see a big SEO admit to mistakes, it's inherent in the nature of what we do so and we can only ever work with the best of our experience. It's just important to make sure the client knows this as well. :)
Danny, where is the code to block only the outbound links via robots.txt?
I would suggest then to remove the H1, H2, H3 criteria from the SEOMOZ-Tools
Great topics!
It is always great to learn from others misteaks :)
Thanks for making them and keep it up!
Thanks for the post. It's always comforting to know even the experts make mistakes.
The H1 / keywords thing is interesting. You see a lot of people promote this as a basic SEO technique.
I think this one is an interesting topic/choice by Rand. I still think that telling someone to use an H1/header that includes their keywords is perfectly valuable even if the actual derivation as an H1 is less important.
It's an easy way to explain to a client how to get those keywords bolded and towards the top of the page so even if it is no MORE effective it still seems it might be an easier way to get uptake!
*This all assumes whomever has designed the site has been logical with the style sheets and the H1 is bold and comes near the top of the page :)
Anyway, if your page are relevant, the title ( h1 ) should be relevant and contains keywords too.
Making a site accessible is already helping the SEO and respect the best practices.
I don't fully agree with the H1 statement. As the other posters have said, if the keywords are relevant to the page, then the H1 should have them in there.
This may not have the advantages to SEO that we once all thought, but i still don't think its a waste of time to do, even for best practices alone.
Better than going back to the old "Welcome to My Site" H1.
Good to see the others in there though. No Follow + Reciprocal links!! Dirty ha ha
I think the main point of the H1 tag that we all seem to agree about is that it is still relevant to usability, but no longer (if it ever were for that matter) relevant for SEO.
That's exactly right. Its good for users (which is a priority) but it is not very useful from a pure rankings standpoint.
I still implement them, I just do it for reasons other than rankings.
I think it's a great idea to post your mistakes because newbies to SEO can get off on the right foot and not waste a bunch of valuable time doing the things that won't get them the results they want.
Cool post. It's always fun to read how things began. Especially for those of us who haven't been working in this field for such a long time!
Good post. I would not consider these 'mistakes' but rather 'learning experiences'. We've all tried all kinds of techniques to stay ahead of the competition at times with success and at times failing miserably. Nevertheless it is a comforting reminder that we've all gone through these experiences.
One more mistake of not turning up on webinar and no prior intimation of the same.
Topic: PRO Webinar - Viral Content: Strategies & Real Life Examples Host: SEOmoz Events
Yeah - that was a really bad technical mishap we made with the Webex software. We're going to be sending out an email ASAP with a new time to watch the webinar for those who missed it. Not quite an "SEO" mistake, but definitely one we feel terrible about.
Hi Rand,
This was the first time I was going to attend your webinar. I hurried things up to be there but was very disappointed.
Will look forward for next one.
Thanks,
Preet
Ugh - really sorry. I promise the content is worth making it to the next one. Viral stuff is super interesting to talk/examine/learn about.
@Rand
At my very first attempt of SEO, added a hell lot of keyword rich content onto the site along with using - display:none; - bang! google banned our site. Very sily but was not aware of it. :P
Am actually doing SEO on a site currently and I was planning to do a H1 revamp on the pages. Now I'm perplexed should I do it or not. At least I'm clear that it is not gonna help me in terms of SEO.
Google recently added a new site map format - are you gonna recommend using the format or do you think there is problem with it.
https://googlewebmastercentral.blogspot.com/2010/06/sitemaps-one-file-many-content-types.html
I'm aware that this site map format was developed independently by Google and Microsoft or Yahoo have not given their consent on it yet.
What if I still care about the Yahoo and Bing traffic?
And in your video you said, you actually test the effect of the changes you made on the site. Do you use any tools for that or you/seomoz actually own a test site where you fiddle around with the SEO changes.
Obiviously I'm afraid of every SEO change I make after google banned our site once.
Asim
how refreshly honest to admit your mistakes.
A nice little recap on the basics.
I've found that headings are commonly used for the homepage link/logo and generic navigation. I'd agree with suggestions that headers are no longer a ranking factor because of how inconsistently they are used.
However, I'll continue to recommend them because it is best practice and the use of descriptive headings to break up content provides a better user experience.
Someone may have asked for this already, but I'd really like a little more info on what you did in point 5. Could we see a few actual examples?
Sure - so here's a URL that previously existed for a Linkscape report:
https://www.seomoz.org/linkscape/intel/basic?uri=www.iditarodblogs.com%2F
And here's what it redirects to on OSE:
https://www.opensiteexplorer.org/www.iditarodblogs.com%252F//a!links!!ref!linkscape?anything%5B%5D=www.iditarodblogs.com%2F&anything%5B%5D=&anything%5B%5D=a%21links%21%21ref%21linkscape
And the rel canonical tag now points it to this version:
https://www.opensiteexplorer.org/www.iditarodblogs.com%252F/a!links
If you're a PRO member, you can still access the classic Linkscape reports, but other folks (and engines) are redirected to OSE.
Way to go on the public confession and even further about actually learning from your mistakes. I know I wouldn't be nearly where I am today if I didn't fail miserably many times over before. Thanks for the insight given here. Funny enough I am just getting ready to move a rather large section of site so the last one struck a chord with me. Think I'll double check a few things on the dev end to make sure we get it right.
Thanks for sharing your "dark" moments.
I agree with too some extend but in terms h1 taging, I do belive that h1 taging is tsill gving some extra credit interms SERP.
In case of buying links is stupid if and if your links are irrelevant ,unstable and poor PR value and poor number back links...but still you are getting from valuable website then definately you will get some extra credit on that.
not extra credit, its still a ranking factor tho around 5 or 6 on the list... btw this needs updating guys to reflect it
https://www.seomoz.org/article/search-ranking-factors#ranking-factors
Curious as to why Term Target still wants H1 tags when it is not relevant?
The truth is that we've built a new version inside the web app launching in August, but haven't yet revved the existing term target. I hope to have that accomplished soon after that August launch, though.
Well, just to play Devil's Advocate - the H1 is utterly relevant!
Just because the ranking correlation is no different to putting the same text in the same place but using a different method (e.g. a paragraph tag with bold and font-size 200%) doesn't mean that you shouldn't use the H1 - it really is no reason to abandon semantic mark-up.
The whole "white hat" approach that you have shifted to over the years (thank goodness!) has included more and more best-practices that are not directly 'traditional' SEO, but rather focusing on wider industry best practices such as writing for the user, not for the search engine (e.g. writing well formed title tags instead of just stuffing them with keywords).
The H1 tag is incredibly important - it is semantic HTML, and as we know, the Web is all about semantics. Headings in general are vital for accessibility and the H1 is sometimes used instead of the when creating auto links on certain sites.
I'm not saying that what have said here is wrong, just that it can be taken out of context way too easily: successful websites are about so much more than doing your SEO math and we shouldn't look too closely at them through just one lens. The problem of saying the H1 is not important is that a great many readers may mis-interpret that and suffer as a result.
Websites require a multi-disciplinary approach and sometimes we need to step back to look at the bigger picture.
[oh and brilliant post!! - nice to see someone not afraid to admit their mistakes and help others learn from them - I take my hat off to you]
So as a newbie, I am paying monthly for irrelevant information???? Is there anything else that has change that your tools do not reflect?
Not to flame your comment, but I personally feel like it was very disrespectful and ignorant of you to say, especially being newer. Now I'm not saying that you shouldn't participate, or comment, but I know when I was new, I definately spent more time listening and reading than I did taking part in the discussions. I'll admit, I am not a newbie and understand some of these concepts more than the average internet user, but you also have to understand that SEO is certainly not rocket science, and you certainly don't need to be an expert to do it well (it just helps).
#1. You DON'T HAVE TO PAY anything to learn SEO. The information is out there, and if you know how to use Google, you have everything you need at your fingertips. Keep in mind though, if you don't pay to learn it from a reputable source, like SEOmoz, then you will most likely make many of the same mistakes mentioned here, probably more. Learning anything new requires trial and error, especially with something like SEO that changes constantly and doesn't have consistent standards across the board. I would be willing to bet that every single SEO has made a mistake at one time or another, and that's also what has helped to grow our industry to what it is today.
#2. You should EXPECT that information you learn about SEO, now or in the future, will change. It always has, and will most likely continue to. No one, including the superhumans at SEOmoz, should be expected to go back and update every tid bit of information on a blog as big as this one. That is the point of dates on blog posts, so you know when the information was published. Now I do agree that premium content or tools should reflect current SEO strategies, but SEOmoz has always been good at staying on top of those changes and reflecting them in their premium content and tools. Your comment is exactly the reason why most companies don't post things like this publicly, which is why I'm glad SEOmoz does.
#3. A FEW PIECES of outdated SEO info here and there is no reason to question your membership here at SEOmoz. I am not a premium member, but I used to be, and the advantages / benefits of their program are very clear in my opinion. There is no where else that you can get all the tools, access to premium content, QandA, the community, training seminars, etc. all in the same place. Not to mention, you are paying to participate with some of the best and brightest people in the business. To me, having resources like that at my disposal is pretty invaluable. They could charge double and it would still be worth it (don't blame me if they raise their prices now - j/k).
#4. Based on your comment, I would think you have more of a reason than most to learn this stuff. That way, you won't have to rely so much on what others are saying and you'll be more capable of coming to your own conclusions about the topics being discussed. I was the same way in the beginning until I started having my own opinions and started trying things out for myself (other than the basics).
#5. No single SEO tool exists that is completely up to date with current strategies (since they can change daily) and there is no tool in existence that is meant to be an end all to SEO. If it were as easy as going to a few tools and doing what they say, it would exist, and none of us would have a job in SEO. It takes work, real work, and a human touch to do SEO correctly, and tools should only be used as an aid to measure, track or report.
#6. In the future, if you find yourself questioning whether or not something is valid for you in regards to SEO, you should do one of the following: Go directly to the SEOmoz QandA and ask tons of questions, then listen and follow through. Participate in forum and blog discussions and ask lots of questions, then listen and follow through. Attend industry events and network with as many experts as you can and ask lots of questions, then listen and follow through.
Sorry if I misread your comment, but it felt to me like you were missing the point of the post, and the fact that SEOmoz is open and transparent about what goes on around their company (which is a good thing).
If you believe in the devaluation of a second text link to any internal page, where does CSS Image Replacement for your logo rank?
I don't think you'd argue that seomoz.org is the keyword you most want to rank for?
https://www.latentmotion.com/css-image-replacement-mistakes/
Thanks for sharing, many wouldn't discuss their past mistakes so openly, well done.
you forgot to mention rel nofollow and how strongly you defend it for site sculpting/siloing Another one big mistake you let people write tons of stupid things in YouMoz and without strong editorial check it went out to the main site and pissed off allot of people (or gave bad advice to some who believed it comes from SEOMoz)
Hmm... certainly appreciate the feedback. It would be great (and very TAGFEE)if, in the future, you could give some specific examples so we know how best to improve. Happy to respond to these, though:
Hope that helps to clarify!
Sorry ror #2 youmoz :) it was provocative from my side :)
rel nofollow thing is very sensitive. I know that just remove them is not good and saw some guys did it and regret (even put them back with positive impact)... I made different tactic with rel nofollow links I simply do not use A tag and some link functionality might be achieved with javascript if you want to allow people to 'click' elements and go to some other page.
I promisse I will not provoke any more.
No worries at all! We loved to be challenged - I think the best comments and conversations happen that way.
Also - glad to hear we're not the only ones who've observed the weird behavior of nofollows, even after Google's supposed change. I wish reality matched better to their messaging - it makes it hard to know when we're getting real information vs. spin.
I have a couple sites where I gained tons of additionial traffic after removal of nofolloow.
May you give us more details? I think many here will be interested to knows your story detailed.
Yes me for exemple :-)
Are you talking about nofollow for page sculpting? Just on internal pages or linking to other domains?
Matt Cutts mentioned a few times that linking to reputable sources "can" help your rankings. I think it was in one of his videos. I'll see if I can find it.
The problem I see with sculpting on your domain is we don't know how the engines react to nofollows. They see a nofollow on your site linking to another page... does that mean you can't vouch for that content? Does it diminish the real pagerank of the site? We don't know for sure, G does a lot that they don't tell us.