I'm currently working on re-authoring and re-building the Beginner's Guide to Search Engine Optimization, section by section. You can read more about this project here.
Common Myths & Misconceptions About Search Engines
--------------------
Unfortunately, over the past 12 years, a great number of misconceptions have emerged about how the search engines operate and what's required to perform effectively. In this section, we'll cover the most common of these, and explain the "real story" behind the myth.
Search Engine Submission
In classical SEO times (the late 1990's), search engines had "submission" forms that were part of the optimization process. Webmasters & site owners would tag their sites & pages with information (this would sometimes even include the keywords they wanted to rank for), and "submit" them to the engines, after which a bot would crawl and include those resources in their index. For obvious reasons (manipulation, reliance on submitters, etc.), this practice was unscalable and eventually gave way to purely crawl-based engines. Since 2001-2, search engine submission has not only not been required, but is actually virtually useless. The engines have all publicly noted that they rarely use the "submission" URL lists, and that the best practice is to earn links from other sites, as this will expose the engines to your content naturally.
You can still see submission pages (for Yahoo!, Google, MSN/Live), but these are remnants of time long past, and are essentially useless to the practice of modern SEO. If you hear a pitch from an SEO offering "search engine submission" services, run, don't walk. Even if the engines did use the submission service to crawl your site, you'd be very unlikely to earn enough "link juice" to be included in their indices or rank competitively for search queries.
Meta Tags
Once upon a time, much like search engine submission, meta tags (in particular, the meta keywords tag) were an important part of the SEO process. You would include the keywords you wanted your site to rank for and when users typed in those terms, your page could come up in a query. This process was quickly spammed to death, and today, only Yahoo! among the major engines will even index content from the meta keywords tag, and even they claim not to use those terms for ranking, but merely content discovery. This subject has been covered in great detail by Danny Sullivan of Search Engine Land.
It is true that other meta tags, namely the title tag and meta description tag (which we've covered previously in this guide), are of critical importance to SEO best practices. And, certainly, the meta robots tag is an important tool for controlling spider access. However, SEO is not "all about meta tags"; at least, not anymore.
Keyword Stuffing & Keyword Density
Not surprisingly, a persistent myth in SEO revolves around the concept that keyword density - a mathematical formula that divides the number of words on a page by the number of instances of a given keyword - is used by the search engines for relevancy & ranking calculations and should therefore be a focus of SEO efforts. Despite being proven untrue time and again, this farce has legs, and indeed, many SEO tools feed on the concept that keyword density is an important metric. It's not. Ignore it and use keywords intelligently and with usability in mind. The value from an extra 10 instances of your keyword on the page is far less than earning one good editorial link from a source that doesn't think you're a search spammer.
Paid Search Helps Bolster Organic Results
Put on your tin foil hats; it's time for the most common SEO conspiracy theory - that upping your PPC spend will improve your organic SEO rankings (or, likewise, that lowering that spend can cause ranking drops). In all of the experiences I've ever witnessed or heard about, this has never been proven nor has it ever been a probable explanation for effects in the organic results. Google, Yahoo! & MSN/Live all have very effective Chinese Walls in their organizations to prevent precisely this type of crossover. At Google in particular, advertisers spending tens of millions of dollars each month have noted that even they cannot get special access of consideration from the search quality or web spam teams. So long as the existing barriers are in place and the search engines cultures maintain their separation, I believe that this will remain a myth.
Search Engine Spam
--------------------
The practice of spamming the search engines - creating pages and schemes designed to artificially inflate rankings or abuse the ranking algorithms employed to sort content - has been rising since the mid-1990's. With payouts so high (at one point, a fellow SEO noted to me that a single day ranking atop Google's search results for the query "buy viagra" could bring upwards of $20,000 in affiliate revenue), it's little wonder that manipulating the engines is such a popular activity on the web. However, it's become increasingly difficult and, in my opinion, less and less worthwhile for two reasons.
First, search engines have learned that users hate spam. This may seem a trivial and obvious lesson, but in fact, many who study the field of search from a macro perspective believe that along with improved relevancy, Google's greatest product advantage over the last 10 years has been their ability to control and remove spam better than their competitors. While it's hard to say if this directly influenced their dramatic rise to lead in market share worldwide, it's undoubtedly something all the engines spend a great deal of time, effort and resources on - and with hundreds of the world's smartest engineers dedicated to fighting the practice, I'm loathe to ever recommend search spam as a winnable endeavor in the long term.
Second, search engines have done a remarkable job identifying scalable, intelligent methodologies for fighting manipulation and making it dramatically more difficult to adversely impact their intended algorithms. Concepts like TrustRank (which SEOmoz's Linkscape index leverages), HITS, statistical analysis, historical data and more, along with specific implementations like the Google Sandbox, penalties for directories, reduction of value for paid links, combating footer links, etc. have all driven down the value of search spam and made so-called "white hat" tactics (those that don't violate the search engines' guidelines) far more attractive.
This guide is not intended to show off specific spam tactics (either those that no longer work or are still practiced), but, due to the large number of sites that get penalized, banned or flagged and seek help, we will cover the various factors the engines use to identify spam so as to help SEO practitioners avoid problems. For additional details about spam from the engines, see Google's Webmaster Guidelines, Yahoo!'s Search Content Quality Guidelines & MSN/Live's Guidlines for Successful Indexing.
Page-Level Spam Analysis
--------------------
Search engines perform spam analysis across individual pages and entire websites (domains). We'll look first at how they evaluate manipulative practices on the URL level.
Keyword Usage
One of the most obvious and unfortunate spamming techniques, keyword stuffing, involves littering numerous repetitions of keyword terms or phrases into a page in order to make it appear more relevant to the search engines. The thought behind this - that increasing the number of times a term is mentioned can considerably boost a page's ranking - is generally folly. Studies looking at thousands of the top search results across different queries have found that keyword repetitions (or keyword density) appear to play an extremely limited role in boosting rankings, and have a low overall correlation with top placement.
The engines have very obvious and effective ways of fighting this. Scanning a page for stuffed keywords is not massively challenging, and the engines' algorithms are all up to the task. You can read more about this practice, and Google's views on the subject, in a blog post from the head of their web spam team - SEO Tip: Avoid Keyword Stuffing.
Manipulative Linking
One of the most popular forms of web spam, manipulative link acquisition relies on the search engines' use of link popularity in their ranking algorithms to attempt to artificially inflate these metrics and improve visibility. This is one of the most difficult forms of spamming for the search engines to overcome because it can come in so many forms. A few of the many ways manipulative links can appear include:
- Reciprocal link exchange programs, wherein sites create link pages that point back and forth to one another in an attempt to inflate link popularity. The engines are very good at spotting and devaluing these as they fit a very particular pattern. See this post for more about reciprocal links.
- Incestuous or self-referential links, including "link farms" and "link networks" where fake or low value websites are built or maintained purely as link sources to artificially inflate popularity. The engines combat these through numerous methods of detecting connections between site registrations, link overlap or other common factors.
- Paid links, where those seeking to earn higher rankings buy links from sites and pages willing to place a link in exchange for funds. These sometimes evolve into larger networks of link buyers and sellers, and although the engines work hard to stop them (and Google in particular has taken dramatic actions), they persist in providing value to many buyers & sellers (see this post on paid links for more on that perspective and this post from Search Engine Land on the official word from Google & other engines).
- Low quality directory links are a frequent source of manipulation for many in the SEO field. A large number of pay-for-placement web directories exist to serve this market and pass themselves off as legitimate with varying degrees of success. Google often takes action against these sites by removing the PageRank score from the toolbar (or reducing it dramatically), but won't do this in all cases.
There are many more manipulative link building tactics that the search engines have identified and, in most cases, found algorithmic methods of reducing their impact. As new spam systems (like this new reciprocal link cloaking scheme uncovered by Avvo Marketing Manager Conrad Saam) emerge, engineers will continue to fight them with targeted algorithms, human reviews and the collection of spam reports from webmasters & SEOs.
Cloaking
A basic tenant of all the search engine guidelines is to show the same content to the engine's crawlers that you'd show to an ordinary visitor. When this guideline is broken, the engines call it "cloaking" and take action to prevent these pages from ranking in their results. Cloaking can be accomplished in any number of ways and for a variety of reasons, both positive and negative. In some cases, the engines may let practices that are technically "cloaking" pass, as they're done for positive user experience reasons. For more on the subject of cloaking and the levels of risks associated with various tactics and intents, see this post, White Hat Cloaking, from SEOmoz.
"Low Value" Pages
Although it may not technically be considered "web spam," the engines all have guidelines and methodologies to determine if a page provides unique content and "value" to its searchers before including it in their web indices and search results. The most commonly filtered types of pages are affiliate content (pages whose material is used on dozens or hundreds of other sites promoting the same product/service), duplicate content (pages whose content is a copy of or extremely similar to other pages already in the index), and dynamically generated content pages that provide very little unique text or value (this frequently occurs on pages where the same products/services are described for many different geographies with little content segmentation). The engines are generally against including these pages and use a variety of content and link analysis algorithms to filter out "low value" pages from appearing in the results.
Domain Level Spam Analysis
--------------------
In addition to watching individual pages for spam, engines can also identify traits and properties across entire root domains or subdomains that could flag them as spam signals. Obviously, excluding entire domains is tricky business, but it's also much more practical in cases where greater scalability is required.
Linking Practices
Just as with individual pages, the engines can monitor the kinds of links and quality of referrals sent to a website. Sites that are clearly engaging in the manipulative activities described above on a consistent or seriously impacting way may see their search traffic suffer, or even have their sites banned from the index. You can read about some examples of this from past posts - Widgetbait Gone Wild, What Makes a Good Directory and Why Google Penalized Dozens of Bad Ones, Google's Sandbox Still Exists: Exemplified by Grader.com, and How to Handle a Google Penalty - And, an Example from the Field of Real Estate.
Trustworthiness
Websites that earn trusted status are often treated differently from those who have not. In fact, many SEOs have commented on the "double standards" that exist for judging "big brand" and high importance sites vs. newer, independent sites. For the search engines, trust most likely has a lot to do with the links your domain has earned (see these videos on Using Trust Rank to Guide Your Link Building and How the Link Graph Works for more). Thus, if you publish low quality, duplicate content on your personal blog, then buy several links from spammy directories, you're likely to encounter considerable ranking problems. However, if you were to post that same content to a page on Wikipedia and get those same spammy links to point to that URL, it would likely still rank tremendously well - such is the power of domain trust & authority.
Trust built through links is also a great methodology for the engines to employ in considering new domains and analyzing the activities of a site. A little duplicate content and a few suspicious links are far more likely to be overlooked if your site has earned hundreds of links from high quality, editorial sources like CNN.com, LII.org, Cornell.edu, and similarly reputable players. On the flip side, if you have yet to earn high quality links, judgments may be far stricter from an algorithmic view.
Content Value
Similar to how a page's value is judged against criteria such as uniqueness and the experience it provides to search visitors, so too does this principle apply to entire domains. Sites that primarily serve non-unique, non-valuable content may find themselves unable to rank, even if classic on and off page factors are performed acceptably. The engines simply don't want thousands of copies of Wikipedia or Amazon affiliate websites clouding up their index, and thus take algorithmic and manual review methods to prevent this.
Penalty Signs & Re-Inclusion Procedures
--------------------
How to Know If Your Site's Been Penalized
It can be tough to know if your site/page actually has a penalty or if things have changed, either in the search engines' algorithms or on your site that negatively impacted rankings or inclusion. Before you assume a penalty, check for:
- Errors on your site that may have inhibited or prevented crawling
- Changes to your site or pages that may have changed the way search engines view your content (on-page changes, internal link structure changes, content moves, etc.)
- Sites that share similar backlink profiles, and whether they've also lost rankings - when the engines update ranking algorithms, link valuation and importance can shift, causing ranking movements.
Once you've ruled these out, follow this flowchart for more specific advice:
While this chart's process won't work for every situation, the logic has been uncanny in helping us identify spam penalties or mistaken flagging for spam by the engines and separating those from basic ranking drops. This page from Google (and the embedded YouTube video) may also provide value on this topic.
Getting Penalties Lifted
The task of requesting re-consideration or re-inclusion in the engines is painful and often unsuccessful. It's also rarely accompanied by any feedback to let you know what happened or why. However, it is important to know what to do in the event of a penalty or banning - hence, the following recommendations:
-
Make sure to thoroughly review the data in your Webmaster Tools accounts, from broken pages to server or crawl errors to warnings or spam alert messages. Very often, what's initially perceived as a mistaken spam penalty is, in fact, related to accessibility issues.
-
Send your re-consideration/re-inclusion request through the engine's Webmaster Tools service rather than the public form - again, creating a greater trust layer and a better chance of hearing back.
-
Full disclosure is critical to getting consideration. If you've been spamming, own up to everything you've done - links you've acquired, how you got them, who sold them to you, etc. The engines, particularly Google, want the details, as they'll apply this information to their algorithms for the future. Hold back, and they're likely to view you as dishonest, corrupt or simply incorrigible (and fail to ever respond).
-
Remove/fix everything you can. If you've acquired bad links, try to get them taken down. If you've done any manipulation on your own site (over-optimized internal linking, keyword stuffing, etc.), get it off before you submit your request.
-
Get ready to wait - responses can take weeks, even months, and re-inclusion itself, if it happens, is a lengthy process. Hundreds (maybe thousands) of sites are penalized every week, so you can imagine the backlog the webmaster teams encounter.
-
If you run a large, powerful brand on the web, re-inclusion can be faster by going directly to an individual source at a conference or event. Engineers from all of the engines regularly participate in search industry conferences (SMX, SES, Pubcon, etc.), and the cost of a ticket can easily outweigh the value of being re-included more quickly than a standard request might take.
Be aware that with the search engines, lifting a penalty is not their obligation or responsibility. Legally (at least, so far), they have the right to include or reject any site/page for any reason (or no reason at all). Inclusion is a privilege, not a right, so be cautious and don't apply techniques you're unsure or skeptical of - or you could find yourself in a very rough spot.
I'm getting pretty excited here. Just one more section to go before the appendices, and then it's home free!
As always, feedback, corrections, suggestions and edits are greatly appreciated.
Great post with lots of valuable information. And while I agree that there is indeed a wall between paid and organic search results, I have an anecdote about a tiny gap in the wall. When I was working at iNest Realty, our most critical branding rule was that the "N" in "iNest" always had to be upper case and all other letters lower case. The issue was that "Inest" was often read as "incest". (We occasionally received some interesting referrals from X rated sites that had fat fingered us as their link-to address -- must have been sort of a disappointing experience for a pervert to get delivered to a real estate site - but I'm getting way off track from my original subject). If a visitor typed iNest into the Google search box, it returned the following, "Were you searching for incest?". This result was making us crazy. We complained to Google about this bitterly via every method we could find to contact them. Finally, and it took three requests, we were able to cajole our Adwords account representative to get the change make. So, it is possible to penetrate the Chinese wall, but it probably helps if you can make a good case that doing so will improve the user experience.
Great example, Randy - thanks so much for sharing. I have found that just getting a message to the right folks on the search team that improves user experience will generally be taken seriously. Actually, one of the best ways to do that, IMO, is through the Google Webmaster Help forum - https://www.google.com/support/forum/p/Webmasters?hl=en
It seems that the best way of contacting to Google is the way how Aaron did it. https://www.huffingtonpost.com/aaron-greenspan/why-i-sued-google-and-won_b_172403.html
Holy crap! So I have to get mixed up with incest to get in touch with the Big G? I'm not sure I even want to now...
@Rand
You forgot to link to the post you mentioned in bullet 1 of the 'Manipulative Linking' section.
One myth i've come across a few times is the belief that search engines will only spider dynamic URLs with less than 3 variables. While it is obviously very beneficial to use static URLs, I have experience of using complex dynamic URLs that still work.
I just noticed that was the last major section - congratulations! Time to get this bad boy published already ;)
Slightly on the fringe of off-topic, but I think keyword density has become such a persistent myth because it not only used to be effective, but still matters to some extent. I think clients get confused when we start them out with keyword research and talk about how it's important to get the right keywords into titles, descriptions, headers, and content, but then go on and on about how keyword stuffing is bad.
Now, I know what we (the SEO community) mean, and we are right about this, but the distinction is sometimes hard to communicate. Yes, keywords are still important, and strategically placing them matters, but more isn't necessarily better, and there's such a thing as too much. It's a lot more nuanced position than the old-school "Put in your keyword exactly 13 times" approach.
Pete this is a good point.
Imagine a somewhat SEO-knowledgeable client who is having a conversation with his SEO:
SEO: Ok, we did all of the research and we think these are the keywords we should target to get you the greatest ROI (hands over list of keywords)
Client: Great, so what does that mean when you say "target".
SEO: Well, we want to let the search engines know that these are relevant pages to these search terms - so we're going to use these keywords in the titles of the pages, in the tags, in the tags, in the headers of the page and then we want to make sure we use the keywords in both internal and external links pointing to the page.
Client: What about the actual copy of the page, shouldn't the keywords go there?
SEO: (looks confused)... ummm no. Why would we put the keywords there? That doesn't matter at all.
---
As you can see from my little one-man play that knowledgeable clients might find this all a little confusing - since they a) know that search engines do indeed crawl and index the body of the page and b) they've probably heard about keyword density.
I think its also important to note that if you insist on "stuffing" keywords or aiming for some set density (15.821%) that you try to use synonyms and such. The semantic web is here, use it to your advantage.
One thumb up for turning my comment into a 1-man play :)
That's just it - 0% vs. X% is a keyword-density issue. You wouldn't want to have zero keywords on your site, and we are strategically placing keywords, we just aren't "stuffing" them in the sense of thinking that it's only a numbers game. It's a distinction that's understandably hard for clients to grasp.
Next time I'll use puppets.
I know how you like the puppets. :-)
*pow, pow, pow* Shooting SEO sharks dead left and right.
So very, very useful. Thanks!
*pew pew pew*
Use my lazer gun it works better!
submitting to search engines still has its place - case in point - I recently created a website that had virtually no competition for the keywords that would be used to find it. Also, I didn't have time, nor the neccessity to do online marketing or SEO (promotion was word of mouth and offline channels)
So submitting to google was a quick way to make sure the audience that I was tarketing would be able to find the site without knowing the address.
2 minutes of work now has the site ranking number one of the crucial keywords.
Great info, thanks very much!
Nice post Rand. Very useful information. I have to say, considering that Google has a tool like webmaster central to communicate with webmasters they really don't communicate that much information about penalties. I realize it's not their problem if your site loses it's rankings but they really should try and do a better job to let you know if your site has been penalized.
Great article, takes a while to read but covers pretty much every basic thing you need to know to start seo.
Really Good!
Great post. One thing I noted when I first went to SEOmoz was the use of the Term Target tool. It seems to encourage keyword stuffing because it gives you a grade on your keyword terms... the number of times it appears.
I found Keyword Stuffing does work for local or niche sites. Probably because there's not going to be many links to your site. Anchor text on your internal links and sitemaps are also big on local sites too.
What are your thoughts on that?
Well, an A+ is still way, way below the threshold for keyword stuffing, so I certainly hope it's not encouraging that behavior! If anything, we'd wanted to avoid that by showing that there's no additional benefit (at least in the tool's grading) once you've used the keyword a few times in the body content. Mostly, it looks if you're missing it in key areas of opportunity like the title, H1, URL, meta description, body text, alt tags, etc.
Great flow-chart and some hints, which save me some steps when starting my next project. And I'm looking forward for the next Beginner's Part X: The top secret Hints - picked up fresh from Google-garden.
Just joking...
Very timely post, thank you, Rand.
Currently working with a client that was previously penalized because an SEO company promised great results...and used hidden text to attempt to do that.
sigh
Will be submitting a re-inclusion request once everything is cleaned up and pages optimized...
THANK YOU!
Helpful article - liked the flowchart.
I am a big fan of the Google Webmaster tools.
Just wondering whether keyword density makes a difference anymore with a search engine like Google?
yep that chart is useful, just used it to check a problem with one of my sites. Problem solved, just a drop in rankings phewwww.
The thing I find really annoying and struggle with is convincing people on a board of directors that the above is accurate. When they hear a pitch from a agency who offer to submit the site it just sounds nice and while they can have it explained over and over that this does not count no progress ever comes from it. I think they are of the view it wont do any harm and it really does sound as though it should work
Basic ethics of SEO being well Re-written. Good starters guide for SEO's, with simple explanations.
I Recently joined Seomoz,
most obvious thing i like to read your posts which i do regularl which concerns mostly for Quality Search.
I think some of this thing still works, butthe problem is the competition. These thngs use to work very effectively because less people are still doing it, and the crowd isn't that big yet... but not anymore.
But we can all be creative with ways to promote our site right?
By the way. here’s an additional list of SEO misconceptions that might just help us be more educated. And not waste our efforts. Link: https://www.squidoo.com/SEOmisconceptions
This was a genuinely useful post. So many points discussed and so many of my myths are busted.
Thanks!
Mythbusting posts always put me in a great mood. I know of more than a few spammers who could benefit from reading this.
I am curious about a top position for "buy viagra" being able to bring in $20,000 a day. Seems absolutely ludicrous. The same guys have been controlling that term for a long time and they've been using the same methods to do it. The only thing that really changes is the sources of authority they leech off of, but even those seem to last for quite a while.
What really baffles me is that their link sources are way, way, way overspammed, probably using botnets. I'm talking nearly 100% of outbound links being to whatever URLs these guys are dropping on a given week. Abandoned guestbooks, forums, blogs, gallery2 installations. Pure garbage. It makes me wonder a bit about how outbound link penalties really work, if they do at all.
On spamming the search engines, I'm glad that you put it that bluntly. "Search engines realized that users don't like spam." While I was doing SEO freelance and waiting tables as well, someone at the restaurant asked if he could build a site that sold random crap but have it rank well for "buying a car" or "car sales" as he assumed that a lot more people were looking for cars as opposed to the treadmill he was trying to get rid of.
I asked if he really felt that someone looking for cars would be happy to find his irrelevant site (even though it wasn't feasable anyway, and I'm pretty sure he had no real intention of building a site), but even he could see in two seconds why that would piss someone off. People hate spam. People hate irrelevancy. That's why they love Google.
This is a great source of information. SEO can be a confusing process but you have simplified this beautifully. As a designer myself, I know what a challenge it can be trying to combine all different standards that will help benefit a website. The chart diagram in particular is very helpful. It would be nice to see more css coding tips and tricks.
great post Rand.
Especially on 'Search Engine Submission': you need to tell a client the difference then between submitting a site (which is no longer valuable), and indexing a site thanks to the sitemap.xml for instance. What do you think? is it correct to present it that way ?
Submitting is a technique of indexing.
Rand Claims that indexing to poor directories or search engines is not effecient as getting valuable link from a trust page.
You are trying to put so called "SEOs" out of business, aren't you?
I am thankfull that somebody finally had the time and will to put everything in one place. Instead of referring to several resources, I can now just rely one. What a saviour! It does tell me that I have been lazy enough not to put one together myself :)
The reiclusion bit was fab, I particulary liked the flow chart representation - easy yes no logic.
Did you intentionally miss out on Page Rank?
Good post Rand, goes a long way to debunking a lot of the old myths.
Increasingly it seems more and more people are becoming aware that quality of content and user experience are key factors for SEO (in terms of drawing links), yet I'm sure there are already plenty of people trying to find ways to manipulate this either through "theft" of content or other means.
Great post & flowchart! Rand, the link to the Statistical Analysis algorithm on Microsoft website is missing. Too bad because i was very curious to see what the analysis document was about.
Clear and concise as always...This really is a goldmine of information that I'd otherwise have to spend hours trying to drag out of numerous other posts, often written by people who are out of touch with current SEO practice. Many thanks.
Rand nice post .
According to me there is no harm in keeping meta tags though they are becoming less and less useful but still even if they do one good they need to be there.Secondly when we talk about code to content ratio what I feel is if you have lot of code at the top then having some content in meta tag can be beneficial .
Meta description is very useful if you want to take charge of what's written below your link in the serps. In addition to that if the search phrase is mentioned in the description it will be bolded adding to another factor of relevance in the eyes of the user.
Hey Rand,
Glad to see a revision of the excellent beginners guide. I find it helpful to get back to basics every so often to ensure I haven't made the schoolboy error of believing my own hype.
Getting the 'basics' right is key to pioneering you own techniques and keeping up with the myriad of changes going on in search - bring on the new version!
Great post once again Rand.
The chart is very useful for those times when we aren't seeing the ranking results we expect and get paranoid of the big G slapping us with some type of penalty...like a hypochondriac!
While I agree with your china wall comment, in my experience I have noticed that in less competitive niches, you can get google organic traffic after running adwords for a week. The traffic isn't great (15-30 visitors/day) but not bad considering I didn't do any other link building. Note: the sites have a few pages of content.
Perhaps i should document this with graphs/traffic data in a youmoz post...
What about hidden conten? Not quite cloaking since no one really sees it. I did an experiment with putting an mp3 player in a hidden span. Google keeps throwing it out and reindexing it. It's almost a weekly happening.
Wow! What a way to pull it all together. I'm really keen to go back and read/reread the posts you highlighted about best practices for linking. -Dana
Now if only there was a good way to let Google 'n' friends know that you found people who have violated the guidelines. Report Spam doesn't seem to result in lowered rankings or exclusions of any of the sites I reported in the past years.
Awesome Post. Its always nice to have a little reflection on the evolution of SEO.
Thanks for another great post Rand!
Still being relatively new to SEO myself, articles like this are very useful.
Keep up the good work!
I have seen many sites most of them do not look this good...
Regards,
<a href="https://www.saibposervices.com/e-accounting-finance-and-book-keeping.aspx"> accounting outsourcing services</a>
You can make keywords meta-tag work for you in one way: use it as a reminder to page editors of the target words/phrases.
Also, I must check if the awesome-bar uses keywords to look up past visited pages? Anyone know?
"AwesomeBar will match what you’re typing (even multiple words!) against the URLs, page titles, and tags in your bookmarks and history, returning results sorted by “frecency” (an algorithm combining frequency + recency)."
- edit: they don't I was confusing the user-tags with HTML tags
Ok, i know it is very hard to write them all, but didnt u think that it will take a century to write all parts in this tempo? :)
good luck Rand