Last week, Google held a live chat session with a number of terrific engineers from the spam, search quality & webmaster central teams. Barry Schwartz posted a text transcsript of the chat up on SERoundTable that I read through, hoping to find some interesting nuggets of information to pass on. Unfortunately, the space given to the Googlers for responses was only 3-4 lines (due to the WebEX client used), so there wasn't much of an opportunity to provide detail.
I thought it would be more valuable to provide the answers from Google alongside the answers I would have given. Hopefully, in this fashion, folks can compare side-by-side.
Andrea Moro - 5:08 pm
Q: What about a feedback status on Spam Report? I mean when I report spam site, I immediately get a message that the suggestion will be taked [sic] on mind, but nobody let us know when, or if the reported site or submission are right or not.
Matt Cutts - 5:15 pm
A: Andrea, normally we're able to take a look at the reports pretty quickly. I like the idea of giving a little more feedback though.
Rand: Google can take anywhere from a day to 2 years to take action on spam reports. Generally speaking, unless the violation is egregious (or appears publicly in the media), Google likes to find scalable, algorithmic solutions to spam issues. Thus, they'll take your report, compile it with dozens of similar reports of the same types of violations, and work from an engineering perspective to come up with a solution that will catch everyone using the tactic, not just the single site/page you reported. We've filed spam reports with Google through clients on numerous occasions and it's very rare that any fast, direct action is taken. In several cases, reports that were filed a year or more ago for cloaking, keyword stuffing, and link manipulation still haven't seen any results.
My best advice, if you're seeking to really get a competitor booted from the index or penalized in the SERPs immediately, is to write about them on major SEO-related forums or submit a thread at Sphinn or a blog post to YOUmoz. When spam is reported publicly, Google tends to take action much more quickly and directly.
BTW - For a much better answer to a very similar question, see Susan Moskwa's response later on, which read:
A: We usually use them to improve our algorithms, so changes may be more long-term than immediate. But we definitely take these reports into consideration.
https://googlewebmastercentral.blogspot.com/2008/06/impact-of-user-feedback-part-1.html
--------
seth holladay - 5:14 pm
Q: how do you define and penalize duplicate content? are syndication deals excluded?
Mariya Moeva - 5:15 pm
A: Hi Seth, we just did a post on duplicate content on the Google Webmaster Central blog which has a lot of useful information that may be helpful for you
Rand: Sadly, syndication deals are not excluded, but I also wouldn't necessarily say that duplicate content is always penalized. I believe the post Mariya is referring to is here - Duplicate Content Due to Scrapers. It's a solid discussion of the topic, and notes that most of the time, you're not going to encounter real "penalties" for copying content, you'll just have those pages filtered out of the results.
However, in any syndication deal, you need to carefully manage expectations. If you are licensing the content out, you need to decide whether you still want the majority of search traffic to come to your site. If so, you'll want to write rules into the contract requiring links back to your original version, and possibly even request the use of the meta robots directive "noindex, follow" so the engines don't get confused by another version. On the flip side, if you're taking the content, it's very wise to make sure you know how many other parties have licensed and posted that same content piece, whether you're required to link back to the original source, and what rules exist on search engine indexing. Many times, new content properties or smaller content websites will experience search traffic and rush to acquire more content without thinking through the consequences. I'd strongly suggest reading these four pieces on the subject:
- Ranking as the Original Source for Content You Syndicate
- The Illustrated Guide to Duplicate Content in the Search Engines
- When Duplicate Content Really Hurts
--------
brian vastola - 5:08 pm
Q: beside content, what are the top 3 things to do to your site to reank [sic] high ( short version)
Susan Moskwa - 5:22 pm
A: https://www.google.com/support/webmasters/bin/answer.py?answer=40349
Rand: I'm not sure how much the linked-to page on building a search-friendly site will answer a question about important metrics in the search results algorithm. If you want the opinions of some very smart SEOs, I'd check out the search ranking factors document. The real answer here is that we don't know for certain, and Google wouldn't be able to freely share this information in a direct, transparent, accurate way because it's part of their proprietary operations. However, if you wanted just my personal opinion, that's here.
--------
Tim Dineen - 5:22 pm
Q: What can we do to get the geo-target country correct when ccTLD isn't available and Webmaster Tools declaration (3-4 months ago) did nothing.
Matt Cutts - 5:26 pm
A: Tim Dineen, I think we offered the feature in the frontend and then started supported [sic] it in the backend a little later, but I believe that we handle the geotargeting in the webmaster console pretty quickly these days.
Rand: There are a lot of other factors besides just the Google Webmaster Tools declaration that can help to put you in the right country for geo-targeting. I'd think about first using a domain name with the proper ccTLD. You mentioned that the right name wasn't available - I'd consider some other alternatives before giving up. I'd also make sure to host the site (whatever the TLD) on an IP address in the country you're trying to target, using the language of that country, getting links from other domains from that country, and registering with Google Local/Maps with a physical address in the country. Adding the physical address to the pages of the site and getting listings in local directories will also aid you. We've experienced the same problems with the Google Tools country-specific targeting and find that in general, although it suggests that it will solve the issue, there are actually a myriad of factors Google considers before they'll "take your word" from Webmaster Tools that you're actually intended for a country-specific audience.
--------
Jonathan Faustman - 5:21 pm
Q: Will hiding navigation items with css (that are displayed on certain pages/directories) have a negative impact when google indexes the site?
Mariya Moeva - 5:26 pm
A: Hi Jonathan, when building your site and considering hiding navigation elements, it's best to always think, "is this good for my users?" and "would i do this if there were no search engines?
Rand: I've got strong opinions about the phrase "Would I do this if there were no search engines?" In fact, I believe it needs to be dropped from the engines' lexicons. We wouldn't register with Webmaster Tools, we wouldn't noindex duplicate content, we wouldn't use meta tags (and many times even title tags), we wouldn't nofollow paid links, we wouldn't create sitemaps, we wouldn't build HTML alternatives to Flash and we wouldn't worry about CSS issues or AJAX if it weren't for search engines. Asking us if we'd do something if there were no engines is a completely useless way of thinking about SEO or website accessibility in the modern era.
That said, Jonathan, I'd say that so long as the number of elements is very small in relation to the amount of content on the page, and so long as you're providing easy, intuitive ways for users to reach those navigational elements, you'll probably be OK. SEOmoz itself fell under a penalty for keeping a large amount of content on a page in a display:none style, though it was in a perfectly legitimate, user-friendly way. Be cautious about how you hide content from users and what search engines might misinterpret - you can't just build for one or the other if you want to have a successful SEO strategy. I'd have to look at your specific page to make a judgment call, but my general advice would be to walk on eggshells when it comes to hiding navigation with CSS, and do it sparsely.
--------
Peter Faber - 5:11 pm
Q: Question: Suppose you rank #1 for inanchor: intitle: and intext: But for just the keyword phrase you're on second page. Any tips on what to do to get up for just the phrase as well?
Matt Dougherty - 5:29 pm
A: Hi Peter, echoing what John just talked about, I'd say making sure your content is useful to users is the best approach.
Rand: Matt's answer really frustrates me, as it is almost a non-sequitur to the question. Peter - we see rankings like that happen quite a bit as well, and very frequently it has to do with how Google is ranking in the normal results vs. more modified searches. Intuition and seeing a lot of SERPs like this tells me that some element of the trust factor and domain authority algorithms aren't coming into play as strongly with the inanchor/intitle/intext results. Generally, when I see those sorts of results, it means you're close to achieving your rankings goals, but need to get more "trusted" links from high quality domains into your site. We also see rankings like this when a "sandbox" like effect is in place - it could be that one day, you'll see your domain "pop" out of the lower rankings and into top positions for many of these searches (what SEOs call "breaking out of the sandbox"). So, the good news is that you're doing a lot of things right, but the bad news is that you either need more trust juice (from high quality links) or time (to "break out") before you'll achieve those rankings in the normal SERPs.
--------
Wall-E The Robot - 5:30 pm
Q: I have some Amazon Associates webstores. Obviously they have the same content that Amazon has. And obviously Google sees duplicate content and dont indexes [sic] a lot of my webpages. Do you have any suggestions on how to solve this?
Mariya Moeva - 5:31 pm
A: Hi Wall-E, as long as your Amazon Associates store provides added value to users, there's nothing you should be worried about
Rand: I'm worried that Mariya's answer here is misleading. Wall-E has a lot to be worried about, even if the store adds value to users. First off, if the pages aren't getting indexed, duplicate content might be one issue, but PageRank/link juice accumulation might be another. Google has a certain threshold it likes pages to reach in terms of PageRank before those pages earn the right to be in the main index. If Wall-E's earning lots of good, high quality external links to his site, looking at the internal link structure to ensure good flow of that juice through the pages would be a good start.
On the duplicate content issue - Amazon's always going to have the benefit of the doubt when it comes to who owns the content. The best solution here is not just to create value for users, but to stay away from making large portions of copied content indexable - using iframes or only minor snippets at a time is important. You probably also want to find automated ways to change some of the input fields you receive from Amazon. Just copying the titles, prices, categories, tags, photos, etc. could get you into dangerous territory. Google has a requirement for indexation that each page you produce meets a certain, secret threshold for unique, valuable content - you'll need to solve that issue in order to achieve consistent rankings.
--------
Robert Longfield - 5:16 pm
Q: Further on Geotargetting. I run a multinational site with about 12 different languages being supported. We are implimenting geotrageting [sic] so users are directed to the appropriate language page for their country. The concern of some is that Google may penalize me...
John Mueller - 5:35 pm
A: I would recommend not redirecting users based on their location. This can be a bad user experience. It's better to allow a user to choose his version based on his searches.
Rand: Such brazen hypocrisy! Google can geotarget its search results, geotarget its homepage, geotarget many of its other service pages, but heavens forbid anyone else do it. This is ridiculous. Robert - I'd say to simply do a quick check before you redirect your users. If their browser accepts cookies, feel free to drop one, re-direct them to the appropriate page and let your user data, feedback and analytics tell you whether or not it's the best experience or not. If the browser doesn't take cookies, drop them on an international landing page that lets them choose their country/language - this will also work well for search engine bots (which don't accept cookies), and will be able to find all of your country-targeted content.
--------
David Thurman - 5:39 pm
Q: Does Google favor .html over say .php or do you treat all URL's the same
Bergy Berghausen - 5:41 pm
A: A URL is a URL. As long as it's serving content we can read, Googlebot doesn't care what the file extension is.
Rand: I hate to be a stickler, but didn't we just go through an episode over file extensions? Bergy really should point out that .exe, .tgz, and .tar (and .0, although that appears to be getting cleaned up) aren't indexed by Google.
--------
Gordy Harrower - 5:40 pm
Q: Do Google tools help a site's ranking?
Mariya Moeva - 5:43 pm
A: Hi Gordy! The most important thing to focus on is the quality of your site and the content that you provide for users
Rand: That's a really obvious, unhelpful non-answer, and disappointing to see. I think the message being conveyed here is that Google can't answer the question, which to my mind is a "probably." I've long suspected that if you run a statistical check on all the sites that have ever registered with Webmaster Tools vs. those that haven't, you find a much lower correlation with spam in the lower group, and thus, it might be a metric used in judging trustworthiness, even if it doesn't have a direct impact on rankings.
As an aside, anytime Google (or any of the engines) can't give an answer, I think it makes them look so much better when they say something like, "That's the kind of question we can't answer directly, because it's in regards to our ranking system, which we need to keep private to help prevent spam." I have so much more respect for that directness and treatment of the webmaster/question-asker as an adult and a professional than I do for the "make good sites!" malarkey. From a corporate communication standpoint, it instantly flips that goodwill switch that Google has ingrained into most of the web-using populace from "you're awesome" to "Oh man, seriously? I thought you guys were better than that."
None of this is to say there weren't some really good answers from Googlers, too. In fact, I'd say it was about 1/3 terrific answers, 1/3 mediocre and 1/3 seriously lacking. I also recognize that Google has absolutely no obligation to do this, and by engaging with webmasters in a public chat like this, they're leaps and bounds ahead of Microsoft & Yahoo!. Kudos to Google once again for their efforts to reach out.
All I really want to highlight with this post is that, for Google, or really any representative of any company or organization (US government, I'm looking in your direction), responding to your audience in direct, honest ways (even when you can't be fully truthful or revealing) gives you far more credibility and more respect than hiding behind irrelevant links/references or repetitive, company-line jargon. Whether it's at a conference, during an interview, in a private conversation or online, there's a higher level worth aspiring to, and I'm both inspired by the efforts to date and left with the feeling that even more could be done by Google's public faces & voices. Here's to hoping.
EDIT: When writing this post, I failed to realize that WebEX was the client used for the chat, and that 3-4 lines was the maximum amount of space available for responses. I think this blunts a significant amount of my criticism (and teaches me, once again, that I should try to understand more about a situation before I antagonize). I've changed the title of this post to help reflect.
As the person who spends quite a bit of her day in our PRO Q&A section, I find this really interesting. I can assure you that if I answered "create good content!" to every second question, we'd have a lot of people unsubscribing from Q&A.
Even acknowledging the limitations on characters, I can't think of a good reason to provide the answer, "do what's best for users, as though search engines didn't exist" when presented with a valid SEO question. It's always irked me that that phrase is even in the webmaster guidelines. We - and they - are here because search engines exist, but we're dealing with people whose level of knowledge surpasses the "good content" doctrine. Instructing experienced SEOs and web developers to "produce good content" is like telling Formula 1 drivers to do up their seatbelts. Of course it's important. Now what else do we need to do?
I recognise that these are only a small sample of the questions and answers. However, this is probably why SEO blogs and informational sites exist. Whilst we don't have access to the same information as Googlers, we're able to be way more candid in our answers.
You want to give kudos for Google for doing this, but I agree with Rand that a response of, "we can't tell you that," is more satisfying than feeling like you're being fed a useless, boilerplate answer.
Excellent point about the seat belts Jane . . . I completely agree.
HOWEVER, imagine trying to answer all of the Q&As with only 3 lines of text and still be sufficient in your answer. THEN try to answer all of the Q&As in real time. THEN try to do so when you aren't the expert on the particular subject matter.
That's what Google was faced with . . . did they learn something from it? Yeah, perhaps they learned a better format would be good but they tried and I feel they tried valiantly.
Brent D. Payne
I agree that they did a great thing by trying this, and there were plenty of good responses. I do however imagine that they could have perhaps had a bit more foresight into some of the difficulties.
This almost seems to come down to whether "we can't tell you that" would have been a more respectful answer. No one likes to be fed a line, and "The most important thing to focus on is the quality of your site and the content that you provide for users" and "I'd say making sure your content is useful to users is the best approach" is totally at odds with the skill level of someone who asks a question about searches involving inanchor, intitle and intext or Webmaster Tools.
This said, you're right: they were answering questions in real time and I'm sure they'll restructure where necessary next time.
To paraphrase..
I'm sorry, to be fair I'm probably being too harsh; the only reason we can belittle the answers is, I guess, because they tried (as Brent points out). But I'm still with Jane & Rand that honesty is better than platitudes, which is what some of them were.
I'm going to stop now before Matt or anyone thinks I have a grudge, as I really am glad that the Googlers try it's just that by doing so they put our expectations even higher..
Each line from a separate Brent Payne comment on this thread.
Brent, we get it. You don't need to provide the same response to every individual comment. Once is enough. Your comments alone now represent 20% of comments in this thread. Just saying there's no need to pulverize an already well-beaten dead horse.
LOL! Awesome comment! ;-)
I got a little 'overly passionate' about this one.
Glad to see you are comfortable enough with me to call me out on it.
Nice catch.
Payne
face it Brent, your a thumb whore :)
Nothing wrong with a bit of love Brent, but we are also here to actually try and get information out of Google as well. I think this has been summed up here and there already; kudos for the exercise Google, yes we do want to talk to you but NO, BAD Google - where is the real information?
Can I suggest afternoon tea and cakes in the park for the next Q & A session?
Yeah, how did you pick up on Brent's Google Ass-kissing Scott? You're like a super sleuth considering the way Brent had subliminally written those comments! ;) Brent, you know that old saying - "It's like fishing in a barrel".
Now get back to the blackboard and write the following one hundred times:
I won't kiss Google's ass in public...
The Google answers didn't strike me as not being "honest & direct", but rather "lazy and canned".
What's the sense in having a live chat session when all you're going to do is spew out the typical company line that we've heard a million times?
Google - next time you have a live chat session, maybe you should have a non-Google employee like Rand moderate, so you can be pressed for more thoughtful and valuable responses. Not kidding.
I'm with you on this one Sean - is it really that hard to just say 'Look we can't tell you that, however, here is a useful bit of information we can share' instead of 'Hey, great content and imagining we don't exist will make you rank super-well!'Although on the plus side I found Rand as informative as ever, every cloud eh?
"The Google answers didn't strike me as not being "honest & direct", but rather "lazy and canned"."
Amen brother. Amen.
Yeah but Sean to Google's credit . . . they only had 3 lines to answer a question that sometimes would require an entire blog post. Then to jump around the algo issues. The ranking questions. And still stay accurate both on a general level and on an exacting level to pass the critics of the SEO industry . . . damn tough to do.
I give them credit for TRYING. Plus, they had two dozen people in the chat session. Not all of them are experts in the Google Algo like Matt Cutts or Adam Lasnik may be. Heck I know a handful of them and their primary roles are much more niche than the questions they were answering in the Q&A. But when you have hundreds of clients online you need to have more than just Matt, Adam, and Maile answering questions. ;-)
Brent
Brent - I didn't realize the limit on the responses. I've updated the post's title and content a bit to reflect that, as my criticism was far harsher than it should haev been given that limitation. Thanks for bringing it up!
Good man Rand! I didn't mean to come off so harsh but these guys put a lot of effort into this and I didn't want the PR (public relations not PageRank) to turn against them and discourage these types of interactions in the future.
Thanks for being such a true leader in this space to make corrections when you feel you may be a bit off-base. It's what makes you more authoritive. We can trust you to be accurate.
Another win for SEOmoz thanks to Rand.
Brent
"lazy and canned"?? Seriously??
300 questions . . . 3 lines of text to answer . . . the WebEx is running late already . . . and you only have a couple dozen people that can answer questions.
What were they suppose to do?
Brent
With all due respect to Matt Cutts and the rest of the Google team, the fact is, Rand's post is symptomatic of what I would consider the underlying problem of relative evasiveness when it comes to Google addressing questions more candidly and forthrightly.
If this were not the case, and if this instance were an anomaly, I doubt very much that Rand would have felt the need to post on this topic, and at the very least, he wouldn't have done so in such a pointed manner.
While I understand and appreciate Matt's explanation that the forum did not lend itself to more in-depth responses, it doesn't change the root problem.
Perhaps brent, they were "suppose" to
1)Limit the number of questions to the number they could provide useful responses for
2)Get some chat software that allows more than 3 lines for a response.
"Make awesome content for your users!" is not an all purpose respose. Shame too. I was excited they were doing this, and thought some more high quality information could have resulted.
I'm glad the thought was there though(they didn't have to do this)...maybe it will evolve into something good later.
oh well. Perhaps the audio Q&A will offer more enlightenment.
Given new information...must update position.
Hi Rand, I enjoyed reading your take on these questions, but I think they would have been a mismatch in at least a couple ways: - We used WebEx for this chat, and WebEx had a hard limit of 3-4 sentences on answers, so we had to give short answers. All of the answers you give would have been severely truncated--that's why all of our answers were short as well. - With 300+ people, there were 500+ questions asked. The questions were literally scrolling by almost too fast to grab. That lended itself to shorter answers and giving url pointers as well, so that more people could get a reply to their question. I think people did enjoy the chat though. One thing we did at the end was to start grabbing questions and answering them by audio. That let us answer questions in more detail, but still answer a lot of questions quickly. You might want to check out the last 20-25 minutes of the MP3 audio of the chat where we answered a bunch of questions verbally.
I haven't finished listening to the audio. Thank you for giving me something to look forward to at the end!
And extended the timeline by 30 minutes to do so. Which shows a dedication to TRYING to get things done right. I have to give Google serious credit for that. A lot of webinars would have cut it at the exact time and left lots of people without answers. Google tried to get through as many as they could by extending the session. Good for them.
Besides . . . I am sure the team was briefed extensively to error on the side of being cautious. In a live environment it'd be easy to 'slip up' and give too much information. So Google's probably going to be more cautious in their answers in these forums.
Brent
Matt - please tell me you purposefully posted a comment that didn't answer any of the issues in a post about not answering questions/issues? It is irony of a truly magical scale
The point about the 3-line thing is obviously totally valid, but doesn't change the fact that some of the ideas verge on being misleading, whether intentionally or not (maybe highly open to debate is a better way of putting it).
As Brent points out in Google's defence, not all of these guys will have been experts in the issues raised (isn't there a saying that if you don't know something, you shouldn't say anything?) and Rand is absolutely right that even doing this is a big step-up on what other engines do.
I guess that the more you do, the more we expect - and that includes clarity in responses, or even being clear that you/any Googler can't respond to a particular question.
Matt - I noted this to Brent above, but given that limitation on time and particularly on response size (which I hadn't realized at all), I think a lot of my attacks were unwarranted. I changed the post title and some content to help reflect that - thanks for the clarification. Please do relay to your team the positives I've noted above - for a 3 sentence limit, there were some really good responses.
Thanks, Rand. It was kind of like a haiku at times. I'd be writing an answer and the text box would just... stop allowing new characters. So I'd have to back up and try to figure out how to say it in less words.
You need to twitter more and then you'd be used to answering in 140 characters or less. :)
hilarious kate . . . and very true. favorited . . . of crap this isn't twitter is it? ;-)
Considering Matt's response, relative to the limitations of "time and space", perhaps the Google team can take the time to answer these questions in a more meaningful fashion.
Based on the responses on this thread, such a move would clearly be valued and appreciated by the Google audience.
Good grief, thats got to have been annoying. I tend to glare at twitter when I run out of room to write. Its usually 10 characters short too...
Matt Cutts tries to help
out webmasters but gives up
and plays with his cats.
/haiku
Hhahahahhaahhah
Nice to see Matt Cutt's response to this post. Perhaps it's worth cutting and pasting his response in the actual blog.
Ok accepted that there was a 3 line restriction on the answers and a lot of people showed up and asked a lot of questions. So the answers in the Q & A chat session were brief. But what will be really nice to see from Google is another version of the document (may be a blog post) where Googlers give more detailed answers to (if not all, only those they consider worthy) the questions.Infact what will be more interesting is Matt Cutt’s version (if & when you have time) of randfish’s post ‘Alternative Answers to Questions from the Google Live Chat Session’
Best take-away from Rand's "translation":
AMAZED by Bergy's answer:
Maybe Google needs to read SEOmoz (the "Don't end URLs in .0" post Rand linked to, AND they should read their own Matt Cutts' blog, "Don't end your URLs with exe."
Good job to Google for stepping up and taking so much heat from a group who's soul purpose is to dissect everything you do and try to analyze it then possibly exploit it.
As SEO and SEM pro's thats what we do, we figure out what Google is doing and make it work to our advantage. And while it would be great for Google to answer everything in detail, I don't expect, nor do I deserve any more than what the other two (coughMSN&YAHOO!cough) are giving up... nothing.
The only thing I would ask of Matt and the others is to be forthright. You don't have to answer everything, but I'd rather hear "I don't feel comfortable answering that, check our blog" or "sorry, that one is a secret" than hear a less-than-truthful answer or outright lie.
I'm not saying they were lying, but at times it did feel as though there were not-so-accurate answers being given.
Cheers,
@trontastic
It's not lying if it is an honest mistake.
Secondly, MS and Yahoo! are much more forthcoming at conferences than Google is . . . Google is nearly worthless at conferences (at least at the mic, get them alone and it's a little tiny bit better).
Nathan Buggia is actually very forthcoming at conferences and I hope that MS will push Google to be more so as well.
I couldn't agree more. Its not lying if its an honest mistake.
and yes AT CONFERENCES they may be...but the other 350 days a year not spent at a conference, you could hear crickets chirping.
Rand took the post right out of my mouth. well.. he wrote it better, so I don't mind. I did intend to write a post about how terrible some of the answers were in that session. Maybe I still will...
anyway:
ZING!
If there were no search engines, we wouldn't do any of this stuff, cause we wouldn't have to... we'd all have different jobs. Maybe I would be a pig farmer, or cinderella at Disney World. Who knows, the point is... there are search engines, therefore there are SEO's, therefore we do stuff to rank.
It's a fantastic idea to not have search engines... go ahead google, you make the first move, and we'll have to follow ;)
There is a quote from either Frank Abagnale or Kevin Mitnick that goes along the lines of "Security through ambiguity is the worst type of protection."
I think it's applicable to this thread and Google's lack of public disclosure. If they release too much information they risk having that information exploited. Is it a good situation? No, because it stops people who could utilise the information in a positive manner from having the information.
corporationydoublespeak!!
Rand, I like your response on the "would i do this if there were no search engines bit". I would never make a title tag ever. Or I would have only:
<title>anti-corporationydoublespeak website</title>
I wouldn'g go so far as to saying there were really "terrific" answers... :) there were a few good ones but still not worth going through the huge transcript...
I do realize that that was more about communication than providing any valuable answers. Googlers do often help a lot via their blog and groups but more often the only universal suitable for any occasion ("non-") answer is "do it for people, not for search engines" that I can't keep hearing any more and that is misleading and totally untrue.
Like Donna said, that should have been said and I hope that will be further discussed...
Thanks for taking "review Google transcript" off my task list.
Agree that more information and help is on Google Groups.
Excellent post, Rand. It needed to be said.
Some of those answers seriously reminded my of what you hear from a politician during an election year. The format is part of that, but why don't they do a more in-depth question and answer, and make it recurring?
From the transcript...I love it!
did they use WebEx in purpose, so that they could not really answer the questions and therefore just give an answer like "make better content dude!" ?
Time for Google to buy WebEx or roll out "Google Conferencing."
I can understand that Google doesn't want to say it's limits in case people exploit them, but for some of those answers you listed Rand, the Googler's might of well said "make great content and they will come, content is king" X(
Rand
Thanks for adding some additional details to my question on the URL's.
Overall, I was glad to see Google at least let everyone interested a sit on the front porch, not quite like letting us into their kitchen, but then again I don't really blame them.
I like your interpretations of the answers Rand!
Yes, I wish google would give some personal feedback on a spam report so at least we know someone is looking into it.
Speaking of a Spam Report - has anyone ever looked at the Sponsored Results for the keyword "Viagra" on Yahoo!
There are legitimately (according to Yahoo's policy) 2 results out of 11 that should advertise there. Sounds like the perfect timing for my first Youmoz post.
Let's say, I have spam reported some one and Google takes an action with 2 days. How do I know that an action has been taken.. What is the most obvious symptom that an action has been taken?
Excellent post with a lot of useful information about Google. I find it very informative when you do these answers and question sections and post them online. I look forward to reading more of your great work soon.
ummm... what was wrong with just saying "Yes" to this question?
Gordy Harrower - 5:40 pmQ: Do Google tools help a site's ranking
Great answers rand !! hats off !!
Thanks Rand for putting your perspective on these questions. I think, combined with the good answers directly from Google, this is a really useful set of information on a number of topics.
John Mueller - 5:35 pmA: I would recommend not redirecting users based on their location. This can be a bad user experience. It's better to allow a user to choose his version based on his searches.
Just like most Google answers, this amounts to nothing but towing the company line. So if you geo-target and serve up pages based on locations you can't be helping the user. So if you sell cars nationally. what should you do. Google think the users in New York want to see the cars for sale in Montana. I don't understand how that is possible.
I personally don't know why Google ever says anything. If they open up then it can be exploited. When they attempt to inform it's so vauge that it can be painful. I would rather hear nothing sometimes than get spoon feed answers.
It's the response that Google has been warning people of for the past several months. Make sure what Googlebot sees is what the human sees. Something changed in their algo that made this more important (in my opinion) so they are getting the word out more about that situation.
Plus, John Mueller is in Switzerland and works on a team not as closely associated with the direction of this question. He gave as fine of an answer as he could under the circumstances of having hundreds of questions to answer, having to do it in only 3 lines of text, and on a topic he's not as closely associated with . . .
Brent
i completely agree with both sides of the issue. I see that on one hand, google needs to keep some aspects of the alogrithm to itself to remain in the position they are in.
I can see it also from an SEO point of view that we are out to get our clients the best result based on our knowledge, skills and abilities in our work.
Where I think it gets blurry and very disappointing, is when key areas of userability and ranking in SERPs are given such a rubbish answer. In particular im refering to the issues mentioned of dupe content and geo targeting.
If there was one flat direct comment saying if you wanna provide great user experience, and rank for the particular country, this is what you need to do.
If there was on flat direct comment explaining how what happens with dupe content with affliates and networks and so forth, and how to make sure you dont get penalised, do this, this and this - and we wont penalise you.
All the other points of how do i rank better, what are your top 3 ranking criteria, etc - those google can keep to themselves.
And i think thats a fair trade.
Rand, thanks for posting this, I agree that Google needs to step it up with the quality. A lot of those answers were pretty pathetic and i'm glad you called them out in the open.
You can't have Matt Cutts, Adam Lasnik, and Maile Ohye answer hundreds of questions live with full-on detail. They knocked out as much as they could and did so as quickly as possible. Keep in mind that most of these people weren't in the SEO industry and Google was doing this mainly to get a large community involved. Think about it . . . they need more humans to help them and the best way to do this is to give a little bit back in order to get them engaged. I feel they accomplished this.
To Rand's point . . . where's MS/Y! on doing things like this?
Brent
Good point...
When targetting a larger population (especially those out of the SEO industry) the message needs to be digestible for that group.
Why doesn´t anybody announce such a event? I hope in the future such appointments will be spread more over the web.
Usually when I have an SEO question, I just call Google directly. :)
I think the frustration is that there are some serious issues that need a better response from Google, and perhaps we were hoping this Q&A would advance that. Whether caused by the limitations of WebEx or not, the resulting session seems like a blown opportunity.
In particular I thought the geotargetting answers were particularly weak. Its a serious issue, and it needs to be answered definitively. If Google is actually going to penalize sites because they geotarget their content in a user friendly way, thats a serious problem. From what I've seen so far, Google is saying do not geotarget content, and we're all hoping they can't actually be serious. Google seems awfully close to implying that geotargeting and spamming are one and the same, which is absurd.
I'm sure it would make their indexing more costly, and the serving of geotargeted content more difficult, but there has to be a solution everyone can agree on by which Googlebot can identify which country it wants to be "from." It would certainly lead to a richer index.
So, I was excited that my company let me participate in this and was more than suprised when nobody that I saw except me had asked the question millions of us want to know....what the heck is really going on with Google organic results. Changing literally by the minute. Matt Cutts answered something along the lines of...their results really do update that fast. If there's a new blog posting or website that posts new content it can affect the rankings. (not quoted verbatim). While I do believe Google can be "that fast" to crawl sites, I don't believe that's the real answer. Isn't everyone else NOT seeing this???
Google answering a question by linking to their own resource?
Oh, ha. You know...when I do that (without first typing a customized answer and then "for more info go here"), what I'm implying is that you're stupid and did not do a cursory search for the answer.
And then when people do that to me, my reaction is "yeah, thanks pal...I already searched for it and read that and it didn't completely answer my question."
Canned answers mean "I'm sick of this question you amateurs keep asking over and over." But I think the thing here is that SEOs and Webmasters have matured. We're not amateurs anymore and we don't want the answers you've been giving beginners. The industry is growing up and so should the answers.
It just seems like all the answers were for beginners, and maybe they just should've advertised it as a beginner's chat.
I'm a little cranky.