Tomorrow, Mike McDonald of WebProNews, Vanessa Fox of Google's Webmaster Central and I will attempt to create a second version of our first interview together. It's late, and I'm wiped (plus I still have some presentation polishing to do), so I'm hoping you'll help me to craft some questions to pose to Vanessa tomorrow. Topics can include:
- Anything to do with Google Webmaster Central, Sitemaps, etc.
- Crawling & Indexing Issues in General
- Google's Webmaster Central Blog
- Vanessa's New Blog
- Rumors & Gossip in the Search World
Topics to generally avoid would be those that center around algorithmic or penalty-type questions. Vanessa's less a part of the search spam or algo-creation team, so they wouldn't be as likely to generate a great response. Thanks for your help everyone - we're looking forward to churning out some really top-notch content and coverage at the end of SES NY.
p.s. For those wondering about YOUmoz entries; Rebecca's been a bit busy partying especially hard, but she'll get to them soon :) Oh yeah, and beware the comics!
UPDATE: Vanessa and I just wrapped up the piece, and I was able to ask many of your questions into the interview. Yes, Matt, even yours. Look for the finished piece on WebProNews, hopefully in the next couple days.
Well, Matt, the rumors this year at SES are all about the people who aren't here. ;)
Here are a couple:
"Does Google plan on making linkback data more transparent (akin to Yahoo) or will the link: command continue to show a subset for the foreseeable future?"
"What changes can we expect in Google's integration of sitemaps?"
For those of us who aren't at SES NYC, what's the good gossip/rumors?
:)
read my mind
:-)
I would be interested to know if they plan on implementing something like Site Explorers 'delete url' function. It might be handy for them to see what urls Webmasters may want included or removed (not that they should leave it up to users completely, but it would be nice) to control once the site had been verified.
Also, it's probably been asked, but it may be cool to have a 'live PR score' instead of updating it every so ofen like now. Webmasters could verify their site and see the live PR. It may even cut down on the ones who are still obsessed with it.
Good call on the "Live PR". I would also love a public ranking system comparable to Alexa.
It wouldn't be that hard I wouldn't think, they even have an area for it already, so all they would have to do is switch the generic page and date with highest PR to a more real time one.
He might even be able to get in a word about the Moz PageStrength tool on WPN.
Live PR would be a nice addtion. Something tells me it won't cut down on the PR obsession, but it would be somewhat more useful than the toolbar PR.
Occasionally, a domain owner may want or need to change domain names for various business reasons, even while keeping ownership of the old domain, perhaps in the case of a merger or company name change etc.
Since we can verify ownership, and 301 individual pages (or entire domain) then why must the site owners suffer a ranking drop for an extended period of time, sometimes never to fully return to their previous rank?
Might Google be considering a way to make this process less painful very soon?
I'd expect even under ideal conditions it would take some time to update all the relevant databases, but I agree this would be something great to have.
It should be easy enough to verify that both sites have the same owner and be able to simply say 'Hey we're over here now.'
Anything that could reduce the time it takes for all the links, PR, traffic to come back from a few months to a few weeks would be greatly appreciated.
I would love to see a Google public site ranking system like Alexa, but more accurate. I think a lot more people have the Google toolbar installed nowadays as compared to the Alexa toolbar. And you just know that Google is already tracking it so all they have to do is make it public! Please? And if not, why not?
Re Google toolbar - I'm not sure so many people do. I don't have it. It just annoyed me, and then was a pain to uninstall. It'll be more than Alexa, but less than you'd think, is my guess.
Why the toolbar when google has analytics! most if not all of us has analytics installed on our websites already, I'm sure Google had a reason to supply this for free.
Probably about time they start using the information we give them. (PS - this is just my opinion, I have absolutely no backing to what i say, it just seemed logical to me)
Please ask Vanessa Fox what Google Base is and how it can be used to increase business to a website.
I looked on their site, but they never really say what the heck it is. Is it like eBay?
Quote from their site:
Google Base is a place where you can add all types of information that we'll host and make searchable online. Based on the relevance of your items, they may also be included in the main Google search index and other Google products such as Froogle and Google Local. Learn more. You'll need a Google Account to use Google Base.A Google Account lets you sign into Google Base and most other Google services (like Froogle Shopping List and Groups). Once you've created your account, sign in everywhere with just your email address and a password of your choosing. Accept Google Checkout.Interested in attracting new customers and processing online sales for free? Learn more about Google Checkout.
Great question! I wondered the same for a long time... I still have doubts.. and I would love to hear about this from an official google source.
Some time ago I was reading about Microformats, how useful they are because they help to "order the web through open standards, by describing the content type with special tag attributes, making it easily searchable" ... then I read that it seemed that this was what google had in mind when they launched google base, let people "categorize" or tag web content or products by describing them and making them easily searchable, but they didn't succeed with it.. because it is a closed platform that doesn't use an open standard. I don't know if this is totally true.. but is the only logical explanation I have heard.
From what I understand is Google Base is Froogle. Froogle is to go away soon. We are using Google Base for a client right now and the products show up in a Froogle search as well as Google Base. Basically we create a feed that we upload to Google Base with the clients products and descriptions etc. which then link into our clients web site/shopping cart.
I'm still trying to learn more about optimizing the feed for Google Base, but so far it's been pretty simple to set up and use, but we haven't really seen a whole lot of traffic from Googel Base yet.
IMO, it's more like Craigslist, but you're right they do use the Base items classified as 'products' to help bulk up the Froogle index.
Google Base is where you go to upload your product data feed to Froogle. It is very powerful, but the UI sucks....guess that's why they call it "Beta"
I would like to know if a banner link is different, from the search engine perspective, of a simple link at the side of a site and different from a link within the text.
Webmaster console shows us URL's that are broken - why not also show us the referral that pointed to it ... especially if it is an internal page so we can use that information to fix typo's.
I would love this too, but sometimes I feel like we're asking the big G to do our web analytics for us...
when will google be ready to index ajax and flash sites?
.. and video, audio.. and all what it is in a non-text format! :)
Yes, JS or Ajax and other multimedia crawling would be nice, or some better way for them to identify it and render it in a search.
Danny Sullivan already provides the best answer to this question. It went something like this:
"When you can see radio, you can index flash."
If you're reading, Danny, perhaps you can perfect that for us. When you said it, it was exquisitely clear.
I have a few for Webmaster Tools:
That's all I can think of now.
I look forward to the interview - thanks Rand!
Shame I missed this :(. I would like to know why they have Urchin script on all of the pages within WMT. :( anyone know that answer?
Because it's built using templates. And so it's on every page.
Not quite what I meant. I know it's in a template, I want to know why. Things like what they do with it, what have they learned from collecting all of the data etc..
Personally, I am always trying to understand the difference between the backlinks in my Google Sitemaps versus the general backlinks that are returned from the general "link:" tag in a browser. Is one "truer" than the other when it comes to page prevelance?
Can wait to hear more!
The links shown in the Webmaster Console (Sitemaps) are a 'truer' indication of what Google sees than the link; search (but still not the whole story..)
Would like to see Webmaster Central have more grouping, sorting and filtering of internal and external links. A few include:
Is Google WMC planning on ever adding a feature to show webmasters the pages on their sites that are supplemental?
That would be cool. We can certainly find the information with a site: search, but having it there in WMC would be nice. Even better might be an explanation of why a page has gone supplemental, but I won't expect to see that anytime soon.
I would give Matt Cutts a kidney for information on why pages go supplemental.
Disclaimer: This post is intended solely for the purposes of hyperbole and is in no way intended to indicate my willingness to bequeath any vital organs to Mr. Cutts or Google. I retain sole possession of all vital organs until such time as I choose to sell them on eBay.
Sorry, but it's too late for the disclaimer. You're personal information has been sent to the Cutts family and they will be waiting on their kidney as soon as Matt lives up to his end of the deal by letting you know why your pages have gone supplemental.
Apparently Matt has already booked his flight to Turkey where the operation will take place.....
There is a patent application floating around that may shed some light on that for you...Keep in mind though, just cause it's in a patent app doesn't mean that's how it works.
I enjoyed your first interview with Vanessa and will look forward to round 2. It's a little late for me now too, but I'll see if I can help with a question or two.
A general question might be what new features are planned for Webmaster Central and Sitemaps.
A few suggestions for Webmaster Central that I would like to claim as my own, but I originally heard from Danny Sullivan and Eric. I forget who suggested which.
On the link data we get a 'last found' date. A 'first found' date would be nice to see. It would be helpful to be able to filter the link data to see links over different date ranges. It would also be helpful to filter out domains from the link data to make it easier to see past the site wide links.
Hope that helps a bit and thanks for the YOUmoz update. I've been wondering when we might see some new posts there.
Yeah, sorry about the lag in YOUmoz publications. I've, uh, been a bit busy "networking" in New York. I'll throw up as many as I can this week.
"networking" eh?
Thanks for the hard work though Rebecca, really, 4 youmoz posts in less than half an hour!
I'm multi-tasking during sessions :D
It's weird to see a post be up for days then all of a sudden have 3 or 4 new ones pop up. Makes it hard to be above the fold for very long.
Not to complain, it doesn't matter THAT much to me, but I guess what made me notice was my post was one of the ones that got pushed down from back log.
Oh well, next time maybe one of mine will be up for awhile when you guys "network" in far off places.
Yeah, sorry about that. Normally I space them out a lot better, but I had some precious free(ish) time to go through the queue, so I bombarded all of you with four new posts. I won't typically do that, so my apologies for throwing you a curveball this time around.
No worries. I'm just glad to see some there. I can appreciate all you do and know sometimes that might mean you get busy and for a few days we won't see posts and then a few will pop up all at once when you have some time.
You've been doing a great job keeping YOUmoz posts flowing and I'm not going to complain about the occasional hiccup.
Hi Rebecca -I just wondered what you mean on ecopt's post with your edit saying it was submitted with no author details? Do you mean it didn't have a biog, in which case mine didn't either....
I will wait for Rebecca to clarify, but I think there was a bug during submission and somehow it wasn't associated with my user account. I had done one Youmoz post before, and that time went smoothly. For some reason it didn't show up in my manage section either. Matt would have to tell you what happened and what he did to fix it.
Ahhhhhhhhhhhhhhh - that makes sense
I noticed that too and wondered what happened since I saw you listed as the author. I assume whatever happened has been fixed now.
"I'll throw up as many as I can..."
Are you talking about youmoz posts or carbombs and shots of tequila? Have fun!
Hahaha, I meant I'll throw up some posts. I managed not to upchuck during this trip :)
why am I turning a pale shade of green....:-)
Oops. I see that I posted my comment in the wrong thread. (Too many windows open.) Sorry.
Sorry Rand, It's too early in the morning to think of any questions. Will go look for some coffee first...
Does Google agree with SEOmoz' Ranking Factors 2 post?
Is Google seriously considering allowing users to delete entries?
What action is being taken to eliminate click fraud?
After verifying a site using sitemaps, is it better to allow Googlebot to crawl the site or better to submit a sitemap?
Why submit a Sitemap to Google WMC?
Doesn't this just mask the real problem (Internal Links)
Wouldn't you be able to identify the problem pages easier if they were not indexed via WMC Sitemap?
To my knowledge... An orphin page (page without links pointing to it) will never rank very well. Would the same orphin page rank better if it were included on a WMC Sitemap or does it still need Links to rank well?
If you had great internal link structure, why would you need to submit a WMC Sitemap?
I've seen this example with orphaned pages. A "blue widget" page is added to a site, linked internally, added to a WMC Sitemap and subsequently indexed by Google. Later, the page is orphaned, but still remains in the WMC Sitemap. Eventually, Google catches on that the only internal link to "blue widget" is through the WMC Sitemap and de-indexes the page.
3 questions for Vanessa:
SERIOUS: Does Google have plans to crawl & index microformats (eg. hCard) for use in regular search results?
INFORMAL: How often do you get together to share ideas and war stories with your Yahoo/MSN/Ask counterparts?
HAH: Tell us the real reason Google won't allow cats at the Googleplex? :)
I'm not sure exactly how to phrase this (and I know it might be a bit algo-y) BUT here's 3 for you:
;)
Could you please find out when (if?) Google trends will be out of beta, and if there will ever be any comparative or complementary volume data to allow for more robust interpretation?
Here's what i would like to ask her,
Yahoo has started mobile ppc which is in beta version does google too plans to start something like this in near future?
Ah, yes. It looks great. Never underestimate the value of world-class photography in making a site look good.
The biggest question on my mind is this:
Why is it so much more difficult to get indexed with Google then it is with MSN/Yahoo? (technically speaking...)
Interesting... Yahoo typically takes the longest for me. MSN is extremely quick and Google isn't far behind.
Same. I've never had any problems with Google.
I've never really had problems getting indexed with Google either. It generally takes less than a week. It's been similar for Yahoo and MSN.
If you get a link back to your site from a site that gets crawled frequently it shouldn't take long. Links in forum sigs usually do the trick.
I find the opposite is true. Usually I get Google first, and the others follow. Used to be MSN right away, but they have tightened up it seems, or at least reduced crawl rates compared to the recent past.
Google has gotten better at finding content, plus now that they have Webmaster Central they have introduced new delivery methods like sitemaps and robots.txt and feedback from them on how to improve.
Did you catch the news that you can now point to your xml sitemap in your robots.txt file and all four major search engines will be able to grab it without having to submit?
https://searchengineland.com/070411-080716.php
Yes, caught it on sitemaps.org. Hence the mention in my post above ...
"sitemaps and robots.txt"
Autodiscovery is what they are calling it I believe. Hopefully this helps end the talks in the forums about submitting to search engines. We have done a pretty good job of spreading the word about that myth, but autodiscovery will help even more.
I thought you might have seen it, but figured I would post the link just in case you had missed it and for anyone else who might not have seen it.
I've always found that building an intenral html sitemap page and linking to it on from the footer of sites works to help get a site crawled. With most sites as long as you build spiderable navigation indexing shouldn't be a big problem, but I know it can help some sites to have an xml sitemap and I think it's good that the search engines are taking steps to simiplify the process.
Yep - similar to the autodiscovery tag for RSS which is another thing I'd recommend to anyone with frequently updated content..
Damn it, I've got my blue SEO book at home filled with questions and scenarios I want covered! *groan*...
I'll post as soon as I get out of the office!