Social media is becoming increasingly important for SEO. Just in the last week we've seen the launch of Google +1 and early correlation data showing Facebook shares are highly correlated with rankings.
In this post I'm going to walk through a bunch of different ways of tracking your social media strategy. We'll start with the basic/easy stuff and move up to more complicated things (Google Docs! APIs! Mmmmm)
3rd Party Tools
There are a million and one social media monitoring/tracking/reporting tools. Here's just a quick sample of ones that I like personally:
PostRank
PostRank is a neat service that tracks your pages (either through inputting manual pages or via an RSS feed) and gives you engagmenet metrics for your content. It even hooks up to Google Analytics giving you real stats for visit counts which is nice. Here's a screenshot of SEOmoz content on PostRank:
There is a free plan or a paid plan that is only $15 / month so it's very easy to get started with PostRank. It's a great tool for tracking your own content - but not as powerful for tracking competitor sites.
Topsy
Topsy is an awesome twitter tracking service - I personally love their search functionality because it's easy to use and gives nice data. For example - a search for our recent #linklove conference gives a really good summary of tweets and links shared:
But Topsy doesn't stop there - you can also run an awesome site: query in their search to show the top tweeted content from a domain. For example a site:distilled.co.uk query from the last 30 days:
Ok, Topsy is cool - we're getting close to analysing our competitors. What about a bulk URL lookup?
Sharedcount
Sharedcount is a very lightweight app I found in a Hacker News comment which gives you a very quick and simple lookup for a given URL. For example the startup Zapd's footprint on social media:
The great thing about this is that they have an API - the same data above is avaiable by just visiting this URL. Wait - what's an API? Let me get on to that.
Using Google Docs to Fetch Social Data via APIs
Ok cool, now we've covered off some 3rd party tools and services I like let's move on to some DIY solutions. In this section I'm going to walk through some fairly advanced things so you might want to read carefully. To give you a tease here's a few things I'm going to show you how to generate:
A graph of social media activity for your site:
Mouse over an individual data point to see which blog post the data point refers to. Of course, I'm not just about the pretty charts - I'm also going to show you how to generate a bunch of data like this:
I should stop here and mention that a lot of this section is going to build on knowledge of Google Docs. Most people only consider Google Docs as an online basic version of Excel. But it's so much more than that - using Google Docs you can actually build spreadsheets that make web calls and interface with live data.
Disclaimer: There are lots of things in this section you can just copy and paste and start using straight away so even for the beginner there should be lots of value. I link to a bunch of Google Spreadsheets at the bottom of the page that you can just make a copy of and get going. However, along the way I'm going to go into some fairly advanced bits and pieces. If you're completely new to using Google Docs to make web calls you should start out with this guide I wrote on Distilled about using the importxml function and Google Scripts:
Using Google Docs To Build Agile SEO Tools
A Basic Introduction to Social Media APIs
The foundation for this post is that we're going to look at calling some APIs to gather some data. Most normal people I know either don't know what an API is or freak out when they hear the term mentioned. Let me reassure you. An API is just a method for fetching some data from a remote resource in an efficient manner. For example, take the Facebook Graph API. You can make a call to this API by just visting a URL like this in your browser:
https://graph.facebook.com/?ids=https://moz.com
You'll see something like this:
Yes there are curly brackets everywhere but hopefully even the most technically-averse among you can figure out what the data says. This is actually JSON and the good news is machines love reading JSON.
Thanks to a Hacker News thread I found a whole bunch of free APIs like this that let you get information about a site's social presence:
- Facebook - https://graph.facebook.com/?ids=https://moz.com
- Twitter - https://urls.api.twitter.com/1/urls/count.json?url=https://moz.com
- Linkedin - https://www.linkedin.com/cws/share-count?url=https://moz.com
- Stumbleupon - https://www.stumbleupon.com/services/1.01/badge.getinfo?url=https://moz.com
- Delicious - https://feeds.delicious.com/v2/json/urlinfo/data?url=https://moz.com
- Google Buzz - https://www.googleapis.com/buzz/v1/activities/count?alt=json&url=https://moz.com
- Reddit - https://www.reddit.com/api/info.json?url=https://moz.com
Using these APIs is a much better thing to do than scraping data from their HTML pages - this way our request only causes minimal load on their servers whereas a full request would load all of the JavaScript and images and so on.
Using Google Scripts To Call APIs
Google scripts are pretty easy to get your head around - anyone who's ever used a macro in Excel will feel right at home.
Of course, what I haven't shown you is how to actually write your own Google Scripts. Let's deconstruct the script I use in the video:
What this piece of code is saying is broadly translated as:
"Define a new function for me to use called FBshares that takes one input argument. When the function is called, fetch the contents of the Facebook graph API for the given URL, interpret the data we get back as JSON and then from the JSON give me in my spreadsheet cell the shares data from the JSON"
Once we have this setup we can just put =FBshares("https://moz.com") into a cell and our newly defined function will give us the number of FB shares for the URL. The key thing to realise here is that once we've told the computer that the data is JSON we can read the data using the dot notation very easily. Let's try a slightly more convoluted script to demonstrate what's going on.
Here's the respone for the Reddit API for a recent Oatmeal comic:
Basically what this is showing us is the different submissions for a particular URL and the data for each submission. I've collapsed the view to make it easier to read (aside: this online json parser is awesome for making JSON look pretty). Within each "data" section we have the following:
Cool - so what we want to do is a build a script that will go through each of the data fields in the JSON and sum the score from each. Hey presto! Here's the script that does just that:
Don't be put off by this - it's really very simple. If you don't understand the for loop at the moment just ignore it and focus on the fact that we can drill down the JSON using the dot notation. object.data.children is looking at the children element within the data element within the overall object. The reason we need to use the [forloop] code is because the data within the children is an array of items and we can select which one we want by saying object.data.children[0] for the first one, object.data.children[1] for the second etc.
Hopefully you can get a feel for writing your own scripts but if you're lazy you can just copy and paste my scripts I wrote here:
Or, if you're really lazy you can just click the image below to be taken to a pre-loaded Google Doc with all these scripts :) (make sure you're signed in to Google and click file -> make a copy and you'll have a personal version of the sheet to use and play with)
Note that I'm actually dynamically pulling in the most recent SEOmoz blog posts (kind of like a DIY PostRank) but you could just run this across a static list of URLs you copy and pasted in.
The Sky Is The Limit
Hopefully by now you have a taste of the power of Google Docs and you're able to fetch your own social media data quickly and easily (and in spreadsheet form!). For bonus points though you should consider building something like this:
It's a google docs that lets you input a keyword - it fetches the top 10 ranking URLs and then fetches the Twitter and Facebook shares for each URL so you can quickly analyse the social footprint of the sites that are ranking for a particular keyword (I even built in a little geolocation switcher for my UK friends!).
That's all for now folks - leave some links in the comments to some cool things you've built using Google Docs! Remember if you set the access to "anyone with the link can edit, not view" then people can make a copy - this is a good way of sharing Google Docs without letting other people graffiti it or break it.
I don't get it!
EDIT: FYI I'm joking. See https://twitter.com/tomcritchlow/status/56035651329601536
Lol - you realise that this top comment is very influential in how people perceive the post right? I bet this is going to get lots of down-thumbs now :)
Gack! I didn't expect to be the first comment. Sorry :( I've edited for clarification.
Thanks for the edit - the good thing now is that this thread will probably turn it around so people thumb it up. But will people see this and think I'm gaming the system? How terribly meta-thumb......
Hi Tom, is it possible to still extract this data or any kind of Facebook Analytic Data into Google Spreadsheets? This was similar to something I had seen earlier https://www.computerworld.com/slideshow/detail/79661#slide1 but I never was able to make it work! :( Can you help me in creating a spreadsheet?
Very cool, Tom! This really inspires me to learn how to "code some sh*t" as you put it at the conference.
Thanks for all the resources too in order to put together scripts to call the different social networking site. That will be heaps useful!
Thanks :) Hopefully this is a good intro to coding for people who are new to it. Dabbling with Google Docs was how I got started on the path that ultimately lead me to learn Python.
Maybe my next post will be all about Python and Appengine...
Python - great choice :)
Python - great choice :)
would love to hear more about appengine!
This Indiana boy seconds that request by our fellow Hoosier, Deep Ripples. Would love to hear more about your experience using Appengine and Python.
Can you have this refrence the likes of a fan page instead of the amount of mentions on facebook? Where instead of the URL you just enter the vanity text(seomoz) and it returns the likes?
I wrote this, but doesnt seem to work
function FBpagelikes(Url) {var jsondata=UrlFetchApp.fetch("https://graph.facebook.com/"+url);var object = Utilities.jsonParse(jsondata.getContentTest());return object.likes;}
thanks
Hi,
I copied your code, but added the "toString()" function to object.likes and that seems to work:
return object.likes.toString();
For everyone who loved this blog post I just did a followup Appsumo video on the power of Google Docs - it's free for the next 15 hours:
https://www.appsumo.com/google-docs-unleashed/
It goes hand in hand with this post which is why I'm putting this here :)
Thank you for this post Tom. And not only because it was clear and not so not understandable as you were fearing in your tweet, but especially because you have convinced this not dev at all I am to finally move his old white ass and do some script s***t done. First this and then something with the Linkscape APIs
Hey guys,
Thought I'd drop a comment with a link to Ian's blog post. He has the SEOmoz api working in Google Docs!
https://www.conversationmarketing.com/2011/04/linkscape-google-spreadsheets.htm
Tom, this is the best post I have seen on SEOmoz in a while and I'm really excited to try some of this out.
Keep up the good work!
Freaking Awesome Data for the Data robots to chew on!
Seriouszzly great work on the api calls inside the docs!
You could trick it out even further with push notification and *presto*
You've got yourself a startup in the current bubble!
--
p.s. have you actually charged anyone yet for that?
Your pre-loaded Google doc does not allow to make a copy as indicated in your article and on the document itself.
Cheers
Are you logged in to Google when you visit the URL? And do you have a Google Docs account?
You need to ensure you're logged in to see the make a copy option. Beyond that, I'm not sure I'm afraid. It's been working for people all week :s
I also had some troubles with this. I was "only" logged in to my Google account. I then opened Google Docs, reloaded your spreadsheet and was then able to save a copy.
By the way: Great post Tom! Really learned a lot!
Great post, and I was able to start using the google docs solution straight away.
Okay, so here's my newbie question - is there any way I can use the facebook API to create a comparison list of a bunch of facebook pages, focusing on number of likes. I want to be able to compare my page, against my competitors - having all this data on a single google docs spreadsheeet would make my life very easy! Any feedback much appreciated!
Hey John,
Yes absolutely you can do this. If you grab a copy of the spreadsheet in the blog post you can copy and paste your own URLs in and then analyse their FB/Twitter stats. Hopefully you can figure out how to duplicate the effort to show your URLs next to your competitors
Awesome post! Quick and easy way to check what users did with your urls in social media.
I try to make this working automatically for other employees but Google Docs cannot import rss or xml sitemap from blog:
=ImportFeed("https://blog.siepomaga.pl/rss", "items url") all the time says Loading
=ImportXML("https://blog.siepomaga.pl/sitemap.xml","//url/loc") says that sitemap.xml doesn't exists.
In the excel 2010 I can import xml file (its 132 entries) and paste it to the Goggle Docs but I get another error "Service invoked too many times: urlfetch".
There are any limitations for API quries? Do you have similar problem with Google Docs?
Is it possible to pull in more data from Facebook and Twitter?
I would like if I could see #likes, checkins, tweets, retweets, etc.
Whoa! I had no idea what Google Docs was capable of doing! I couldn't get the code to work for myself however. I kind of wish I was more of a techie in that regards. Maybe I'm doing it wrong? I saved a copy of the spreadsheet into my Google Docs, and then I went into the Script Editor to replace "URL" with "https://www.domain.com" however, I keep getting an error message that reads "Missing ) after closing parameters". I checked the code and it looks good. Am I missing something here because unlike saffure9, I really don't get it. Is the purpose of this social media tracking spreadsheet to track individual links or general shared #s of a particular domain? I'm having a little trouble digesting all of this as helpful and useful as I know it is. Thanks for sharing this by the way!
It's encouraging to see you progress from your first coding experience with www.7bks.com to playing with APIs.
With this post, you're inspiring a lot of SEOs to look a little closer at certain aspects of information retrieval, which is one thing our industry really needs.
I've been loving a lot of the social media monitoring tools for some time (my fav so far being PostRank), but there are so many DIY possibilities with a minimal amount of coding knowledge.
Thanks for sharing something so useful to all of us.
-Matthew Edward
Thanks Tom.
This is a great post! I can't wait to try it out.Food for my inner geek.
This is reall coool and helpful post. Great stuff!!!
This was absolutely amazing! Thanks so much, Tom, for this very helpful post.
Excuse the pun, but thanks for sharing.
This is really useful, I'm going to get my creative hat on and see what I can come up with, as I'm sure many others are.
Cheers
Tom, you're a genius. I was scared of APIs but now I think I get it. Any chance there's a way to track twitter followers (rather than tweets) with one of these bad boys? I'm not that advanced yet... Thank you!
awesome trick that we can use to get the clear sharing info .
Hey Tom, I'm not sure if your still monitering questions to this post. But if you are, Do you know if there is a scrpt that will track who the people are that like your page? (I would like google docs to show the url of their profile)
I am trying to build a tool that will moniter a facebook contest. If you can point me in the direction of a tutorial, or website that talks about this topic, or if you know of a tool... I would really appreciate it.
-Bryant Jaquez
If this would not be amazing I would have to say it's great.
thank you so much. i've been searching for ages trying to find the API's to pull out data for a reporting system and your post even though is quite old now helped out so much. It goes to show that stuff does stay relevant!
Wow! This is awesome ! pure awesomeness!
I'm working on some automated stuff and this is really helpful! thx
Awesome Tom, TY!
very svelte tutorial. top.
As you say there are a tonne of tools out there. Rowfeeder is a great one to work with excel, oh the list goes on...
Burning question though Tom is 'did you mum see that easter egg?'
Hi Tom,
Great post! You are putting a smile on my face :-)
How can i make the sky is the limit sheet work for google.nl?
Thanks
Maarten
Hi there =)
Some simple question. I was looking for a function to include/insert/import only the number of all facebook likes of a page into my website. So i tried your code on my site, but unfortunately and also expected it doesn't work.
You know? We have 469 Likes on our Facebook Page and we just wanna insert this number in our sidebar. Should get out something like <p>469</p>.
I know, it's just a tutorial for Google Docs, but have you guys any idea for this? Would be so thankful...
Thanks and best greetings =)
Nice post. I knew about APIs but I've kept them out of my life. Excellent moment to give them a go (and hopefully get to real coding!).
Tom, what a fantastic post ! Thanks for putting this together and sharing. Should have hours of fun playing with this.
Any tips on how to incorporate Digg stats into this ? (unfortunately I know nothing of APIs and scripting :-( )
I've not used the digg api myself but hopefully you'll find the information you need here:
https://developers.digg.com/
Very useful AND I do understand it (at least after reading the "difficult" section a 2nd time). I like the Google spreadsheet you allow us to use :-). Thanks!
Didn't know Google Docs supported that much functionality yet - I'm eager to try it out.
OMG! That last point - Sky is the Limit - and the example about analysing the competition SM graph on keywords is truly awesome. Gonna dig right into this. Thanks a lot!!!
This was very helpful. I love it. I'm going to have to start getting more into google docs, your post has addicted me to the raw power therein.
Tom - I love you. In a thanks-for-telling-about-scripts-in-google-docs kind of way.
I've been playing around with the APIs and Google Docs for a while for both SEO and other purposes. I even got started on pretty insane rank analyser by using ImportXML only. Now that I know more about the scripts, I see sleepless nights ahead because the possibilities are filling my brain already :)
Thanks for the post. I love the incorporation of the visuals for each site. My personal favorite tracker is TopRank. It's so user friendly that even the social media novice can understand and make sense of all the data. Another great site is socialmention. What do you think of that platform?
Michael
Love it! My eyes started to cross a bit and I'm saving the bulk of it for later but you laid everything out and I'm excited to try it. :)
Is there a way to use an rss feed api or something to autopopulate the most recent blog posts? Also I am getting an error of "server invoked to many times: urlfetch"
Thanks, awesome post!
Yep - you can import an RSS feed using =importfeed("feed url", "items url") though I found for some reason it didn't like parsing feedburner RSS URLs...
The urlfetch invoked too many times error is annoying - it only appears intermitently so hopefully you can work around it.
Thank you!
Thanks for the great article. I looked at your importXML expression and then tried to utilize that to import my whole XML sitemap. My sitemap is formed by Drupal and looks like this for each individual URI:
<url><loc>https://www.example.com/</loc><lastmod>2011-05-09T19:14:14+00:00</lastmod><changefreq>daily</changefreq>
<priority>1.0</priority></url>
From the XPath documentation it looks like this should work (sorry, I am not an XML expert), but it does not and the error message I get is not very helpful either.
=importxml("https://www.example.com/sitemap.xml","//@loc")
Any idea why this might happen? Is the Sitemap definition different from standard XML which might block the import?
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="https://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="https://www.sitemaps.org/schemas/sitemap/0.9 https://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">I just ran my sitemap through a validation check and it complained about the missing doctype. Thanks, Jens
Try using "//loc" for your xpath instead of "//@loc". Does that work?
I tried this but this doesn't work
Thanks - This is good stuff. Cant wait to implement google docs part!
This is great Tom!
Thanks
Great post Tom, thanks for sharing!
great job Tom, love it !
myscript i have was half a-- :D
quality post ..props :D
One of the best posts I've read! Some great insight
An absolutely epic post - although I'm still a little bewildered by some of the code-y elements.
On a related note, but also at a massive tangent - if I ever become a rapper (unlikely) I'm going to call my self J-SON and I will become a hit with all the developers out there. A man can dream...
I love Google Docs ability to run script and interface with other Google services like Gmail. It's a lot of fun to play with. However, it's not really a robust solution for scalable monitoring, so we have to look at other tools. Radian6, ScoutLabs etc seem pricey when you consider much of the data comes from APIs that are out there for us to access for free! I think if I had a few months on my hands, I'd love to develop the killer affordable social monitoring and management tool. Or does it exist?
Thanks for including us, Tom. :)
Good to see options for those who are newer to the monitoring and measurement game, as well as some more advanced options for the data geeks (and those in the making!)
Melanie
Thanks for pointing out how useful google docs can be. I had been looking at zoho creator before. I'll now have to try both ;)
I guess I will be using google docs to parse and work on data extracted by https://open.dapper.net which is an excellent web-based website scraper.
This is an amazing post. I've never really dived into the APIs for most networks, you've given me most of the resources needed to get started. I guess I have no excuse to put it off any longer.
Time to make some fancy looking graphs of my own!
Very interesting stuff will test it all out soon =)
Ive only used Google docs for sharing speadsheets and documents never more then that =)
FBshares(url) function doesn't work for me. It was working on a couple of cells. But now I get 'thinking' flash up, disappear, and an empty cell.I know that these particular URL's would have got FBshares - many in fact.
I've only populated a couple of cells, so I can't imagine I'm pinging the API at all hard.
Any ideas Tom?
P.S Tweets(url) function works fine.
Try changing the script to return the contents of the URLfetch and see what the JSON looks like. That might give you an idea of what's going wrong. Sometimes gdocs and/or facebook throws a hissy fit so you might just need to come back in 5 mins :)
Thanks Tom,
Ok well it seems that some of the pages didn't have shares, but likes. However I am unable to return likes as I have no Authentication. I know we have likes by using the https://graph.facebook.com/url , and also the conditional coding below ("Whohooo!") which I added works fine.
When there are No likes and No shares then the cell will return "No Shares" - so I know that we aren't hitting the api too hard.
So I've added to your code like so:
function FBshares(url){ var jsondata = UrlFetchApp.fetch("https://graph.facebook.com/"+url); var object = Utilities.jsonParse(jsondata.getContentText()); if (object.shares==null) { if (object.likes!=null){ return "whohooo! "; }else return "No Shares"; } else return object.shares;}
If you or anyone knows how to return likes and comments much like sharedcount.com, but then using your method, that would be awesome!
Try parsing a FQL url like this instead:
https://api.facebook.com/method/fql.query?query=SELECT%20share_count,%20like_count,%20comment_count,%20total_count%20FROM%20link_stat%20WHERE%20url=%22https://www.seomoz.org/blog/how-to-track-your-social-media-strategy%22&format=json
Thanks Tom. Awesome. Will look at parsing this. Many thanks for your quick responses and knowledge.Hope you're loving New York.
From here in New Zealand, thanks!
I am not able to track my social media strategy correctly using sharedcount.com website.
Because it isn't showing the results correctly. :-(
Wow, what an impressive blog! Thanks so much for the helpful how-to. If anyone doesn't have time to do so, Radian6 is an awesome tracking tool to use!