One of the biggest challenges many of my clients face is building the right SEO processes in place, so that any problems are quickly accounted for before they lead to bigger issues. Below are three things you should consider when trying to create a more streamlined process for making sure the technical foundation of the site is solid. Though none are considered "quick" or necessarily easy wins and can initially take a significant amount of time, ultimately in the long-run, they will help make monitoring the SEO on your site more efficient. This means less time spent identifying and fixing site issues and more time focusing on other aspects of SEO, like linkbuilding, developing a content strategy, etc... Overtime, the impact this will have on your site can result in high rewards.
1) Technical Annotations in Google Analytics
Currently, many of my clients with Google Analytics accounts either don't include any annotations in Google Analytics, annotate only their email, PPC, social campaigns or use it to keep track of search engine algorithm changes (like Panda updates). However, the value of annotating any technical changes made to the site in Google Analytics creates a more efficient internal process.
Scenario 1: Let's say that you have set up Google Alerts to alert you of any spikes and drops in traffic. Then, having technical changes annotated in Google Analytics makes it quicker and easier for you to specifically determine the cause of this spike or drop, instead of investing hours later on trying to determine the cause of these changes in traffic. In addition, any major technical issue runs the risk of being implemented improperly (in terms of SEO considerations), simply because there are so many issues to take into account.
Here is more information on how to setup a Google Alert.
Scenario 2: Often times SEO is not a technical priority for the development team, mostly because it is difficult to measure the ROI of what is often times, a significant amount of invested time and effort. Creating annotations in Google Analytics could help with this process- for example, if a spike in traffic were to occur and the team was somehow able to attribute this to a technical implementation on the site, the technical team could be properly recognized as being the cause of this change.
2) Sitemaps- Google/Bing Webmaster Tools
SEOs should create an internal process where Google Webmaster Tools is checked at least once a month to ensure there are no major issues with the sitemaps or with bots crawling the site. Sitemaps are only useful if they are kept up to date and well-maintained.
Why is this important? Duane Forrester of Bing has stated that "Your Sitemap must be clean. We have a 1% allowance for dirt in a sitemap." His definition of dirt includes 404 or 500 status code errors and redirects. He continues by saying "If we see more than a 1% level of dirt, we begin losing trust in the Sitemap."
Best practices include submitting a new Sitemap regularly, depending on how often new content is generated on the site. A publishing site might need to update every few hours, an e-commerce site every week, and a relatively static site every month.
Sitemaps should be checked at least on a monthly basis in Webmaster Tools to ensure there are no issues with the Sitemap.
These include:
- Checking for error messages
- Checking number of pages submitted versus indexed
- Checking for malware (and address these immediately!)
- Checking for crawl errors (like 4xx and 5xx issues)
Using Screaming Frog
If you do have a Screaming Frog account, you can also use it to verify Google Webmaster Tools errors, especially because Google Webmaster Tools do not always update their errors. Thus, you don't want to be looking for 404s that have already been fixed. You can also use it to check your sitemap for errors. To do so, simply upload the XML sitemap into Screaming Frog and crawl it. Craig Bradford of Distliled work a fantastic blog post on how to use Screaming Frog to accomplish these tasks and more.
If Google Webmaster Tools is not periodically checked, the number of errors can seem overwhelming. Joe Robison wrote a fantastic SEOmoz post on fixing an overwhelming number of errors in Google Webmaster Tools.
3) Creating Automated Scripts
404 Pages Returning Status 200 Codes:
Barry Schwartz wrote a blog post on how 404 pages should not return status 200 codes. The reasoning being that it could be confusing to spiders as they see a page that exists technically have no content. This can affect rankings over time because it is creates massive duplicate content as bots are crawling through the same content over and over again across several URLs.
He also suggests creating automated scripts to check for this type of issue.
However, to initially help you determine the extent of this problem on your site and provide an estimation of the number of 404 pages that return status 200 codes, plug a site search query into Google. See example below:
site:example.com/ "page not found"
If the query returns results, you know your site is returning status 200 codes for 404 pages and that this issue needs to be fixed.
SEO Score Card:
I've talked about creating an SEO score card before. I've also recently recommended another version of this to another client who had hundreds of thousands of URLs. In this specific instance, they had difficulty making sure that only high-quality, non-duplicate content would be indexed. Being an e-commerce client, the site also had tons of products that were very similar (resulting in identical product descriptions and content on the site).
I suggested creating an internal score sheet that would automatically be re-run every month to make sure that all currently indexed pages are still considered high-quality, while also offer an opportunity for pages that were once deemed low-quality to reviewed regularly. Once those low-quality pages became high-quality, they will become automatically indexed.
This process could be used to generate the sitemaps - but the goal is to future-proof the site against future search engine algorithmic changes while improving the overall domain authority of the site.
There are caveats that need to be addressed when creating an SEO score sheet- we want to be careful about noindexing pages, especially as overtime, this could result in less and less of the site being indexed. Once the initial script is written, check the results and see if these are actually pages that you want noindexed. If not, the script might have to be rewritten.
The ultimate goal is to make sure that only quality pages are indexed, while also keeping tabs on how many more pages on the site need unique content. This type of knowledge can prove useful when creating the site's linkbuilding/content strategy.
Conclusion
The overall goal is to build a streamlined process for technically auditing a site that can be described and thus, communicated internally. Creating a more efficient process means more time invested in other important elements- compiling quality content, building an online community, and social media to name a few.
Nice job! Excellent post...I would also reccomend checking for the rel canonical tag as well as standardizing a www version of the site. Making sure all non www 301 to www version as well as checking rel canonical can easily elminate duplicate content issues and hence lift any penalty.
Great addition! I definitely agree :)
Hi Stephanie,
Interesting points, there are certain things that i would love to see automated -
There are many instances where a test site is hosted on a staging server which is a replica of the website which is live. Of course the robots.txt file of the test website blocks all the search engine spiders from crawling these pages. However there are instances where the site is transferred from the staging server to the live server which includes the blocked robots.txt file which is then followed by the site not getting crawled by search engine spiders which eventually results in fall in rankings and traffic.
Wish #1 - A tool/software that would alert the concerned people about robots.txt file blocking the search engine spiders.
For Google Analytics, there are so many websites that have an include.php file where the GA code can just be copy pasted and the code gets pasted on all the pages. Problems occur when there are multiple template files. If a website has 2,00,000 odd pages with 20 templates, even if the GA code is removed from one of the templates, one would not see a major drop in traffic, of course there will be a fall but chances are that it might go unnoticed.
Wish #2 - A tool/software which would alert webmasters about removal of GA code in case of multiple template driven websites.
There are cases where canonical tags cannot be implemented simply because the CMS does not allow it. In such cases one would obviously go for canonical tag implementation in the HTTP header response. However the biggest problem here would be in cases where a website has thousands of PDF files and corresponding number of HTML/PHP/ASP pages live with canonical tags present in the http header response.
Wish #3 - A tool/software which would crawl all such pages and alert webmasters in case canonical tags are removed from the http header response.
It would be great if these pointers are somehow incorporated while building a streamlined process for technical audit of a website.
I am sure community members will definitely have a few more suggestions/pointers.
Cheers,
Sajeet
All three wishes are very easily granted. Is there really not a tool people use to do this? How many other people would like one as if there is demand I might write one for you all
Yes, please do!
Agree with Dave! :)
I agree - such a tool would be great!
Seems like there is some interest. I'll put some more thoughts into over the weekend as to what is needed and will start knocking one together in my free time. I'm in the middle of a house move at the moment so I'm a little snowed but this should be a fun little project so I will let you all know when it's ready
It would be great if you can create such a tool
Interesting Post Stephanie!
I am glad that you mentioned the point that Google Webmaster Tool (Bing if you use it) should be used to check the errors and flows if there is any available in the website, this actually help webmaster fixing the issues before it becomes a dragon and take hours to resolve.
Another important part that should also be taken in consideration is the fact that Google webmaster tool does not update on regular basis so we should not only rely on it... I like the idea of double check through screaming frog.
Overall a great Read!
Hi Stephanie!
A great way to follow up technical issues form Dave's post from yesterday.
I'm a big fan of setting up annotation in Google Analytics. That's a real effective and time saving way to examine drops and ups as well as to visually show, explain and connect those changes with previous interactions to your customers.
Nice post Stpehanie.
I can certainly vouch for analytics annotations, especially when you've got multiple people working on a site. Meaningful annotations can really help to unravel the cause of positive and negative changes to your site's traffic and usage.
Don't just use it for keeping track of major site updates - try and get into the habit of recording anything that may affect traffic. Being able to show your customer the changes you made, when you made them and what the impact was is very powerful!
Absolutely! If it is possible, annotate everything
Thanks for the tip on annotating, I don't know how I missed this, but this is going to be very useful. I am going to start on that tonight.
It was an interesting post to read Stephanie! There is one question that came in my mind after reading your post. I just wanted to make one thing sure about the pages that have no content. Would they be treated as duplicate contents by Google. When you mentioned above that"it could be confusing to spiders if they see a page that has not content" were you saying that spiders might think of the "no content" as a "duplicate content". Also in case Google does consider it as a duplicate, should the website owner request for removal of those pages via Google Webmaster tools to avoid any plagiarism issue. Could you please clear this query so that I can understand the scenario.
Hi Stephanie,
I read your post this really best for SEO and you tell here about sitemap and webmaster tool ,automated scripts this three things are really very best and this really help in SEO process and you also mention that when you update your site this really best and if we update it at the mention time we getting benefits that sure .
I went thru the article and this was very hepful.The links to old articles are very useful too. Thanks! I will join Maria for asking a part 2 :)
LOVED this article! Are you planning to write Part II? because I know there are several other thechnical factors.
The biggest challenge for many SEO start-ups is building the right SEO processes in place, so that any problems are quickly accounted for before they lead to bigger issues. You can find below a short list of things that should considered when trying to create a more streamlined process for making sure the technical foundation of the site is solid.https://www.marjinalescort.com/
These are great technical factors in SEO. I also just found this article over on Search Engine Watch, which outlines a few other considerations, and the author writes in a very easy way to understand. Would definitely recommend checking it out:
https://searchenginewatch.com/article/2300520/Technical-SEO-for-Nontechnical-People
One most like "Sitemaps" Details :)
Thanks,
I don't understand why people want absolutely integrate sitemap in their mai SEO strategy... This is just an artificial mean to index quickly a lot of pages from your site. If you don't use Google News, please tell me why you really want to use a sitemap :).
p { margin-bottom: 0.08in; }
Nice post! The way you have described how to make technical annotations in Google Analytics is simply great. I agree with your point that Google Analytics is really helpful to keep a track on SEO algorithm changes. Google analytics is really important and your post help us describing how to use technical annotations to rank a website high in leading search engine pages. Yes! With an advanced search engine optimization technical process, we can check error messages by checking sitemaps, at least once a month. It will be good to see more information about Google webmasters in your forthcoming posts. Overall it is a great post!
clearly article, i have some mistake on page, after read, i'll fix it follow you say
can you help me check my website https://xuonginoffset.com, i'm amater seo
thank you very much
Stephanie
Were just in the process of revamping our SEO process, good timing for your post. Some good food for thought while going through and updating
Hi Stephanie,
The information whaich you have provide about techinal aspects of SEO is really very usefull. Thanks for sharing and reminding these informaiton.
Hardik
Thanks on the great post! Very informational!Maybe you can add me on FB?
Thanks for a great post. The tip on using Screaming Frong to verify sitemaps is gonna come in very handy!
Your missing out if you haven't started using it already!
Great post! The post very relevant information on technical aspects of search engine optimization. Creating annotations on Google analytic account is something very important and this post help us defining and descibing how to use annotations to rank your website high in the search engine result page.
Thanks for a good post and sharing information
Good post and I agree with all your statements, however, checking your Webmaster Tools once a month?! This seems a bit risky to me as problems can occur any time... Personally I check this every morning to check for any errors or issues that may have occured overnight. It's just not worth leaving yourself wide-open to potential problems if you're not monitoring, analysing and fixing.
Even better- some of my clients never check it, so once a month is already a huge step forward :)
Good response! Glad to see you're educating your clients. Extremely important!
Thanks Stephanie, was a nice read!
To be honest, I think a lot of SEO's kind of neglect the essence andn/or importance of webmaster tools particularly. Being such a powerful snapshot for glaring problems, posts like this help to emphasise it's importance.
Will be looking to scrub up in this area myself now! lol
Thanks again :-)
Hi Stephanie thanks for the post. As usual you always make sure to get your message across clearly.
If anyone has read Avinash's posts you would all know that utilising Google Analytics Intelligence reports allows you to analyse a sites unknown unknown's. You can read more info about his concept here - https://www.kaushik.net/avinash/leverage-web-analytics-custom-alerts/
For GWT, this should be something all SEO's initially incorporate in their site audits, that way it can be something they can followup with for their clients on a regular basis. I also prefer do mini reviews, considering the clients budget and nature of the website.
Finally doing an SEO score card in my belief is going to be something that will help you stay out of trouble! I'm sure SEO's have experienced situations where they have commited to a project without understand the sites limitations. Doing a SEO score card will allow you not to identify site opportunities, but understand the sites context and target market in general.
Thanks again,
Vahe
Yes, I really enjoyed reading Avinash's post. Had some great insights!
Sadly this part is all too common."SEO is not a technical priority for the development team, mostly because it is difficult to measure the ROI of what is often times, a significant amount of invested time and effort"As a Web Developer for over 12 years justifing development time to introduce SEO best pratices, automation, analytics and social media integration has always been a tough sell to employers and clients. They struggle to understand the concepts and so its often overlooked or half baked solutions are applied that actually make it worse.
Great pots, technical SEO can make such a big difference but is often overlooked. Thanks for sharing.
Good post Chang really extremely informative.