In the Moz Q&A, there are often questions that are directly asked about, or answered with, a reference to the all-powerful .htaccess file. I've put together a few useful .htaccess snippets which are often helpful. For those who aren't aware, the .htaccess file is a type of config file for the Apache server, which allows you to manipulate and redirect URLs amongst other things.
Everyone will be familiar with tip number four, which is the classic 301 redirect that SEOs have come to know and love. However, the other tips in this list are less common, but are quite useful to know when you need them. After you've read this post, bookmark it, and hopefully it will save you some time in the future.
1) Make URLs SEO-friendly and future-proof
Back when I was more of a developer than an SEO, I built an e-commerce site selling vacations, with a product URL structure:
/vacations.php?country=italy
A nicer URL would probably be:
/vacations/italy/
The second version will allow me to move away from PHP later, it is probably better for SEO, and allows me to even put further sub-folders later if I want. However, it isn't realistic to create a new folder for every product or category. Besides, it all lives in a database normally.
<Files magic>
ForceType application/x-httpd-php5
</Files>
This will allow the 'magic' file, which is a PHP file without an extension, to then look like a folder and handle the 'inner' folders as parameters. You can test it out here (try changing the folder names inside the magic 'folder'):
https://www.tomanthony.co.uk/httest/magic/foo/bar/donk
2) Apply rel="canonical" to PDFs and images
The SEO community has adopted rel="canonical" quickly, and it is usually kicked around in discussions about IA and canonicalization issues, where before we only had redirects and blocking to solve a problem. It is a handy little tag that goes in the head section of an HTML page.
However, many people still don't know that you can apply rel="canonical" in an alternative way, using HTTP, for cases where there is no HTML to insert a tag into. An often cited example that can be used for applying rel="canonical" to PDFs is to point to an HTML version or to the download page for a PDF document.
An alternative use would be for applying rel="canonical" to image files. This suggestion came from a client of mine recently, and is something a couple of us had kicked about once before in the Distilled office. My first reaction to the client was that this practice sounded a little bit 'dodgy,' but the more I think about it, the more it seems reasonable.
They had a product range that attracts people to link to their images, but that isn't very helpful to them in terms of SEO (any traffic coming from image search is unlikely to convert), but rel="canonical" those links to images to the product page, and suddenly they are helpful links, and the rel="canonical" seems pretty reasonable.
Here is an example of applying HTTP rel="canonical" to a PDF and a JPG file:
<Files download.pdf>
Header add Link '<https://www.tomanthony.co.uk/httest/pdf-download.html>; rel="canonical"'
</Files>
<Files product.jpg>
Header add Link '<https://www.tomanthony.co.uk/httest/product-page.html>; rel="canonical"'
</Files>
We could also use some variables magic (you didn't know .htaccess could do variables!?) to apply this to all PDFs in a folder, linking back the HTML page with the same name (be careful with this if you are unsure):
RewriteRule ([^/]+)\.pdf$ - [E=FILENAME:$1]
<FilesMatch "\.pdf$">
Header add Link '<https://www.tomanthony.co.uk/httest/%{FILENAME}e.html>; rel="canonical"'
</FilesMatch>
You can read more about it here:
https://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
3) Robots directives
You can't instruct all search engines not to index a page, unless you allow them to access the page. If you block a page with robots.txt, then Google might still index it if it has a lot of links pointing to it. You need to put the noindex Meta Robots tag on every page you want to issue that instruction on. If you aren't using a CMS or are using one that is limited in its ease, this could be a lot of work. .htaccess to the rescue!
You can apply directives to all files in a directory by creating an .htaccess file in that directory and adding this command:
Header set X-Robots-Tag "noindex, noarchive, nosnippet"
If you want to read a bit more about this, I suggest this excellent post from Yoast:
https://yoast.com/x-robots-tag-play/
4) Various types of redirect
The common SEO redirect is ensuring that a canonical domain is used, normally www vs. non-www. There are also a couple of other redirects you might find useful. I have kept them simple here, but often times you will want to combine these to ensure you avoid chaining redirects:
# Ensure www on all URLs.
RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteRule ^(.*)$ https://www.example.com/$1 [L,R=301]
# Ensure we are using HTTPS version of the site.
RewriteCond %{HTTPS} !on
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
# Ensure all URLs have a trailing slash.
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ https://www.example.com/$1/ [L,R=301]
5) Custom 404 error page
None of your visitors should be seeing a white error page with black techno-babble when they end up on at a broken URL. You should always be serving a nice 404 page which also gives the visitor links to get back on track.
You can also end up getting lots of links and traffic if you but your time and effort into a cool 404 page, like Distilled's:
This is very easy to setup with .htaccess:
ErrorDocument 404 /cool404.html
# Can also do the same for other errors...
ErrorDocument 500 /cool500.html
6) Send the Vary header to help crawl mobile content
If you are serving a mobile site on the same URLs as your main site, but rather than using responsive design you are altering the HTML, then you should be using the 'Vary' header to let Google know that the HTML changes for mobile users. This helps them to crawl and index your pages more appropriately:
https://developers.google.com/webmasters/smartphone-sites/details
Again, this is pretty simple to achieve with your .htaccess file, independent of your CMS or however your are implementing the HTML variations:
Header append Vary User-Agent
7) Improve caching for better site speed
There is an increasing focus on site speed, both from SEOs (because Google cares) and also from developers who know that more and more visitors are coming to sites over mobile connections.
You should be careful with this tip to ensure there aren't already caching systems in place, and that you choose appropriate caching length. However, if you want a quick and easy solution to set the number of seconds, you can use the below. Here I set static files to cache for 24 hours:
<FilesMatch ".(flv|gif|jpg|jpeg|png|ico|swf|js|css|pdf)$">
Header set Cache-Control "max-age=28800"
</FilesMatch>
8) An Apple-style 'Back Soon' maintenance page
Apple famously shows a 'Back Soon' note when they take their store down temporarily during product announcements, before it comes back with shiny new products to love or hate. When you are making significant changes to redirect users to such a page, a message such as this can be quite useful. However, it can also make it tough to check the changes you've made.
With this bit of .htaccess goodness, you can redirect people based on their IP address, so you can redirect everyone but your IP address and 127.0.0.1 (this is a special 'home' IP address):
RewriteCond %{REMOTE_ADDR} !your_ip_address
RewriteCond %{REMOTE_ADDR} !127.0.0.1
RewriteRule !offline.php$ https://www.example.com/back_soon.html [L,R=307]
9) Smarten up your URLs even when your CMS says "No!"
One of the biggest complaints I hear amongst SEOs is about how much this or that CMS "sucks." It can be intensely frustrating for an SEO when they are hampered by the restraints of a certain CMS, and one of those constraints is often that you are stuck with appaling URLs.
You can overcome this, turning product.php?id=3123 into /ray-guns/ in no time at all:
# Rewrite a specific product...
RewriteRule ray-guns/ product.php?id=3123
# ... or groups of them
RewriteRule product/([0-9]+)/ product.php?id=$1
This won't prevent people from visiting the crappy versions of the URLs, but combined with other redirects (based on IP) or with judicious use of rel="canonical," you improve the situation tremendously. Don't forget to update your internal links to the new ones. :)
10) Recruit via your HTTP headers
Ever looked closely at SEOmoz's HTTP headers? You might have missed the opportunity to get a job...
If you would like to add a custom header to your site, you can make up whatever headers and values you'd like:
Header set Hiring-Now "Looking for a job? Email us!"
It can be fun to leave messages for people poking around - I'll leave it to your imaginations! :)
Download the rules
You can grab all of these rules in quick-form from a compilation I made.
Viewing headers
If you are unsure about how to look at HTTP response headers, here's a great tool to get you started.
If you would rather do it in your browser, follow these steps:
- Chrome on Windows: Ctrl-Shift-I and click 'Network' (then reload the page)
- Chrome on Mac: Command-Option-I and click 'Network' (then reload the page)
- Firefox: Install Live HTTP Headers
Share yours!
Anything I missed, mistakes I made, or better ways to do something? Any cool ones you have up your sleeves? I'd love people to add their tips to the comments so I can come back to this post next time I get stuck. I'll try to update my download file with any cool ones the community comes up with.
Thanks for reading, and don't forget to test anything you change! :)
A handy code, If you want the site to work only on the office intranet, which we did a few times, Just add the IP's which you want to allow in the format : allow from IP Address"
<Limit GET POST HEAD PUT DELETE>
order deny,allow
deny from all
allow from 122.176.46.234
allow from 122.176.46.1
allow from 122.160.62.79
allow from 122.160.62.1
allow from 182.68.176.236
allow from 203.110.95.98
allow from 115.241.205.244
allow from 115.240.7.130
</Limit>
Another few that are missing in this post:
Redirect all non www to www:
# Redirect non-www urls to www
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www\.yoursite\.com
RewriteRule (.*) https://www.yoursite.com/$1 [R=301,L]
Redirect all www to non www:
# Redirect www urls to non-www
RewriteEngine on
RewriteCond %{HTTP_HOST} ^www\.yoursite\.com [NC]
RewriteRule (.*) https://yoursite.com/$1 [R=301,L]
Found this via https://www.htaccessbasics.com/force-www-nonwww-domain/
Very late comment, Tom, but hoping you're still getting notifications on this thread...
First - thanks for the useful bites of info - there were several here I wasn't familiar with.
As far as canonical headers for PDFs, your .htaccess directive is
<Files download.pdf>
Header add Link '<https://www.tomanthony.co.uk/httest/pdf-download.html>;
rel="canonical"'
</Files>
but wanted to check... I'm assuming the "download.pdf" indicates the file is in the root directory and if stored deeper, the full relative path to the file would need be included here?
Hope you can confirm.
Thanks!
Paul
P.S. Your code shows the Header add line broken over two lines - was that just for formatting? Can the code be included in the .htaccess all on one line?
You can do something like this: DocumentRoot /directory/
However, I suggest having separate .htaccess file in every directory instead of having one in root.
Thanks Tom, I wasn't aware of 8 out of these 10. But now I know at least these 10. Thanks Buddy :)
Me too. Lots of new stuff here.
One thing I HIGHLY recommend doing (that wasn't mentioned here) is to backup your current htaccess file... because a single typo can literally make your entire website display an Internal Server Error!
What I do is test things on a site that gets little traffic. Then when it's working, I clone the new htaccess file to my big site.
Hi Tom thanks for this useful piece. I just want to ask about the 404 redirect "ErrorDocument 404 /cool404.html" This works fine when people reach a page in root folder, however if they are inside a folder, eg: www.blabla .com/folder1/somethingnotfound this does not work. I guess we need to put the 404 file in every sub directories also?
Thanks
1) Make sure your root directory is called correctly in the error code. ie. / is root, but you may need to rename it depending on where the file is located. Look on your server and pull the full path to the .htaccess file.
2) .htaccess on the sub-directory can overwrite rules from a .htaccess on the root level. Make sure you aren't fighting yourself. If you have a .htaccess in the folder that isn't working, simply add the ErrorDocument 404 /cool404.html" call.
3) Ensure all your server configs are setup properly for .htaccess control. Most LAMPs are defaulted to no for .htaccess to improve performance and reduce load.
Nice post. I've usually worked in the Microsoft IIS space but have been doing more PHP / LAMP lately, so this will come in handy.
BTW, here are some resources on URL rewriting for IIS:
https://www.seomoz.org/ugc/microsoft-technologies-and-seo-web-development
https://www.iis.net/learn/extensions/url-rewrite-module/using-the-url-rewrite-module
Also, URL rewriting rules use Regular Expressions (RegEx) for URL matching. Here are a couple of good RegEx resources. These apply to both Apache and IIS:
https://www.ultrapico.com/Expresso.htm (free RegEx builder tool)
https://www.regular-expressions.info/quickstart.html (RegEx tutorial)
Bookmarked! Didn't know you could do some of these, can be very useful in the future thanks! Few more you should add to your list:
GZIP compression - make your site load faster
<ifModule mod_gzip.c>
mod_gzip_on Yes
mod_gzip_dechunk Yes
mod_gzip_item_include file \.(html?|txt|css|js|php|pl)$
mod_gzip_item_include handler ^cgi-script$
mod_gzip_item_include mime ^text/.*
mod_gzip_item_include mime ^application/x-javascript.*
mod_gzip_item_exclude mime ^image/.*
mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.*
</ifModule>
Page redirect - useful when you move your page location and want to redirect to it
Redirect 301 /old.htm /new.htm
Site redirect - useful when you moving entire site to new domain
Redirect 301 / https://www.newsite.com/
Site redirect w/ same page structure - if you are moving domains and keeping the same URL structures
RewriteCond %{http_host} !^www.oldsite.com [NC]
RewriteRule ^(.*)$ https://www.newsite.com/$1 [R=301,L]
Amazing stuff,, specially canonical tag for images love that part. yeah that's true these all are useful hidden snippets for SEO's.
Forgot about this one too!
<ifModule mod_headers.c> Header set Connection keep-alive </ifModule>
Make sure Keep-Alive is turned on in your server config.
Thanks for great tips.
Also I can't access the test site _https://www.tomanthony.co.uk/httest/magic/foo/bar/donk to check how exactly it works...
Hi,
I'm trying to redirect a www.example.com/index.html page to just www.example.com.
I've tried a bunch of different options involving [THE_REQUEST] rewrite code but none are working with this site. It's a wordpress site: www.betterhealththrunutrition.com
I realized that if I deleted this rewrite rule
RewriteRule . /index.php [L]
from the .htaccess file the redirect works, but none of the other sub-folders work any longer! (ex. /about returns a 404) :D
Can someone please help!
This should help you James:
RewriteEngine on
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]+/)*index\.html\ HTTP/
RewriteRule ^(([^/]+/)*)index\.html$ https://%{HTTP_HOST}/$1 [R=301,L]
Source: https://www.stevefortuna.com/remove-index-php-from-url/
Your guide is so comprehensive that I have no questions anymore!
Love that blog post - one thing with I got stuck is writing .htaccess for wordpress and vbulletin together.
Bookmarked!
Yeah, I meant to add in the post that it can be problematic using some of these sometimes with things like Wordpress installed as you can get annoying conflicts which you have to work out on a case by case basis.
Yes exactly, I just installed vbulletin in domain root and wordpress under a directory "blog" placed single .htaccess (some rules were made by vbseo) and I powered home page with wordpress using wordpress tutorial on their site. Now everything I have fixed but unable to fix blog home pagination with .htaccess. I appreciate any sort of help in this matter.
Tom Your article is really helpful for me , Actually i was know only some points like 1, 2, 4, 5 and 10 but i don't know about other 5, so Heads of to you and thanks Tom for sharing valuable information with us.
Here's another one that I don't think has been tossed up there.
<FilesMatch ".(js|css|html|htm|php|xml)$">
SetOutputFilter DEFLATE
</FilesMatch>
What does it go Alan ?
It is a method of invoking gzip and compression to file extensions. Reduces file sizing and server load over time.
Awesome thanks :) Will try it out !!
Cool, ping me on Twitter and let me know if you have any questions, @Texas_Marketing ... I've done a lot of work on my sites to improve page speed and don't mind throwing out some suggestions.
Great post, Tom! .htaccess intimidated the hell out of me for several years. It wasn't until I recently took over majority control of Miles Design's website that I become more comfortable. The tips you've offered are very helpful - I'll specifically be applying the custom 404 page as soon as our site redesign is complete and ready for launch!
Do you have any insight on finding the correct place within the .htaccess to make these changes? That was always the most intimidating aspect for me...
Hey Robbie. Good question! To place things accurately, you did need to develop a decent understanding of the syntax.
Having said that you can work through the rules from top to bottom and create a flow of what would happen to a specific URL with these rules. If you find a redirect that sends you to a new URL which then gets redirected, it is a good sign you might be able to improve the rule ordering.
Generally you want the most specific rules earliest. Each of these should also try to encapsulate later rules. For example, imagine you have a rule that enforces www on your domain, and then you have a rule that redirects an old product page to a new one. The product rule should go first, and should also ensure it is redirecting to the www version of the URL.
Hope that makes sense! Thanks for the comments.
Without a doubt I'm bookmarking this post.
Just be careful when editing your .htaccess live, you've got a great chance of taking down your site :)
Nice tips. I recently started experimenting with .htaccess and this post is just what I was looking for. To check Http Headers in Chrome, I personally like the HTTP Headers extension: https://code.google.com/p/chrome-extension-http-headers/
I especially like your first en last tip. Going to check them out right away, thanks!
Nice tips. Luckily right now, i want to learn about .htaccess. And thanks for let me know about the feature inspect element in chrome. Before this i don't about it. Thanks again.
This is great. The Canonical for images and PDF's will be great and the info on 404 error pages and clean URL's is just as good.
Thanks nux but it's still not working!! :(
not sure what's up with this site...
Invaluable information - I already had some of these but its great to have someone put them and some extras into one list :)
Thanks Tom!
Great tips to be followed in order to have better SEO implementation..thanx for telling the right method for making SEO friendly URL's
Excellent article, thanks. as far as watching HTTPheaders my preferred tool is urivalet.com
Hi, a question if possible, how to behave when the URL to be addressed are image files? Thanks
Hi Tom thanks for all the 10 tips. I already knew 5 of them, thanks for teaching me 5 more tips about .htaccess. This chrome extension from ayima helps us to know the server redirection path and the type of the server redirection (301, 302).
Hi there Tom, I have to say that is a very nice and complete post. I am sure that the newbies which they don't have so much technical knowledge they will thank you for a lot of time. Keep up the good work. :)
Hey, Thanks for 10 tips... I have faced a problem in static html website (hosted on apache ) ... which has got a link from other website containing the pattern ?cid= for eg. google.com/?cid=123
I want that "?cid=" part to be redirected on home page. (using .htaccess)IS it possible...?
Indeed .htaccess is powerful, and i think carefully about each and every change i make.
Then i verify that everything works as i expected them to.
hello sir
i have a problem in canonical tag,i want to open my site in www.example.com,i also set htaccess for 301 redirect and its work.but the problem is my website not working properly eg.add to cart product not worked and site admin not open what can i do .please anyone help me my site developed in opencart system.
thanks in advanced please...
Hello Tom,
Nothing is more productive than making your site SEO-friendly and future proof. After the release new algorithm updates, SEO has changed significantly. Anyway, thanks for sharing these wonderful tips. I will be using all of these later on.
I have problem with .htacess file while i type in browser example.net they move to example and not found i use canonical code :
RewriteEngine on
RewriteCond %{HTTP_HOST} ^example\.net$ [NC]
RewriteRule ^(.*)$ https://www.example.net/$1 [R=301,L]
Tom,
Could you tell how to redirect https to http using .htaccess code?
This is very usefull, thank you so much.
Google is now pushing for an encrypted/HTTPS web. So here you go!
## Requests to HTTP
RewriteCond %{SERVER_PORT} 80
## Not a subdomain or other domain name within your site root.
RewriteCond %{HTTP_HOST} ^(www\.)?yourdomainname\.com$
## Send them to your HTTPS site
RewriteRule ^(.*)$ https://www.yourdomainname.com/$1 [R,L]
For some reason I run into character set issues with WP, adding this to htaccess solved the issue:
AddDefaultCharset utf-8
Hi all,
First of all, thanks a lot for your post!
Hope this is still being watched... I want to achieve the following:
First: myserver.ch and www.myserver.ch should show a masked display of the contents of www.myserver.com/wordpress/de BTW, all content is on the .com server
Second: www.myserver.com should show a masked display of the contents of www.myserver.com/wordpress
By the way, there is a moodle installation at /moodle, which should stay as is.
Both rewrites should work simultanously and be SEO friendly. Can this be done simply by .htaccess in both cases? if so, how?
Thanks! maan
Wow that PDF canonical section should be a post of its own. Thanks Tom!
I don't think the rel=canonical for images would work. While it's a creative idea, the rel=canonical is suppose to be used for pages that are nearly identical. It's also important to note that implementing a rel=canonical is only a suggestion for Google and is not a guarantee like 301. I highly doubt that Google would accept a rel=canonical from an image to product page BUT I would love to be proven wrong if you had any data confirming that Google was following the rel=canonical for the images.
Is it redundant to include a redirect to my canonical domain (www) in my .htaccess file since I already have the correct rel="canonical" in my header?
Here is another useful snippet:
RewriteCond %{HTTP_HOST} ^111\.222\.222\.111
The above is used for ip canonicalization, to prevent duplicate content.
It is used like this:
# Ensure www on all URLs and ip canonicalization
RewriteCond %{HTTP_HOST} ^111\.222\.222\.111
RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteRule ^(.*)$ https://www.example.com/$1 [L,R=301]
hiThanks for this helpful information i'm new to htaccessi have a problem when using the
9) Smarten up your URLs even when your CMS says "No!"
it didn't catch my css what can i do
here is the link where i tried it https://sweetluscious.com/cookies/
I will post a video tut on this soon.
Great information ! i am going to bookmark this page to future use ! Thanks @Tom Anthony for share this .htaccess stuff .
This is awesome. I am as nontechnical as you get, and even I understood this. This is a great "plain english" guide for .htaccess files. Even if you're agency side and don't go into server's yourself, this can be a great resource to send to developers. Bookmarked :)
Tom great post. What I would like to add is using the gzip compression module which goes directly in your root .htaccess file. It helps cache your site which enables faster loading plus also saves on the bandwidth of your system. We had this implemented for one of our clients and we did see slight improvement in the loading times. Here is a reference - https://webdesign.about.com/od/speed/ht/website-compression.htm
Tom,
great stuff. I think this will help a lot of people with question concerning htaccess. Solves a lot of problems and adds a lot of value for the ones that didn't know everything about it (like myself). Great stuff and keep it up.
thanks
Regards
Jarno
Hi,
thanks for your article!
Other very usefull suggestions for the .htaccess file are in the html boilerplate project: https://github.com/h5bp/html5-boilerplate/blob/master/.htaccess
Boilerplate itself: https://html5boilerplate.com/
Best wishes,Georg.
Thanks for the share the list is fairly comprehensive for what is actually needed. There are a few notable security htaccess tricks that are worth mentioning though.
Great stuff Tom! Definitely a few Ive never used before - tip #1 is awesome! Saving these as Coda clips!
Custom 404 pages are a lot of fun ;o)
your pal
chenzo
Hi Tom,
Great article but you could also use the Redirect Path extension for Chrome by Ayima to view headers.
Adam.
Amazing tips! If only I could get a firm grasp on the .htaccess syntax!
A Complete Amplification of .htaccess files with instance of .htaccess codes.
It’s really informative post; however we would like to have some question in term of .htaccess and redirection.
1. My first question is that creating manual redirection through Internet Services Manager (ISM) at Window Server is it safe in term of SEO?
2. Is it good in term of SEO that website custom 404 redirection on home page with 302 header status?
Thanks for sharing in-depth information about .htaccess redirections!
1. Windows Servers don't support .htaccess files. Http Header redirects are "safe" in terms of SEO. It's more to do with what you want to achieve and how to achieve it in a way that is friendly to users and search engines.
2. A 404 is not a redirect. It is a status code a URL returns to indicate the requested URL does not exist. You should not use redirects (3XX codes) to handle missing pages (404).
Common mistakes are to 302 the missing URL to a page that shows an error message and returns a 200 code.
@tiggerito
Thanks for response, I do understand that 404 is status code of header, but my question is if we use 302 redirection with pause of 10 seconds for 404 pages in the website, is this really good in term of SEO?
@tiggerito: Windows Server using IIS does not support .htaccess files out of the box. However Windows Server running Apache does support .htaccess files.
@Venkatesh Madgundi: no.
Nice article, thanks.
I would add that for viewing headers I would add Fiddler as a complete standalone HTTP sniffer tool.
Hey, great post. It would be great to see something like this in the form of a cheat sheet like the anatomy of a url cheat sheet or web devs seo cheat sheet. Fingers crossed
a complete guide that should always be kept in mind.
Thanks Tom for putting together this awesome resource. I was really excited to see you can add rel="canonical" to pdf documents and images!
I have a question:On Google Webmaster technical Guidelines they suggest to "make sure your web server supports the If-Modified-Since HTTP header".
Can you add some instruction on how to assure that using .htaccess? Is that possible?
https://www.feedthebot.com/tools/if-modified/
Generally this is handled on the back-end, not through .htaccess.
Hi Alan,
I tried the feedthebot.com tool some time ago, and also now but I'm not sure it is accurate... for all the websites I've tested it returns No. Even for seomoz.com
Or maybe the tool is accurate and not many servers support the If-Modified-Since HTTP header...
I use SEOBook's tool set to pull the server header info.
https://tools.seobook.com/server-header-checker/
This will show any HTTP information that is active on the page, including dates and relevant time information, connection status, etc.
Great stuff, I hadn't thought about several of these before. So far I've only brushed the surface of htaccess but the little bit I know is very helpful. Thanks to you, I've added a few tricks to my bag.
Great post! Thanks for sharing.
Thanks for the wonderful tips on maximising the use of .htaccess, love #5 - custom 404 page.
Thanks Tom For such detail work on htacess snippets, i do not know about .pdf .xlsx canonical & site-speed. Thumbs up(y)
Great Post Tom. It will help me a lot. Thanks for Posting..
Thank you Tom, very usefull for me. I looking for this kind snippets.
Thanks!!
Awesome list! Thank you.
One of the best posts on SEOmoz this year! Awesome work Tom!
Reading #2 triggered a light bulb. Have you tried adding rel=canonical for video transcriptions?
It's great when people make development tasks easy for non/semi/aspiring-to-hopefully-one-day-be developers.
thanks great advice i am a novice when it comes to this stuff need to become more technical and this post has given me a kick start cheers
One thing to be conscious of with (6) and (7): some cacheing layers will not cache items sent with a vary: user-agent header as Google recommends; I think Varnish won't do this out-of-the-box and I know Akamai won't do it at all. Be careful to check your cacheing layer before you implement this header!
Wow. Posts like this make me feel like the dumbest guy in the room. I appreciate that.
Thanks for the snippets. I saved them all - and now I'm going down the rabbit hole to find out what else I don't know about htaccess.
Great post bringing to light some key .htaccess means to advance a site into 2013!
Love how you mentioned the Vary Header for dynamic serving on a single URL to allow true Mobile SEO over just responsive design.
100... Thumbs up post Tom .... :) It's really amazing knowledge sharing information about .htaccess file.. I like the 404 page and Canonical point setup with .htaccess most... thanks for sharing... I will share with my social networks :)
Thank you for posting the great content.