I work with a lot of insurance agents and they often tell me that their biggest competitors are the other insurance agents that work for the same company as them. As it is with most big brands, there are multiple offices in the same city and those individual locations all want to rank first on Google for the exact same thing.
An accidental discovery
I stumbled across something a few months ago when troubleshooting ranking for one of our insurance clients, and everyone I've shared this with was just as surprised as I was.
The discovery? Google filters out a ton of pages for big brands (organically), which seems to have a direct impact on which locations rank in the local 3-pack.
To illustrate this, I'll look at a few examples. Usually when I search for a branded term, the people that rank high in the 3-pack also rank high organically, as illustrated below.
What's interesting is that you'll notice there are only 3 organic results that belong to this brand before you start seeing small business directories like Yellowpages or Yelp.
However, when you add &filter=0 to the end of the URL string, you will see that most of these location pages are being filtered out. Adding this search operator to the URL is the same as if you went to the last page of results and click the message that states:
In order to show you the most relevant results, we have omitted some entries very similar to the [X] already displayed. If you like, you can repeat the search with the omitted results included.
This isn't just happening to one brand; it's happening to most.
It also doesn't just happen in the insurance industry. Any big brand that has multiple offices/branches that are serving the same city would possibly run into this. Here is another example for the search "H&R Block Houston":
Why is Google doing this?
Most people are aware that Google filters duplicate content. They want to provide their users with unique results, not a repetitive list of the same thing over and over. This is part of why they have a filter — to keep a single domain from completely dominating the search results page.
How does Google decide who gets filtered?
Whichever pages have the highest ranking power and are the most relevant are the ones that end up showing first and avoid the filter. As no surprise to anyone in the SEO community, the common factor I see influencing this the most is links. Often the first couple that show up have some good local links from some charity they sponsored in town or some professional organization they're a part of that the other offices don't have.
What should big brands be doing differently?
- Make the pages for your locations more different. When I compared two of the filtered pages for one insurance brand, the content on the two different agents' pages was 77% the same. I looked at two location pages for a tax accountant business in the UK and they were 81% the same. A lack of unique content is an easy way to trigger Google's filter since the pages look like duplicates. Be wary of using widgets that copy the same content to every page.
- Take advantage of targeting zip codes or community names so that each location can rank better for different types of keywords that are more central to their area. For example, there's a community called Unionville which is inside the city of Markham, ON, yet most franchises target only Markham on their sites.
- Try to get more links to the individual location pages so they're more authoritative. Many sites automatically link to the homepage for a big brand instead of linking to the page that lists the information about that specific location.
Other uses for &filter=0
I also use this feature regularly to find out if Google is confused about which page to show for the keyword I'm targeting. If I want to do a good job at local SEO, I don't want Google to be confused! Often it will reveal pages that Google thinks is related to that same keyword, even if it's slightly different ("dog trainer" vs "dog training"). I once had a client who had his homepage ranking and service page (that we wanted to rank) filtered. By removing some of the duplicate content on the homepage and changing some things in the meta tags, we were able to solve that problem and saw his ranking increase over the next couple months.
Have you found other good uses for the &filter=0 search operator? Tell me about them in the comments.
Hello, JoyHawkins
Recalled "&filter=0" ,almost after 7 years, That parameter helps to study local listing results very well. I should say that your observation is wonderful. Thank you for sharing!!
This is a fascinating article and incredibly relevant to my business as a real estate agent. Gave me a few ideas to test out and see if I can improve my ranking results. Thanks for this article Joy!
Glad I could help :)
Can this filter also be applied to Google maps results?
Yes and no. Getting filtered in the organic results doesn't mean your listing gets filtered in the map pack. You'll probably find it will increase in position if you get unfiltered because the signals that get you unfiltered help ranking in both sections. There is another type of filter I've seen in the map pack that is more based on the similarities of the listing (name, website, phone #). It's extremely rare where I would see 2 listings for the same business (using the same phone number) ranking together in the map pack.
Sometime these accidental discoveries gives you more learning then normal. This is one of the classical example.
Great observation about the use of the filter. Thanks!
Very interesting article. i had noticed this "filter thing", but i couldn't tell why specific results are filtered out and how to avoid filtering. Thank you.
Thanks for that hat tip Joy!! I also used another filter as mentioned by Darren Shaw which helps to include location search filters which Google recently removed from its search bar. It's "&location=CITYNAME" . Hope everyone's using that too.
I never tested &filter=0 URL thing. This will surely be much helpful in improving pages content and rankings.
Thanks Joy!!
This is great to understand the filter. I´ve been looking into this for some time and this explains it clearly to me ! Thanks for sharing !
Joy, thank you for sharing. Very interesting!
Great post, thanks! Will have to do some testing.
Question: What plugin are you using to get the position change overlay on the left hand side of the SERP?
It's called SERPTrends and it's pretty awesome!
great article. Thank you! What plugin are you using to show the rankings in the serps rise and fall
Chase, It's called SERPTrends.
This would be rather for seo guys, not ordinary users. Also "if filtered", then "work on it to get back!" imo.
Since the search was a regional term, it seems appropriate that the region landing page for Baltimore, MA was shown in the second position. From the user's point of view, that seems to me like the most relevant result for that search, as opposed to showing seemingly random agent-level results. I think this underscores the importance of having a location directory with region/state/city level landing pages, as opposed to focusing on just the location-specific pages.
Looks like I have something new to test too :D
"&filter=0" - I'll try if I can use this for other puspose, Thanks!
I have just known this. Very helpful. Thanks Joy Hawkins. Your very first paragraphs made me startled a little bit. But thankfully, this only applies to duplicate content on the same domain.
Thanks Joy Hawkins, for this valuable article about Google's Filter.
On subheading "What should big brands be doing differently?" #3: I think your different locations having references from a variety of link providers could help search engines see the worth and the high relevance of each location as a single point of interest and making them not beeing dropped by the filtering. But this all depends on the user experience – if the user don't like the content and just disappear quickly after entering it you can't compete with other more unique entries beeing crawled.
On the other hand I would suspect it highly depends on the place you are located with you laptop or tablet, which choice is presented to you. This is merely self-evident – just wanted to mention it.
This was a revelation to me. Thanks so much for sharing this with us!
Best,
PopArt Studio
Thanks a bunch. My company has a few versions of a website for different countries and I think we screwed this up, I'm currently checking different regions to see if nothing's missing.
Edit: We did have one problem related to the filter and now it's solved. Thanks again!
Wow. I had no idea Google was filtering like this. It makes sense, but I'm sure that presents a bit of a challenge to rank in crowded areas. I definitely need to use the &filter=0 more often to figure out why some pages are performing better or worse. Thanks Joy!!
Very interesting article Joy. I was not aware of this up to now. Thanks for the detailed analysis as well.
Seems pretty common sense that they wouldn't show 10 links from the same domain in 2016. Google has evolved quite a bit since the beginning, and that would be a very 1990's neanderthal version of the SERP. If people want to just look at the one websites / brands results they will search for that site specifically instead of using a generic search.
Also makes sense since they have switched 7 packs to 3 packs and also Matt Cutt's explicitly stated that they made updates to prevent domain clustering as early as 2013. https://www.briggsby.com/google-domain-clustering-...
See Algorithm Change History - Domain Crowding, May 2013: https://moz.com/google-algorithm-change
I have seen the same search Result with the term "DIR Body Camera" and it showed me the 7 same domain on the top positions.
I never made a deep thinking about the filter parameter and the "most relevant results" message. Thinking about it now, I would say that I had unconsciously assumed that google was filtering out results from the same website or duplicate content, finally showing the highest ranked link. But it has enormous implications for local SEO of small firms, not just big brands. They need to know not just all their neighborhood competitors, but also anlyze their websites to differentiate from them.