So based on the project in the previous posts, here’s what I learned about how GEOsets work, and how you can maximize your chances when creating or optimizing a GEOset on your site.
Definitions:
First of all a GEOset is a group of pages that all rank for the same typed keyword. They don’t cannibalize each other and rank simultaneously because there is geographic implication in the keyword, and geographic difference in the page content. Two users might ask Google for the same thing, but they’re standing in different places and get different answers.
Natural Granularity
First identify the Natural Granularity of the thing your site is selling. For internet service providers that’s the city level but it can be different for different topics. The way to know what level of granularity to divide your GEOset into, is identify where the landscape of service offering actually changes. Combinations of ISPs are demonstrably different from one city to the next, but not at the zip level. But for other service offerings maybe things are different regionally or even down to the neighborhood level. Identify where things are different and those are your geographic divisions.
Goldilocks zone and what happens when you get it wrong
If we’d decided to index zip pages and build out content there, things would have gone sideways really fast. Zip pages are too similar to be the naturally granular level for ISPs. At best all our efforts would have been for nothing, since these pages would have cannibalized each other. This is when many pages function the same and are targeting nearly identical keywords. Since geographic difference is the key to avoiding cannibalization in a GEOset, lack of geographic difference guarantees it. At worst this looks like spam and will get your site penalized.
But there’s a danger in going too broad as well. Sites work best when any given user who lands there gets useful results, personalized to their unique situation if possible. For an ISP site, many zip pages, while risky for cannibalization, aren’t necessarily bad for the user. Each page is narrow and helpful, but there are just too many. So it’s best to keep them to non-indexable internally searchable pages. But state pages, while certainly avoiding risk of cannibalization, would lack utility. Because the natural granularity of ISPs is at the city, providing state level data isn’t actionable. If someone asks you “where are you from?” and you tell them “Planet Earth”, the answer is true, but also probably useless.
Granularity tests.
There’s a few ways to double check that you’ve divided your GEOset with the right granularity. The first is the “Landscape Test”. Does the answer or result on the page change when the page changes? If not, then you’re probably drilled down too narrowly. Sure a user can get what they need, but there’s no reason for them to be on this page in particular. This looks like spam. The second test is the “Utility Test”. Can a user get what they need from this page, or do they have to click “deeper”? If they have to click deeper, then the GEOset should be as well.
The Slinky effect or “critical mass” of GEOsets
Because GEOsets rank for the same page, but provide personalized results they behave both like a group of pages and a single page. Yes, if one is considerably different from the rest, it might rank differently, but often they move in unison. Especially if your GEOset is more or less an entire population, it will not be weighted equally. On city-level GEOsets, about 50% of the population lives in cities represented by 15-20% of the pages in the set. This means that projects can be done on some pages but not all, but must be done on enough to represent a significant percentage of the total users of the GEOset. We didn’t see any movement until we’d optimized several hundred pages. This depends on the granularity being appropriate for the topic such that each page has unique value, but topic equivalence.
Technical rebuttal against spam
Finally, getting the granularity of the GEOset right is a natural defense against accusations of spam. If your pages are truly unique, and truly useful then this satisfies the ultimate goal of Google: to serve the user a good experience. I’ll talk more about a specific example of this in the next post.
