Backstory:
When I started SEO work on highspeedinternet.com it was the first time I worked on a site with a GEOset. If you don’t know, a GEOset is essentially a templatized group of pages which all rank for the same keyword, but it’s not cannibalization because the search engine layers on local intent. Highspeedinternet.com (which I’ll call “HSI” for brevity), had about 30,000 city pages in its GEOset and who knows how many zips when I started working on it. The biggest sites I’d worked on up to that point were probably 100 pages at most. (Honestly it was a bit overwhelming at first and forced me to learn a bunch of advanced Excel stuff just to stay on top of things. But more on that later).
HSI was and still is an affiliate marketing site which resells residential internet. It’s actually not technically a reseller, since the consumer gets the exact price and plan they’d get from the ISP itself. HSI and other affiliate sites make their money not from the consumer as a middle man but from the ISP by charging a “finders fee”. Think of HSI like a search engine, but just for a specific topic. It works just like search engines by making money from advertisers, but people who use the search engine get the result for free.
Anyway, when I started on the site, it didn’t rank for much. The GEOset which would have targeted terms like “internet providers” and “internet providers near me” ranked probably page 3 of Google at best, and didn’t really generate significant traffic. There was also a “Providers” page which was designed to capture search volume for the same terms as the city pages, but also terms like “internet providers by zip code”. It did double duty as a paid search landing page, and was mostly a giant zip check field and nothing else.
When someone entered a zip code, they’d be routed to a zip code page, which was a parameterized version of the city page associated with that zip code. Zip pages were canonicalized up to the respective city pages, but sometimes still showed up in search results whenever Google didn’t respect the canonical tags.
Initial optimizations and goals:
My goal was increase rankings and traffic on both the “Providers” page and the city pages. At the time search volume for “internet providers” was about 135,000/month and “internet providers by zip code” was about 1600/month. HSI got barely any of that, with the “Providers” page ranked around the middle of page 1 on Google and generating about 30% of the site’s 800 sessions per day from organic search.
Here’s what a typical city page looked like then. [link to https://web.archive.org/web/20160709211328/https://www.highspeedinternet.com/ca/los-angeles] As you can see, it was pretty bare bones. Apart from the customer reviews at the bottom, 100% of the page is dynamically generated from a database. That is, the card for a given provider is the same on this page with just the data switched out for speeds or availability percent when it appears on a page for a different city.
Also, the meta data was templatized and dynamic. The Los Angeles page meta description was just . “Compare high speed internet options in Los Angeles, CA See plans, pricing, and reviews of all internet service providers in your zip code!”. The only difference between this meta description and that of any other city page was the city would be switched out. Although this meant that descriptions were technically different, they weren’t functionally different. Google probably considered these to be 30,000 duplicated meta descriptions (since the only difference was the city name).
First meaningful projects:
The first project that produced any growth at all was designed to increase CTR from SERPs. I had the thought that 30,000 duplicate meta descriptions probably weren’t doing HSI any favors with Google, and also probably weren’t resonating much with users. I noticed that they would sometimes show up in SERPs but often not. So I did the first meta description test.
Dynamic meta description test:
I got with the front end engineer and we figured out a way to make dynamic meta descriptions. Just like the provider cards pulled from the database to populate percent availability etc, we created a set of variables in the meta description. The description would find the city name, and top provider in that city, and then pull in the provider name, connection type, and percent availability in that city. An example would be this from Los Angeles: “Fiber: AT&T is 1% available in Los Angeles | DSL: AT&T is 80% available in Los Angeles | Cable: Charter is 1% available in Los Angeles| Satellite: Hughesnet is 99% available in Los Angeles”.
Since the meta description was now pulling in different combinations of provider names, and percent availability numbers, each of the 30,000 pages would be truly unique, and since the provider type was displayed, it would arguably be more attractive to users when displayed in SERPs.
We launched dynamic meta descriptions in October of 2016, and the change produced a roughly 35% growth in traffic, mostly from increased click through rates, but also from some non-zero increases in rankings. Site traffic went from about 500-700 sessions a day to roughly 1000, but still ranked between page 2 and 3 for target keywords.
I think the lesson here is that optimizing meta descriptions works! Sure, Google can replace them with whatever they want, but a good enough description will display in SERPs. And if you do it at scale like we did on HSI, enough will show and enough will get positive engagement that it produces meaningful results. Next I’ll talk about the Big Content Project that put HSI on the map.
