Konves Digital
SEO journal

Optimizing highspeedinternet.com Part 3

Let me continue the story of how I ranked Highspeedinternet.com (HSI). Obviously as an SEO specialist, the best feeling in the world is when you come to work every Monday and do your weekly report and things are up. A lot. HSI was growing about 20% WoW as additional deep pages in the GEOset continued to be crawled and indexed, and as the top pages continued to rank better.

Rankings up and down but mostly up.

I keep saying “rank better” but let’s break down the rankings profile: Before the big content project, the best target keyword was “internet providers” at about 135k searches a month, but the site really didn’t capture much of that volume. That keyword, along with “internet providers near me” and “internet providers in my area” rounded out the bulk of keywords that ranked below page 1 before the project and which rose to more or less position 1 across the whole GEOset (by the end of the year). There was even a week or two where the site ranked for “internet”. Just the one word, by itself. 

This sort of sheds light on another principle of how Google ranks stuff. When things move, they often move in fits and starts. Keywords might pop higher than they eventually settle, but the final spot is higher than initial.

Anyway as things grew every week, traffic kept going up, revenue kept rolling in, and it was increasingly easy to justify additional spend on resources. Something I neglected to mention at the start of the story is that there was new leadership on the HSI team, with Cory joining from another agency. He was instrumental in evangelizing my ideas to senior leadership, keeping the team motivated to continue building out optimizations, and played a pivotal role in monetizing all the new traffic, which I’ll get into later. 

Catastrophe.

Anyway, things hummed along great for a few months and then disaster struck. Every SEO’s worst nightmare (well I don’t know about all, but at least mine) is popping into work after the weekend and everything is on fire. In my routine check of GSC one morning I just happened to click the “manual actions” section, and noticed a penalty. I checked a few other sites, and a total of 7 sites had incurred a manual penalty. Overnight about 50-90% of our traffic was done, all from the GEOset which had been going so strong for those few months.

Penalty details.

The penalty was for “doorway pages” which is a concept from back in the early 2000s or before. You might remember those sites which had a cover on their landing page which said “enter site” and “exit”, before you hit the main site. Those types of pages provide no value to the user, but since everyone who enters the site has to click, it essentially doubles your engagement. Since user engagement is a ranking factor, it was a way for SEOs to boost rankings with little effort. 

Like most SEO tactics which boost rankings without benefitting the user, Google soon cracked down on this, and devalued it. In the case of HSI the city pages were considered to be doorway pages because our site didn’t have pricing or a way to actually purchase internet, but required the user to click through to the actual ISP. The accusation was that because the user need to click through to the ISP, the pages had no value, and since there were 30,000 pages with no value that looked like egregious spam. 

Rebuttal: Arguing with a Billion Dollar Company

Removing this penalty was top priority and a team effort. The plan was for me to write the reconsideration request, our SEO directory would review, and our senior copy editor would clean it up and make it more presentable. My main angle was that these pages weren’t doorway pages because they provided actual value. The value was that in every city in the US, there’s a more or less unique combination of internet providers. If you’re in Phoenix, and you need internet, the Las Vegas page isn’t exactly what you need. Yes there are ISPs that serve both Phoenix and Vegas, but there’s some that are unique to each. Thus, each page has value and unique value for users in their respective cities. 

Supporting points.

I also pointed out in the reconsideration request that these pages had maps of ISP availability, and copy which described the map. Although every page had a map and a block of copy, the map and corresponding copy were unique to the city. When you travel somewhere you can’t just read any map. It has to be a map of the place you’re going to.

Reconsidered!

I submitted the reconsideration request and waited. It was the Friday before July 4th weekend and I was planning on taking a few days off to backpack the Uintas. I got a text from Cory on July 5th that the penalty was lifted! We were saved! Interestingly the other 6 sites which had gotten penalties also had their penalties lifted, even though I hadn’t mentioned them in my reconsideration request (I focused on HSI because the GEOset there was producing about 90% of the revenue for the site, and the other GEOsets weren’t as significant)

Why it worked.

In hindsight the reason the reconsideration request was effective was because of optimizing at the correct level of granularity. The uniqueness I called out across our GEOset was true specifically at the city level. If we had used a zip code based GEOset, my rebuttal would have been ineffective, since it would have been more or less true that they were doorway pages. So I guess the lesson here is that GEOsets are a powerful thing, which if done right can produce huge gains and shrug off accusations of spam, but if done at the wrong level of granularity are at high risk for penalty for being doorway pages.

In further posts I’ll get into everything else we did on GEOpages to continue to optimize for traffic, and better convert that traffic to revenue.

Michael Konves

Leave a Reply