A Comparison Between SEO in the mid-2000s and 2018

Back in the mid 2000s, SEO was a rather mechanical discipline - far more science than art. Google's algorithm was a lot cruder back then, and people took advantage of this. I remember very well there was a huge gap between what Google said in their guidelines, and how they actually ranked sites. This gap confused the average online marketer. They set out with the best of intentions by carefully reading Google's webmaster guidelines and built a site around Google's list of best practices. And then weeks later, their content-rich sites were still struggling in the lower ranks of the SERPs. And who was on page one? Sites with thin content, lots of keyword stuffing and near-duplicate content from one page to another (often just a town name being the only differential between any two given pages). Clearly, the shadier practices were being rewarded despite the wholesome content that Google claimed they valued more than a set of crude metrics that gave many sites a false sense of trust.

Fast forward to 2018, and we can see that Google have much more closely aligned their algorithm to their guidelines. The algorithm has smartened up and become street-wise. It's a lot more distrustful of sites now, and you have to work a lot harder to win that trust. Inbound anchor text is judged in a much more nuanced way now. Google can match on-site content with searched-for keywords far more intelligently, so there's less of a need to mechanically build links containing specific phrases. Google looks at user metrics to see how much time people spend on a site. Not that a long time is always good - it can depend on the search term as to whether a visit ought to be short or long.

Penguin has been rolled into the algorithm for a few years now, so it's constantly evolving rather than being manually updated.

So what does this mean for the digital marketer in 2018? I think it's actually good news. To be honest, most whitehats in the mid 2000s were strong-armed into performing blackhat tecnhiques because they worked so well....and results are everything. Then when Google did a real bait and switch when Panda and Penguin came out, it left a lot of would-be whitehats with egg on their face as they explained to their clients that Google's updates had penalised their sites. We would be talking about 2011 and 2012 when this happened. Since then, I've seen whitehats become more and more rewarded for whitehat activity - since the algorithm has finally been able to reward such activities more and more. This means really focusing on content, while judiciously building links from trusted websites with a strict editorial process. Furthermore, these links are a lot more natural - author citations for example.

Having said that, it's not all good news. We've seen Google expand its adspace on the SERPs pages - it's now not unusual to see organic results only below the fold. To the average searcher, that means there are NO organic results! Most searchers don't even scroll - they search and click on whatever they see above the fold. We've seen Google Maps become monetised now, and a more aggressive monetisation model on YouTube. All of this shows how much pressure Google are under to turn a buck. The concern is that Google may eventually become a de facto "pay to play" platform in the future where even if you rank number one for a particular keyphrase, you'll only win a tiny fraction of traffic compared to Google Ads.

To end on a positive note, the rewarding of whitehat practices has helped webmasters become better publishers. We ultimately write content for human beings. They are the ones with the purse strings. Quality content improves conversion rates, and that is the ultimate metric.

Article kindly provided by netcentrics.co.uk

Latest Articles