How have the tides changed, and what were the impacts?
SEO: Dependant on Search Engines
SEO is completely dependant on Search Engines, like Google, Bing, and Yahoo. It’s a shame that the ranking algorithms can’t be given to us, because then we’d all be getting great results! However, that’s part of the charm, testing and figuring out what works in beating a Search Engine is time consuming and extremely resource intensive.
So here’s what’s happened. Google has been coming out with evolutions to their Search Engine in mostly animal-based updates. It all started with Boston, which through a series of basic web-development transformations became Panda with the Caffeine add-on, which become Penguin with tons of add-ons, which become Hummingbird with the RankBrain add-on as the current working model.
The Search Engine has seen changes from a basic hard coded approach, to SERP generation, giving power to the users, to crunching down on manipulation and SEO tactics to finally being able to learn from the internet.
As a result of these extensive changes, SEO has had to adapt. SEO in effect is, while nobody likes to admit it, a technique with which to beat Search Engines. Of course the Search Engine owners of the world dislike this, so they come out with more and more advanced methods of mitigation until there, in theory, can’t be any SEO left to do.
Boston. The first major update that moved Google from a flimsy Search Engine into a very powerful informational powerhouse. Boston resulted in the compilation and “ranking” of websites within its registrar to display to the user.
While primitive, this was a huge step up from having a load of information spewed at a user making a search which potentially had loads of unrelated text. This introduced a Search Engine Results Page, that meant to actively try and find the best solution to a query.
Everything was still hard coded at this point, and the system was very primitive; relying on about 20 or so factors that would determine the “rank” of a website.
Rather than a core algorithm development, Caffeine lived and operated as an indexing and crawling strategy markup. Caffeine, reportedly from Google, resulted in a 50% fresher index. This means that extra stress was being placed on content recency rather than content value and back-links.
Caffeine was the beginning of content-creation and marketing as everybody wanted to mass-produce articles of low quality rather than time consuming quality articles. Recency prevailed, and while it wasn’t perfect it certainly provided a significant increase to the total information present on the internet.
Panda was the first major core algorithm overhaul since 2002, and without came ridiculous changes. Panda 4.0 is still in use today in some nations and as a proofing ground tester due to it’s properties as a Search Sorting Algorithm (SSA).
Panda was launched with the intent of crushing black-hat SEO based websites, and poor quality content websites. Some of the most effected websites included those with high ad to content ratios, keyword stuffing, white text/white background, and general bad practice.
12% of all Search Results (an enormous amount of websites) were effected by this change for better, and worse, as those with good quality content that had been written much earlier in times past, were suddenly being rewarded and seeing SERP growth.
Panda brought in the first “law and order” into the Search Engine realm by instilling fear in black-hat SEO conductors and those who had been thinking about it. Penalisations were spread far and wide, some websites being black-marked, a unique feature of the algorithm. The black-marking has been carried over to every major overhaul since, and still exists today as a way for Google to effectively kill off organic traffic to the site.
Penguin was the second wave of order. Penguin resulted in the adjustment of 14.5% of Search Engines, and negatively impacted 3.1% of all queries. Penguin came about as a more intrusive, and intelligent version of Panda, while Panda still existed as the hierarchical leader.
Penguin became the finalising force in creating “quality websites” rather than the new developing black-hat techniques such as black-link spamming and cross-page information duplication. However, Penguin had an unexpected consequence, resulting in the use of black-hat SEO to sink competitor websites. This required immediate intervention, and action that did not come for the next 4 months.
In that period, Penguin & Panda both failed to prevent black hatters from being able to tank SERP leaders by fraudulently creating websites and associating them with the leading websites. This is, even today a common black hat strategy that results in immediate black-marking, but back then required much more time to sift through malicious material.
After Panda and Penguin, Google engineers realised they needed a flexible system that they could change and put things into and pull out like sockets in a wall. Like linux, but for Search Engines. Hummingbird was the creation. They took Panda and Penguin, combined them, and upped the flexibility of the system to allow new additions to be made easily.
Hummingbird has been compared to Caffeine, because while it is a core algorithm update due to the combination of Panda + Penguin, it doesn’t actually add much itself. It seems as if Hummingbird was really just more of the gateway for changes that would be occurring in the time to come, such as RankBrain. This is the current algorithm model used globally.
RankBrain is the next generation of Search Engines. RankBrain is an AI snippet that enables Hummingbird to learn. Effectively what this is does is mitigate “SEO” strategies and most effectively learn how to much response and query so that every search is satisfied.
Since it’s implementation it has been dubbed the most influential ranking factor in Google’s History, followed by Panda. RankBrain has caused the fluctuations, depreciations, appreciations of various aspects of the 200 factors that go into the Hummingbird code. As it learns, additional elements are added, removed, and changed as SEO strategies adapt to try and accommodate for this now constantly changing environment.
Technical SEO is among the few factors that remain stable in terms of results, as both back-links and content creation have been heavily devalued. As a result, TSEO has taken precedence in modern SEO due to its ability to abide by factors that Search Engines can’t yet function without, such as crawling, parsing, processing, and indexing.