SEO is a semi-permanent marketing investment because after enough time and work, the growth becomes 100% organic and improves naturally. As DA grows, sites begin to experience a form of economies of scale, from natural backlink growth, to black-hat protection.
What are Economies of Scale in SEO
Sites grow over time, very slowly. Domain Age is a ranking factor in the Google algorithm, and while very weakly weighted, is still a factor nonetheless. Think of this as an inflation rate, or a baseline for the metric of “natural growth.” If sites put in slightly more work, and actively invest in SEO, the growth should exceed the base rate.
As sites get larger, primarily through gaining backlinks, they begin to see natural benefits which begin to initiate the flywheel effect, which is discussed below. When sites hit this growth point, they begin gaining backlinks and traffic solely through existing, which is true organic growth.
Of course, this requires work to achieve. The “base rate” has to be beat by doing valuable technical, off-page, and on-page SEO in order to saturate the market you are involved in, and to enter the flywheel.
The 7 Economies of Scale
1. Flywheel Effect
In SEO, the flywheel effect refers to the work required so that the acquisition of backlinks becomes natural, as a result of the site’s position. Basically, a continued growth with minimal intervention; very similarly to the business management interpretation.
The flywheel effect is applied to many sectors in many industries. There are references to the flywheel effect within physics and electrical engineering, but the most compatible description relates to Business Management. The flywheel effect in Business Management refers to:
… the effort that a manager must exert for a system to “begin working.” Once working, the system behaves and works by itself with minimal intervention.
2. Growth Continuation
After specialising in a single niche and developing digital presence in the area, sites begin to grow. This occurs via the pillars of SEO, whether it be content, backlinks, or technical optimisation. As sites grow, they begin to have greater DA’s, more geographically diverse backlink profiles, greater content silos, and many more pages that require indexing.
This factors play perfectly into allowing large sites to pivot into new markets to allow for continued growth. In the Growth Cycle of a site, the saturation of the market leads to a logarithmic penetration, which results in more resources needing to be deployed in order to saturate the final percentages. It is far more efficient for sites to pivot into new markets, where the exponential “Growth Phase” can be acted upon, rather than the logarithmic “Maturity Phase.”
As a result, a larger site in several niches will always outperform, in terms of traffic, a smaller site which owns 100% of a medium sized niche. This is simply because 70% penetration into two small niches trumps the total traffic of a 100% medium sized niche; this is of course subject to the niches, but is generally the case.
Sites which see the best pivotal strength into new markets are:
- News and Media sites
- Content curation sites
- Avid bloggers
- Research hubs
3. SERP Presentation
As sites grow, primarily in the form of DA, they also begin to appear more and more “trustworthy” in SERPs. Of course, if two sites are similarly ranked, the site with better rich data extract and peripheral will be chosen purely from a user behaviour point of view. We call this SERP Presentation at SEOSPIDRE, where we try and calibrate smaller sites for these specific results:
Here are some examples of great SERP Presentation based on the above factors:
Rich Data Extract
How-To / FAQ
4. Indexation Frequency
Bigger sites are indexed more frequently, and not because they can informally request it with Google. Page Rank is a calculation of how likely, mathematically, a user is to end up on a specific site by clicking on links. The more likely, the more “important” that site is deemed to be. This was the original backbone of the web, found in backlinks!
One of the key foundations of a Search Engine is crawlers, which are sent out to index, and re-index sites and update the index. This increases the “index freshness,” which was a priority in the Caffeine update in 2010. As crawlers primarily follow backlinks for discovery, if small sites link to big sites (which is a very common occurrence), the crawler eventually winds up on the big site again.
This can be mathematically modelled, and became the advanced version of Page Rank. A side effect was that larger pages, which had crawlers reach them more frequently were also more frequently indexed. This meant the Google index had very high freshness ratings for larger sites. As this is one of the key ranking factors, large sites continued to rank better as a consequence, than slightly smaller sites with fewer backlinks.
This also feeds into the flywheel theory, that as a large site is constantly receiving a supply of backlinks from small, yet to be discovered sites, the crawlers eventually discover the small sites and re-visit the big sites.
5. Algorithm Resistance
A diversified portfolio has the most stability. One of the axioms of banking is also an axiom of SEO, where the market is always highly volatile and requires constant adjustment. However, the bigger you are, the less you need to worry about being sunk in the SERP as a result of a single adjustment.
Systems such as the new Google Fact Check allows sites to “score” other sites factuality. If you are a small site, it will only require a few poor ratings to dramatically impact your factuality rating, and reduce the value of your content. Larger sites benefit from having larger Reliable : Unreliable ratios, and therefore singular attempts to reduce the value of the content are far less likely to success.
One update which was very dangerous to large sites was the Google Diversity Update, which exponentially made it more difficult for sites to rank for more than one link per SERP. The function is logarithmically scaled, meaning that each link becomes far, far less likely to rank than the prior.
6. Backlink Insulation
Big sites have bigger backlink profiles! Not only that, but they tend to be far more diverse, in geography, type, and anchor. This prevents larger sites from being subject to toxic backlink attacks, which dramatically swing their performance in SERPs. While large sites are still impacted by toxic links, the healthy : toxic ratio is reduced less than if the attack occurred on a smaller site. It also gives larger sites more time to identify and disavow toxic backlinks before any real impact is seen.
Black hat SEO has been a real threat since the beginning of Webmaster autonomy. Since individuals realised that other sites could be taken down, rather than their own improved, it not only became a service, but a reasonably common practice.
Therefore for small sites, it’s critical that you stay on top of your backlink health, traffic, and site security.
7. Available Data
Sites with greater DA’s benefit from having tons of data to optimise user experience. It’s a known fact that Google is moving more towards user experience for SERP rankings, and increasing weight in factors such as time on site, sessions per user, and direct recurring traffic.
With more data on how users behave on-site, via click funnels and data imported from HotJar, big sites are capable of better optimising their site and A/B testing with larger sample sizes.
Smaller clients have to make deal with smaller groups to A/B test changes which increase probability of skewness and biases within the sample.
Bigger sites have additional advantage in terms of keyword rankings, where if sites are ranking for 1,000+ keywords they can regularly measure the growth and decline of rankings within keywords.
On-Page tests are also more fruitful as they will likely impact a more diverse range of rankings for keywords. For example a single change may result in ranking decrease/increase across five or six keywords. This signifies not only the relationship between the keywords, and the page in question, but gives a better idea than if a single keyword decreased/increased as to what works and what doesn’t.