Skip to Content

Content Optimization – Why You Need to Go Beyond the First Page

4 min read

A particular segment of content marketers believes there’s no value in looking beyond the first 10, 20, or 30 results in the SERP. From a content creation perspective, this could not be further from the truth. Subscribing to this belief jeopardizes the long-term success of your content marketing efforts. 

You’re not writing for information gain. You’re just writing me-too content.

In a way, it’s not surprising this myth exists. In an often-quoted 2009 study on human-computer interaction, researchers found that in their tests, 91% of searchers don’t go any further than the first page. Later, this Moz study found that “on average, 71.33% of searches result in a page one organic click.”

Fast-forward a decade, and little has changed. Most searchers of information rarely go past the first page or two in Google. But there’s a giant leap between someone who consumes information and one who creates information to be consumed. It’s something that the “don’t bother looking past the first page of Google” promoters conveniently ignore.

Just because searchers limit themselves to a few pages doesn’t mean content creators should do the same with their research.

Let me put it to you this way. Those who believe that it’s not worth looking past the first one or even ten pages of the search results are saying that:

  • The sum knowledge of a topic is contained within those limited results.
  • There’s nothing to be gained by examining any more pages about that topic.
  • All the expertise is contained within that small set of URLs.
When Google’s Search Liaison talks about “offering the best content”, he doesn’t mean copy what the top 10 results do.

Read what Google has to say about content quality and expertise. It says nothing about copying the top N results in search.

Those top 10, 20, 100 search results form the corpus or collection of documents. For all intents and purposes, this limited amount is the embodiment of all that expertise. But it’s not enough.

We’ve written before about topic modeling for SEO. The problem with building a topic model using such a small subset of data is that it will never have the depth and richness of one created using a far larger corpus.

While I understand everyone has their own opinion on whether looking past the top 10 pages is meaningful, data science says otherwise. When it comes to a corpus, size matters according to this study, that study, and many more


Oxford Languages defines comprehensive as “complete; including all or nearly all elements or aspects of something.” Relying on a limited sample of the top few pages in the SERP, when Google’s index contains millions of entries, is not the way to create a comprehensive topic model.

MarketMuse is the only software platform that analyzes thousands of documents to create a model for any given topic. Only after that topic model is created do we compare the top 20 in Google.

We invest heavily in building our own patented systems and methods for semantic keyword analysis and required infrastructure.

Our competitors, on the other hand, don’t. Thus they perpetuate the myth that you only need to research a handful of results to create expert-level content. 

I think the reason they take this approach has to do with economics. Many use third-party data accessed through an API, making it cost-prohibitive to do any extensive analysis. Those taking their own approach do so in a simple manner, lacking resources for anything more sophisticated.

Think about this for a minute. Would you consider a writer an expert just because they read the top 20 articles on a particular topic? 

I hope not.

Stephen leads the content strategy blog for MarketMuse, an AI-powered Content Intelligence and Strategy Platform. You can connect with him on social or his personal blog.

Tweet
Share
Share