Google made it harder for external models to access the web's depth.

KitaYama

Well-known member
Local time
Tomorrow, 05:23
Joined
Jan 6, 2022
Messages
2,204
This may be of interest for the site owner.

You can no longer view 100 search results at once. The default max is now just 10.
Google has quietly changed how search results appear, reducing the maximum results per page from 100 to just 10, a move that could reshape SEO, AI training, and digital visibility.
This shift means 88% of websites have already seen a major drop in impressions. By cutting access to results ranked 11–100, Google just limited what both users and AI models can “see.” Platforms like Reddit, which often ranked in that range, were hit hard, with noticeable declines in visibility and engagement.

The impact was immediate.
According to Search Engine Land, 88% of sites saw a drop in impressions.
Reddit, which often ranks in positions 11-100, saw its LLM citations plummet. Its stock dropped 15%.
This isn't just SEO fallout. It's an Al supply chain issue.

Most large language models such as OpenAl, Anthropic, and Perplexity rely (directly or indirectly) on Google's indexed results along with their own crawlers.
By cutting off the "long tail" of results, Google just reduced what they can see by 90%.

The internet's "training data" just got shallower.

For AI startups and developers, this change goes far beyond SEO. It affects how data is gathered, how algorithms learn, and who gets discovered online. The open web just got narrower, and those relying solely on search now face a tougher battle for attention.

 
First, thank you, @KitaYama - for a very timely post of a major change that will certainly affect a lot of things that we do on a regular basis.

Second, I wonder - and this is ONLY a guess - whether Google initiated this change after some of the successful lawsuits on how the AI companies used "pirated" sources for training their LLMs (and whatever else they use). Many suits have been filed complaining that the AI companies exceed the boundaries of the "fair use" exception for reproduction or access to copyrighted content.

Third, another speculation... I wonder if Google thinks that returning too many items in a search leads to their AIs accepting irrelevant content and thus providing responses that are commonly called "hallucinations." We have to remember that LLMs return things that are deemed statistically likely to relate to to a given question. If you bring in too many bad answers and don't challenge them, the AI will not know to discard whatever linked the bad answer to the question.
 
First, thank you, @KitaYama - for a very timely post of a major change that will certainly affect a lot of things that we do on a regular basis.

Second, I wonder - and this is ONLY a guess - whether Google initiated this change after some of the successful lawsuits on how the AI companies used "pirated" sources for training their LLMs (and whatever else they use). Many suits have been filed complaining that the AI companies exceed the boundaries of the "fair use" exception for reproduction or access to copyrighted content.

Third, another speculation... I wonder if Google thinks that returning too many items in a search leads to their AIs accepting irrelevant content and thus providing responses that are commonly called "hallucinations." We have to remember that LLMs return things that are deemed statistically likely to relate to to a given question. If you bring in too many bad answers and don't challenge them, the AI will not know to discard whatever linked the bad answer to the question.
I think, these two comments from Instagram tells the whole story:
  • In other words, google made it almost impossible for smaller sites to be seen without paying for ads.
  • This gives Google an edge over all other AI competitor companies…It’s a clever commercial move by them, as they ringfence the data for their own AI
 

Users who are viewing this thread

Back
Top Bottom