Google’s recent removal of the &num=100 URL parameter is reshaping how SEO data is collected, analyzed, and reported. According to early analyses, 87.7% of sites have seen a drop in impressions in Google Search Console since the change.
While this may seem like a minor technical update, its impact on rank tracking, data accuracy, and SEO reporting is significant. Understanding what changed, who it affects, and how to adapt is essential for maintaining reliable insights into your website’s performance.
What Is &num=100 and What Changed Recently


The &num=100 parameter was a long-standing, though unofficial, way to tell Google to display up to 100 search results on a single page instead of the default 10. Many SEO tools and analysts relied on it to extract comprehensive SERP (Search Engine Results Page) data efficiently.
By appending &num=100 to a search URL, tools could retrieve 100 organic listings at once, reducing requests and improving tracking efficiency. Although Google never documented this as a supported feature, it worked consistently for years and became a standard practice in SEO data collection.
In late September 2025, Google silently deprecated support for this parameter. Searches that include &num=100 now return the standard 10 results per page. This change means that any process, scraper, or rank-tracking tool relying on it must now issue multiple paginated requests (page 1, page 2, page 3, etc.) to retrieve equivalent data.
The update is not reversible and appears to be permanent.
Who Does This Affect
Rank Trackers and SEO Tools
The most immediate impact is on rank-tracking tools and SERP monitoring software such as Semrush, Ahrefs, AccuRanker, and Moz. These platforms previously used the &num=100 parameter to capture up to 100 search results per keyword query efficiently. With its removal, tools now require 10 separate requests to collect the same data that was previously fetched in a single query.
As a result, operational costs have increased due to more API calls, higher bandwidth use, and longer processing times. To manage this, many platforms are already reducing tracking depth — offering only top 20 or even top 10 ranking results instead of the full top 100.
For SEO professionals and agencies, this means reduced visibility into lower-ranking keywords, limited competitor tracking, and potential gaps in historical rank data. Positions that once appeared in deeper ranges (for example, 50–100) may now go unrecorded, affecting long-tail keyword monitoring and historical comparisons across reporting periods.
Reporting and Data Availability
This change directly affects reporting accuracy. Instead of seeing ranking coverage across the top 100 results, most dashboards will now represent a smaller window of keyword visibility.
As a result:
- SEO performance reports will show less granular ranking data.
- Long-tail keyword visibility may appear to decline simply due to reduced tracking depth.
- Comparing new data with historical datasets that used &num=100 will be inconsistent.
Google Search Console Data
Industry analysts, including Google’s Gary Illyes and SEO expert Glenn Gabe, have suggested that large-scale scrapers using &num=100 may have distorted Search Console metrics for years. The removal of this parameter could therefore make GSC data more accurate.
Early observations show:
- Impressions dropped for 87.7% of sites.
- 77.6% of sites lost unique ranking terms.
- Short-tail and mid-tail keywords experienced the largest visibility declines.
Previously, many impressions in Search Console were not real user interactions but automated scraper requests. With this cleanup, reported impressions now reflect genuine search exposure rather than artificially inflated activity.
Impact on Clicks and CTR
Clicks appear less affected by this change. However, impression counts have declined sharply, leading to higher average click-through rates (CTR) in Search Console.
This trend is similar to what occurred after AI overviews and generative answers were introduced: impressions rose as more queries surfaced AI-driven results, but clicks declined as fewer users scrolled through traditional listings.
Now, with &num=100 removed, impressions have dropped again—but this time, because artificial scraping impressions have been filtered out.
One Positive Outcome
How SEO Professionals Should Adapt
The removal of &num=100 changes how SEO professionals measure performance and interpret metrics. It also increases the value of thoughtful analysis over automated data collection.
1. Reassess Your Data Baseline
Treat September 2025 as a new baseline. Any comparison between pre- and post-change metrics should be interpreted cautiously.
Impression declines or ranking drops may not indicate loss of real visibility. Instead, they likely reflect the removal of non-human impressions.
2. Adjust Rank Tracking Processes
- Update any rank-tracking scripts or APIs to use paginated queries (start=0, start=10, etc.) instead of &num=100.
- Expect slower data collection and possible rate-limit constraints.
- Focus tracking on top-performing keywords where ranking fluctuations have meaningful business impact.
3. Focus on Click and Conversion Metrics
With impression data now more constrained, clicks, conversions, and engagement should be prioritized as key indicators of SEO performance.
This approach aligns SEO reporting more closely with business outcomes, making it easier to demonstrate ROI to stakeholders.
4. Communicate the Change Clearly
Agencies and consultants should inform clients about this shift. The decline in impressions is not a penalty or ranking loss—it’s a data correction.
Clear communication helps maintain trust and prevents unnecessary concern over metrics that appear negative but are actually more accurate.
5. Reevaluate Keyword Strategy
Because deeper ranking positions (beyond page 2) are harder to monitor, keyword targeting should focus on achievable, high-intent queries.
Optimizing existing pages for stronger topical relevance, user engagement, and content depth will become more valuable than tracking hundreds of marginal terms.
6. Embrace Data Quality
This update effectively filters out noise. Moving forward, SEO analysis will rely more on clean, verified data sources rather than scraped metrics.
Professionals who adapt quickly will deliver insights that are more consistent, credible, and defensible.
What It Means for Your Brand
For brands, the practical implications are twofold:
- Reporting Changes — Expect to see fewer impressions and ranking terms in dashboards and reports. This is a data recalibration, not a performance loss.
- SEO Team Value Increases — As automated rank data becomes less accessible, strategic interpretation and technical expertise become more important. SEO professionals who understand data context and accuracy will have greater value than before.
Brands should encourage teams to focus less on vanity metrics and more on actionable insights that connect SEO to revenue, engagement, and visibility goals.
Why Google Made This Change
Google has not issued an official explanation. However, several likely motivations exist:
- Preventing Large-Scale Scraping: The &num=100 parameter made it easy for bots and automated tools to extract large datasets quickly. Disabling it reduces data scraping and server load.
- Protecting Search Integrity: Limiting bulk data collection helps Google preserve result accuracy and consistency across devices and regions.
- Responding to AI Data Harvesting: With the rise of AI platforms such as ChatGPT, Perplexity, and other search-integrated tools, Google may be taking steps to restrict automated mass data gathering from its SERPs.
While the change creates short-term disruption for SEOs, it aligns with Google’s broader direction: protecting proprietary data and ensuring search metrics better represent real user behavior.
Conclusion
The removal of the &num=100 parameter marks the end of a long-standing shortcut in SEO data collection. It introduces challenges for rank tracking and performance analysis, but it also improves data reliability.
SEO professionals should view this not as a setback, but as a correction that elevates the quality of available data. By focusing on accurate metrics, actionable insights, and meaningful performance indicators, teams can adapt smoothly and continue to drive measurable growth.
In an era of AI-driven search and evolving metrics, precision and adaptability define effective SEO.


