Key Takeaways
Google disabled the &num=100 URL parameter in mid-September 2025, eliminating the ability to view 100 search results on a single page. This seemingly small technical change has triggered dramatic shifts in Google Search Console data. 87.7% of websites are experiencing sudden impression drops, 77% of sites lost keyword visibility after Google removed the number 100: Data, and rank tracking tools are facing a 10x cost increase. However, your actual search rankings haven’t changed – only how they’re measured.
The change removes bot-driven impressions from SEO tools, resulting in cleaner, more accurate data that better reflects real human search behaviour. Website owners should expect lower impression counts but improved average positions, whilst marketers need to prepare for potential increases in SEO tool subscription costs.
What Is the &num=100 Parameter and Why Did Google Remove It?
For many years, you could add &num=100 to the end of a Google search URL to force Google to show 100 results on one page instead of the standard 10.
This URL parameter became an industry standard for SEO professionals and rank tracking software. Tools like Semrush, Ahrefs, AccuRanker, and Moz relied heavily on this feature to efficiently gather comprehensive search engine results page (SERP) data with a single request, rather than making multiple paginated calls.
On 12th September 2025, the SEO community discovered that Google had disabled the ability to use the parameter &num=100. The change appeared without any prior announcement, developer documentation update, or official communication from Google, leaving the industry scrambling to understand and adapt.
When questioned about the change, a Google spokesperson stated: “The use of this URL parameter is not something that we formally support.” However, this stands in contrast to Google’s informal support of the feature for over a decade, suggesting the removal was a strategic decision rather than simply discontinuing an unsupported feature.
Why the sudden change? Industry experts theorise several motivations:
- Combating excessive bot scraping: The explosion of AI tools and language models scraping Google’s results placed an unprecedented load on Google’s infrastructure
- Reducing server costs: Serving 100 results per request consumed significantly more resources than standard 10-result pages
- Data integrity: Bot traffic was distorting Google Search Console metrics, making impression data less reliable for genuine website owners
- Competitive positioning: Limiting bulk data access may help protect Google’s competitive advantage in search data
How Does This Change Affect SEO Tools and Rank Tracking?
SEO tool vendors now face a “10x increase in crawl load”, server power, and infrastructure required to gather the same dataset, as retrieving 100 results now requires 10 separate page requests instead of one.
The immediate consequences across the industry have been significant:
Operational disruptions: Multiple reports emerged of broken dashboards, missing ranking data, error states, and incomplete SERP reports immediately following the change. Major platforms, including Semrush, Ahrefs, and Keyword Insights, all acknowledged the issue and worked to implement workarounds.
Cost implications: The exponential increase in API calls directly translates to higher infrastructure costs for tool providers. These expenses will inevitably be passed to customers through subscription price increases in the coming quarters.
Reduced tracking depth: Some providers, like AccuRanker, have adapted by changing their default tracking depth to the top 20 results, explaining that “retrieving more results from Google requires pulling additional pages, and each extra page significantly increases costs”.
Slower reporting cycles: The need to perform more API calls means collecting, processing, and generating rank tracking reports now takes considerably more time, potentially delaying insights and decision-making.
For website owners and marketers, this means:
- Expect potential price increases from your SEO tool subscriptions
- Consider whether you genuinely need tracking beyond the top 20-50 positions
- Evaluate whether your tools still provide adequate value given potential depth reductions
- Diversify your SEO tool stack to avoid over-reliance on a single provider
What Happened to My Google Search Console Data?
Following Google’s removal of the &num=100 parameter, 87.7% of sites experienced impression declines in Google Search Console, according to analysis of 319 properties by Tyler Gargula at LOCOMOTIVE Agency.
If you checked your Search Console around mid-September 2025, you likely noticed two seemingly contradictory changes:
Sudden impression drops: A noticeable decline beginning around 10th–12th September 2025, as bot-driven impressions vanished from reporting, with desktop searches particularly affected.
Improved average position: Your average ranking may have improved overnight, appearing closer to position 1.
Why This Makes Perfect Sense
Google defines an impression as a link to a site appearing in the current page of results. When rank-tracking bots loaded pages with 100 results using the parameter, a website ranking at position 99 would register an impression.
Here’s a practical example of how bot impressions distorted your data:
Imagine your website ranks in Position 8 for a valuable keyword where real users frequently find you. However, throughout the day, rank tracking tools check this keyword and retrieve positions 1-100. Each time a bot checks, your site gets an impression at position 8 (where real users see you) but also potentially at positions 67, 78, or 92 for long-tail variations.
Your average position would be calculated based on a weighted average, making it much lower than where real users actually encounter your website in search results. With bot impressions removed, your average position naturally improved to better reflect genuine user behaviour.
The Silver Lining: More Accurate Data
With the elimination of the &num=100 parameter, artificial impressions are no longer recorded, so your average position becomes more representative of where real users actually find your website, resulting in cleaner data that better reflects true visibility to potential customers.
What you should expect moving forward:
- Lower impression counts – but these represent actual potential visibility
- Improved average positions – reflecting genuine user search patterns
- Stable click numbers – because bots never clicked
- Higher click-through rates – as the denominator (impressions) decreased whilst clicks remained constant
- More actionable insights – data now aligns with real user behaviour
Should I Be Worried About My Website’s Performance?
No. Your actual search performance likely hasn’t changed at all.
This change doesn’t impact your actual search visibility or rankings—what’s changed is how accurately that performance is being measured and reported. The dramatic shifts you’re seeing in Search Console are accounting corrections, not performance declines.
Think of it this way: if you were running a physical shop and previously counted both real customers and mannequins walking past your window, then suddenly started counting only real customers, your foot traffic numbers would drop significantly—but your actual business hasn’t changed.
Key Reassurance Points
- Your rankings remain the same: Nothing about how Google ranks your content has changed
- Real user traffic is unaffected: Human-driven traffic remains stable, as bots never clicked results
- Your SEO strategy should continue: Quality content, technical optimisation, and link building remain as important as ever
- The data is now more reliable: You can make better decisions based on accurate human behaviour patterns
However, there are strategic adjustments to consider for optimising in this new landscape.
How Should Website Owners and Marketers Adapt Their Strategy?
The removal of the &num=100 parameter marks a broader shift towards more realistic, user-focused SEO metrics. Here’s how to adapt:
Focus on Metrics That Matter
Prioritise top-20 positions: The vast majority of users never go past the first page of results. Concentrate your efforts on keywords where you can realistically reach page one rather than tracking positions 50-100.
Track business outcomes: Monitor clicks, conversion rates, and revenue rather than vanity metrics. These measurements were always the true indicators of SEO success.
Set new baselines: Use the week-over-week changes since 10th September as a new baseline and note any substantial changes in your reporting. Don’t compare post-September data directly to earlier periods without accounting for the methodology change.
Adjust Your Reporting Approach
Educate stakeholders: Explain to clients, managers, or board members why impression numbers dropped significantly, but this represents better data quality rather than a performance decline.
Emphasise CTR improvements: The natural increase in click-through rates post-change demonstrates your content’s genuine appeal to real searchers.
Benchmark against competitors carefully: Ensure you’re comparing like-for-like data periods when conducting competitive analysis.
Diversify Your SEO Intelligence
Evaluate tool subscriptions: Assess whether your current rank tracking tools still provide adequate depth and value given potential cost increases or feature reductions.
Consider alternative search engines: Bing, DuckDuckGo, and other platforms aren’t affected by this Google-specific change and may offer opportunities for diversification.
Invest in first-party data: Build direct relationships with your audience through email lists, communities, and owned channels rather than relying solely on search platform data.
What Does This Mean for AI-Powered Search and the Future?
The &num=100 removal sits within a broader context of rapid changes in search behaviour and technology. The timing is particularly significant given the explosive growth of AI-powered search alternatives like ChatGPT, Perplexity, and Google’s own AI Overviews.
The Connection to AI Scraping
Industry experts have highlighted connections between the spike in impressions in recent times and excessive scraping from AI tools, with some suggesting “all of the AI tools scraping Google are going to result in the shutdown of most SEO tools.” Google Modifies Search Results Parameter, Affecting SEO Tools
The proliferation of large language models (LLMs) training on web data, combined with AI-powered search tools querying Google at scale, created unprecedented demand on Google’s infrastructure. The &num=100 parameter made this scraping far too efficient from Google’s perspective.
Optimising for AI-Era Search
To thrive in this evolving landscape:
Create genuinely helpful content: AI systems prioritise content that directly answers user questions with expertise and authority.
Build topical authority: Comprehensive coverage of subject areas signals trustworthiness to both traditional search algorithms and AI systems.
Structure for featured snippets: Clear, concise answers at the beginning of sections increase chances of being cited by AI assistants.
Maintain E-E-A-T signals: Experience, Expertise, Authoritativeness, and Trustworthiness remain critical ranking factors.
The fundamental shift is clear: bulk ranking data becomes less accessible, whilst quality, user-focused content becomes more important than ever.
Author Bio
Aaron Smith is Head of Digital at Blacksmith SEO & Digital with 20 years of experience in digital marketing. Based in Vancouver, he has guided over 30 recent clients through the transition to AI-powered search whilst maintaining and improving organic visibility. Aaron holds a Commerce Degree from Royal Roads University and regularly contributes to industry publications on search marketing evolution. Connect with Aaron Smith on LinkedIn.
What people are asking?
Q: Will Google bring back the &num=100 parameter?
A: Highly unlikely. Google has officially stated the parameter was never formally supported, and the change addresses significant infrastructure and data quality concerns. The industry consensus is that this change is permanent, and SEO tools have already adapted their systems accordingly.
Q: Are my lower Search Console impressions bad for my SEO?
A: No. The lower impression counts don’t reflect any decline in your actual search performance or rankings. They simply represent more accurate data by removing bot-driven impressions that were never from real potential visitors. Your clicks and genuine user engagement remain unaffected.
Q: Should I cancel my rank tracking tool subscription?
A: Not necessarily. Evaluate whether the tool still provides value for your specific needs. If you primarily need to track top-20 positions and your tool continues to provide reliable data at that depth, it may still be worth the investment. However, compare options and consider whether competitors offer better value following industry-wide adjustments.
Q: How long will it take for Search Console data to stabilise?
A: Most websites saw immediate changes around 10th–14th September 2025. The data should already be stable, reflecting the new methodology. Allow 2-4 weeks of post-change data to establish reliable new baselines for your reporting and decision-making.
Q: Does this affect international or mobile search differently?
A: The impact appears most significant on desktop search data, where rank tracking tools primarily operate. Mobile and international search are affected similarly in terms of the technical change, but the data distortion may have been less severe depending on tool usage patterns in different regions.
Take Action: Get Expert Help Navigating Search Changes
The removal of Google’s &num=100 parameter represents just one of many rapid changes reshaping digital marketing. As search becomes increasingly AI-powered and user-focused, having expert guidance ensures you’re investing in strategies that deliver genuine business results rather than chasing vanity metrics.
Our team specialises in helping website owners and marketers adapt to platform changes whilst maintaining and improving organic visibility. We focus on sustainable, white-hat tactics that work regardless of which parameters Google decides to deprecate next.
Ready to future-proof your SEO strategy? Contact us today for a complimentary audit of your current approach and personalised recommendations for thriving in the new search landscape.