4 min read

Avg. Position Updates = Why Your Search Console Data Got Better Overnight

Avg. Position Updates = Why Your Search Console Data Got Better Overnight
Avg. Position Updates = Why Your Search Console Data Got Better Overnight
7:37

Your boss is thrilled. Your average position in Google Search Console jumped from 45 to 8.5 overnight. Your rankings look incredible. Your visibility appears to have skyrocketed.

You did absolutely nothing to deserve this.

Welcome to the most confusing data anomaly of 2025, where everyone's rankings improved simultaneously and nobody wants to admit their sudden "success" is completely meaningless.

What Actually Happened (And Why Nobody Told You)

In mid-September 2025, Google quietly killed support for the &num=100 parameter. This obscure technical change eliminated the ability for tools and crawlers to pull 100 search results per query instead of the standard 10 or 20.

Sounds boring. Was catastrophic for how we measure SEO performance.

Here's what that parameter was doing. Third-party SEO tools, automated crawlers, and various monitoring systems were using it to track where websites ranked across all 100 possible positions on Google. When Google stopped supporting it, all those data points vanished. Search Console recalibrated to reflect only what actual humans see—typically the first 10 to 20 results.

Your rankings didn't improve. Google just stopped counting the noise.

The result? Average positions that were being dragged down by rankings in positions 47, 68, and 93 suddenly disappeared from the calculation. What remained were your stronger rankings in positions 1-20. Your average position mathematically had to improve because the denominator changed.

The Alligator That Finally Closed Its Mouth

SEO professionals had been watching something strange since February 2025. Impressions were climbing steadily while clicks remained flat. When you charted impressions against clicks, it created a shape that looked like an alligator opening its mouth wider and wider.

Everyone had theories. AI Overviews were inflating impression counts. Zero-click searches were exploding. Google was showing more results per query. All plausible. All wrong.

The alligator's mouth snapped shut on September 12th. Impressions dropped dramatically overnight. Not because visibility declined, but because automated crawlers had been artificially inflating the numbers for months.

Those "impressions" were never human eyeballs. They were bots scraping Google results beyond what any reasonable person would scroll through. Some were legitimate SEO tools tracking rankings. Others were likely large language models like ChatGPT using third-party tools to harvest search data.

You were getting credit for visibility that existed only in automated systems.

Why Sites Are Seeing "Better" Rankings Everywhere

The math is simple but the implications are weird.

Before the change, if you ranked #3 for one keyword and #87 for another, your average position was 45. After the change, Google only counts the #3 ranking because #87 is beyond what it now tracks. Your new average position is 3.

Congratulations on your 93% improvement in average position. You achieved this by doing nothing while Google changed its methodology.

This is happening across virtually every website with any meaningful search presence. Everyone's average position improved simultaneously. Which means nobody's competitive position actually changed.

It's like if a teacher announced they're only grading the top three assignments from each student instead of all ten. Everyone's GPA improves, but class rank stays exactly the same.

New call-to-action

The Shift From Bot Traffic to Human Data

Here's what makes this change actually valuable despite the confusion it created.

We're now measuring what matters. Real human search behavior. Actual visibility in results people might conceivably see.

The old system was counting impressions for rankings so deep in search results that they were functionally invisible. Position 87 on Google isn't a ranking—it's SEO purgatory. Nobody scrolls that far. Including it in visibility metrics was technically accurate but practically meaningless.

The new baseline reflects reality. If you rank on page six of Google results, does that ranking exist in any meaningful sense? Philosophically interesting. Operationally irrelevant.

Search Console is now showing us the search landscape as users actually experience it, not as bots catalog it.

What Marketers Should Actually Trust Now

Stop comparing pre-September data to post-September data without massive caveats. You're comparing two different measurement systems.

Annotate everything. In any report that spans the September 12th change, include a clear note explaining the methodology shift. Your future self will thank you when someone asks why metrics jumped or dropped.

Use the current data as your new baseline. The numbers you see today are what Google Search Console will report going forward. This is your new normal. Build your expectations and targets around these figures.

Focus on clicks, not impressions. Clicks were always the more reliable metric, but now they're essential. Impressions got weird for months and then recalibrated. Clicks remained consistent because they measure actual user behavior, which didn't change.

Track trends within the new system. Month-over-month changes after September 12th are meaningful. Comparing October 2025 to March 2025 is comparing apples to whatever that alligator was eating.

Watch queries ranking in positions 1-20 specifically. This is now the range Search Console reliably tracks. Your performance within this tier matters more than ever because it's what actually gets measured.

The Metrics That Still Matter (Maybe More Than Before)

Average position is still useful—just differently. Instead of showing where you rank across all possible positions, it shows where you rank in the zone that actually drives traffic. This is arguably more valuable information.

Impressions stabilized at lower levels, but they're more accurate. You're seeing real search activity instead of bot-inflated numbers. Lower doesn't mean worse. Lower means honest.

Click-through rate became more reliable. When impressions were inflated with bot traffic, CTR calculations were artificially depressed. Now that impressions reflect human searches, CTR shows actual human behavior.

Total clicks remain the north star metric. They never lied. They still don't. If clicks are growing, your SEO is working regardless of what impressions or average position claim.

Moving Forward With Clearer (If Smaller) Numbers

The hardest part of this change isn't the data—it's explaining why your metrics suddenly look different to stakeholders who weren't obsessively tracking the alligator effect.

Frame it as a refinement, not a loss. You didn't lose visibility. You lost the illusion of visibility in places that never mattered.

For most reporting contexts, simply note the methodology change and use current figures as the baseline. If you're running sophisticated models or long-term forecasting, you might need to normalize historical data. But for everyday performance tracking, just move forward with what Search Console shows now.

Google made fewer results available to automated systems. Real searchers still find what they're looking for. Your actual SEO performance hasn't changed.

Just your ability to pretend position 73 was meaningful.

Chat GPT-5 Is Here

Chat GPT-5 Is Here

OpenAI just dropped GPT-5, and the internet is buzzing with hot takes. But let's cut through the noise and talk about what this actually means for...

Read More
The Death of Demographics: Why Psychographics Rule the Future

The Death of Demographics: Why Psychographics Rule the Future

Demographics told us who customers are. Psychographics tell us why they buy.

Read More
Marketing Dashboard Design: KPIs That Actually Matter to CEOs

Marketing Dashboard Design: KPIs That Actually Matter to CEOs

The conference room falls silent as you advance to the slide showing your marketing dashboard. Thirty-seven metrics cascade down the screen in...

Read More