The Real Cost of Blocking AI: Traffic and Revenue Impact Analysis
December 8, 2025
Editorial Policy
All of our content is generated by subject matter experts with years of ad tech experience and structured by writers and educators for ease of use and digestibility. Learn more about our rigorous interview, content production and review process here.
Key Points
- Using an AI blocker filter does not negatively impact search traffic. Studies tracking thousands of publisher sites show traffic variations within 1% between sites that block AI bots and those that do not.
- AI referral traffic is growing rapidly but remains small. ChatGPT referrals grew 25-fold in 2024-2025, yet AI platforms still account for less than 0.15% of global internet traffic compared to 48.5% from organic search.
- AI Overviews are reducing click-through rates significantly. Zero-click searches jumped from 56% to 69% between May 2024 and May 2025, with publishers reporting click-through rate reductions of 34% to 46% when AI summaries appear.
- The revenue equation is nuanced. While blocking protects your content from unauthorized training, the emerging AI referral channel shows higher engagement and conversion rates than traditional search.
- Publishers should focus on maximizing the value of traffic they do receive. Regardless of your blocking decision, optimizing ad yield on existing traffic remains the most controllable lever for revenue.
Why Publishers Need to Block AI Crawlers
The publishing world is facing an uncomfortable reality: AI companies are crawling publisher content at unprecedented rates while providing minimal traffic in return. Cloudflare's data reveals a stark imbalance that fundamentally breaks the traditional value exchange that made the internet work.
OpenAI's crawl-to-referral ratio sits at approximately 1,700:1 as of June 2025. Anthropic's ratio is even more lopsided at 73,000:1. Search engines crawl your content, index it, and send traffic back. AI crawlers consume your content and often answer user queries without ever directing visitors to your site.
Publishers like The New York Times, Wall Street Journal, Reuters, and hundreds of others have responded by updating their robots.txt files to implement AI blocker filters.
The question every publisher needs to answer: should you follow their lead, and what will it actually cost you? For a comprehensive overview of your options, our complete publisher's guide to AI crawlers covers whether to block, allow, or optimize for maximum revenue.
Does Blocking AI Bots Hurt Your Traffic?
Let's cut through the speculation and examine what happens when publishers block AI crawlers. The evidence might surprise you.
No Measurable Search Traffic Penalty
Raptive, managing over 6,000 publisher sites, tracked traffic patterns from June 2024 to May 2025 across sites with varying blocking configurations. Sites that blocked one or more AI bots saw no statistically significant change in traffic compared to sites that did not block bots.
The average traffic variation across cohorts was within 1%, which falls within expected fluctuations due to variables like core updates and changes in search behavior. This finding addresses the primary concern publishers have: that implementing an AI blocker filter might somehow anger Google and hurt their search rankings. The data suggests this fear is unfounded, at least for now.
Need a Primer? Read this first:
- A Guide to Increasing Website Traffic: Foundational strategies for growing your traffic before making AI blocking decisions
Understanding Google-Extended and Your Block AI Strategy
Many publishers hesitate to block Google-Extended, the user agent Google uses to access content for Gemini and other AI products. Google has stated that blocking Google-Extended does not affect search rankings or inclusion in AI Overviews.
However, blocking Google's main crawler (Googlebot) would be catastrophic for your traffic. The distinction matters significantly. Google-Extended and Googlebot are separate user agents with different purposes. You can block one without touching the other.
AI Crawler | Purpose | Safe to Block? | Impact on Search |
GPTBot | OpenAI model training | Yes | None |
ChatGPT-User | ChatGPT web browsing | Yes | None |
Google-Extended | Gemini/AI training | Yes | None |
ClaudeBot | Anthropic model training | Yes | None |
PerplexityBot | Perplexity search | Yes | None |
Googlebot | Search indexing | No | Devastating |
The Emerging AI Referral Channel
While the blocking debate rages on, something interesting is happening. AI platforms are starting to send meaningful traffic to publishers, even as their crawling behavior remains problematic. Understanding these dynamics is crucial for publishers focused on increasing website traffic through strategic channel diversification.
AI Referral Traffic Growth Rates
ChatGPT referral traffic to publishers is growing exponentially. According to Similarweb data reported by Digiday, ChatGPT sent 243.8 million visits to 250 news and media websites in April 2025, up 98% from 123.2 million visits in January. This represents real visitors arriving at publisher sites through AI-generated answers.
The growth trajectory is significant even if the absolute numbers remain small. Research from SE Ranking shows AI traffic jumped from just 0.02% in 2024 to 0.15% in 2025, a rise of more than seven times. ChatGPT dominates this channel with 77.97% of all AI-driven referral visits, followed by Perplexity at 15.10% and Gemini at 6.40%.
Higher Quality Traffic from AI Sources
Here's where it gets interesting for publishers focused on revenue. AI referral traffic isn't just growing; it's converting at remarkably higher rates than traditional channels.
Microsoft Clarity's research found that visitors arriving from large language models converted to sign-ups at 1.66% compared to 0.15% from search, 0.13% from direct traffic, and 0.46% from social media. Subscription conversions followed similar patterns, with LLM traffic converting at 1.34% while search managed just 0.55%.
Users arriving via AI platforms have often pre-qualified their intent through detailed conversational queries. They know what they're looking for and are further along in their decision journey.
The AI Blocker Paradox
Publishers face a genuine paradox when deciding whether to block AI crawlers. Blocking protects your content from being used to train models that could eventually replace your site as a traffic source. But blocking might also reduce your visibility in emerging AI referral channels. The good news is that selective AI blocking allows you to permit beneficial bots while blocking harmful scrapers.
What Implementing an AI Blocker Filter Actually Accomplishes
The robots.txt approach has clear limitations. It's a request, not an enforcement mechanism. AI companies can simply ignore it, and some evidence suggests traffic continues flowing to sites that have implemented blocks.
According to Digiday, The New York Times received 240,600 visits from ChatGPT in January 2025, despite the publisher blocking crawlers from ChatGPT and Perplexity in its robots.txt protocol.
This suggests that blocking primarily affects future training data collection rather than current AI-powered search functionality. Your existing content may already be in the training data, and AI search tools may still surface and link to your content regardless of your robots.txt settings.
The Real Trade-Off When You Block AI
The decision to block isn't really about traffic impact today. It's about several other considerations that cost-conscious publishers and portfolio managers need to weigh carefully.
- Content control is the first factor. Do you want your original content training AI models without compensation?
- Future positioning matters too. Are you betting on licensing deals becoming the norm, and does blocking position you to negotiate?
- Industry solidarity plays a role as well. Does joining other publishers in blocking strengthen the collective bargaining position?
- Finally, there's legal strategy. Does blocking support any potential copyright claims? Publishers navigating these complex questions should understand the legal landscape around blocking AI scrapers in 2025.
Related Content:
- Beyond Zero-Click Search: How publishers can reclaim their audience as AI Overviews reduce click-through rates
- Traffic Shaping and QPS Optimization: Technical strategies for optimizing your traffic quality and ad performance
How to Implement AI Blocker Filters
If you decide to block AI crawlers, implementation is straightforward. For a comprehensive walkthrough, check out our guide on how to block AI bots with robots.txt for publishers. Add the following to your robots.txt file:
User-agent: GPTBot
Disallow: /
User-agent: ChatGPT-User
Disallow: /
User-agent: Google-Extended
Disallow: /
User-agent: anthropic-ai
Disallow: /
User-agent: ClaudeBot
Disallow: /
User-agent: PerplexityBot
Disallow: /
User-agent: FacebookBot
Disallow: /
User-agent: Bytespider
Disallow: /
For WordPress sites, plugins like Yoast SEO provide simple interfaces for managing robots.txt entries. Simply copy and paste the relevant user-agent rules.
Beyond robots.txt: Stronger AI Blocker Options
More aggressive blocking options exist for publishers wanting stronger enforcement against AI crawlers. Our technical implementation guide covers advanced methods to block AI from scraping your website.
Server-level blocking lets you configure your web server or firewall to reject requests from specific user agents entirely. CDN-level controls through services like Cloudflare now offer one-click AI crawler blocking that affects over a million sites. IP-based blocking allows you to block known IP ranges associated with AI crawlers, though this requires ongoing maintenance as ranges change.
Keep in mind that sophisticated scrapers can spoof user agents and bypass these controls. Determined AI companies or their partners can access content through various means if they choose to.
The Bigger Revenue Picture
While the blocking debate captures attention, the more pressing revenue question for most publishers involves the declining click-through rates from traditional search. Publishers focused on sustainable growth should explore programmatic monetization solutions to automate ad revenue and maximize earnings.
How AI Overviews Impact Publisher Traffic
According to Digital Content Next's survey of 19 member publishers, zero-click searches increased from 56% to 69% between May 2024 and May 2025. While this captures trends beyond AI Overviews, the timing aligns with the rollout.
Over eight weeks in May and June 2025, the median Google Search referral was down almost every week, with losses outpacing gains two-to-one. Across the eight weeks, the median year-over-year decline in referred traffic from Google Search was -10% overall. This decline affects publishers whether they block AI crawlers or not.
Revenue Impact Comparison
The following table summarizes the current state of various traffic sources for publishers making decisions about AI blocker implementation.
Traffic Source | Trend | Publisher Control |
Google Organic Search | Declining (-10% YoY median) | Limited |
AI Overviews Click-Through | Declining (34-46% reduction) | Very Limited |
ChatGPT Referrals | Growing (98% increase) | Moderate |
Perplexity Referrals | Flat to declining | Moderate |
Direct Traffic | Stable | High |
Maximizing Revenue from Your Existing Traffic
Here's the uncomfortable truth: you can't fully control whether AI erodes your search traffic. But you can control how effectively you monetize the traffic you do receive.
Publishers navigating this landscape should focus on the levers they can actually pull. Every visitor represents revenue potential, and optimizing that conversion matters more than ever when total traffic faces headwinds. For publishers looking to take control of their digital presence, website development strategies for increased traffic and revenue generation provide a solid foundation.
Ad Layout and Density Optimization
The balance between user experience and ad revenue becomes more critical when you can't rely on endless traffic growth. Publishers seeing traffic declines need to extract more value from each session without destroying engagement.
Strategic ad placement that prioritizes viewability while respecting user experience can significantly increase effective CPMs. High-impact units like video and interactive formats command substantially higher rates than standard display.
Session Value Focus
With fewer sessions available, session RPM becomes the essential metric. This measures the total revenue generated per visitor session rather than per individual pageview.
Publishers should track session depth to understand how many pages each visitor views. Time on site matters because longer sessions create more impression opportunities. Ad viewability determines whether your ads are actually being seen. Fill rate reveals whether you're leaving unfilled impressions on the table. Understanding invalid traffic and how to combat IVT is also essential for protecting your revenue from fraudulent impressions.
Demand Diversification
Relying on a single demand source is risky in any environment. Publishers should ensure they have access to premium programmatic demand through multiple SSPs competing in header bidding auctions. Direct sales relationships bring higher CPM campaigns from brands that value your audience. High-impact format inventory through video and interactive units commands premium rates.
Making the Right Decision for Your Site
The AI blocking decision isn't one-size-fits-all. Your calculus depends on your specific situation and business model.
Publishers Who Should Probably Block AI Crawlers
Original content creators should consider implementing an AI blocker filter if their business model depends on unique, original content that AI could replicate or summarize. News organizations fall into this category, particularly those with paywalls or subscription models where AI summaries directly compete with their value proposition. Publishers pursuing licensing deals should also consider blocking, as it establishes your position and intent for potential negotiations.
Publishers Who Might Wait Before Implementing an AI Blocker
Small technical publishers should weigh their options carefully. If AI referral traffic represents a meaningful percentage of total traffic (even if small in absolute terms), blocking could hurt. Tool and reference sites may benefit from AI visibility since users are likely to click through for functionality rather than just information. Publishers benefiting from AI citations who see meaningful traffic from being surfaced in AI responses might want to monitor the situation before blocking.
Frequently Asked Questions About AI Blocker Filters
What is an AI blocker filter?
An AI blocker filter is a set of instructions added to a website's robots.txt file that tells AI crawlers not to access and scrape content. These filters target specific user agents like GPTBot, ClaudeBot, and Google-Extended to prevent AI companies from using publisher content for model training without consent or compensation.
Does blocking AI crawlers affect my Google search rankings?
No, blocking AI crawlers does not affect Google search rankings. Google has confirmed that blocking Google-Extended (their AI training crawler) does not impact search rankings or inclusion in AI Overviews. Studies tracking thousands of publisher sites show no statistically significant traffic changes after implementing AI blocker filters.
How do I block AI crawlers from my website?
You can block AI crawlers by adding user-agent directives to your robots.txt file. For each AI crawler you want to block, add two lines: "User-agent: \[crawler name\]" followed by "Disallow: /". Common crawlers to block include GPTBot, ChatGPT-User, ClaudeBot, Google-Extended, and PerplexityBot. WordPress users can use plugins like Yoast SEO to manage these entries.
Will blocking AI crawlers reduce my traffic from ChatGPT and other AI platforms?
Possibly, but the impact is nuanced. AI platforms may still reference and link to your content even after you implement blocking, as your existing content may already be in their training data. Current AI referral traffic remains less than 1% of total traffic for most publishers, so the immediate impact is typically minimal.
Next Steps:
- Ad Layout Optimization: Maximize revenue from your existing traffic with strategic ad placement
- Best Practices Managing Poor Ad Yield Performance: Diagnose and fix revenue drops before they impact your bottom line
Amplify Your Ad Revenue with Playwire
The AI disruption creates genuine uncertainty for publishers. Traffic sources you've relied on for years may deliver fewer visitors tomorrow. The best defense is maximizing the revenue from traffic you do control.
Playwire's RAMP Platform provides publishers with the tools to optimize every impression. Our AI-driven yield optimization works continuously to improve CPMs, while our direct sales team brings premium brand campaigns that command rates far above programmatic averages.
See It In Action:
- Traffic Shaping Revolution: How Playwire's ML algorithm boosted publisher revenue by 12% through intelligent optimization
Whether you choose to block AI crawlers or embrace the emerging AI referral channel, you need a monetization partner focused on your revenue, not just impression volume. Playwire's approach to quality, performance, and transparency means we're aligned with your success.
Ready to stop leaving revenue on the table? Apply now to see how Playwire can amplify your ad revenue regardless of where your traffic comes from.


