It may be the end of the world as we know it for digital publishers. The rise of ChatGPT and other LLMs has caused a shift in traffic away from search engines like Google towards AI search tools, leading to a substantial decrease in traffic for many online publishers.
Leading publications like CNN and Business Insider have all reported drops in traffic of as much as 30–40% following the introduction of Google’s AI overviews in May 2024. Less traffic inevitably means less ad revenue, which immediately raises concerns over content monetization—not to mention the future of careers in the publishing industry.
Part of the issue is that users are less likely to click on links when an AI summary appears in search results. If a third-party AI tool synthesizes content and provides a summary to the user, there’s a high chance they’re not going to click through to the original creator. No visit, no ad revenue.
Cloudflare, a content delivery network used by many of the world’s biggest websites, implemented a block on AI crawlers in July 2025 in response to the decline in traffic.This gave website owners control over which crawlers could access their content. From July to December 2025, Cloudflare denied over 416 billion AI scraping requests.
The debate over blocking AI crawlers raises some interesting questions for online publishers: Block everything, and you’ll lose referral traffic from AI overviews and third-party AI search tools. Do nothing, and you’re practically handing over your content to be accessed on a third-party site for free.
How traffic passes from AI search to publisher
AI searches are initiated through chatbots or via search engines with AI overviews. Through chatbots, AI platforms accounted for just 0.15% of global internet traffic in 2025, compared to 48.5% via organic search. While the overall level of traffic is lower than organic search, it’s grown seven times since 2024, suggesting it could account for more traffic in future.
Research conducted by Ahrefs generated similar findings on chatbot traffic. The study examined 3,000 websites and found that just 0.17% of the average website’s traffic comes from AI chatbots. Click through rates from AI overviews also tell a similar story, with research showing the presence of AI-generated overviews on Google correlates with a 58% lower average clickthrough rate.
The limited data we have available suggests that online publishers are going to be unlikely to get volumes of traffic similar to organic search, at least for the foreseeable future. However, there is evidence that suggests that although AI search platforms send fewer visitors, those visitors also convert at a higher rate than traditional search. Superprompt analysed 12 million website visits across over 350 businesses and found that AI search traffic converts at 14.2% compared to Google’s 2.8%—effectively a 500% difference.
This aligns with another Ahrefs study, which found that AI search visitors convert at a 23 times higher rate than traditional organic search visitors. This indicates that AI search traffic may be lower in overall volume, but can offer a significantly higher conversion rate.
In this sense, AI crawlers may be worth permitting for the visibility for publishers that are looking for high-quality traffic that converts in the form of sign ups and subscriptions. While it doesn’t seem likely that they will enjoy the same level of traffic as they did pre-AI search, there are still opportunities to promote and monetize content.
Enter the world of Answer Engine Optimization (AEO). Publishers need to decide if the referral traffic they can gain from ChatGPT and other tools is worth allowing crawlers to parse their intellectual property, and whether it’s worth investing in AEO.
Life after AI search: AEO
Tim Sanders, CIO of software comparison website G2, appears enthusiastic about the potential opportunities presented by AI search and has used segmented blocks on crawlers since the Cloudflare update. While G2 isn’t an online publisher in a traditional sense, it maintains a diverse library of software reviews, in-house research, and blog posts—content that could be scraped and served to users on a third-party chatbot, to the detriment of the company’s traffic. But the reality is a little more nuanced.

Image credit: G2
“In the beginning, AI search felt like, ‘wow, traffic’s going down for everyone, not just us,’ but every customer we talked to said traffic’s down. Well, traffic was down because people are going to ChatGPT, getting their answer, and they’re not having to go through Google and read a bunch of articles to synthesize their own answer,” Sanders said.
He continued, “What we understood later though, as the models begin to mature, and it really started for us like spring of last year, as the models begin to really value review sites before they make purchase recommendations, we started to see a lot more influence, a lot more activity, and it’s made a huge difference to our perceived value in front of customers that we talked to”
Sanders also shared results of a survey conducted by G2 in August 2025, which found that 50% of buyers claimed they started research on AI search chatbots like ChatGPT, instead of going to Google.
“If you’re looking for a CRM solution for your hospital that works on an iPad, you just go to ChatGPT and say, ‘give me the best three CRM solutions for an iPad’ and it becomes the new shortlist. It saves the buyer so much time from where they used to have to go to Google, create a list, and then go to [review websites] and read reports,” Sanders said. “The world’s changing really fast.”
Sanders notes that blocking the crawlers of tools like ChatGPT and Gemini “can cause companies to be invisible.” We can interpret this to mean that companies who block AI search tools from parsing their content will lose out from the users that click through from these tools. That being said, there are a number of measures publishers can take besides blocking crawlers to minimize their visibility to LLMS, if they want to protect intellectual property or high-value content.
These measures include slowing down parts of the website to delay crawling activity and forcing crawlers to abandon sessions, using white listing to allow traffic from selected sources, and locking content up in PDFs so that it’s only parsed in a deep research session. Websites can also use JavaScript instead of HTML, as many AI crawlers struggle to render JavaScript content. Sanders suggests one way to manage crawlers is to have the HTML header language available to AI agents but deeper content as subscription only.
In the case of G2, category data and basic reviews are available to crawlers, while deeper review content and reports require users to log in. Online publishers will have to learn to adjust to a new paradigm, balancing old school SEO alongside AEO and Generative Engine Optimization (GEO) to rank in next generation AI search tools.
New monetization models emerge
The growth in AI search tools also presents new opportunities for monetization, a process which Cloudflare is aiming to facilitate. “I really think of it as building new models of agentic paywalls,” Will Allen, VP of Product at Cloudflare told IWAI.
”It’s yet another tool that they can have in their arsenal to decide, what are these new monetization models? They’ll again rely upon subscriptions, ad revenue, conferences, brand recognition, and email newsletters. Now they can also think about, ‘Can I do licensing deals? Can I pay per crawl?’ Another variant of those licensing deals… It’s one of many,” Allen said.

Image credit: LinkedIn
Monetization models like pay-per-crawl are still experimental, but they present another potential channel for publishers to get compensated for their content, alongside licensing deals with AI providers, ad revenue, and subscriptions. That being said, Allen noted that Cloudflare doesn’t have any general numbers on how much publishers can expect to make from these types of arrangements, due to how domain and site specific they are.
But is it too late? “If your content has been readily available online for anyone to access for many years, it’s [already] been indexed,” Allen said. Though he did note that there’s still an enormous opportunity to deliver value through new content. Billions of people consume content everyday, so there’s still a demand for quality digital publishing.
Publishing has changed, and its future stands on the edge of a knife
There’s no turning back the clock. What’s scraped has been scraped; regardless of the outcome of The New York Times lawsuit or any of the other high-profile legal conflicts to come, AI search tools aren’t going anywhere. Online publishers must grapple with new ways to deliver value, particularly if more users migrate to AI search tools over Google.
It’s undeniable that many publishers have seen catastrophic losses in traffic, and digital publishing will likely not return to what it was pre ChatGPT. But there are still opportunities out there for publishers to monetize content. Knowing how to leverage AEO, being selective about what AI crawlers to block, and embracing monetization models like pay per crawl is a great place to start. But for many, it’s not going to be enough.


