So, you create a fresh and juicy piece of content for your website, hit publish, but your first reader is… not even a human, but a bot (or two, of a whole party of bots). Some are your good old buddies (hello, Googlebot), others are something new – AI crawlers. And, of course, just like any innovation, these guys raise many questions and debate: say, some publishers dream of squeezing into ChatGPT’s recommendations, while others believe AI crawlers are here to eat your content for breakfast and never send a single click back.
Either way, it feels dramatic, like you can’t win, but is that really so? Let’s figure it out.
Just a couple of years ago, life was so much simpler! You mainly cared about Googlebot and maybe Bingbot, depending on the markets you work in. They crawled your pages, indexed them, and sent you traffic. Now you’re dealing with a whole zoo: GPTBot (OpenAI), Google-Extended, ClaudeBot, PerplexityBot, CCBot, and more.
Several reports show that AI crawlers are no longer an exotic, small part of global traffic. They are so deeply rooted in our lives already that some analyses suggest that a big slice of all hits to many sites now comes from bots related to LLMs (Large Language Models), not humans.
Also, a recent scandal with one open-source documentation site, Read the Docs, came to the surface not so long ago. The owners measured its bandwidth and found that after it blocked AI crawlers, traffic dropped from around 800GB a day to 200GB! And that is actually a 75% cut in bandwidth and about $1,500/month saved.
So, on one side, AI crawlers may give you some visibility in AI tools – that’s true. What is more, today, chatbots are commonly perceived as being less motivated than other digital spaces, meaning they are more objective, less selling, and therefore can be trusted. This sounds like a good-to-have traffic source.
However, on the other side, they cost money, burn server resources, and don’t actually show ads or buy anything. What is more, they can deliver content from your website right to a user directly, which means they will have fewer reasons to visit your site. And this already makes some publishers reach for that big red “block” button.
As we already know, the real pain for publishers is not just that AI crawlers read your site, but rather what happens next in search results. The problem is that Google’s AI Overviews and other AI summaries answer user questions directly on the SERP, often using publisher content as raw material.
When we asked our SEO expert, for a blunt summary, he said:
AI crawlers are not malicious, but they are existentially disruptive.
They are built with one main goal in mind: to keep users inside their own platforms (and it looks like they are doing pretty well). Now, Google, OpenAI, Perplexity, and similar players all want to be the final answer box, not just the gateway that sends people to your site. To do that, they extract and summarize your content, then serve it back to the user – often without giving you a meaningful click in return.
This shifts SEO from the old game of rankings to a new game of feeding large language models. Those models don’t always credit you properly, and they definitely don’t always send traffic. Before, Google’s classic cannibalization didn’t go beyond snippets, but now it’s deeper than that. The question is now asked and answered inside a closed system, and in many cases, your site is simply background fuel.
That said, the picture isn’t all doom, because AI doesn’t completely flatten everyone. Models still prefer content that is authoritative, structured, and well-maintained. They still struggle to fully replace things like real experts, proprietary datasets and databases, first-hand experience, failures, and scars.
So yes, AI crawlers are disruptive. But if you build something more than just another how-to article, you’re not powerless in this new landscape.
Our SEO expert states that AI-driven results can break visibility and CTR, but they do it selectively, not across the board. The most vulnerable group is classic informational, “zero-click” intent searches – all those queries where users really just want a short, clear answer:
For this type of query, the same pattern keeps showing up. Impressions often stay stable or even grow, so on paper, your visibility looks fine; however, once AI Overviews or similar blocks appear, clicks start to collapse. Users see a ready-made answer at the top of the SERP and simply have no reason to visit the site that actually owns the content.
This is very similar to what we already saw with Featured Snippets and “People Also Ask,” just with the volume turned up. You can still have 100,000 impressions for a page, but your CTR evaporates because the answer is already on the results page.
Recent third-party research backs that up:
From an SEO perspective, as our SEO expert highlights:
The pattern is now pretty clear: when a SERP includes AI modules, CTR tends to drop. When it doesn’t, the click curve looks much closer to the classic pre-AI world. For many sites, top-3 organic positions are suddenly behaving more like positions 6-10 used to.
In practice, that often looks like this:
Branded queries are still relatively stable, because users are actively looking for a specific brand or site.
Non-branded informational queries fall sharply once AI blocks appear above the traditional results.
On real projects, as our SEO specialist says, it’s no longer unusual to see a number one ranking slip from something like 28-40% CTR before AI features to roughly 10-22% CTR afterwards, depending on the niche and query type. It doesn’t happen for every keyword, but it happens often enough that you can’t pretend it’s a fluke.
Now, it looks like this question needs additional attention, so we asked our SEO specialist which kinds of content are getting hit the hardest, and which still have a real shot at winning clicks. He said:
At the very top of the casualty list sits simple informational content. This includes definition posts, very basic step-by-step tutorials, and short guides.
If the answer comfortably fits into a short paragraph or a neat little list – “How to reset your iPhone,” “What is CPC,” or “Symptoms of food poisoning” – AI and Google’s own modules can:
The verdict from our SEO expert is tough but honest: this kind of content is unsustainable on its own. It only makes sense if it is clearly connected to:
Next comes “commodity” news – general tech updates, trending gossip, crypto price blurbs, and basic “Apple announced new feature” pieces.
Here, AI summaries stacked on top of Google Top Stories create a particularly bad combo. Users get the gist of the story right there, and unless your coverage offers something unique, they rarely dig deeper, unless you can deliver:
And if you cannot, you’re basically competing against hundreds of AI-written rewrites. The verdict: this segment is essentially dead unless you’re already a recognized news site or brand.
Classic SEO/affiliate pages are another big casualty. You know the type:
AI can now generate comparison overviews for these products in seconds. Users get a solid high-level summary without needing to click through ten nearly identical listicles that most likely include bias and ads.
The only review pages that still reliably rank and earn are those that show:
In other words, they survive only if you inject serious E-E-A-T steroids (Experience, Expertise, Authority, Trust).
Simple, mainstream recipe content is in a similar boat. For queries like “how to cook chicken breast,” Google now often shows:
Users can basically cook the dish without ever visiting your site. There are still exceptions, though:
Overall, the verdict here is low viability unless you’re clearly niche, opinionated, or building a brand rather than just chasing keywords.
The story changes when user intent shifts from “give me the answer” to “help me decide.” As our SEO expert claims, AI crawlers and AI-generated SERP modules do not break visibility equally for all query types, and this is where things get more encouraging for publishers.
Our SEO expert told us that queries with transactional or commercial intent are much less affected:
Long, thoughtful analysis is one of your best weapons. This includes:
AI can summarize existing information, but it cannot magically generate:
Think of articles like:
Search results are an increasingly rewarding experience, evidence, and authority. The direction is basically: “Tell me something nobody else knows.”
Interactive tools are another powerful category, and they are extremely future-proof. We’re talking about:
These assets actually solve the problem for the user and lock the user into your interface, not Google’s or a chatbot’s. What is more, this type of content tends to generate shares, bookmarks, and backlinks.
Think of mortgage or loan calculators, pregnancy due date calculators, keyword clustering tools like Thruuu or Keyword Cupid, and YMYL financial health tools. If you want future-proof SEO, don’t just publish another “10 tips” article – build something users will come back for.
Any content that revolves around a structured, proprietary database is also hard for AI to fully replace. That includes pages where you collect and normalize data, and users can browse, filter, and search through it.
Examples might include:
These kinds of pages often become natural link magnets, strong brand assets, and massive topical authority boosters in their niche.
AI can mimic tone, but it cannot fake your actual experience. It doesn’t have your personal failures, your €10k campaign scars, or your accurate industry intuition. Formats that tend to win here include:
This is E-E-A-T in its most weaponized form: instead of recycling already-existing knowledge, you’re sharing lived experience.
AI Overviews currently struggle with video-first knowledge and rich, proprietary visuals. Real footage, authentic screenshots, and complex charts are still hard to compress into a neat AI answer. Things that tend to survive and thrive:
If you show real data and real screens, you’re automatically harder to replace.
Finally, there’s an opinion! What can be more human than a critical (and maybe just a bit provoking) opinion?
We know that AI is mostly trained to be safe, generic, and neutral, but Google, on the other hand, increasingly rewards content with a clear, strong point of view, like contrarian takes, deep, niche expertise and bold, opinionated frameworks.
Think of headlines like:
Opinions require identity, and identity requires authority – that’s something AI fundamentally doesn’t have.
Now, should you be radical towards those AI crawlers? You’ve probably seen the advice: “If you don’t want AI to steal your content, just block GPTBot and its friends.” That naturally leads to a nervous follow-up: if we block AI bots like GPTBot or CCBot but keep Googlebot, are we doing something dangerous to our SEO strategy? Our in-house expert responds:
The first thing to understand is that robots.txt is granular. You can allow Googlebot while blocking AI scrapers. A simple example:
In practice, it works like this:
Googlebot handles organic search, rankings, and classic SEO.
AI bots (GPTBot, ClaudeBot, CCBot, PerplexityBot) handle model training, retrieval, and AI-style answers.
These are related but separate ecosystems, so breathe out. However, to make things even clearer, from here, it’s helpful to split the answer into two: direct SEO impact and indirect ecosystem effects.
Our SEO expert’s verdict is clear: there is no direct SEO penalty just for blocking AI bots. Google’s ranking systems primarily care about:
As long as these crawlers can access your pages as intended, your indexing and rankings are safe. Blocking GPTBot or other non-Google AI crawlers does not automatically tank your SEO.
The crucial nuance comes from the indirect side. If your content is not available to AI models or AI-powered aggregators, you may miss out on:
According to our SEO expert, this can be especially important if you rely heavily on news content, product comparison sites, and review hubs, tutorial/how-to content, and long-tail informational blogs.
In that sense, blocking AI bots can mean fewer brand mentions inside AI tools and fewer referrals from AI-powered experiences. So, from an SEO viewpoint, you can frame it like this:
Direct rankings are safe, as long as Googlebot and friends are allowed.
Indirect AI visibility can be at potential risk – your site might be invisible inside AI answers and AI search experiences.
Publishers don’t block AI crawlers just out of principle. There are very concrete reasons, but they are not about SEO:
Blocking GPTBot, ClaudeBot, CCBot, and similar agents can reduce that exposure and help protect both your content footprint and your infrastructure.
So if you decide to get rid of those crawlers, we also have a piece of advice for you. Instead of copying someone’s robots.txt snippet from Twitter, you’d better think in terms of a real crawl policy.
Robots.txt is still your first line of defence, but many AI scrapers completely ignore it. That’s why more and more publishers implement layered protection, such as:
Some large platforms now even experiment with “pay per crawl” models, where heavy access by big AI companies is monetized rather than handed out for free. Reddit is a well-known example: first restricting many crawlers, then turning its data into licensing deals with Google and OpenAI.
Don’t rely on CAPTCHAs alone
During research for this piece, an affLIFT thread emerged where a member described a ChatGPT-based agent successfully navigating Cloudflare’s “I’m not a robot” Turnstile check on its own. The agent mimicked human behavior – mouse movement, timing, browser signals – well enough to pass.
One member joked that agents controlling a computer like a human “will have no problem clicking that,” and another pointed out that security systems need to evolve fast or CAPTCHAs will become useless.

So yes, CAPTCHAs still help against basic scripts – but they are no longer a serious shield against modern AI agents. If you care about who copies your pages, you’ll need:
Watch logs and analytics like a hawk! You will have to check your server logs for known AI User-Agents regularly, monitor bandwidth, response times, and suspicious IP ranges, as well as constantly compare analytics with monetization:
Many “sessions” but low engagement and earnings? > likely bots.
Huge bandwidth spikes without any revenue bump? > also suspicious.
In short, AI crawlers aren’t pure evil, but they are rewriting the rules: they hit simple informational content the hardest, leave some commercial and expert formats still standing, and force you to think in terms of a real crawl policy instead of blind trust.
Your best move now is to protect your site from pointless bot load, invest in content and tools that AI can’t easily replace, and squeeze as much value as possible from every human who still lands on your pages. And if you want those human visits to turn into actual money, make sure your Monetag Smartlink or other preferred ad format is set. Track the efficiency of Monetag ads and see if the profit stays as high as always – this is also a great sign of your website getting out of the AI “jail”!