
Fabrice Canel and Krishna Madhavan from the Microsoft Bing team posted a useful blog post named Does Duplicate Content Hurt SEO and AI Search Visibility? The short answer is yes, and the long answer is also yes.
I like how Fabrice Canel put it on X, "Similar pages blur signals and weaken SEO and AI visibility." Yep, when you give search engines, like Bing, mixed and confusing signals, it is possible for that to cause issues with traditional and AI search. This is why John Mueller from Google keeps talking about consistency.
That being said, the Bing blog post goes into the AI aspect of all of this. It reads:
- AI search builds on the same signals that support traditional SEO, but adds additional layers, especially in satisfying intent. Many LLMs rely on data grounded in the Bing index or other search indexes, and they evaluate not only how content is indexed but how clearly each page satisfies the intent behind a query. When several pages repeat the same information, those intent signals become harder for AI systems to interpret, reducing the likelihood that the correct version will be selected or summarized.
- When multiple pages cover the same topic with similar wording, structure, and metadata, AI systems cannot easily determine which version aligns best with the user’s intent. This reduces the chances that your preferred page will be chosen as a grounding source.
- LLMs group near-duplicate URLs into a single cluster and then choose one page to represent the set. If the differences between pages are minimal, the model may select a version that is outdated or not the one you intended to highlight.
- Campaign pages, audience segments, and localized versions can satisfy different intents, but only if those differences are meaningful. When variations reuse the same content, models have fewer signals to match each page with a unique user need.
- AI systems favor fresh, up-to-date content, but duplicates can slow how quickly changes are reflected. When crawlers revisit duplicate or low-value URLs instead of updated pages, new information may take longer to reach the systems that support AI summaries and comparisons. Clearer intent strengthens AI visibility by helping models understand which version to trust and surface.
It is really a good blog post, and goes into tons of details. And I believe this is all relevant for Google Search and Google AI responses as well.
So read the Does Duplicate Content Hurt SEO and AI Search Visibility blog post.
🎁 Santa may check his list twice, but duplicate web content is one place where less is more. Similar pages blur signals and weaken SEO and AI visibility. Give your site the gift of clarity in our latest Bing Webmaster Blog. https://t.co/TPrOQGywHJ #Bing #SEO #AIsearch pic.twitter.com/EXwx8cSFw0
— Fabrice Canel (@facan) December 19, 2025
Update from John Mueller of Google:
Yes! I think it's even more the case nowadays. Mainstream search engines have practice dealing with the weird & wonky web, but there's more than just that, and you shouldn't get lazy just because search engines can figure out many kinds of sites / online presences.
— John Mueller (@johnmu.com) December 24, 2025 at 8:41 AM
Forum discussion at X.

