Below are the most recent 30 comments. I try to keep it clean of comment spam, but some times things
get through and it takes me several hours to get to it. So please excuse any of that comment spam.
The only difference is, that optimzing for AI doesn't make sense, because there's no reward for appearing in the AI generated answers. Users will not click through to your site, so you'll not be able to recoup any costs you invested into optimizing your content.
Just shows you again why the "AI economy" is unsustainable, and why it will lead to the complete degeneration of the web even for content that will still stay on it or will still get published.
I block China and other countries through CloudFlare and loads of visits still show up on GA. They don't show up in CloudFlare's own stats though, so I don't trust the info in GA - I'm sure they estimate a lot of the traffic
What sticks with me most about noindex is an experiment I once saw where someone put the noindex tag inside the body, and it actually worked, the page didn’t get indexed.
That story never fails to make me smile.
Strange that its stopped reporting CWV for Mobile but is reporting for Desktop. I have the same version sent for desktop and mobile. My site is predominantly a desktop site, but that still doesn't explain the drop.
Also, I have noticed, there's been an increase in desktop CWV all of a sudden, but I can't reproduce the error in Chrome or Webpagetest :(
https://uploads.disquscdn.com/images/36291d5995212451d6fdc1f5b8ddf61032e8fe44434a94f7c6016bbc09d6e231.png
Yes, the GEO IP DB is paid and works extremely well for country blocking. But Alibaba, TenCent, etc. have servers all over the world so those must be addressed manually. We're ecom and get attempted credit card testing fraud orders too, and those come from all over the world including compromised PCs in the USA running hidden proxies. With country blocks in place, and dealing with more elusive scraper bots as they become known, we've locked most of them out and have it under control.
Is anyone able to access the "settings" tab within GSC? Across all my properties, I've been hit with a 504 error for at least 3 weeks now. https://uploads.disquscdn.com/images/1bacd4efc80ae7e3e78447cf98eb0d35053968b1cf59f53e3d3a73610d3df8ff.png
Down 98% in the last few days since the update, impressions have tanked to their lowest in years, absolute crickets, no articles being indexed. What a lovely way to celebrate the festive season. Stay safe out there, everyone, and happy holidays!
Honestly, the sheer amount of headaches this specific disconnect causes is exhausting.
It’s frustrating to be the one constantly telling brilliant dev teams that they have to “dumb down” modern architectures because the crawler won’t execute past the first signal. We’re building spaceships for users, but we still have to park them in a horse stable so the bot can make sense of them.
I’m tired of debugging pages that work perfectly in real browsers but remain invisible to Googlebot. It’s 2025, we shouldn’t still be fighting the crawler at this level.
Yeah, I know what you're thinking, why have I reposted your comment with paragraphs, I want to test a theory to see if Disqus sees if having one large paragraph is spam.
If we used GA, and removed our block list, I'm sure our traffic would look similar. You folks using GA need to get a handle on these bots because I'd say 90% of the hits our site gets are from bots.
GA traffic data has to be highly unreliable due to it's failure to distinguish valid human traffic. Doesn't anyone have or sell an updated exclusion list for those using GA? If not, that would be a great opportunity for anyone willing to put in the time and effort.
If all else fails, maybe it's time to move beyond GA and find an analytics service that excludes bots.
Would be far better than flying blind with GA because flying blind and being oblivious to the lack of real human traffic is what Google wants otherwise they would have fixed it already.
Thanks. Has Disqus rolled out something new that's doing this or maybe a new mod setting that needs to be dialed down? It appears I'm not the only one with this problem, and there were no links in that post either. Just weird....
Still Stuck. I was the one posting yesterday on X about it. The past 2 days traffic is almost dead. I really have no clue what is happening. Traffic was never that low in every other core update.
If we used GA, and removed our block list, I'm sure our traffic would look similar. You folks using GA need to get a handle on these bots because I'd say 90% of the hits our site gets are from bots. GA traffic data has to be highly unreliable due to it's failure to distinguish valid human traffic. Doesn't anyone have or sell an updated exclusion list for those using GA? If not, that would be a great opportunity for anyone willing to put in the time and effort. If all else fails, maybe it's time to move beyond GA and find an analytics service that excludes bots. Would be far better than flying blind with GA because flying blind and being oblivious to the lack of real human traffic is what Google wants otherwise they would have fixed it already.
“What the hell is going on?… Is it full of bots everywhere or what?… We’re getting visits from countries outside our target.” https://uploads.disquscdn.com/images/7f4de04de38be19c4fdbcb120454029b1f1d595b57bfb38c3f088f06a2ff8079.jpg
The GEO IP is a paid product right? I also block some IPs from USA if they are heavy visitors because no user would read that many pages. FOr me, I can't see why FB needs to send bots to my site at such a heavy rate, they're not a SE engine and I don't get that many visitors from them so I block them.