This blog is indexed by Bing and DuckDuckGo but not by Google

I had a chat with ChatGPT (CG), 'Google indexing issues' today (13 Sep.) on this issue. CG made some suggestions which I tried out but they did not solve the problem. However, I now have the satisfaction that I tried out more possible solutions than I had done in the past. CG also agreed with my view that Google is not indexing this blog probably due to some policy changes they have made over past few years where they have become more selective about indexing Blogger blogs, especially new blogs. Further, these policy changes seem to be private and not disclosed to Blogger users.

Given below are edited extracts from the abovementioned chat.

Me: My blog https://raviswdev.blogspot.com/ was created on or around Jan. 26, 2024 and has over 200 published posts now. Bing indexed it early on and so site:raviswdev.blogspot.com in bing.com search lists many results from my blog. But Google has not indexed it so far. site:raviswdev.blogspot.com in Google.com search gives 0 results. I had also added raviswdev.blogspot.com as a domain property in Google Search Console many months ago but that has not helped at all. 

Any ideas why? Any solutions?

CG: 

Why Bing Indexed but Google Didn’t

  1. Different Crawling Policies

    • Bing is often quicker to index new or smaller sites.

    • Google can be more conservative, especially with Blogger subdomains (*.blogspot.com) because there are many spammy ones.

  2. Domain Property in GSC

    • Adding your site to Google Search Console (GSC) is a good step, but it doesn’t guarantee crawling/indexing.

    • Google still needs to “want” to crawl your pages.

  3. Possible Issues Specific to Google

    • Crawl budget / prioritization: Low-priority blogs sometimes take months to get picked up.

    • Technical issues: Robots.txt, meta tags, canonical issues (though Blogger defaults usually don’t block Googlebot).

    • Content / trust factors: New blog with lots of posts in a short time might look “thin” or “low quality” to Google until it establishes trust.


What You Can Try

  1. Check Coverage Report in GSC

    • Go to Google Search Console → Indexing → Pages.

    • See if Google shows errors like “Crawled – currently not indexed” or “Discovered – currently not indexed.”

    • That will tell you if Google is seeing your pages but choosing not to index them.

  2. Inspect URL Tool in GSC

    • Pick one post URL (like https://raviswdev.blogspot.com/2025/09/...)

    • Use the URL Inspection Tool.

    • Click Request Indexing.

    • If Googlebot fetches it successfully, indexing usually happens in days/weeks. If it says “URL is not on Google” with crawl errors, that’s the clue.

  3. Submit Sitemap

    • Blogger auto-generates a sitemap:

    • https://raviswdev.blogspot.com/sitemap.xml

    • https://raviswdev.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

    • Go to GSC → Sitemaps → add these.

    • This often speeds up discovery.

  4. Check Robots.txt

    • Visit: https://raviswdev.blogspot.com/robots.txt

    • Make sure it doesn’t block / or your post paths.

    • Default Blogger robots.txt usually allows Googlebot.

  5. Build Some External Links

    • Share your blog posts on social media, forums, or even link them from another site you control.

    • Google is much more likely to crawl if it finds inbound links.

  6. Patience (but not blind patience)

    • Some new blogs take 6–12 months before Google starts indexing, especially if they’re on Blogspot.

    • If you’ve crossed 8+ months with zero indexing, it’s a strong signal something is off (robots.txt, sitemap not submitted, or Google applying quality filters).


Possible Cause in Your Case

  • Since Bing indexed you quickly, and you already added the domain in GSC months ago, the main issue is likely Google not prioritizing Blogspot subdomain content until it gains some trust signals (backlinks, engagement, sitemap submission, etc.).

  • It’s unlikely to be a robots.txt block, unless you changed Blogger settings.


My suggestion for next steps:

  1. Check GSC coverage report + URL Inspection right away.

  2. Submit sitemap.xml manually.

  3. Pick 3–5 key blog posts and do “Request Indexing.”

  4. Try to get at least one external backlink (even from your old blog, GitHub, or social media profile).

That should trigger Googlebot to start crawling.



Me: I had requested indexing for some posts many months back. When I check the status it shows "Crawled - currently not indexed". I have LinkedIn posts that refer to my posts on this Blogger blog. These LinkedIn posts usually get indexed by Google. .... Note that I have four or five more Blogger blogs. I suspect that having more than two or three Blogger blogs is tripping up the later Blogger blogs I added, when it comes to Google search indexing. I will now explore your suggestions of 3. Sitemap and 4. Robots.txt

...

Me: Output of https://raviswdev.blogspot.com/robots.txt :
User-agent: Mediapartners-Google
Disallow: 
User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /
Sitemap: https://raviswdev.blogspot.com/sitemap.xml

CG: ... 
✅ Conclusion from robots.txt
  • No crawling block on your posts.

  • Sitemap is correctly linked.

  • So robots.txt is not the cause of Google’s non-indexing.

...

Me: Submitted the two sitemap entries you suggested, separately: https://raviswdev.blogspot.com/sitemap.xml https://raviswdev.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500 
------------- 
GSC gave a message for each that Sitemap was submitted successfully. The entries it shows under Submitted sitemaps show the added sitemaps. But the status is 'Couldn't fetch'.

...
[I later shared partial outputs of:
https://raviswdev.blogspot.com/sitemap.xml
https://raviswdev.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500 ]

CG: Excellent — thanks for sharing the partial outputs. Both feeds are valid and accessible (they open fine in a browser, show proper XML, list your posts, and have lastmod/updated fields). That means the “Couldn’t fetch” status in GSC is not because the sitemaps are broken. ....

Me: I could not find an option for resubmit on clicking the sitemap. The display says 'Sitemap could not be read' .. But the open sitemap button opens the sitemap

CG ...

Bottom line

  • 🚫 “Sitemap could not be read” in GSC ≠ your blog is blocked.

  • ✅ Since your posts are already showing as “Crawled – currently not indexed,” Googlebot is definitely reaching them.

  • 🔑 The main issue is indexing trust/quality, not sitemap fetch failure.


[Test Live URL failed for https://raviswdev.blogspot.com/sitemap.xml]

Me: Test Live URL says: 
URL is not available to Google 
Page availability 
Page cannot be indexed: Excluded by ‘noindex’ tag 
Indexing allowed? 
error 
No: 'noindex' detected in 'X-Robots-Tag' http header

CG: Ah, that explains everything. ✅

The sitemap itself (https://raviswdev.blogspot.com/sitemap.xml) is being served with an X-Robots-Tag: noindex HTTP header, which tells Google: “do not index this URL.”

That’s actually normal for Blogger sitemaps — Google does not index the sitemap page itself. Instead, it uses the sitemap as a discovery feed. The “noindex” header doesn’t apply to your actual posts; it only applies to the sitemap document.

👉 But the real issue is that your posts themselves are also stuck at “Crawled – currently not indexed.” That means Googlebot can fetch them, but it’s choosing not to add them to the index.

What this means in practice:

  • “Sitemap couldn’t fetch” in GSC is often misleading for Blogger blogs. The sitemap is accessible, but GSC sometimes fails to process it properly.

  • The bigger concern is why Google isn’t indexing any of your 200+ posts, even though Bing is.

Next steps I’d suggest:

  1. Check one actual blog post in URL Inspection (Test Live URL)

    • Example: https://raviswdev.blogspot.com/2024/07/some-post-title.html

    • See if it shows “Indexing allowed” or also blocked by some noindex.

  2. Confirm Search Preferences in Blogger

    • Go to Blogger Dashboard → Settings → Crawlers and indexing.

    • Ensure:

      • “Enable custom robots.txt” = ON (yours looks fine).

      • “Enable custom robots header tags” = ON.

      • Posts/pages should be set to all (not “noindex”).

  3. Check robots.txt manually for a sample post

    • Try: https://raviswdev.blogspot.com/robots.txt?url=/2024/07/some-post-title.html

    • Make sure it isn’t disallowed.


⚠️ Key point:
If your posts themselves are returning “noindex” in the HTTP header, that’s the real blocker. If only the sitemap has “noindex” (which is expected), then the problem lies elsewhere — possibly low trust / duplicate content filters.

Me: https://raviswdev.blogspot.com/2025/06/using-chatgpt-to-write-react-native.html - Jun 6, 2025 post. Test Live URL worked. ... As in my past such attempts, I clicked on REquest Indexing which got done successfully.

CG: Perfect 👍 — that’s actually very good news.

Since your actual post URL (the June 6, 2025 article) passed Test Live URL without the noindex issue, that means:

  • ✅ Your posts are indexable (no robots.txt block, no noindex header).

  • ✅ GSC allowed you to request indexing, so Googlebot can fetch it.

  • ❌ The fact that it’s still not showing in Google Search means this is a “Crawled – currently not indexed” case — i.e., Google is deliberately holding back indexing.


Why Google does this

  • Quality/trust filters: New blogs (especially on Blogger, with many posts in a short time) sometimes get held back until Google trusts the site more.

  • Multiple blogs: As you suspected, having 4–5 blogs on the same Google account/IP can make Google slower to index later ones (not an official rule, but many bloggers report this pattern).

  • Thin/duplicate content suspicion: If Google thinks content overlaps with other sites (even your own LinkedIn posts quoting your blog), it may prefer indexing the LinkedIn version.


What you can do now

  1. Keep requesting indexing for your highest-value posts (like the React Native one). Do this gradually — maybe 1–2 per day.

  2. Get backlinks from outside LinkedIn. Even a single link from another site/blog that Google already trusts can “unlock” indexing.

    • Example: GitHub Pages, Medium, or a personal site with a custom domain.

  3. Strengthen E-E-A-T signals: Add an About page, Contact page, and link back to your main profile (book, GitHub, LinkedIn). Google treats such signals as indicators of a real author.

  4. Be patient: Some Blogger blogs take 6–12 months before Google fully indexes them. Bing is much faster, but Google is more selective now.


⚡ Suggestion: Since your LinkedIn posts referencing your blog are indexed, you can try this trick:

  • Write a new LinkedIn post that links only to one blog article you want indexed.

  • Request indexing for that blog article in GSC.

  • Sometimes, Google follows the link from LinkedIn and finally decides to add the blog post itself.



Me: Perhaps three or maybe five years ago, I would see that my posts on my other Blogger blogs would get indexed very quickly. Now even for my other Blogger blogs, some posts get indexed but some are not indexed with Blogger giving some weird alternate canonical something error when I would check. I would not face such issues in the past. .... 

My suspicion is something changed in Google Blogger policies few years ago with them cutting down on stuff they index. That has impacted my Blogger blogs with new Blogger blogs being typically ignored completely! 

I further think that Google is keeping such policies confidential. So as a Blogger user, I am totally in the dark. ... I don't have an option as I am using free Blogger service. 

Good to know that Bing and DuckDuckGo search engines have not axed my new Blogger blogs from their indexing. But when it comes to Search, Google is king. 

Thanks for your suggestions. We tried - it did not work, so far at least. As of now, I give up. I mean, what else can I do? If (I was) paying for the service, I could demand customer support. But I don't have that option. Thanks anyway.

[CG tended to agree.]

Me: I reviewed our chat and tried out two suggestions you had made, which I had not tried out earlier.
I later checked https://raviswdev.blogspot.com/robots.txt?url=/2025/06/using-chatgpt-to-write-react-native.html . Output:
User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /

Sitemap: https://raviswdev.blogspot.com/sitemap.xml
==================

Next, you had said: Go to Blogger Dashboard → Settings → Crawlers and indexing.
Ensure:
“Enable custom robots.txt” = ON (yours looks fine).
“Enable custom robots header tags” = ON.
Posts/pages should be set to all (not “noindex”).
--- end of your instruction ---------

When I checked:
“Enable custom robots.txt” is OFF
“Enable custom robots header tags” is OFF
Could not find anything related to: Posts/pages should be set to all (not “noindex”).

But the above settings are the same for another of my blogs: ravisiyer.blogspot.com which does get indexed by Google (earlier on, all posts; now: some posts). So I don't think the above settings are the cause of my raviswdev.blogspot.com not getting indexed.

CG: You did some very sharp checking here 👌 — and you’re right.

On Blogger, the default settings are:

  • Custom robots.txt = OFF → Blogger serves its own default robots.txt (which is what you saw).

  • Custom robots header tags = OFF → Blogger applies its own default meta robots rules.

And as you noticed, those defaults are the same on your older blog (ravisiyer.blogspot.com) — yet that one does get indexed (at least partly). So, the settings themselves aren’t blocking indexing.


What this means

  • Your posts are not blocked (we confirmed with Test Live URL).

  • Sitemaps are technically accessible, even if GSC says “couldn’t fetch.”

  • The difference between ravisiyer (some indexing) and raviswdev (zero indexing) is Google’s trust/priority filter, not a robots.txt or header setting.

Google has become much more selective about Blogger blogs in the past 3–5 years — exactly the trend you described. Their systems appear to “sandbox” new blogs for months, and some stay semi-ignored unless they get signals (backlinks, custom domain, external traffic).


About the “Posts/pages should be set to all (not noindex)”

That setting only appears after you enable custom robots header tags. It lets you specify default robots behavior for:

  • Homepage

  • Archive/search pages

  • Individual posts/pages

But since you left it off, you’re just on Blogger’s defaults, which is perfectly fine.


✅ Bottom line: You are correct — these crawler/indexing settings are not the reason your raviswdev blog is invisible in Google. The cause is Google’s internal policy shift on free Blogspot blogs.


Do you want me to map out what happens if you switch raviswdev.blogspot.com to a custom domain (say raviswdev.com), but keep Blogger as the backend? That’s often the single change that pushes Google to treat the blog as a “real site” instead of a low-priority subdomain.

Me: I will have to pay for that custom domain. I want to use the free option now and so I can't go for the custom domain option.

CG: Got it 👍 — if you want to stay with 100% free Blogger + Blogspot subdomain, then unfortunately you’re bound by whatever “priority” Google’s systems assign to Blogspot sites.

Here’s the practical reality of your situation:

1. What’s happening now

  • raviswdev.blogspot.com is new → Google puts it in a “low trust” bucket.

  • They crawl (you saw sitemaps open fine, Live Test works) but choose not to index.

  • This is not because of your settings — it’s a policy/algorithmic filter.

2. Why your older blog (ravisiyer.blogspot.com) still gets some indexing

  • It has age, history, and maybe some backlinks.

  • Google doesn’t drop old Blogspot blogs fully, but new ones don’t get the same indexing treatment.

3. What you can still try (free only)

  • Interlinking → Add links from ravisiyer.blogspot.com posts to raviswdev.blogspot.com posts (internal backlinks help).

  • External signals → Share your new blog’s posts on Twitter/X, Mastodon, LinkedIn, or forums. Even one or two clicks from outside can “wake up” Googlebot.

  • Regular posting → New posts, updated posts, labels — activity sometimes helps show Google your blog is alive.

  • Manual indexing requests → Keep using “Test Live URL → Request Indexing.” It may take weeks, but sometimes a few posts slip into the index.

  • Patience → With Blogspot, indexing delays of 3–6 months are common now.


⚠️ Without a custom domain or external backlinks, there’s no guaranteed way to force Google to index all (or even most) of your new Blogspot posts anymore.

So yes — you are right to “give up” on fighting this head-on. The best you can do for free is to build signals slowly and hope Google lets some posts through. Bing, DuckDuckGo, etc. will continue to index them fine.



Comments