This blog is indexed by Bing and DuckDuckGo but not by Google
Why Bing Indexed but Google Didn’t
-
Different Crawling Policies
-
Bing is often quicker to index new or smaller sites.
-
Google can be more conservative, especially with Blogger subdomains (
*.blogspot.com
) because there are many spammy ones.
-
-
Domain Property in GSC
-
Adding your site to Google Search Console (GSC) is a good step, but it doesn’t guarantee crawling/indexing.
-
Google still needs to “want” to crawl your pages.
-
-
Possible Issues Specific to Google
-
Crawl budget / prioritization: Low-priority blogs sometimes take months to get picked up.
-
Technical issues: Robots.txt, meta tags, canonical issues (though Blogger defaults usually don’t block Googlebot).
-
Content / trust factors: New blog with lots of posts in a short time might look “thin” or “low quality” to Google until it establishes trust.
-
What You Can Try
-
Check Coverage Report in GSC
-
Go to Google Search Console → Indexing → Pages.
-
See if Google shows errors like “Crawled – currently not indexed” or “Discovered – currently not indexed.”
-
That will tell you if Google is seeing your pages but choosing not to index them.
-
-
Inspect URL Tool in GSC
-
Pick one post URL (like
https://raviswdev.blogspot.com/2025/09/...
) -
Use the URL Inspection Tool.
-
Click Request Indexing.
-
If Googlebot fetches it successfully, indexing usually happens in days/weeks. If it says “URL is not on Google” with crawl errors, that’s the clue.
-
-
Submit Sitemap
-
Blogger auto-generates a sitemap:
https://raviswdev.blogspot.com/sitemap.xml
https://raviswdev.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
-
Go to GSC → Sitemaps → add these.
-
This often speeds up discovery.
-
-
Check Robots.txt
-
Visit:
https://raviswdev.blogspot.com/robots.txt
-
Make sure it doesn’t block
/
or your post paths. -
Default Blogger robots.txt usually allows Googlebot.
-
-
Build Some External Links
-
Share your blog posts on social media, forums, or even link them from another site you control.
-
Google is much more likely to crawl if it finds inbound links.
-
-
Patience (but not blind patience)
-
Some new blogs take 6–12 months before Google starts indexing, especially if they’re on Blogspot.
-
If you’ve crossed 8+ months with zero indexing, it’s a strong signal something is off (robots.txt, sitemap not submitted, or Google applying quality filters).
-
Possible Cause in Your Case
-
Since Bing indexed you quickly, and you already added the domain in GSC months ago, the main issue is likely Google not prioritizing Blogspot subdomain content until it gains some trust signals (backlinks, engagement, sitemap submission, etc.).
-
It’s unlikely to be a robots.txt block, unless you changed Blogger settings.
✅ My suggestion for next steps:
-
Check GSC coverage report + URL Inspection right away.
-
Submit sitemap.xml manually.
-
Pick 3–5 key blog posts and do “Request Indexing.”
-
Try to get at least one external backlink (even from your old blog, GitHub, or social media profile).
That should trigger Googlebot to start crawling.
Me: I had requested indexing for some posts many months back. When I check the status it shows "Crawled - currently not indexed". I have LinkedIn posts that refer to my posts on this Blogger blog. These LinkedIn posts usually get indexed by Google. .... Note that I have four or five more Blogger blogs. I suspect that having more than two or three Blogger blogs is tripping up the later Blogger blogs I added, when it comes to Google search indexing. I will now explore your suggestions of 3. Sitemap and 4. Robots.txt
...
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /
Sitemap: https://raviswdev.blogspot.com/sitemap.xml
-
No crawling block on your posts.
-
Sitemap is correctly linked.
-
So robots.txt is not the cause of Google’s non-indexing.
https://raviswdev.blogspot.com/sitemap.xml
Bottom line
-
🚫 “Sitemap could not be read” in GSC ≠ your blog is blocked.
-
✅ Since your posts are already showing as “Crawled – currently not indexed,” Googlebot is definitely reaching them.
-
🔑 The main issue is indexing trust/quality, not sitemap fetch failure.
The sitemap itself (https://raviswdev.blogspot.com/sitemap.xml
) is being served with an X-Robots-Tag: noindex
HTTP header, which tells Google: “do not index this URL.”
That’s actually normal for Blogger sitemaps — Google does not index the sitemap page itself. Instead, it uses the sitemap as a discovery feed. The “noindex” header doesn’t apply to your actual posts; it only applies to the sitemap document.
👉 But the real issue is that your posts themselves are also stuck at “Crawled – currently not indexed.” That means Googlebot can fetch them, but it’s choosing not to add them to the index.
What this means in practice:
-
“Sitemap couldn’t fetch” in GSC is often misleading for Blogger blogs. The sitemap is accessible, but GSC sometimes fails to process it properly.
-
The bigger concern is why Google isn’t indexing any of your 200+ posts, even though Bing is.
Next steps I’d suggest:
-
Check one actual blog post in URL Inspection (Test Live URL)
-
Example:
https://raviswdev.blogspot.com/2024/07/some-post-title.html
-
See if it shows “Indexing allowed” or also blocked by some
noindex
.
-
-
Confirm Search Preferences in Blogger
-
Go to Blogger Dashboard → Settings → Crawlers and indexing.
-
Ensure:
-
“Enable custom robots.txt” = ON (yours looks fine).
-
“Enable custom robots header tags” = ON.
-
Posts/pages should be set to all (not “noindex”).
-
-
-
Check
robots.txt
manually for a sample post-
Try:
https://raviswdev.blogspot.com/robots.txt?url=/2024/07/some-post-title.html
-
Make sure it isn’t disallowed.
-
⚠️ Key point:
If your posts themselves are returning “noindex” in the HTTP header, that’s the real blocker. If only the sitemap has “noindex” (which is expected), then the problem lies elsewhere — possibly low trust / duplicate content filters.
Me: https://raviswdev.blogspot.com/2025/06/using-chatgpt-to-write-react-native.html - Jun 6, 2025 post. Test Live URL worked. ... As in my past such attempts, I clicked on REquest Indexing which got done successfully.
CG: Perfect 👍 — that’s actually very good news.
Since your actual post URL (the June 6, 2025 article) passed Test Live URL without the noindex
issue, that means:
-
✅ Your posts are indexable (no robots.txt block, no
noindex
header). -
✅ GSC allowed you to request indexing, so Googlebot can fetch it.
-
❌ The fact that it’s still not showing in Google Search means this is a “Crawled – currently not indexed” case — i.e., Google is deliberately holding back indexing.
Why Google does this
-
Quality/trust filters: New blogs (especially on Blogger, with many posts in a short time) sometimes get held back until Google trusts the site more.
-
Multiple blogs: As you suspected, having 4–5 blogs on the same Google account/IP can make Google slower to index later ones (not an official rule, but many bloggers report this pattern).
-
Thin/duplicate content suspicion: If Google thinks content overlaps with other sites (even your own LinkedIn posts quoting your blog), it may prefer indexing the LinkedIn version.
What you can do now
-
Keep requesting indexing for your highest-value posts (like the React Native one). Do this gradually — maybe 1–2 per day.
-
Get backlinks from outside LinkedIn. Even a single link from another site/blog that Google already trusts can “unlock” indexing.
-
Example: GitHub Pages, Medium, or a personal site with a custom domain.
-
-
Strengthen E-E-A-T signals: Add an About page, Contact page, and link back to your main profile (book, GitHub, LinkedIn). Google treats such signals as indicators of a real author.
-
Be patient: Some Blogger blogs take 6–12 months before Google fully indexes them. Bing is much faster, but Google is more selective now.
⚡ Suggestion: Since your LinkedIn posts referencing your blog are indexed, you can try this trick:
-
Write a new LinkedIn post that links only to one blog article you want indexed.
-
Request indexing for that blog article in GSC.
-
Sometimes, Google follows the link from LinkedIn and finally decides to add the blog post itself.
Me: Perhaps three or maybe five years ago, I would see that my posts on my other Blogger blogs would get indexed very quickly. Now even for my other Blogger blogs, some posts get indexed but some are not indexed with Blogger giving some weird alternate canonical something error when I would check. I would not face such issues in the past. ....
My suspicion is something changed in Google Blogger policies few years ago with them cutting down on stuff they index. That has impacted my Blogger blogs with new Blogger blogs being typically ignored completely!
I further think that Google is keeping such policies confidential. So as a Blogger user, I am totally in the dark. ... I don't have an option as I am using free Blogger service.
Good to know that Bing and DuckDuckGo search engines have not axed my new Blogger blogs from their indexing. But when it comes to Search, Google is king.
Thanks for your suggestions. We tried - it did not work, so far at least. As of now, I give up. I mean, what else can I do? If (I was) paying for the service, I could demand customer support. But I don't have that option. Thanks anyway.
On Blogger, the default settings are:
-
Custom robots.txt = OFF → Blogger serves its own default robots.txt (which is what you saw).
-
Custom robots header tags = OFF → Blogger applies its own default meta robots rules.
And as you noticed, those defaults are the same on your older blog (ravisiyer.blogspot.com
) — yet that one does get indexed (at least partly). So, the settings themselves aren’t blocking indexing.
What this means
-
Your posts are not blocked (we confirmed with Test Live URL).
-
Sitemaps are technically accessible, even if GSC says “couldn’t fetch.”
-
The difference between
ravisiyer
(some indexing) andraviswdev
(zero indexing) is Google’s trust/priority filter, not a robots.txt or header setting.
Google has become much more selective about Blogger blogs in the past 3–5 years — exactly the trend you described. Their systems appear to “sandbox” new blogs for months, and some stay semi-ignored unless they get signals (backlinks, custom domain, external traffic).
About the “Posts/pages should be set to all (not noindex)”
That setting only appears after you enable custom robots header tags. It lets you specify default robots behavior for:
-
Homepage
-
Archive/search pages
-
Individual posts/pages
But since you left it off, you’re just on Blogger’s defaults, which is perfectly fine.
✅ Bottom line: You are correct — these crawler/indexing settings are not the reason your raviswdev
blog is invisible in Google. The cause is Google’s internal policy shift on free Blogspot blogs.
Do you want me to map out what happens if you switch raviswdev.blogspot.com
to a custom domain (say raviswdev.com
), but keep Blogger as the backend? That’s often the single change that pushes Google to treat the blog as a “real site” instead of a low-priority subdomain.
Me: I will have to pay for that custom domain. I want to use the free option now and so I can't go for the custom domain option.
CG: Got it 👍 — if you want to stay with 100% free Blogger + Blogspot subdomain, then unfortunately you’re bound by whatever “priority” Google’s systems assign to Blogspot sites.
Here’s the practical reality of your situation:
1. What’s happening now
-
raviswdev.blogspot.com is new → Google puts it in a “low trust” bucket.
-
They crawl (you saw sitemaps open fine, Live Test works) but choose not to index.
-
This is not because of your settings — it’s a policy/algorithmic filter.
2. Why your older blog (ravisiyer.blogspot.com
) still gets some indexing
-
It has age, history, and maybe some backlinks.
-
Google doesn’t drop old Blogspot blogs fully, but new ones don’t get the same indexing treatment.
3. What you can still try (free only)
-
Interlinking → Add links from
ravisiyer.blogspot.com
posts toraviswdev.blogspot.com
posts (internal backlinks help). -
External signals → Share your new blog’s posts on Twitter/X, Mastodon, LinkedIn, or forums. Even one or two clicks from outside can “wake up” Googlebot.
-
Regular posting → New posts, updated posts, labels — activity sometimes helps show Google your blog is alive.
-
Manual indexing requests → Keep using “Test Live URL → Request Indexing.” It may take weeks, but sometimes a few posts slip into the index.
-
Patience → With Blogspot, indexing delays of 3–6 months are common now.
⚠️ Without a custom domain or external backlinks, there’s no guaranteed way to force Google to index all (or even most) of your new Blogspot posts anymore.
So yes — you are right to “give up” on fighting this head-on. The best you can do for free is to build signals slowly and hope Google lets some posts through. Bing, DuckDuckGo, etc. will continue to index them fine.
Comments
Post a Comment