Have you observed a sudden drop in your Blogger sitemap index data inside your webmasters tools account? Do you see a major drop in your URLS Submitted data compared to Indexed data? If yes then there is nothing to worry because just this week Blogger has updated the total number of items in its Atom Feed and RSS Feed. The maximum number of blog post entries in these XML files were previously 500 Links but after this latest update, your blogspot XML feed has now been broken down into smaller groups containing a maximum of 151 Links in each page. This means that now webmasters will show a count of 151 in the Submitted data column which was previously 501 for URLS in web index. The default feed count is same i.e. 26 links in both RSS and Atom Feed.
This surely looks like a much better approach for faster indexing of your blog posts and this will also help Google webmasters tools to better organize your sitemap pages into smaller parts which are easy to index and crawl thus saving processing time. The same update is applied to Blogger Sitemap.xml file where each sub-page will now consist of only 150 links, previously 500 URLS were grouped per page.
How to fix Submitted and Indexed Data Difference?
You are seeing this error or difference of sitemap data inside your webmasters search console account because you have submitted multiple custom sitemaps to Google either using our Sitemap Generator tool or using a custom sitemap submission method using atom feeds. This sitemaps are located at atom.xml. The screenshot below shows the
In the screen shot above you can observe that the submitted urls are 151 while the indexed urls are 496. The Indexed links are greater than submitted links because webmasters tool is showing you the old cached indexed count of blogger sitemap where the sitemap contained a maximum of 500 links. The indexed data is thus an old cached count while the submitted data shows the refreshed Link count after Blogger updated sitemaps.
Blogger introduced Dynamic Sitemaps for all blogspot blogs from January 2015 onwards. Now you can excess all your sitemap data from a single sitemap.xml file as shared below:
Your sitemap.xml file is hosted at this location:
Our Blog Parent sitemap is hosted under this link:
All you need to do is to delete the custom atom.xml sitemaps and instead submit sitemap.xml as the only sitemap inside your webmasters account. Follow these steps:
1. Go To Webmasters Tools > Crawl > Sitemaps
2. Delete all your old sitemaps which uses the atom.xml Feed structure
3. Next submit the dynamic sitemap.xml as the only sitemap
4. That's it!
This single parent sitemap will automatically update its sub-pages and submit your new posts to Google index. If you click the sitemap link you can easily see how many sub-pages does the sitemap contain. This is a better and standard way to create sitemaps and widely followed for wordpress blogs as well.This parent sitemap is divided into sub-pages each containing a total of 150 links per page. Since we have published around 1435 posts so far therefore the total number of our sitemap pages are thus 10 where each sitemap contains 150 links. It is basic math i.e. 10 x 150 = 1500
It takes around 24 hours for all links to get indexed.
Already submitted sitemap.xml?
If you have already submitted the dynamic sitemap months ago but still see a difference in your submitted/indexed data then you just need to resubmit the sitemap so that Google may fetch all newly submitted links.
After resubmitting our sitemap the index data got back to normal:
I hope this clarifies all your answers and help you guys troubleshoot this technical SEO problem that you were facing recently. Let me know if I could be of any further assistance. Feel free to post your questions in the comment box below. Wishing a great weekend buddies. Peace and blessings! =)
This article was requested by our reader Basit Khan. I personally thank him for bringing this problem into our notice.