I have posted around 60 articles at my keepmequiet-com blog but Google is not indexing the whole posts in their index. There are only 47 post indexed and rest of the result is search pages which I don’t like to index in Google. So I have previously added robots.txt and parameters to block them all from being indexed. But those search pages are also there showing “A description for this result is not available because of this site’s robots.txt” So why it is indexed?
You need to noinex them in X Robots header tags. Do this:
Go to blogger > Settings > search Preferences > Enable "Custom robots header tags " > tick the noindex check box under “Archive and Search pages:”
That simple. Resubmit your sitemap. Could take around 24 hours to one week depending on your blog’s crawl rate.
Thanks for the reply and mentioning this tip. I have already implemented it. But as you said this depends on crawl rate, how to increase it?