How To Stop Search Engine From Indexing Unnecessary Posts And Pages Of Blog - The 15 All Time Secrets That You Must Learn Before Earning By Writing Blogs And Forex Market

How To Stop Search Engine From Indexing Unnecessary Posts And Pages Of Blog










Search engine crawlers generally index all posts and pages of websites and it's a automated process. Whether we add a xml sitemap to 'Webmaster Tools' or not your blog will ultimately be indexed by crawlers. But it will take considerable time. Addition of a xml sitemap just expedites the process and forces the search engine to index your blog or website faster.

There might be certain articles/posts and pages of a blog such as 'privacy policy', 'terms & conditions', 'search', 'archive' or 'label' etc that we do not want to be indexed by web crawlers.

But how to do that? Following steps will explain it in an easy manner.

1) To do this one first needs to enable 'custom robots.txt' in blogger. Read: how to
enable custom robots.txt in blogger.

2) Once you have done this, go to the 'Page Editor' of the specific post or page, from the Blogger Dashboard  and click 'Custom Robots Tags' under 'Post/Page Settings' marked red, as shown in screen shot below:-


                  
screen-shot


3) Now, untick the 'default' and 'all' boxes and tick right 'noindex' and 'none' boxes, as shown below and marked red.

                    
screen-shot



4) Click 'Done'. Now, this specific post/page of your blog will not be indexed the web crawlers of search engines.


HAPPY BLOGGING

No comments

Powered by Blogger.