If you are a blogger, then you must be aware of the “search robots” that spy on your website to recognize new updates and notify the status of the same to the search engines. Bloggers who are struggling to gain organic traffic will be glad to know that these robots have a full influence on their website’s search visibility.
But the only condition is that you must be proficient in using the custom robots header tags in order to receive the best outcome because Robots Header Tag Settings enhance your communication with the crawlers and this communication has to be maintained appropriately to increase your search visibility. And here you are required to get in-depth knowledge about the same. If you are curious to know, then kindly read ahead.
This article includes a step-by-step guide about Setting Up Custom Robots meta tags in Blogger. We have also mentioned here the purpose of using each tag. If you have a blog on Blogspot and are often concerned about increasing the organic traffic but are unaware of the right ways, then this post is just for you.
But, before jumping on the guidance for setting up custom robots Header Tags For Blogger, let’s elaborate a bit about the benefits of using these and the introduction of each specific tag.
Custom Robots Header Tags And Reason For Enabling These
For crawling and indexing your Blogger blogs, enabling this feature is essential so that your blog can be more SEO optimized. Even slight negligence can directly drag your website out from the search engines. So being careful while implementing these tags is mandatory.
In the custom robots header tags option in your Blogger’s dashboard, you will see the following mentioned tags, please carefully read the details written along with each tag.
All – this is the default option that is not specifically required to be enabled. The “all” tag means that the robots are not bound for anything, they can freely crawl and index your page.
Noindex – this tag prevents search engines from indexing your web page. This tag is used when you want to keep a specific page to be hidden from the public. Using the “noindex” tag will restrict the robots from crawling and indexing that particular page of your site.
Nofollow – whenever you put external links on your post, then the default option is “dofollow” which means the URL that you have mentioned on your site, will be followed by the crawlers and will be considered for search engine rankings. If you want to disable that option, then select the nofollow tag, so that no URL on your post can be followed by the robots.
None – this tag is used to implement both noindex and nofollow tags. In short, if you neither want to index your page nor want the outbound links on your site to be followed by the crawlers, then the “none” robot tag is the most suitable option for you.
Noarchive – Google has the feature of copying your website into their server that shows in the search engine result pages whenever your site server goes down. That copied version of your site shows as the “cached” link on the search page. Using the “noarchive” robot tag disables this feature and removes your website’s “cached” link from the search results.
Nosnippet – text snippet is the meta description of your website that depicts what’s inside the web page and helps in captivating readers’ attention for opening the link. This tag must be turned on if you want your readers to read the summary about your web page. Using the “nosnippet” header tag will remove the snippet from the SERPs.
Noodp – it removes the metadata from the open directory project of your website.
Notranslate – you must have noticed that certain web pages offer translation, but if you don’t want to add the translation feature to your site, then use “notranslate” to disable the feature.
Noimageindex – using this tag will deindex the images of the post on the search results page, which means, the post will remain indexed in the SERP but the images of that specific page won’t show up on the search result. This tag helps in keeping your post images hidden from the stealers over the web.
Unavailable_after – this tag is used when you want to unlist a specific web page of your site after a particular time. Here you also need to mention the date and time after which you want to remove your page from the SERP.
How To Setup Custom Robots Header Tags For Blogger
Step 1 – Sign in to your Blogger account.
Step 2 – you will see the list of your blogs, select the particular blog for which you want to customize the robot tags.
Step 3 – click on “setting” on the left sidebar. Scroll down until you find the option “crawlers and indexing”.
Step 4 – Once you get the crawlers and indexing, the first option you will see is, “enable custom robots.txt”, then beneath it, you will see the option “enable custom robots header tags setting”, on the right side of this option you will see a switch button, turn it “ON”.
Step 5 – turning it ON will highlight the below-written pages’ name, these are the pages where you have to modify the robot tags. The pages’ names you will see are – “home page tags”, “Archive and search page tags”, and “post & page tags”.
Step 6 – click any of the pages, a dialogue box will appear for customizing robots tags for your particular selected page, a huge list of robot tags you will see, you need to be careful while turning ON any of the tags, and don’t forget to hit the save button once you are done with your tags selection process.
The procedure of enabling the custom robot tags is the same for archive and search pages and for posts and pages too. Setup the tags as per your preference and hit the save button once you are done.
Caution – avoid setting up custom robots header tags, if you are unsure about it. Our recommended tip is – before trying to customize these tags, read the above-mentioned details with each tag carefully, as your slight unawareness can remove your blog from the search engines.