A website can be used for many purposes. Not everything can be shared by the world. There are situations when you want to hide a website.
Like a wedding website, where you want to just share the URL with the guests whom you want to invite. Keeping the website open for the whole world is like inviting the whole world to your wedding. Not affordable in many cases.
Private party events, membership sites, etc. fall in the same category where you want to hide the website from search engines. So that the information related to the event or site cannot be indexed by search engine bots.
Google spiders are the most advanced bots in terms of search engines. They can crawl and understand all kinds of data whether it is in PDF, PPT, Flash, etc formats.
In situations where you need to hide your website from Google, we have written three effective ways that can help you hide your website from search engines.
1. Blog Search Engines From Website Using noindex HTML meta tag.
Meta tags are the HTML tags used in a web page to keep the webpage metadata for reference of search engine spiders. This meta data is used by search engine bots to learn more about the page and its content.
Search engines like Google follow this directive strictly. They will deindex the website from their search results when a webpage is marked as noindex. This results in a Google ranking drop for that page.
The code used in HTML for putting noindex meta tag is as follows:
This code is put in between the tags of an HTML page. It says that any kind of crawler called robots, to not index the content of the page.
But if you want to block only Google from indexing your website, then the above code is used like this:
This code says to the webpage crawlers of Google, which are called “googlebot” to not index the webpage where this code is used.
Fortunately, Google respects this tag and does not index pages or posts that have noindex meta tag in them.
If you are using WordPress, then you do not need to input noindex meta tags on every page. There is an easier way to do that with just a few clicks.
Go to the WordPress admin dashboard and find Settings > Reading.
As you can see, with just one selection in the WordPress dashboard you can discourage all search engines from indexing the website.
2. Using Robots.txt File To Block Google
Adding meta tag with noindex value is the easiest way to deindex a website from search engines. But not every time you want to deindex the whole website.
When you want more control over the indexing and deindexing of your website, then go for the robots.txt file in your website directory.
A robots.txt file is placed at the root of your website files. It instructs the robots the way you want your website to be treated.
A robots.txt file looks like this:
User-agent: Googlebot Disallow: /dontindexthispage/ User-agent: * Allow: /
In the above set of instructions, we are saying to Googlebot to not index the page on your website which has “dontindexthispage” in its URL.
In the second instruction, we are allowing all kinds of bots, spiders and search engine spiders to crawl the whole website. No restrictions are placed for the indexing of any webpage for other search engine spiders.
As you can see, with the robots.txt file you will have more control over the indexation of your website and its different parts. You can write your own set of rules here.
If you are still not sure how to write the right set of rules for the robots.txt file, do not worry. Here is an easy to use online robots.txt file generator for free: https://en.ryte.com/free-tools/robots-txt-generator/#custom
Generate the instructions from this robots.txt generator and input them into your website’s robots.txt file. With this tool you can block different kinds of search engine bots like these:
As you can see there are many different kinds of search engine crawlers that are always spidering the websites for different purposes. You can block all of them or select ones based on your website purposes.
3. Password Protecting The Website Pages To Block Search Engines
Blocking the website using meta tags and robots.txt value is the most commonly used method to not index a website in search engines. But it does not mean that it’s the most effective way.
Sometimes the search engines do not respect the instructions of robots.txt file or noindex meta tags. There can be many reasons for that.
As the search engine, bots are a set of computer codes working without a human brain so errors are bound to happen at times.
To prevent yourself from these errors, you need to have a bulletproof way to make it impossible for search spiders to not index your website. This is possible by making your website password protected.
Password protecting a website works the same way how your password protects your email. Ever seen any of your emails indexed in Google? Never, right!
Password protecting a website is the bulletproof way to not index your website in search engine results. Anyone who wants to see the content of the website needs to input a password and Google bots cannot have it or use them.
Password protecting for a website can be used to block the whole website or parts of it. Most of the membership websites lock their content behind a password.
If you are using WordPress as your website’s CMS, then it got many plugins that can be used to password protect your website. Check out the list of plugins that can be used to restrict content on a WordPress website here: https://wordpress.org/plugins/tags/restrict-content/
Based on the purpose of blocking search engines like Google from indexing your website, the three ways we discussed above can be used.
Also based on the sensitivity of the content on the website, one can select either the easiest or the toughest way to block a website from search engines.
Whatever may be the reason, these three effective ways will help you block the website from search engines and save your content from being exposed on the internet.
If you liked this post or have other ways to share it with us, do comment below.