Some resource owners mistakenly believe that indexing occurs by a “personal invitation” that must be address to bots, i.e. send a link to the resource. In fact, search spiders constantly examine the Internet space, so a site that is not clos from indexing will sooner or later be assess.
It’s good if by this point the resource meets the requirements of search engines. But what if it doesn’t? Then instead of optimization, you can get the opposite effect called “pessimization”, which in the future will require efforts to restore the reputation.
The resource has been created recently,
- work is underway on filling it, changing the interface, etc. It is better to open the resource for analytical work of spiders after full revision and configuration.
- There is a duplicate resource that allows webmasters to test innovations and possible changes in real conditions. The analogue should be made invisible to robots, otherwise they may see duplicate content, which entails extremely unpleasant consequences. If the site is a testing platform for developing scripts, templates, etc., it should also not be index.
- If a web resource is creat and develop on a hosting, it is available at any time. This allows you to work on it when it is convenient, immediately implement emerging ideas, test developments. Let them index and send the finish resource to the database.
First of all,
It is necessary to hide service information intend only for the owner and webmasters working with the site. Duplicate pages containing non-unique content should be block for indexing.
For example, online stores with catalogs that contain hundreds of pages often suffer from the consequences of this. A low percentage of originality will also be not on pages with users’ personal information, which should also be prohibit from indexing. This is also done for ethical reasons. Owners rightly hide pages with czech republic phone number resource shopping carts and customer order forms from search crawlers – they only make sense for a specific person, and it is not at all necessary to see them in the search results. Spiders regard such information as “garbage” and will be grateful if the owner relieves them of the need to index it.
So, you should hide from bots:
- service how digital marketing works in online business development information;
- various “garbage”;
- duplicate content.
Google is the most deb directory severe in the latter case, and can impose sanctions on a resource for a large percentage of duplication. Such actions by the system, to put it mildly, do not have the best effect on the further promotion of the site.
Source: unsplash.com
40% duplication is a critical indicator, at which it is important to take care of hiding the page from robots.