Increase the speed of site indexing
Most of the site visitors receive through search engines. In order for a resource to be displayed in search results, you must first index it. It is the process of scanning each section of the service and adding it to a special database of search engines. Without this, the site simply will not be issued in the search results, because the system does not know it. Therefore, new visitors from here will not be added.
There are cases that for indexing a resource has too many unnecessary pages open – service, duplicated, etc. Then, not only the indexing rate decreases significantly, but the site ranking significantly deteriorates. This is explained by the fact that the system sees a lot of unnecessary and useless information on the site and does not give it to the TOP issue. Change the situation is not difficult. Enough to hide useless pages. This can be done in two ways – to prohibit indexing, and also to glue the pages.
Method one – the prohibition of indexing
There are two options – to do it:
- Through the directive Disallow. In the text file robots.txt. Special directive User-agent allows you to specify the desired search engine or all at once and set a ban on viewing certain pages.
- By noindex. To do this, select the content directive meta tag robots.
The Disallow directive helps not to waste the crawling budget. Unlike the noindex function. Here the folders will be downloaded and only then the robot will know that it is not necessary to open them. Therefore, to save the crawling budget, it is better to give preference to the first option. But the second is useful in a situation where there are links to other sections of the site that should be indexed on the banned pages. There is another directive that allows links to prohibited pages. Therefore, if there are internal links, it is advisable to choose the second option.
Bonding – features how the indexing is produced and affects
When merging pages such non-text parameters from the page to be attached will be added to the parameters of another page. And if you ban any of the above methods, such content is lost. Therefore, if the pages that you want to hide have useful parameters that have a positive effect on the ranking, then it is better to choose bonding. Three methods are used for this:
- If the pages to be glued together have the same content, then the process is performed via a redirect 301. It shows that the page has been moved.
- You can use the directive Clean-param, which is located in the robots.txt file. This method is intended for addresses that have dynamic parameters.
- You can also use the rel attribute, select the “canonical” value of the <link> element.
To index the site was successful, you can ban extra pages. If they contain links, behavioral files, etc., then we prefer gluing. We choose the method depending on the characteristics of our resource.