CityHost.UA
Help and support

What is the meta tag robots

 3831
16.09.2019
article

The robots.txt file and sitemap can be used to manage the indexing of the site by search engines. With their help, you can transfer information about the pages of your site to the search engine, prohibit or, on the contrary, allow them to index specific pages.

In addition to these options, you can also use a special tag, which, depending on the specifics, can be used on various types of sites. In the article, we will talk about the robots meta tag , which can be used together with the robots.txt file and allows you to manage indexing both by template and individually for each page.

What is robots meta tag

Despite all the advantages, the robots.txt file has one drawback: it cannot completely remove an already indexed page from the output. Therefore, even after specifying a page in it, it can be found, but at the same time official information about it (for example, description) will be missing. Instead, users will see the comment "page description is not available due to restrictions in the robots.txt file". In fact, the file is convenient for the initial complete hiding of the page, but it is not possible to fine tune it.

At the same time, the robots meta tag just allows for a more flexible setting of indexing by specifying the necessary values in the content field:

At the same time, first of all, the very possibility of indexing a specific page is determined in the meta tag, and only then are the rules for search robots regarding actions with content on it and information listed.

In order for search engines to read it correctly, one or more directives must be specified in the content ("xxxxxxxxx") attribute:

  • index/noindex – indexing is allowed/disallowed;
  • follow/nofollow – indexing of links placed on the page is allowed/disallowed;
  • all/none – full indexing of the entire page is allowed/disallowed;
  • noimageindex – it is forbidden to index any images placed on the page;
  • noarchive – it is forbidden to show the "Saved copy" link in the search, which allows you to view a copy of the page in the Google or Yandex cache;
  • nosnippet - it is forbidden to display a snippet describing the essence of the page in search results;
  • noodp is a command that informs the Google bot that it cannot take information from the DMOZ directory for a snippet.

How to use the robots meta tag

The follow/nofollow value allows you to follow/not follow links. Index/noindex specifies whether to check the content on the page. The general tag rel=nofollow can completely block the search engine from scanning all the links on the page. All these values are used in various combinations to improve the search engine optimization process.

Depending on what is required (for the robot to index, but not go, or go to links without scanning the content), you can set different values in turn. In particular, permission for indexing with a ban on links may be necessary in the case of representative offices of a large brand. Each store has its own reputation, which is not always perfect and corresponds to the advancing principles of the main business. In such cases, a special meta tag is set, in which the values "index, nofollow" are consistently indicated. Thus, the tab with representative offices will be available to the search engine for scanning, but it will not be able to go through a specific link.

The meta tag is also convenient in that, using it, you can carefully set up complex indexing. This is expressed in the closing of some links from search robots and the prescription of action scripts for others. In addition to the above examples of tags, there are other values that allow the work to interact with the content of the page in one way or another.

What is the difference between the robots meta tag and the robots.txt file

Unlike the robots.txt file, the meta robots tag is more relevant in terms of use for programming search engine action scripts with links. Where the first acts more simply and affects the entire page as a whole, the second makes it possible to close its parts separately, to set up a certain logic of actions. At the same time, you can work separately on each object or set up templates for page packages.

At the same time, if the page is closed immediately due to the metatag and the file, the search engine will automatically perceive the rules of the latter as the main one. For this reason, if it is necessary to open indexing with a tag, it is important to check that the page in the file is also available for indexing. Then the robot will correctly execute the set scenario.

Was the publication informative? Then share it on social networks and join our Telegram channel. We remind you that you can order cheap hosting from the hosting company CityHost. For technical questions, contact the online chat or call ?? 0 800 219 220.


Like the article? Tell your friends about it: