CityHost.UA
Help and support

What are duplicate site pages?

 4135
12.06.2019
article

In internal site optimization, duplicates are web pages whose content is identical to the content of other web pages. Most often, they appear due to defects in the CMS or as a result of erroneous actions of the webmaster. For example, when the site is simultaneously available at https://www.example.com/ and https://example.com/

Next, we will talk about the harm that the appearance of duplicates brings to blogs, news portals and online stores, and how to find duplicate website pages using free and paid methods.

Why you should avoid duplicate pages

Duplicate web pages on the site are dangerous because the uniqueness of the content placed on them is zero. And unique content, as you know, is one of the main components of successful search engine optimization. In addition, the danger of duplicates lies in the following:

  • Let's imagine the situation: you published a cool article on the website, which you ordered from a top author. A week later, when analyzing backlinks, you find that other blogs have started referring to this article. Everything, it would seem, is coming together perfectly - the link profile is growing, and together with them, the position of the site should grow. But this is not happening. And all because other sites rely on a duplicate of the page on which the article is published. There can be no question of the effectiveness of promotion in such cases.
  • Artificial intelligence, on which the operation of large search engines is based, is trained to identify duplicate pages. But sometimes he does it incorrectly and they get issued. As a result, the canonical (original) page, the link profile of which was created for months or even years, remains out of sight of search engine users. Again, this negatively affects the effectiveness of promotion.
  • Web spiders that have determined that there are duplicates on the site begin to visit it less often. As a result, the period during which new content is released increases. In addition, the risks of falling positions increase if the crawler finds duplicates on the site.

Speaking of duplicates, it is also worth noting that not only the content of the web page visible in the browser window, but also the meta tags of the site can be duplicated. In such cases, search engines will return pages with identical titles and descriptions. This confuses users and underestimates the CTR, the value of which seriously affects the ranking of the site.

You just learned what duplicate pages are and what damage they bring to search engine optimization. Now it's time to learn how to find this problem on the site.

How to find duplicates on the site

To check the site for duplicate pages, we recommend performing the following actions:

  1. Check if the site is accessible by two protocols at once: HTTP (http://example.com) and HTTPS (https://example.com/). If the browser opens both the http and https versions of the resource, this is a clear indicator that there is a problem with duplicates. Also check the site address with a slash (https://example.com/) and without a slash (https://example.com) at the end, with www (https://www.example.com/) and without www (https : //example.com/).
  2. Use the tool for Google and Yandex webmasters. If your blog or online store has duplicate web pages, they will be displayed in the "Coverage" report in Google Search Console, as well as in the "Site Information" section in Yandex.Webmaster. Knowing what duplicates are and how to detect them with the help of Google and Yandex webmaster tools, you will not have to spend money on the purchase of SEO software.
  3. Use the Screaming Frog SEO Spider tool. In its free version, it is possible to scan up to 500 URLs - these limits are enough for small web projects. If you need to increase them, sign up for an annual subscription (?149). Screaming Frog SEO Spider is 100% successful in finding duplicate titles and descriptions. After all, pages with the same meta tags often have the same content.

Note. We also recommend paying attention to the Netpeak Spider software solution, which is able to find duplicate pages, text, meta tags, H1 headings. You can use this program for free for 14 days. In addition, Netpeak Spider identifies pages that take a long time to load. If many such pages are found after scanning by this program, we advise you to think about ordering hosting (cheap, fast and convenient) on CityHost. Online stores, blogs and portals hosted by us load instantly - both users and web spiders enjoy visiting them.

How to prevent duplicates from appearing

Regular search for duplicates of website pages using the listed methods will protect your online resource from the negative consequences associated with this problem. In order not to forget to check duplicates, set a weekly reminder in the "Google Calendar" application on your smartphone or any other similar software product.

Was the publication informative? Then share it on social networks and join our Telegram channel. We remind you that you can buy Ukrainian hosting from the hosting company CityHost. For technical questions, contact the online chat or call ?? 0 800 219 220.


Like the article? Tell your friends about it: