10 Critical Errors in SEO For 2019

Open for indexing website mirrors

The leader among the errors are website mirrors. By this is meant the phenomenon of duplicates of the website, which have different addresses, but the same content. Mirrors are created for various reasons – booking the necessary domains, the need to move in other countries, advertising campaigns (for example, each channel must lead to a separate landing page). However, there is one simple rule that the index should be only the main mirror. If the check showed the presence of mirrors, they should be removed or closed from indexing.

It is dangerous because since there will be several mirrors in the search engine at once, it is not known which of them will be indexed first. With regard to the rest, they will be taken by search engine as doubles, which will automatically lower the website in the search results.

Missing or incorrect sitemap.xml filling

Sitemap.xml, or sitemap is a special file that includes all pages of the website. The sitemap is created for the convenience of the search robot, so that it does not miss any important pages when scanning. The more correct and complete this file is, the faster the website is indexed. Sitemap can be compared with the table of contents of the book: it is immediately clear what and where to look. If new sections or pages are added to the website, you must immediately add them to sitemap.xml or configure the automatic generation of a sitemap so that new pages immediately appear in sitemap.xml.

It is dangerous because complete absence or sitemap.xml errors impede the work of search robots. Because of this, some pages may not be indexed, therefore, do not get into the search.

Absence or non-uniqueness of title and description meta tags

Title and description are the kings of the meta tags. It is from them that the user begins to get acquainted with the website. Title is the name of the landing page, it is displayed in the title of the browser and in the search results. Description is a brief description of the page, it is not visible on the website, but the user can read it under the title in a search engine. All landing pages in the search results must have completed and unique meta tags title and description. If they are not there, it is important to fill the empty places, and make the doubles unique.

It is dangerous because search engines do not tolerate emptiness. If the robots do not find the title and description, they can pull text from another part of the page into the title. In addition, the absence of meta tags affects the assessment of the quality of the page.

Incorrect handling of 404 error

If a user clicks on a broken link or enters an abracadabra in a URL, s/he should see “Error 404: This page does not exist” on the screen – the standard server response to the missing page. In practice, it often happens that non-existent pages gives the robot a response to a normal page – “200 OK”, or demonstrate content that is a higher level, for example, a category page. This should not be. In addition, on the page with code 404 there should be a link to the main page, a site map and the necessary contact details (for example, mail, telephone for communication), so that the user can continue to navigate the website.

It is dangerous because with a large number of 404 pages, both robots and users are dissatisfied. A visitor who cannot quickly find a page or go to the main page will be disappointed and will not come again. Search robots perceive a large number of errors as an indicator of a poor-quality website and can lower the resource in search results.

Low download speed website

Good website is a fast website. Its pages load quickly, until the user has time to blink an eye. If the slider on the website is spinning for a few seconds, this is a problem. The reason for this can be heavy images on the website, third-party scripts, cheap hosting tariffs and many other points that are very important to consider. To increase page loading speed, it is important to eliminate the cause of the brakes. For example, compress pictures or switch to a new rate.

It is dangerous because low page loading speed affects both indexing and user behavior. If the robot receives content from the website for a long time, it manages to index fewer pages in one session. In addition, there is nothing to say about users – there are few patient in the world who are ready to wait for the page to load for more than three seconds.

Robots.txt errors

When a search robot comes to the website, it must understand which sections it needs to index and which ones to pass over. This information is written in the robots.txt file. The form for filling this file is standard for any websites. The main User-agent directive reflects the scope of the rule for search bots in Google and other search engines. In addition, the robots.txt file must contain a link to the file on the sitemap – the path to the file with the xml-map.

It is dangerous because the robots.txt file is a constitution for search bots. If not, the robot can do whatever it wants. In addition, most of all, he wants to index all pages of the website: mirrors, trash, system catalogs, filter pages, search results, etc.

Low page relevance

Relevance is a compliance with a page to users’ requests on search engines. High relevance ensures that the page will be in the top search engines. Technically, relevance is calculated as the ratio of keywords and phrases to a specific query. All pages in the promotion must have a high relevance to keywords; otherwise, they simply will not be visible in the search. Relevance can be increased by internal (keywords, meta tags) and external (reference mass) optimization, usability and availability of the necessary conversion elements on the page.

It is dangerous because low relevancy means that the page is not in the top of the search results on request. As a result, all the work on technical optimization goes out the window.

Nauseous Texts

The text on the pages of the website should be unique and with a low level of “nausea.” If you have long figured out the uniqueness, then nausea is more difficult. Nausea is the frequency of use of the word in the text. A few years ago, it was considered that the more keywords in the text, the higher the website in search results. Much has changed after the introduction of special ranking algorithms. Search engines have learned how to punish re-optimized pages written for robots. Now texts overspam is a big problem for SEO specialists.

It is dangerous because the higher the nausea rate of the text, the worse the ranking. The worst thing that can happen is that the website will fall under the filters of search engines or will drop in the results. Write for humans, not robots.

Interlinking errors

Interlinking is an important mechanism for improving the website in search results. However, there is a limit to everything. Too many links have the opposite effect. Usually there should not be more than 400 internal and 1000 external links on one page. In addition, on the page may be broken and redirect links. Both the first and second should be as small as possible. These are general tips, and there are more unspoken rules of linking than anecdotes about blonde-haired women.

It is dangerous because incorrect linking threatens to re-optimize, and broken and redirect links make it difficult to index pages and adversely affect the loyalty of users. As a result, a decrease in position in the search results.

Indexed page duplicates

By analogy with the website mirrors, it happens that pages that duplicate each other’s content fall into the search. This is what happens when instructions for search robots are missing in the robots.txt file. Then the bot indexes pages that are not needed in the search results: filters of goods, search pages or pagination, which duplicate content diluting. Most often, the problem is solved by setting the correct robots.txt or attribute rel = canonical.

It is dangerous because by analogy with mirrors, duplicate pages harm promotion – the website goes down in search results.