Step-by-step instructions for free SEO site optimization

The content of the article:
Current articles:
Create high-quality infographics – 10 tips to help
The best tools for SEO professionals

The technical side is the basis of any online resource. Of course, not only the advanced and convenient functionality, beautiful design and utility is important. The key factor for promotion is still SEO optimization. It allows the search engines to correctly determine the contents of any resource. Let’s first set the key points required for competent SEO optimization.

  1. Setting up a webmaster’s office and counters

Starting the promotion of any resource, you must, above all, be registered in special offices of webmasters. This signals the search engines about a new resource, so its robots will begin to visit the site. It should be borne in mind that it is not enough just to enter the domain; in order to effectively return the function of the cabinet, it should be correctly configured. For example, when registering with Yandex.Webmaster, do the following:

  • add a site map (sitemap.xml) to the required section;
  • specify your region;
  • do a robots.txt evaluation to check the work done.

Try to maximize the many useful features provided by such webmasters:

  • validator – to control the correctness of the preparation of various documents (for example, site maps, semantics, etc.);
  • monitoring query performance (number of impressions, conversions, etc.);
  • visualization of data on the indexing of pages (taken into account by the search engine or missed).

To get all the data on visits and actions of visitors to the resource, visit counters help. They also need to be installed after registration. The counter is an effective assistant that allows you to understand in which segment you should conduct site improvements.

  1. Robots.txt text file

Ideally, the robots.txt file is recommended for each resource with the additional prescription of the access address. This file is used by search engines for the initial appeal at each transition to your resource, it contains the main rules. Simply put, robots.txt is a list of recommended files for indexing. However, one should not expect strict adherence to this list. It all depends on which administration system is functioning on your resource; it is this one that forms this list of recommendations. Here is a brief list of what you need to put in a robots.txt:

  1. Rules that take into account the directives of specific models of robots, such as, for example, GoogleBot or Yandex and other analogues.
  2. Blocks from indexing all system resources.
  3. All files excluded from the search are closed (login and registration pages, search filters results, etc.).
  4. Guidelines Host for YandexBot.
  5. Sitemap sitemap.xml.
  6. For GoogleBot, open the files related to the resource visualization (pictures, fonts, photos, js and css).

The configuration tool related to this section can be found in our other articles.

  1. Site Map – Sitemap

Sitemap is not only a tool for optimizing search directions on the Internet resource, but also additional functionality that simplifies visits to users. For indexing by search engines, a Sitemap must be generated in xml format and include all the necessary links recommended for search and issue.

For the majority of popular CMS, special plug-ins have been developed. They allow you to organize and adjust the map. If it is impossible to use add-ons to create a Sitemap, then diversified online generators come to the rescue. This can be MySitemapGenerator, or even desktop applications such as ComparseR or Sitemap Generator. For additional ease of navigation for visitors, as well as robots, add a sitemap in html format – it is a regular page with a list of links to sections. Avoid excessive volume, if you have an online store, then a standard hierarchy consisting of the main categories and subsections is enough.

These are addresses leading to a deleted item (page, image, and so on). Broken links, and even more so their significant number, cause a negative reaction from users of the resource. It is always unpleasant not to get what you want, in response to a search query. At the same time, not only users, but also search engines also react negatively to sites with an excessive number of such links with irrelevant content, lack of updates and other negative points. To avoid this, you need to regularly monitor the site and remove non-working addresses. We recommend using free online resources designed for such a search.

  1. Remove duplicate pages

Pages with duplicate – give access to identical data for different links. The presence of such content resource leads to the following technical problems:

  1. There is a high probability of incorrect definition of the main (relevant) page by search engines, and this can greatly affect the location of the site in the TOP. Having found such a double, robots glue the addresses together and in this case they can choose the wrong one among them.
  2. Doubles affect the duration of indexing by the search engine, increasing not only the throughput load, but also the time spent on the process.
  3. Total server load increases.

This process causes the greatest difficulties in optimizing the technical side of the site. If it is impossible to block duplicates technically, we recommend using the canonical address rel = canonical with the link tag.

Let us consider an example of access to the same material for different links:

  • www.ооо.ru/abc
  • www.ооо.ru/abc1

Basically choose the first, therefore, it will look like this:

‹link rel=”canonical” href=”www.ооо.ru/abc” /›

Or, alternatively, place the addresses of duplicate pages in the codes, encircling them with tags, for example: ‹head› ‹/ head›. You can also use the 301 redirect settings by closing them via robots.txt.

Пошаговая инструкция бесплатной SEO-оптимизации сайта
  1. Setting the path for URL

There are several notations for the term URL-path: SEO-Friendly URL, semantic or human-readable URL. It should consist of intelligible and the most appropriate definitions, not only for the user, but also for the search engine. Adequate URL path has the following advantages: it quickly identifies page content by title and improves overall indexing, due to the presence of key words.

Before starting to form a link to the site, you need to decide on the search engine that you prefer, for which the adjustment will be made. Yandex prefers transliteration, while Google prefers translation. Example of the URL path of the About / About us page:

  • Yandex:
  • Google: or

What to consider when setting up the URL path (page addresses):

  • advantage for simplicity and brevity;
  • close correspondence to the content of the page (the use of the main title will be optimal);
  • remove extensions (php, html, htm and others);
  • to separate words, choose (-), not (_);
  • if possible, exclude detailed identifiers (such as idXXX and others).
  1. Main address or primary mirror

Almost any domain has synonyms of access to it. The most common options are different by adding to the beginning of www, you can also attach a domain to the main link or use Cyrillic. When you install an SSL certificate, the number of mirror links increases to four options:


If all versions are available over time, the search engines glue them into one and recognize it as the main one, so you should do the following in advance:

  • connect SSL: choose the main and the most optimal address:;
  • enter the Host directive in the robots.txt file for YandexBot;
  • configure 301 redirects from all auxiliary links to the main one;
  • in the webmaster there should be only the main mirror and all work should be done only with it.
  1. Using SSL Certificate

The advantage of even a simple and free SSL certificate can be guaranteed uniqueness of the domain and ensuring the security of circulating information. Read more about this certificate in a separate article, which will tell you how to formalize the transition to httpS.

  1. Semantic markup

Detailed layout of information using occurs with the use of html accessories, their qualities related to the selected components of the site content. It optimizes the outdoor submission of snippets in shows on request. Competent proofing based on allows you to improve and optimize the search distribution. Illustrative examples of the main categories:

Product Offers

Information texts of a different nature and subsequent guide keys

Various step by step instructions, recipes

The markup feature also allows you to show any other important information. In more detail, we have disassembled the microdata and how to install it in a separate article.

  1. Reaction (response) server

When clicking on a link in a browser or scanning a search engine, the server that serves as the location of the resource sends a response to the request, namely the HTTP status code, automatically sending information about the resource and the specific page:

  • 200 – site (page) is working;
  • 404 – site (page) does not exist;
  • 503 – the server is temporarily unavailable.

Sometimes it happens that the status code on the output is incorrect. This failure should be calculated and corrected the setting of these status codes in a special file htaccess. An important setting is the error 404, despite the fact that the page exists, but for some reason the server does not see it. Such a failure prevents indexing. Status codes are checked using Yandex.Webmaster or applications in browsers.

  1. Fast download site after the transition

One of the key optimization criteria is the overall responsiveness of a resource to requests. Users do not like a long wait and download site elements or reactions to transitions – the answer to such inhibition, as a rule, is care. This point is also taken into account by search engines that are designed to imitate the reactions of users and respond to them.

To identify and evaluate the factors hindering the work, you can use Google PageSpeed Insights. As a result, you will receive a list of recommended works to increase the load response:

  • For smartphones and tablets:
  • For PC and laptop:

In addition, following the proposed improvements adds a rating to the rating, which is taken into account when creating the TOP list, but it still does not guarantee ideally high rates of speed, but still helps to find errors, correct them and speed up.

Пошаговая инструкция бесплатной SEO-оптимизации сайта


We reviewed the key factors affecting the technical optimization of the resource. You should know that proper SEO-optimization, including technical – this is quite an impressive list of actions, but in the end has a positive effect. However, do not neglect any of the recommended factors, as they work in the complex. To check the technical optimization of your resource, we recommend that you use the service of a professional SEO audit, and if you need not only checking, but also setting up, order a comprehensive promotion.

No Ratings Yet

Comments and Interview Rating

Let's start the conversation!


Our competencies
We develop projects from scratch, as well as take on the redesign and development of sites from which you would like more.
We're doing a great Google promotion.
Advertise your company's website in Google search results
We know how to drive traffic or brand awareness through a social platform.

You might be interested

Website statistics counters or how to find traffic?
Telegram Targeted Ads
How not to fill the site from scratch? SEO rewriting: what is it?
International Friends Day

Want to get interesting articles?

sign up for our blog
Request a call