What is the semantic core, how to compose it. We create a high-quality semantic core


How to correctly compose the information component of a website so that customers can quickly find it

So, you decided to create a portal where people could find interesting information, but you know that promotion requires some skills, namely, compiling a semantic core. But semantics precisely suggests that the site will be filled with meaning. Therefore, here we will talk about how to kill two birds with one stone - both to attract the audience to useful information, and not to force search engines to “swear”.

Old and new approaches to filling a website with information and a core of meaning

When creating a website, first of all, you need to know what interests users and how they search for information - after all, the same data can be found in different ways. It is also necessary to take into account the interests of the user - after all, since all the information that will be presented on your site should be interesting to any reader, people need to be attracted to read it. And you can’t do without search engines - Yandex and Google simply won’t “accept” the portal as their own unless a number of conditions are met.

In particular, by dispersing the keywords that make up the search phrase throughout the portal. Therefore, it is important to fill the text with meaning. This is nothing more than a semantic (notional) core - a combination of words and phrases that reflect the thematic focus and structure of the Internet resource. In general, semantics is a department of linguistics that studies the semantic content of units (elements) of language. Everyone has probably seen such expressions on websites as “The main character helps his friends watch movies online in order to avoid becoming victims of the villain” (the phrase is approximate, the essence, I hope, is clear). Here the user clearly sees that the keyword “watch movies online” is present, but this is not done for him, but for the search engine. As a result, he may feel deceived - there is no need to slide into this, it will not bring anything good. A competent text with a core of meaning will be perceived much better.

In order for the user to find an Internet resource, you can use two methods:

  • First, analyze the search queries of clients, based on the results of which create the structure of the portal (the semantic, or semantic, core, in this case, plays a decisive role in the framework and design of the resource);
  • first make a plan about what the site structure will look like before moving on to analyzing what interests users (the semantic core is distributed throughout the ready-made portal frame).

The first approach involves adapting to current conditions - and this option really works. In this case, the resource structure is subsumed under keywords and remains an object. The second option is like in the Time Machine song “You shouldn’t bend under the changing world - one day it will bend under us.” Using this approach, the businessman himself chooses what he wants to tell potential users. This approach can be called a kind of proactive - and the businessman in this case will become the subject.

It is important to understand that the main goal of marketing and business is customer focus. And the second method provides just that. That is, an entrepreneur or marketer decides what data he should present to the audience using his portal - and, of course, he should have some knowledge of what will be told on his site. Therefore, first he plans an approximate design of the resource, a preliminary list of pages, and after that he analyzes how the user searches for the information he needs. And with the help of the information content of the resource, it answers the questions that the user asks the search engine.

The first option is the “SEO” method. It has been a leader for quite a long time, and it is still used today. Using this method, key phrases were found for which the site creator simply wanted to get to the very top of the search engine, and after that the resource structure itself was created and the keys were distributed across all pages. The information content was optimized for keywords and phrases.

But this method shows in practice that the search engine may be deceived, but people are not. The information value of the resource is falling - people are not interested in reading texts that contain keywords, they think that somewhere they are being deceived. But marketing is not created for the sake of this - business creates trends, and the businessman chooses for himself what to tell users. Marketing should not “dance to someone else’s tune,” otherwise the audience will cease to respect it - it must shape the environment itself, but at the same time be client-oriented. The “SEO” approach has neither one nor the other, and that is why it is becoming obsolete.

In the meantime, because of it, some promising queries for the search engine are eliminated, and here this can also be understood, because there is a lot of competition on the Internet today. In addition, sites are filled with keywords that search engines love.

The planned result of constructing a semantic core is a list of key queries that are dispersed across the pages of the portal. It includes page URLs and requests indicating frequency.

Site design

The structure, or design, of an Internet resource is a kind of hierarchical, or ranking, scheme of a web page. By creating it, the following problems are solved:
  • Planning an information strategy and structure for presenting information to the user;
  • Ensuring that the portal complies with search engine requirements;
  • Guarantee of resource ergonomics for the client.

To do this, you can use whatever is convenient - even MS Word or Paint, you can also draw it by hand or on a tablet using a stylus. When planning a structure, you need to answer yourself 2 questions:

  • What information do you, as a businessman, want to convey to your clients?
  • Where you need to publish this or that content.

If we take the design of a small confectionery shop portal as an example, it will include information pages (recipes, history of a particular cake), an articles section and a product catalog (showcase). If you imagine this in diagram form, it might look like this:

Hierarchical site diagram

Next, the design is presented in the form of a table. Here the hierarchy is indicated, the page names are indicated, columns with keywords and their frequency are included, as well as with the URLs of the pages. If you imagine a table of the design of a confectionery website, it could be as follows:


This is how you can present the structure (design) of an Internet resource as a table

To begin with, we only know the “Page Titles” and “Legends”, and the “URLs”, “Keys” and “Frequency” will be filled in later.

Keywords

It is important to understand what keywords are and what search queries clients use - without this, creating a website and presenting information to users will not be effective. You can use one of the services to select keywords - but it is important to understand that these words are appropriate.

So, keys are words or phrases used by users to find the information they need. A simple example - in order to prepare a pie, he enters the query “apple charlotte recipe with photo” into the search engine.

Keys can be divided into several groups:
Depending on popularity there are:

  1. Requests with low frequency (they show 100-1,000/month);
  2. Mid-frequency (1,000-5,000 impressions);
  3. High-frequency (queries 5,000-10,000 impressions per month).

Depending on customer needs, they vary:

  1. Informational (if the user needs to find any information - for example, “how to clean clothes from fucorcin”, “what vitamins improve skin condition”);
  2. Transactional (requests issued with the aim of making a transaction, but without specifying a specific site or store - “buy a sofa”, “download a game”, “apply for a loan”);
  3. Navigation (if the client wants to find information on a specific site - for example, “webmoney create a card”, “track Belpochta track code”, “European wholesale discounts”);
  4. Others (if it is difficult to determine what the user wants - for example, when entering the phrase “brain”, it is not clear what the person wants to know - the structure, organ, interesting facts about it, and in addition, it is not clear what kind of brain we are talking about - the spinal cord or head).

Now for each point. The difference in popularity assessment depends, as is clear from the context, on how popular a particular topic is among users. The division is conditional; some experts determine a smaller number of impressions for queries. An example is the following case: for a site that sells smartphones, the request “buy a Samsung phone” with a frequency showed 6,000/month - medium frequency. At the same time, for a sports club, the request for “Thai boxing training” with a viewing frequency of 1,000 requests is high-frequency.

All this must be taken into account and an extremely broad semantic core must be constructed, and it must be enriched by low-frequency phrases, since according to statistics, from 60% to 80% of all user queries can be classified as low-frequency. That is, low-frequency keys should be used as the main resource for attracting potential customers to the site - these are a kind of narrowly targeted keywords. They need to be diluted with high- and mid-frequency queries.

To effectively use the second group, which differentiates keywords, you should first take into account the needs of customers when dispersing keywords across pages or creating a content plan. That is, articles where the user will be provided with information should answer their questions. These are most of the key phrases without a specific intention - that is, the words “buy”, “download” and the like should not be inserted into informational articles. The sections “Shop”, “Catalog” and “Showcase” are designed to satisfy the transactional requests of users.

Please note that most transactional requests are commercial. And accordingly, if you decide to sell cakes, you will have to compete with “Cake Moscow”, “Dobryninsky and Partners” and “Vienna Workshop” - the largest manufacturers of confectionery products. But if you use the above recommendations correctly, everything becomes much simpler. Maximize the semantic core of the text and reduce the frequency of queries. For example, the frequency of the request “buy American-style chopped cake” will be lower in frequency than “buy American-style cake.”

Structure of search queries

A phrase is a general concept that includes a particular one. Same with search phrases - they include the body, qualifier and tail. For example, taking the search query “cake” as a basis, we cannot understand what the user needs - the definition of a confectionery product, the purchase of it, or just pictures. The query itself is high-frequency, and this means high competition in providing results. In addition, entering a request will bring many visits to the site from clients who are not at all interested in receiving the information you provide, and this will negatively affect the behavioral factor. And all because such a request contains only the body.

If we introduce an addition in the form of the word “buy”, we also get the inclusion of a qualifier - something that determines the client’s intention. You can replace the word “buy” with “recipe”, and then such a query will become informational, and if you enter “cakes in I love cake”, then such a question will become a navigational one. Therefore, it is the specifier that determines whether a key belongs to one or another type of keyword.

Sometimes you may encounter a situation where a user, wanting to sell a certain item, enters the request “buy” to see where people buy this item the most.

If you enter the phrase “buy a cake in Moscow” or “buy a cake to order,” then the last part of the search query is the tail. It just specifies some details about how or where the client intends to do it. So, if the client needs to know a specific store, then the request will become a navigational one.

Search phrase structure

If we look at the following examples: “buy a homemade cake in Almaty”, “Napoleon cake recipe”, “buy a cake with delivery”, we will see that in each situation there is a specific user goal, and the tail only clarifies the details.

Therefore, for the semantic core, it is necessary to identify the basic terminology associated with the services and goods that will be presented on the portal, or with business activities and customer needs. So, if a person needs a confectionery product, then he will be interested in cakes, marshmallows and marshmallows, marshmallows, waffles, cookies, meringues, cupcakes, etc. This is the body of the key request. And then we find specifiers and tails. Thanks to phrases with “tails”, your reach increases, and at the same time, there are fewer “search competitors”.

Internet resources that allow you to formulate a semantic core (selection of key meanings)

In order to collect keywords for your website, there are many assistants that make life easier for a businessman. There are paid ones, which are needed if the site is huge or there are many of them, and free options, suitable for a small portal.

In this article we will look at the following resources:

  • KeyCollector(paid);
  • SlovoEB (free);
  • Wordstat from Yandex (free);
  • AdWords from Google (free).

Key Collector

This is a paid tool with many features. It automates the operations necessary to construct a semantic core. You can, of course, use free analogues of the program, but to do this you will have to use several Internet resources at once, since the choice of this program is almost unlimited. In addition, this service is simply irreplaceable if you own more than one website, or are accustomed to having everything fit in one program, so that you don’t have to look for third-party resources, and also if you have several websites or a large website needs semantic content .

It offers the following features:


This is what KeyCollector looks like

SlovoEB

This service is free. The developers are the same ones who created the Key Collector program. In order for the program to be used, you need to provide a login from an additional Direct account. This is due to the fact that Yandex can block the account due to automatic requests, so you should not use the main one.

The resource offers the following features:

  • Collecting keywords via Wordstat;
  • Filter queries by high frequency;
  • Syntactic analysis of search hints.

SlovoEB interface
How does the program work? First, let's create a new project. Select “add phrases” - here are the phrases that customers use to search for information about a particular product.


Adding a search phrase to the program

In the “Collection of keywords and statistics” menu, select the required element and start the service. For example, if you need to collect key phrases, then choose this option.


Determining the frequency of key phrases

Wordstat (Yandex service)

This is a free resource for selecting and analyzing search phrases. It is necessary if you are ready to analyze and classify queries manually. The service offers the following options:
  • Displaying information about impressions and queries by keyword, search phrases, while you can analyze both general and mobile data (that is, you can see how popular the query is on mobile devices);
  • Demonstration of statistics by region;
  • Displaying data on the popularity of a specific request relative to time (“query history”);
  • Display a phrase or query only in the specified form (for this you need to put the phrase in quotation marks);
  • Display statistics without taking into account stop words (you must put a minus sign in front of this word so that it is not taken into account);
  • Demonstration of data using the selected preposition (in this case, put “+” in front of it);
  • Displaying information by category of requests (for this, groups of requests should be indicated in parentheses, and key options should be separated by a forward slash “|”: that is, in order to quickly obtain data on the request “order a cake”, “buy a cake”, “order a cupcake”, “buy a cupcake”, “order a pie” and “buy a pie”, follow the instructions as shown in the picture below);
  • Displaying data on requests linked to specific areas.


Request for “cupcakes”, general statistics


Key data by region


Here you can see when the request was most or least popular


Show the phrase in the specified form


Information for the key without word forms


Statistics without taking into account the stop word


Data for six requests at once - a convenient thing if you need to quickly get information


If you select a specific region, you can see what is popular there

Google AdWords (Google Keyword Planner)

If Google is a significant leader in a particular region, then it is better to use this service. It is designed precisely to calculate the needs of users of this search engine. The service is free, but there are paid services (for example, for advertisements).

The tool offers the following features:

  • Collection of information on search queries;
  • Development of new combinations of queries and forecast of their relevance and dynamics.

In order to get statistics on specific queries, you should select this option on the main page of the tool. You will need to enter the phrases of interest and upload the file in CSV format, then select the region for which statistics are needed, you can also specify stop words (as described in Wordstat). Everything is ready - you can press the “Find out the number of requests” button.


Information on queries from Google

Services offering analytics services

You can also use Google Analytics or Metrica analytics systems if you need to build a semantic core for an existing resource. These tools help you identify what search phrases your customers are entering.


You can also find inspiration for creating keywords here

In addition, data on common phrases for clients to search for certain information can be checked using the Yandex and Google webmaster account. For the latter, the data is located in the Search Console, then you need to go to the “Search Traffic - Search Query Analysis” department.

The Yandex webmaster suggests using the “Search queries - Popular queries” section.

Tools that allow you to analyze competitors' websites

Competing sites are another place to look for keyword inspiration. In order to identify them, it makes sense to read their publications or check the text using the HTML keywod tag using the program code of the web page. Or Advego with Istio can help you.


Istio interface

If you want to analyze the entire competitor’s portal, you can use the following tools:

Now in more detail on each point.

In order to determine the main keys, they will need to be written down. This can be done either on a piece of paper or using computer programs. You will need the ideas of all your colleagues - you need to write them all down without exception: each one may turn out to be the “thicket of the Grail” that will attract clients to you.

The list might look something like this:


Sample list of phrases to search for

In this list, almost all the keys are high-frequency, without any specifics. Phrases with medium and low frequency will allow you to expand the core to the maximum. So let's move on to the next stage.

Here this difficulty is solved by using a keyword tool. For example, you can choose the Yandex service - it is one of the most convenient, despite the apparent initial complexity. Here you can link to a specific region if you offer a product or service in a specific geographical area.

So, at this stage we analyze all the keys compiled by our colleagues.


Main Query Analysis

You should copy the phrases from the left column of the service and paste them into the table. Now you should focus on the right column of the assistant - here Yandex offers phrases that visitors used along with the main phrase. In this way, in one click you have the opportunity to select the appropriate keys and copy them to the left column. Don't worry if any of these don't fit - these phrases will be eliminated at the final stage. And it’s already close, like winter in Game of Thrones.

The result of this phase will be a compiled list of search phrases, which will be for each main key. At this stage, there may be hundreds or even thousands of different queries.


List of phrases

Let's move on to the final stage. No matter how easy it may seem, it is not. This is the most time-consuming and complex work with the kernel. It is necessary to manually exclude from the semantic core that which does not fit its meaning.

But you should not remove low-frequency keys - under no circumstances. “Old-school” optimizers may continue to consider this key garbage, but you shouldn’t fall for this trick. Example: using the “diet cake” key as a basis, you can see that the service displays 3 impressions per month in the Cherepovets region. The “SEO” method involves throwing away his keys. But now you will understand why you shouldn’t do this - and I hope you will continue to apply this advice in life.

Specialists in the field of SEO, in order for their pages to be in the top of search engines, purchased or rented links. At the same time, they had to use certain keys. The method is still used today. And they can be understood, because phrases with a low frequency of display, as a rule, do not recoup the money spent on the link.

But if you look at “diet cakes” through the eyes not of an old-school SEO specialist, but of a client-oriented businessman, you can open up additional opportunities. After all, some of the potential clients are really interested in this - and not least of all these are girls who watch their figure. Thus, we know for sure that this request is of interest to someone, and therefore, with a clear conscience, can be included in the semantic core. If the confectioners in your company prepare such a product, it will definitely come in handy where the products will be described. And if not, this information content can be saved for the information section of the portal.


“diet cake”, which may be considered garbage, is actually not so

What then should be excluded? Let's figure it out:

  • Firstly, these are phrases where other brands are present;
  • Secondly, repeating phrases - for example, from 3 keys “cakes to order New Year”, “cakes to order new”, “cake to order New Year”, the first key will be enough;
  • Thirdly, if you are not involved in such a thing as “dumping”, then, accordingly, keywords using the words “cheap” and “inexpensive” will certainly not be useful to you;
  • Fourthly, keys with inappropriate areas - if you trade only in Cherepovets, but do not deliver to nearby villages or do not trade in a certain area of ​​the city, this data is not needed;
  • Fifthly, keys with links to products that you know for sure that you are not going to sell and, accordingly, do not sell;
  • And sixth, you certainly won't need phrases spelled incorrectly - no matter if they're grammatical errors or typos - the search engine will help a visitor looking for "gbhj;yst" instead of "cakes", "cupcakes" instead of "cupcakes".

Voila, when you have identified all the keys that do not suit you, you have received the necessary “cakes to order” keys. The same must be done with everyone else. And the next stage will be the classification of phrases into types.

Construction of a correspondence (relevance) map and classification of key phrases

Search phrases that the target audience is going to use as the main ones and find data that will lead users to your site are integrated into the so-called. “semantic (meaning) clusters” are categories of queries that are similar in semantic content. This means that the “cake” cluster includes all phrases that are directly or indirectly associated with this word - and in this case, this unit of language appears as “particular”, and all phrases are “general”. This is what you can see in the picture below.

Please note that here, too, there are clusters of the second, third, and fourth categories. The broader the topic, the more levels the cluster has. Although de facto it turns out that clusters of the second group are sufficient.


Cluster levels

Most of the clusters were identified at the very first stage of creating keywords. Naturally, for this you just need to understand the presented topic, because without knowing anything about cakes, it is unlikely that you can create a competent semantic core. The compiled site diagram will also serve as an assistant for creating a cluster.

Clustering of the second category is very important. Specifiers should be added here that will indicate the clients’ goals - for example, “buy cakes”, “history of the creation of the Napoleon cake”. We put the last cluster in the information section, and the first in the catalog.

Now we return again to the hierarchical diagram of the web page and the table developed on its basis. “Cakes to order” was identified using the Yandex service and was subsequently not excluded from the list. Now this key should be distributed between the pages of the corresponding section.


This way you can distribute the search phrase on your site

Let’s take this example: in the cluster there are search phrases for “custom cakes with a football theme.”


Football cakes turn out to be of interest to users

And if a confectionery produces this type of product, then we know in which section this page will be located. It should be placed in “Mastic Cakes”, because this material is used to create such a confectionery product. This means that here we create the corresponding page. We include it in the design of the Internet resource, indicating the URL and search phrases with frequency.


Creating a page in the appropriate section

You can use the same tool that helps you choose the right keys to see what else users are asking for regarding football topics. These phrases should also be added to this page.


Let's figure out what else clients are interested in regarding football and cake

We mark the keys. We disperse the remaining search keys.

The diagram that was drawn at the very beginning can be changed an unlimited number of times - if necessary, you can create new categories and sections. So, if the “Children’s Cakes” page did not exist before, then, remembering that the company can make custom cakes with the cartoons “Peppa Pig” or “Paw Patrol,” you can make changes and create such a page. At the same time, these keys may also be located in the “Mastic Cakes” section.


Creating a new section in the hierarchical table of the site “Children's Cakes”

There are two important points to keep in mind:

  • The cluster may not include a suitable phrase for the page you want to create. The reasons may be incorrect use of a keyword, shortcomings in services for selecting keyword combinations, or simply the low popularity of the product or service being sold. But at the same time, this is not at all a reason to abandon the page and sell the product. For example, if you did not find the search query “Peppa Pig cake” in the search engine, but the confectionery company has the capabilities to make such a product, then you can clarify the needs of customers using another service. In this case, such a request will be found, and for the most part they are found;


People are also searching for Peppa Pig
  • Well, after eliminating unnecessary keys, completely inappropriate queries may remain. Well, they can either be removed or applied to another cluster. Let’s say a confectionery company specializes in unique recipes, but you think it’s better to leave time-tested cakes like “count’s ruins” or “Napoleon” in the past - such keys can be left in the section where the user will be provided with general information - in this case, “recipes”.


The key phrase can also be placed in the information section if it is very popular among visitors

So, at the final stage, having dispersed all the keys across the pages, you receive a list of the portal’s web pages, where the URLs, queries and their frequency are indicated. Let's move on, that's not all.

The final stage of enriching the semantic core

So now we have everything we need. We have a table with a semantic core, a list of preliminary web pages and key phrases that define the needs of certain clients. All this will help in drawing up a plan for the information content of texts (content plan). Now, when composing it, you will need to specify the name of the web page or article, and include in it the main query for the search engine. But it should be borne in mind that this does not always have to be the most common key from the point of view of Yandex or Google. It should reflect what you want to convey to users and what customers want to receive.

Other key phrases should be used as an answer to the question - “What should I write about?” Of course, you should not immediately “shove” all the phrases that were found using a tool for selecting search queries into one section or another - be it an information plan page or an offer to buy a certain service or product. It should be repeated at the very end once again: you need to first of all pay attention to the information needs of users, and not to key phrases and “stuffing” the text with them, like pills. The user always sees when they are trying to “feed” them - if the text is composed correctly, he will not even have in his mind that keywords were used here.

Finally, what should not be done with the semantic core?

I hope that you have no more questions about what has already been said, and you can now create a dozen websites based on the knowledge you have gained. But you should still identify some actions that you should not do. Later you will intuitively understand this, but now you should learn them by heart. Here are some tips that will help you become a professional in properly compiling an online resource:
  • You should not refuse keys that have too much competition. Yes, you don’t really need to get to the very top for the search queries “order marshmallows”. Just use the phrase as a content idea;
  • Also, you should not get rid of phrases with low frequency - these are the very content ideas with which you will most likely be able to satisfy those who were able to find similar services even from the largest companies;
  • Do not use formulas and coefficients (like kei, the ratio of popularity to competition) to evaluate keywords. Let's make it clear once again: semantics is a branch of linguistics. This is not an exact science, like, for example, physics or mathematics. This is closer to art than to precise research, and, subject to the requirements of conforming to a formula or coefficient, semantics loses its zest. Thus, you lose a lot of ideas for information content that can be excluded by the program - but it is not the program that will subsequently read the text;
  • You should not create a separate page for one key. Everyone has probably come across such online stores where there are special pages “buy a cake” and “order a cake”. The semantic core is lost here, because in essence these are the same action. Or “buy cheap” and “buy cheap” are synonymous words, so you shouldn’t separately fill pages with useless content;
  • There is no need to completely automate the construction of the semantic core. Of course, you used special tools to collect key phrases, and for huge projects such tools are simply irreplaceable - especially key collector. But without human analysis, the value of the list of keys is low. This is not a great secret - even those who use old school knowledge know this. Services only make our lives easier by collecting information that would otherwise require long and painful filtering, but cannot be compiled into text itself. More precisely, such programs exist, but their value for humans is small, and they are intended for a completely different purpose - not for reading by the user. Only someone who understands something in this area can really determine the degree of competition, draw up a plan for an information campaign, or analyze the situation in this area. All three points are indirectly associated with the design of the structure of a web resource and the dispersion of keywords;
  • Don't be a pushover - there's no need to focus too much on collecting key phrases. When just starting a business, there is little point in conducting thorough spying on competitors, collecting as many keywords as possible from all available search engines down to hotbot and researching search suggestions. It is enough to use one or maximum two resources, and these can be Yandex or Google. Well, or rambler from mail.ru at worst, if this particular search engine is popular in your region. Tut.by - if you are specifically interested in the Belarusian region, or uaportal.com - in Ukraine. But they are used only as a link to the region: if, for example, residents of Belarus are interested in “cakes with Ksenia Sitnik,” this will not say anything at all to a resident of Russia. This means that you shouldn’t over-saturate your site with keys either.

You need to remember why and why you are building a kernel. And also that it is semantic.

So is it marketing or SEO?

It cannot be said that one contradicts the other. A marketer can be a good SEO, and vice versa. It’s just that a person who knows how to correctly formulate the semantic core for his website requires, first of all, the logic of a businessman and marketer (customer focus), and then the skills of a specialist in the field of SEO (correct placement of keywords). You need to understand what you, as a businessman, can offer to a potential consumer. Next, you need to understand how clients search and find the data they need. And the tools described above will help with this. Analyze, filter out unnecessary things, find keys that are suitable in meaning, classify them, and ergonomically distribute them throughout the entire site structure. And voila, now the moment has come when you can start creating a content plan.

Before starting SEO promotion, you need to create a semantic core of the site - a list of search queries that potential clients use when searching for the goods or services we offer. All further work - internal optimization and work with external factors (purchase of links) are carried out in accordance with the list of requests defined at this stage.

The final cost of promotion and even the expected level of conversion (number of calls to the company) also depend on the correct collection of the core.

The more companies promote using the chosen word, the higher the competition and, accordingly, the cost of promotion.

Also, when choosing a list of queries, you should not only rely on your ideas about what words your potential clients use, but also trust the opinion of professionals, because not all expensive and popular queries have high conversion and promoting some of the words directly related to your business can to be simply unprofitable, even if it is possible to achieve an ideal result in the form of TOP-1.

A correctly formed semantic core, all other things being equal, ensures that the site is confidently positioned in the top positions of search results for a wide range of queries.

Principles for compiling semantics

Search queries are formed by people - potential site visitors, based on their goals. It is difficult to keep up with the mathematical methods of statistical analysis embedded in the algorithm of search engine robots, especially since they are continuously refined, improved, and therefore changed.

The most effective way to cover the maximum number of possible queries when forming the initial core of a site is to look at it as if from the position of a person making a search request.

The search engine was created to help a person quickly find the most suitable source of information for a search query. The search engine is focused, first of all, on a quick way to narrow down to several dozen the most suitable answer options for the key phrase (word) of the request.

When forming a list of these keywords, which will be the basis of the site’s semantics, the circle of its potential visitors is actually determined.

Stages of collecting the semantic core:

  • First, a list of the main key phrases and words found in the information field of the site and characterizing its target orientation is compiled. In this case, you can use the latest statistical information about the frequency of requests in the direction in question from the search engine. In addition to the main variants of words and phrases, it is also necessary to write down their synonyms and variants of other names: washing powder - detergent. The Yandex Wordstat service is perfect for this work.

  • You can also write down the components of the name of any product or subject of the request. Very often, queries include words with typos, misspellings, or simply spelled incorrectly due to the lack of literacy of a large part of Internet users. Taking this feature into account can also attract additional resources from site visitors, especially if any new names appear.
  • The most common queries, also called high-frequency queries, rarely lead a person to the site they are looking for. Low-frequency queries, that is, queries with clarification, work better. For example, the request Ring will return one top, and piston ring will provide more specific information. When collecting, it is better to focus on such requests. This will attract target visitors, that is, for example, potential buyers if this is a commercial site.
  • When compiling a list of keywords, it is also advisable to take into account widespread slang, the so-called folk, which have become generally accepted and stable names for some objects, concepts, services, etc., for example, cell phone - mobile phone - mobile phone - mobile phone. Taking into account such neologisms in some cases can provide a significant increase in the target audience.
  • In general, when compiling a list of keys, it is better to initially focus specifically on the target audience, that is, those website visitors for whom the product or service is intended. The core should not contain a little-known name of an item (product, service) as the main option, even if it needs to be promoted. Such words will be found extremely rarely in queries. It is better to use them with clarifications or use more popular similar names or analogues.
  • When the semantics are ready, it should be passed through a series of filters to remove clogging keywords, which means that they bring the wrong target audience to the site.

Taking into account the semantics of associated queries

  • To the initial list of SEO core, compiled from the main keys, you should add a number of auxiliary low-frequency ones, which may include important but not taken into account words that did not come to mind when compiling it. The search engine itself will help you with this. When you repeatedly type key phrases from the list on a topic, the search engine itself offers for consideration options for frequently occurring phrases in this area.
  • For example, if the phrase “computer repair” is entered, and then the second query is a matrix, then the search engine will perceive them as associated, that is, interconnected in meaning, and will provide various frequently occurring queries in this area to help. With such key phrases you can expand the original semantics.
  • Knowing a few main words from the core of the text, using a search engine it can be significantly expanded with associated phrases. In the event that a search engine does not produce an insufficient number of such additional keys, you can obtain them using the methods of a thesaurus - a set of concepts (terms) for a specific subject from the same conceptual area. Dictionaries and reference books can help here.

Logical scheme for selecting semantics for a site

Formation of a list of requests and their final editing

  • The key phrases that make up the semantics generated in the first two steps require filtering. Among such phrases there may be useless ones, which will only make the core heavier, without bringing any tangible benefit in attracting the target audience of site visitors. Phrases obtained by analyzing the target orientation of the site and expanded using associated keys are called masks. This is an important list that allows you to make the site visible, that is, when a search engine operates, in response to a request, this site will also be displayed in the list of suggested ones.
  • Now you need to create lists of search queries for each mask. To do this, you will need to use the search engine that this site is oriented to, for example, Yandex, Rambler, Google, or others. The created list for each mask is subject to further editing and cleaning. This work is carried out based on clarification of the information posted on the site, as well as the actual search engine ratings.
  • Cleaning consists of removing unnecessary, uninformative and harmful requests. For example, if the list of a building materials website includes phrases with the words “course work,” then it should be removed, since it is unlikely to expand the target audience. After cleaning and final editing, you will get a version of actually working key queries, the content for which will be in the zone of so-called visibility for search engines. In this case, the search engine will be able to show the desired page from the semantic core using internal links of the site.

Summarizing all of the above, we can briefly say that the semantics of a site is determined by the total number of search engine query formulations used and their total frequency in the statistics of hits for a specific query.

All work on the formation and editing of semantics can be reduced to the following:

  1. analysis of the information posted on the site, the goals pursued by the creation of this site;
  2. compiling a general list of possible phrases based on site analysis;
  3. generating an extended version of keywords using associated queries (masks);
  4. generating a list of query options for each mask;
  5. editing (cleaning) the list to exclude unimportant phrases.

From this article you learned what the semantic core of a website is and how it should be compiled.

The semantic core is a scary name that SEOs came up with to denote a rather simple thing. We just need to select the key queries for which we will promote our site.

And in this article I will show you how to correctly compose a semantic core so that your site quickly reaches the TOP, and does not stagnate for months. There are also “secrets” here.

And before we move on to compiling the SY, let's figure out what it is and what we should ultimately come to.

What is the semantic core in simple words

Oddly enough, but the semantic core is a regular Excel file, which contains a list of key queries for which you (or your copywriter) will write articles for the site.

For example, this is what my semantic core looks like:

I have marked in green those key queries for which I have already written articles. Yellow - those for which I plan to write articles in the near future. And colorless cells mean that these requests will come a little later.

For each key query, I have determined the frequency, competitiveness, and come up with a “catchy” title. You should get approximately the same file. Now my CN consists of 150 keywords. This means that I am provided with “material” for at least 5 months in advance (even if I write one article a day).

Below we will talk about what you should prepare for if you suddenly decide to order the collection of the semantic core from specialists. Here I will say briefly - they will give you the same list, but only for thousands of “keys”. However, in SY it is not quantity that is important, but quality. And we will focus on this.

Why do we need a semantic core at all?

But really, why do we need this torment? You can, after all, just write quality articles and attract an audience, right? Yes, you can write, but you won’t be able to attract people.

The main mistake of 90% of bloggers is simply writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don’t know about it. They are not psychics, but just robots. Accordingly, they do not rank your article in the TOP.

There is another subtle point with the title. For example, you have a very high-quality article on the topic “How to properly conduct business in a face book.” There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the highest quality, useful and interesting on the Internet on this topic. No one was lying next to you. But it still won't help you.

Why high-quality articles fall out of the TOP

Imagine that your site was visited not by a robot, but by a live inspector (assessor) from Yandex. He realized that you have the coolest article. And hands put you in first place in the search results for the request “Promoting a community on Facebook.”

Do you know what will happen next? You will fly out of there very soon anyway. Because no one will click on your article, even in first place. People enter the query “Promoting a community on Facebook,” and your headline is “How to properly run a business in a face book.” Original, fresh, funny, but... not on request. People want to see exactly what they were looking for, not your creativity.

Accordingly, your article will empty its place in the TOP search results. And a living assessor, an ardent admirer of your work, can beg the authorities as much as he likes to leave you at least in the TOP 10. But it won't help. All the first places will be taken by empty articles, like the husks of sunflower seeds, that yesterday’s schoolchildren copied from each other.

But these articles will have the correct “relevant” title - “Promoting a community on Facebook from scratch” ( step by step, in 5 steps, from A to Z, free etc.) Is it offensive? Still would. Well, fight against injustice. Let's create a competent semantic core so that your articles take the well-deserved first places.

Another reason to start writing SYNOPSIS right now

There is one more thing that for some reason people don’t think much about. You need to write articles often - at least every week, and preferably 2-3 times a week, in order to gain more traffic and quickly.

Everyone knows this, but almost no one does it. And all because they have “creative stagnation”, “they just can’t force themselves”, “they’re just lazy”. But in fact, the whole problem lies in the absence of a specific semantic core.

I entered one of my basic keys, “smm,” into the search field, and Yandex immediately gave me a dozen hints about what else might be of interest to people who are interested in “smm.” All I have to do is copy these keys into a notebook. Then I will check each of them in the same way, and collect hints on them as well.

After the first stage of collecting key words, you should end up with a text document containing 10-30 broad basic keys, which we will work with further.

Step #2 — Parsing basic keys in SlovoEB

Of course, if you write an article for the request “webinar” or “smm”, then a miracle will not happen. You will never be able to reach the TOP for such a broad request. We need to break the basic key into many small queries on this topic. And we will do this using a special program.

I use KeyCollector, but it's paid. You can use a free analogue - the SlovoEB program. You can download it from the official website.

The most difficult thing about working with this program is setting it up correctly. I show you how to properly set up and use Sloboeb. But in that article I focus on selecting keys for Yandex Direct.

And here let’s look step by step at the features of using this program for creating a semantic core for SEO.

First, we create a new project and name it by the broad key that you want to parse.

I usually give the project the same name as my base key to avoid confusion later. And yes, I will warn you against one more mistake. Don't try to parse all base keys at once. Then it will be very difficult for you to filter out “empty” key queries from golden grains. Let's parse one key at a time.

After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the “Worstat” button in the program interface, enter your base key, and click “Start collection”.

For example, let's parse the base key for my blog “contextual advertising”.

After this, the process will start, and after some time the program will give us the result - up to 2000 key queries that contain “contextual advertising”.

Also, next to each request there will be a “dirty” frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise drawing any conclusions from these numbers.

Step #3 - Collecting the exact frequency for the keys

Dirty frequency will not show us anything. If you focus on it, then don’t be surprised when your key for 1000 requests does not bring a single click per month.

We need to identify pure frequency. And to do this, we first select all the found keys with checkmarks, and then click on the “Yandex Direct” button and start the process again. Now Slovoeb will look for the exact request frequency per month for each key.

Now we have an objective picture - how many times what query was entered by Internet users over the past month. I now propose to group all key queries by frequency to make it easier to work with them.

To do this, click on the “filter” icon in the “Frequency” column. ", and specify - filter out keys with the value "less than or equal to 10".

Now the program will show you only those requests whose frequency is less than or equal to the value “10”. You can delete these queries or copy them to another group of key queries for future use. Less than 10 is very little. Writing articles for these requests is a waste of time.

Now we need to select those key queries that will bring us more or less good traffic. And for this we need to find out one more parameter - the level of competitiveness of the request.

Step #4 — Checking the competitiveness of requests

All “keys” in this world are divided into 3 types: high-frequency (HF), mid-frequency (MF), low-frequency (LF). They can also be highly competitive (HC), moderately competitive (SC) and low competitive (LC).

As a rule, HF requests are also VC. That is, if a query is often searched on the Internet, then there are a lot of sites that want to promote it. But this is not always the case; there are happy exceptions.

The art of compiling a semantic core lies precisely in finding queries that have a high frequency and a low level of competition. It is very difficult to manually determine the level of competition.

You can focus on indicators such as the number of main pages in the TOP 10, length and quality of texts. level of trust and tits of sites in the TOP search results upon request. All of this will give you some idea of ​​how tough the competition is for rankings for this particular query.

But I recommend you use Mutagen service. It takes into account all the parameters that I mentioned above, plus a dozen more that neither you nor I have probably even heard of. After analysis, the service gives an exact value - what level of competition this request has.

Here I checked the query “setting up contextual advertising in google adwords”. Mutagen showed us that this key has a competitiveness of "more than 25" - this is the maximum value it shows. And this query has only 11 views per month. So it definitely doesn’t suit us.

We can copy all the keys that we found in Slovoeb and do a mass check in Mutagen. After that, all we have to do is look through the list and take those requests that have a lot of requests and a low level of competition.

Mutagen is a paid service. But you can do 10 checks per day for free. In addition, the cost of testing is very low. In all the time I have been working with him, I have not yet spent even 300 rubles.

By the way, about the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

By the way, regarding the frequency of requests. We now need to take the final step, which will allow you to attract a lot of traffic even for low-frequency queries.

Step #5 — Collecting “tails” for the selected keys

As has been proven and tested many times, your site will receive the bulk of traffic not from the main keywords, but from the so-called “tails”. This is when a person enters strange key queries into the search bar, with a frequency of 1-2 per month, but there are a lot of such queries.

To see the “tail”, just go to Yandex and enter the key query of your choice into the search bar. Here's roughly what you'll see.

Now you just need to write down these additional words in a separate document and use them in your article. Moreover, there is no need to always place them next to the main key. Otherwise, search engines will see “over-optimization” and your articles will fall in search results.

Just use them in different places in your article, and then you will receive additional traffic from them as well. I would also recommend that you try to use as many word forms and synonyms as possible for your main key query.

For example, we have a request - “Setting up contextual advertising”. Here's how to reformulate it:

  • Setup = set up, make, create, run, launch, enable, place...
  • Contextual advertising = context, direct, teaser, YAN, adwords, kms. direct, adwords...

You never know exactly how people will search for information. Add all these additional words to your semantic core and use them when writing texts.

So, we collect a list of 100 - 150 key queries. If you are creating a semantic core for the first time, it may take you several weeks.

Or maybe break his eyes? Maybe there is an opportunity to delegate the compilation of FL to specialists who will do it better and faster? Yes, there are such specialists, but you don’t always need to use their services.

Is it worth ordering SY from specialists?

By and large, specialists in compiling a semantic core will only give you steps 1 - 3 from our scheme. Sometimes, for a large additional fee, they will do steps 4-5 - (collecting tails and checking the competitiveness of requests).

After that, they will give you several thousand key queries that you will need to work with further.

And the question here is whether you are going to write the articles yourself, or hire copywriters for this. If you want to focus on quality rather than quantity, then you need to write it yourself. But then it won't be enough for you to just get a list of keys. You will need to choose topics that you understand well enough to write a quality article.

And here the question arises - why then do we actually need specialists in FL? Agree, parsing the base key and collecting exact frequencies (steps #1-3) is not at all difficult. This will literally take you half an hour.

The most difficult thing is to choose HF requests that have low competition. And now, as it turns out, you need HF-NCs, on which you can write a good article. This is exactly what will take you 99% of your time working on the semantic core. And no specialist will do this for you. Well, is it worth spending money on ordering such services?

When are the services of FL specialists useful?

It’s another matter if you initially plan to attract copywriters. Then you don't have to understand the subject of the request. Your copywriters won’t understand it either. They will simply take several articles on this topic and compile “their” text from them.

Such articles will be empty, miserable, almost useless. But there will be many of them. On your own, you can write a maximum of 2-3 quality articles per week. And an army of copywriters will provide you with 2-3 shitty texts a day. At the same time, they will be optimized for requests, which means they will attract some traffic.

In this case, yes, calmly hire FL specialists. Let them also draw up a technical specification for copywriters at the same time. But you understand, this will also cost some money.

Summary

Let's go over the main ideas in the article again to reinforce the information.

  • The semantic core is simply a list of key queries for which you will write articles on the site for promotion.
  • It is necessary to optimize texts for precise key queries, otherwise even your highest-quality articles will never reach the TOP.
  • SY is like a content plan for social networks. It helps you avoid falling into a “creative crisis” and always know exactly what you will write about tomorrow, the day after tomorrow and in a month.
  • To compile a semantic core, it is convenient to use the free program Slovoeb, you only need it.
  • Here are the five steps of compiling the NL: 1 - Selection of basic keys; 2 - Parsing basic keys; 3 - Collection of exact frequency for requests; 4 — Checking the competitiveness of keys; 5 – Collection of “tails”.
  • If you want to write articles yourself, then it is better to create a semantic core yourself, for yourself. Specialists in the preparation of synonyms will not be able to help you here.
  • If you want to work on quantity and use copywriters to write articles, then it is quite possible to delegate and compile the semantic core. If only there was enough money for everything.

I hope this instruction was useful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the fastest way from zero to the first million on the Internet (a summary from personal experience over 10 years =)

See you later!

Yours Dmitry Novoselov

Greetings, dear reader of the web-revenue blog!

Today I decided to tell you about the basics of SEO promotion, namely compiling the semantic core of the site (SY).

Semantic Core is a library of search words or phrases and their morphological forms that most accurately characterize the activities of the site, as well as the goods or services offered by the site. Roughly speaking, the compilation of NL is the compilation of the link structure of the site’s target queries for which it is planned to promote the site!

Why is the semantic core of a website created?

1.SYA forms the theme of the site, which will be taken into account by search engines.

2. A correctly formed syntax is the basis for the optimal structure of a web resource.

3. In order to link each page of the site with semantic information to a specific part of the language (keywords).

4. To form a limited set of keywords in order to rationally allocate resources for website promotion for specific keywords (words).

5. To estimate the cost of website promotion in search engines (search engines)

Basic Concepts

Before we begin compiling a semantic core, let’s look at a few basic concepts.

1. All queries that users enter into search engines can be divided into:

High frequency (HF)

Mid frequency (MF)

Low frequency (LF)

How to find out which group this or that request belongs to, you ask. In general, there are no strict boundaries and boundaries that separate high-frequency from mid-frequency, and mid-frequency from low-frequency queries. This largely depends on the theme of the site. If we take average values, then we will consider low-frequency queries that are received up to 450-700 times a month; medium frequency - up to 1.2 - 2 thousand times a month; high frequency – over 2 thousand times a month.

Many webmasters recommend starting to promote sites, first for low-frequency and medium-frequency queries, this is on the one hand correct, but there is one thing: some low-frequency and mid-frequency queries have such high competition that it will be no easier to promote for such queries, What would you use to promote high frequencies?

So, when compiling a website’s SYNOPSIS, you should not rely only on the frequency of words; you also need to determine how difficult it will be to compete for a given request.

Therefore, we will introduce 3 more groups of queries:

Highly competitive (VC);

Moderately competitive (SC);

Low competitive (LC);

Many people consider VC to be highly competitive, SC – moderately competitive, and NC – low competitive. However, this is not always the case. Nowadays, in many areas, low-frequency queries have become so in demand that it is better not to try to get to the TOP using them. Sometimes it’s easier to reach the top in midrange frequencies (but this is also rare). Sometimes you need to take into account words that people often misspell (for example: Volkswagen it can be spelled Volcwagen or Volswagen) or words that a person types without forgetting to change the keyboard layout - “cjplfnm uba fybvfwb” instead of “create a GIF animation”. Such mistakes in words can also be used to promote a website well!

And three more important concepts:

Primary queries are queries that characterize the resource “in general” and are the most general in the subject of the site. For example, the primary requests for my website are: website creation, website promotion, website promotion, making money on the website, etc.

The main ones are queries included in the list of the semantic core, those for which promotion is advisable. For my blog: how to create a website, how to promote a website, making money on a website, etc.

Auxiliary (associative) - queries that were also typed by people entering the main queries. They are usually similar to the main queries. For example, for the query SEMANTIC CORE, internal optimization, website promotion, SEO will be associative.

I have explained the basic theory, now we can move on to the basics of compiling a semantic core:

1. If you are composing a SL for your website, then first sit down and think about what your website is about, what queries a person can use to find it, try to come up with as many keywords (sentences) for your topic as possible and write them down in a text document. For example, if you are going to make a website about various drinks, cocktails, etc., after a little thought you can write down something like the following words: soft drinks, cocktail recipes, making cocktails, fruit drinks..., etc.

And if you are doing this for any client, then we find out from the client a list of words by which he wants to promote his site.

2. We analyze the sites of competitors from the top 10 (we look at what queries they are promoted for and receive most of the traffic)

3. We use the client’s price list (name of goods, services, etc.)

4. We try to find synonyms for keywords (hard drive - hard drive - HDD)

5. Collection of keywords that are suitable for your personal blog, Internet resource or business. Here you can use wordstat search query statistics, or it is better to use special software such as Key Collector.

6. Traffic counting for selected search queries. For this, you can also use a key collector or link aggregators: seopult or webeffector.

7. Removing dummy requests. These are search queries for which the impression values ​​are greatly inflated or even inflated. You won’t get visitors from dummy requests.

8. Removing keywords with a very high promotion budget. You can again find out the approximate budget: seopult or webeffector. You can also filter out highly competitive queries.

Then we distribute them throughout the site.

The general scheme for compiling a strategic language looks something like this:

As a result, we will receive a list of keywords for our site. That's basically the whole scheme. It is not that complicated, but it is quite labor-intensive and takes quite a lot of time. But as I wrote above, this is the basis of the site, which is worth paying close attention to.

Mistakes that are usually made when compiling SYNOPSIS:

When selecting keywords, try to avoid the following problems:

SYNOPSIS should not consist of general phrases that poorly characterize your site or, conversely, too narrow. For example, if a visitor wants to learn about creating a vertical drop-down menu in Wordpress, then he will type “creating a vertical drop-down menu in Wordpress”, and not “creating a website”, “creating a site menu”, “web site”, etc. In general, you should cover more specific queries. On the contrary, too narrow queries will not give you enough visitors. Try to find a middle ground.

If you have little text, you shouldn’t put a lot of keywords on it. For example, this article is tailored to 3 keys! But the volume of text is quite substantial - more than 6 thousand characters. Ideally, there should be 1 key per article. But you can use the following rule: one or two keywords per 2 thousand characters of an article.

When creating the site's language, misspelled words that users accidentally make when typing are not taken into account. I spoke about them above.

Well, I think the theory is enough for you, and in the next article we will talk about!

Hello, dear readers! Often, during personal conversations with webmasters, I see a lack of understanding of the importance of keywords for search engine promotion. Therefore, I decided to write a separate post on this important topic. What's happened semantic core of the site, its pros and cons, an example of a ready-made kernel, tips for its correct composition - the topic of this article. You will learn why correctly selected keywords are so important for a web resource in terms of SEO promotion. You will feel their importance and exclusivity, and see the power from their use.

Why did I write this post? Firstly, my article is intended for novice webmasters who have just begun to understand SEO promotion. And secondly, in the top 20 in Yandex there is not a single article that would reveal in detail to the search user the essence of the semantic core. Let's fix this bug. I hope the Russian search engine will take this into account. 🙂

The concept of the semantic core

What it is

Each of us knows how to use a telephone directory. Whether it is a pocket alphabet version in the form of a notebook or a large printed Talmud of all the telephones in the city - the principle of its use is very simple. We are looking for a specific last name for all the contact information of the person we are looking for. To do this, we need to know the person’s full name. Usually, already knowing a person’s last name, we can easily look at all the entries in the directory and select the one we need. That is, knowing a certain word (last name), we can get all the data on a person that is recorded in the book.

The semantics of the site work on this principle - we enter a search query in the search engine (analogous to the name for which we are looking for contacts in the directory) and receive a list of the most accurately answering documents from various web resources. These documents are individual pages of websites or blogs, the content of which is optimized for the search query we are requesting, which is called a keyword. Thus, the usual semantic core is a set of search queries (keywords) that accurately characterize the subject of a web resource and the type of its activity (usually for commercial sites that offer services or goods on the Internet).

Therefore, the semantic core performs a very important function - it allows a web resource to receive users from search engines on its pages, which are necessary to perform various tasks. For example, for a commercial site such tasks may be selling goods or services, for a news portal - advertising third-party sources using contextual or banner advertising, for a blog - advertising affiliate programs, etc. That is, the core is exactly the foundation that is necessary to receive search traffic using SEO promotion. Without the correct and high-quality set of keywords, it is not possible to get such traffic.

Therefore, if we want to use the resources and capabilities of search engines to promote our web resource, it needs to have a competent semantic core. If we don’t want to receive search traffic, if we don’t need a target audience from search, then the presence of semantics for our site is not justified. We don't need keywords.

Types of semantic core

In search engine promotion there is a main and a secondary semantic core. The main thing implies a set of main keywords with the help of which this site realizes its goals and objectives. For example, these could be the search queries that potential buyers of goods or services on a commercial website follow from a search. That is, thanks to the requests of the main core, the site or blog fulfills its purpose. It is precisely these keywords that transform an ordinary user from a search into a client.

A secondary semantic core allows a web resource to solve less significant problems or receive additional traffic to attract more potential customers. In this case, the range of keywords can be not only the main subject of the site, but also related topics. Typically used in a number of commercial sites that want to convert visitors with additional articles to sell services or products on sales pages. For example, this is what many sites that provide SEO services do. On their web resources, in addition to the main selling and explanatory pages, there is an additional large set of documents that together form a blog option.

For bloggers, the main core is a list of keywords that bring them search traffic, since this is the main task when monetizing a blog with the help of a target audience from search. And a blog usually doesn’t have secondary semantics. Because whatever the blog’s task, all traffic on all pages promoted in search engines goes to solving its commercial task. But, usually, keywords on the main topic give much more commercial conversions than keywords on a non-core topic.

What you need to know when creating a semantic core

Keywords for the semantic core are selected according to various parameters (frequency, competitiveness, etc.). I wrote about this in detail in. Depending on these parameters, semantic cores are selected for commercial sites, blogs, news portals, etc. The choice of certain values ​​of these characteristics depends on three components - on the promotion strategy, on the amount of budget allocated for it and on the subject of the area of ​​activity of the site being promoted.

The promotion strategy dictates the plan for selecting keywords, focusing on the relationship of the promoted pages with the site and their quality. Firstly, the page linking scheme is important here - the number and importance of documents that will be promoted in search engines depends on its correct choice. Each web resource should have its own scheme, with the help of which there will be a better distribution of weight among the promoted documents. Secondly, you need to know the volume of content of the page being promoted (for which keywords are selected) - depending on the number of characters, one or another number of keywords is used for it.

The possibility of using competitive queries in the semantic core depends on the amount of budget. The more financial resources a webmaster can allocate to his project, the better words he can include in its semantics. It’s no secret that the highest quality promoted keywords always cost a decent amount of money to get into the top 10 for them. This means that internal optimization is not enough to promote pages for these key queries - you need to purchase external links, and you need material resources to purchase them.

First of all, the minimum keyword frequency threshold that can be used to receive search traffic depends on the topic. The more competitive the topic is and the narrower it is (in terms of its popularity), the more the webmaster casts his gaze towards low- and micro-frequency words (their frequency is less than 50-100, depending on the topic). For example, the topic of my blog (SEO and web analytics) is very narrow and I cannot use more keywords to create a semantic core. Therefore, most of the keywords on my blog are LF and MK search queries with a frequency of no more than 100. If we compare with popular topics (for example, cooking - LF queries in it reach 1000!), then you can easily understand that you can get serious traffic through small the topic is very difficult - you need to have a huge number of keywords with low frequency (but not competitive). This means you need to have a huge number of promoted pages.

Plan for creating a semantic core

To create a semantic core, you need to perform a number of sequential actions:

  1. Select the website topics for which keywords will be selected. Here you need to write down all the areas of activity that the web resource is dedicated to. If this is an online store, then we list the main categories of goods (usually keywords are not searched for separately for goods, unless the product does not have a numbered model, but only a general one). If this is a blog, then all subtopics of the general topic, as well as individual titles of important articles. If the site provides services, then future keywords will be the names of these services, etc.
  2. Based on the selected topics, create a list of masks - initial queries that will be used to parse our semantic core. You can learn more about the concept of a list from. another post on this topic, which describes.
  3. Find out the region where your website is being promoted. Depending on the location of the resource, some keywords may have varying degrees of competition. The narrower and more precise the region, the greater its importance in terms of promotion. For example, many keywords are not as competitive in the Russia region as they are in Moscow.
  4. Parse all kinds of search queries from the Yandex and Google search engines (if necessary) using various methods (manual, in the Key Collector program, competitor analysis, etc.). The described parsing options, for which I have written detailed step-by-step guides, can be found via the links in the block of additional articles after this post.
  5. The received search queries must be analyzed and the highest quality ones retained. , which will allow you to weed out dummy words, will be able to separate competitive words from non-competitive ones and show the most effective keywords.
  6. Distribute the received keywords among the documents of the web resource. The final stage in creating a semantic core, in which it is necessary to secure your keywords for each promoted page, depending on the promotion conditions. Thanks to this distribution, the webmaster can immediately see the main request and additional ones, but also guess the structure of future content. Here the core is divided into primary and secondary (if necessary).

After the full semantic core of the site has been created, in terms of promotion, it’s time to take on internal page optimization and then do effective linking.

Example of a ready-made semantic core (keyword table)

In order for you to see the finished result, I decided to post a piece of the semantic core that I ordered. In this screenshot you will see a table of selected keywords (the picture is clickable):

As you can see, all keywords are distributed among articles (distribution is carried out by the customer himself or by me for a fee). You can immediately see which search queries from the table can be used as the main query, and which in the form of additional keywords. All this allows the customer to immediately implement the received keywords into finished posts without delay or make a rough plan for future posts - everything is clearly and clearly visible.

Where can you order an excellent semantic core?

If you want to assemble a full-fledged semantic core for a commercial website or information project, I recommend contacting.

Pros and cons of the semantic core

As in any business, creating a semantic core has its positive and negative sides. Let's talk about the cons first:

  • To obtain a high-quality semantic core of a website, various cost options are needed. If you ordered your list of keywords from a specialist, then these costs will be a decent amount. The more words you order, the more rubles you give away. If you select keywords yourself, you can spend quite a lot of time learning how to select the most effective queries. But the experience gained and the quality of the keys will return the hours spent collecting the core in the form of excellent conversion and targeted search traffic.
  • To select and analyze the best keywords from those already parsed, knowledge is required. Not every webmaster will delve into this matter - after all, in addition to knowledge of optimization, you need to know the basics of working in Excel and be able to count (for formatting data, first of all). But, again, if all this happens, the quality of your semantic core will be many times higher than a simple set of parsed keys from Wordstat.

As you can see, there are few disadvantages, but they are significant. And if material resources are not so important, then studying the necessary material requires time, which is valued most of all. After all, it cannot be returned... Let's move on to the advantages:

  • The biggest and most important advantage is that you get a strong foundation for promoting your web resource. Without keywords, the visibility of a website or blog will be very low, which will not attract much search traffic to it. This means that the assigned tasks and goals will not be achieved. Simply writing high-quality content is not enough to attract search engine users. Search engines also need the presence of keywords in the article (in the text) and on the page (in its meta data). Therefore, creating a semantic core is the first stage, without which search engine promotion is simply unrealistic!
  • Thanks to a good semantic core (of course, taking into account its competent internal optimization), the promoted pages of a web resource will not only receive exactly the target users from the search, but will also allow the site to have excellent behavioral factors. When using bad keywords (or their absence), the bounce rate increases noticeably, and the number of targeted conversions decreases.
  • Having a ready-made semantic core at hand, any webmaster will have an idea of ​​the future of his web resource in terms of content creation. By looking at the keywords, he will see future topics of his articles on the site. That is, he can plan his activities much more effectively - give the copywriter a task in advance, better prepare for seasonal jumps, or come up with the next series of posts.
  • As a rule, a large number of posts are created (located) on a blog. Therefore, you need to look for a lot of keywords for them. But you shouldn’t take the first ones you come across - look for the best options, evaluate them, analyze the parameters of your search queries. You may spend more time, but you will end up with high-quality keywords that can attract better traffic than a bunch of the first search queries seen. Make yourself a goal to search for keywords for one article every day. Having filled your hand, you will collect them for your core much faster, thereby servicing 3-5 posts at a time.
  • You don’t need to read a bunch of SEO materials, don’t rush to super-duper new optimizations, and don’t look for the perfect way to create semantics for your blog - it doesn’t exist! Take the one that you understand. And use it to select keywords. But trust only verified materials about SY.
  • It often happens when we find it difficult to come up with a topic for our future post. Especially from the point of view of search engine promotion, when the priority is to obtain maximum search traffic. A ready-made semantic core helps to get rid of this problem once and for all - before your eyes at any time (!) you will be able to observe the ready-made topics of the following articles. And not only - in this way you can estimate a future series of posts, which can then turn into your own information product. 🙂

And finally, the most important advice. Don't chase big and thick keywords. There will be no benefit from this, you will only create problems for yourself by wasting a lot of time. Take into account my next three-phase principle for selecting keywords, which I successfully use on my blog (more than 80% of keywords are in the top 30, about 49% in the top 10 in Yandex and more than 20% in Google - and this is in my complex competitive topic!):

  1. Check the resulting parsed words that you collected in your own way (for example, using Wordstat, the Slovoeb program, competitor analysis, etc.) for competitiveness - estimate their promotion cost (how to do this, you can find out in my free book or you can read in separate additional article - link after this post). Remove all competitive queries, leave only those with the minimum cost (for example, in SeoPult these are keywords worth 100 rubles, in SeoPult Pro - 25).
  2. For the remaining search queries, rate their quality. To do this, you need to calculate the ratio of the exact and basic frequencies (available both in the book and in the post about analyzing search queries). Depending on the topic, exclude low-quality words and dummy words.
  3. Now, among the quality words received, distribute them among articles. If a post has more than one keyword, choose the boldest one (but this keyword should have a higher exact volume than the others). It will be the main query that brings most of the search traffic for the promoted article. If two of the keywords proposed for the article are suitable for dominance (more than two are rare), take both. Write one in the title, the other in h1.
  4. unique multifunctional platform Key Kollector;
  5. For a blogger, the best assistant for creating his own semantic core, of course, will be. It works quickly and is absolutely free! I recommend!