How to improve the site with the help of SEO and usability: 4 mini-cases

One of the main problems of old-school SEO is the conflict between the interests of search robots and live users. Artificial saturation of texts with keywords is the simplest example of the incompatibility of clumsy methods of promotion to the needs of an audience.

Currently, many search marketers are declaring the transition from regular SEO to SEO 2.0. New SEO has many names: Internet and content marketing, search and inbound marketing, SEM. One of the main signs of SEO 2.0 is the coincidence of interests of living users and search robots. This is logical, since search engines serve the interests of people and even learn to “think” like a human being. However, robots have not yet become humans. Therefore, the contradictions between the interests of search engines and people still occur.

From this article you will learn how to improve the site, while thinking about the interests of visitors and search engine promotion.

SEO and usability go hand in hand

The conflict between the interests of users and robots appeared artificially. The optimizers of the old school at some point decided that manipulating the issue is a shorter and less costly path to success than developing a site for people. However, search engines are constantly improving the algorithms of work, trying as best as possible to meet the needs of users. Search engines understand that users need quality sites. Therefore, they close loopholes for the manipulation of the issue and recommend webmasters to improve the quality of resources.

More advanced algorithms have allowed search engines to eliminate many of the contradictions between high-quality websites for people and resources, sharpened by a particular request.

Search engines have always recommended creating websites for people, not for robots. Old school SEOs perceived this recommendation with a smile, while they themselves wrote key phrases with a density of 3.141592% into the texts and bought links. To succeed today, search marketers need to learn to understand the recommendations of search engines literally. We need to create websites for people, that's all.

The examples below show how you can simultaneously take care of the user's happiness and high positions of the resource in the issue.

1. Content or product list

On the main page of a commercial site, users want to see brief information about the business and links to the main sections, products, and useful information. And search engine optimizers are trying to promote the main page on any request. To increase the search value of the main, they publish a voluminous text.

It is hardly interesting for living visitors to have voluminous texts on the main page, but that’s not all. Publishing text at the top of the page reduces the visibility of important links, a list of products and services, and other information that live users are looking for.

You can solve this problem simply and elegantly. To do this, publish the bulk text at the top of the page in the form of a collapsing block (table or div). The text of the folding block remains visible for robots. At the same time, it does not prevent live visitors from selecting links of interest.

2. Video and text

Between reading the text and watching the video, many people choose the second. Brands meet them by offering video content. The problem is that search robots find it difficult to determine the relevance of a page if only a video is published on it. Even if you publish a video description, use video markup and a site map for a video, users will be able to find a page with a clip except by its name.

You can take into account the interests of users and search robots at the same time by publishing the video and its decoding as text. It might look like this. You can also hide decryption from users by using folding blocks.

3. PDF file and HTML page

Users love to download and view PDFs. Reports and white paper in PDF are popular, not least because of the ability to quickly print these documents. Search robots scan PDFs, but they clearly prefer content published in HTML format.

The way to solve this problem is similar to the case with video and text. You can copy content from a PDF file to a site page. Also publish the PDF file on the page itself. When search engines index a page and a document, make the PDF file invisible to robots by making the appropriate changes to robots.txt. So you avoid duplicate content.

4. Duplicate content

Sometimes webmasters are forced to publish the same content on different pages of the site. For example, such a situation is inevitable if the owner must warn visitors to his site about the legal aspects of cooperation. The webmaster is obliged to provide users with legal information in order to protect them from misunderstandings. And search engines see the same content on every page. This adversely affects the rating of the site.

To solve this problem, it is enough to make legal information invisible to search engines. This can be done by publishing the “Terms of Use” and “Disclaimer” in photo format. Live users will be able to read the text, and search robots will assume that they are dealing with a regular jpeg file.

SEO 2.0 doesn’t work without people’s interests

You can count on successful search marketing only if you think about the interests of the living visitors of the site. Search engines spend huge amounts of money to at least bring robots closer to the ability to think like a human being.

However, contradictions between the interests of search engines and people still occur. They can be solved in two ways. First, you can find a solution that suits both robots and humans. Examples of this approach above. Secondly, you can not waste time searching for non-standard solutions, but immediately choose the "human" option. So promising. Do you agree?

Watch the video: 4 Case Studies On Doubling Conversion Rates By Localizing & Personalizing (February 2020).

Loading...

Leave Your Comment