
The team of Visitorgen.com came up with a very interesting study. If their perceptions are true, it can fundamentally change our belief in the SEO as a profession. Our study summarizes the results of roughly 4 months of intensive work. Let us now try to set aside all our assumptions and knowledge so far for a few minutes and look at the details of the study objectively.
It’s time to look at the hit list differently
The main goal of every website owner and all SEO professionals is to place the managed website in the most prominent place in the list of results generated by Google, preferably for as many search terms as possible. But are you sure that consecutive pages are a continuation of the same search result list? They are not at all. Each page or group of pages is built on a separate set of relevance.
What does this mean exactly?
That the links on the first page received the most relevance points determined by the algorithm based on a ranking system. The most relevant results for your searches will appear here. The whole website has relevant content compared to the search, and complies with most Google policies.
If many websites with the same relevance are included in this set, page 1 of the results list can be very long. Many searches can return results pages of up to 50 items. The fact that these search results can only contain 10-20 hits per page is inherently wrong, as it varies greatly. What is even more interesting to note is that the result list on the second or third page does not necessarily contain the same number of items as the first. Each page has an independent number of items depending on the size of the relevancy defined by the different pages.
What haven’t we noticed yet?
This might not be a significant observation by itself, but it is not just the number of items that changes as you scroll. On closer inspection, for each search, we see an estimate at the top of the results list, which shows how many web pages would match that particular search. For more popular searches, that number can be as high as tens or even hundreds of millions. This is the full set of websites that Google searches for and will list the most relevant ones for us. But what about the other side then? It can be observed that with some paging, this set of relevance will decrease significantly. A backward results page will no longer list results for us from the same set of relevance.
Let’s take an example to illustrate relevance sets:For example, if we search for the term Paleo Cookbook (say with an American IP address), we will notice that the first page will indeed list cookbooks on paleo food culture. The second page will contain mainly content related to the paleo lifestyle, especially articles. Then, as we go back and forth on the following pages of the list, relevance will become less and less. Later, the term paleo will not appear much, it will be more about cookbooks or cooking, although if we look at the first page, Google has recorded millions of more relevant results, but at the 50th – 100th result it won’t have anything to do with our original search keyword.
OK, but what are these relevance sets based on?
It is difficult to give an exact answer, but an approximate explanation can be given. Basically, a set of relevance is a group of results structured according to different aspects, in which it groups the results to relevancy based on different aspects. Let’s look at some approaches to this.
Based on profiles of searchers
To whom and how a website will be relevant, we need to understand the habits and search history of searchers, simplifying their profile. It’s not a secret that Google keeps enough detailed statistics. You’ve probably already noticed, for example, if you search for a city or street, maybe set a route on your computer and then you get in your car and want to plan that route so that your phone’s GPS will navigate there, it will almost immediately give you that option. This is not accidental! Google always tries to understand your current needs and responds every minute. This is also true for your searches. The more searches you start on a particular topic, the better you will understand what you want to start with that information, and the consequence is that after a while you will see not only the results that might seem more relevant due to the popularity of the website, but also those where you can get exactly the information you are looking for. Whether you really got this from the website will infer it from your length of stay and your actions on the website. The primary result list is therefore the result of SEO activities in the classical sense, which can be achieved with keyword content and link building, but you can override this relevance according to the search habits of your visitors. whether you have really received this from the website, it will be inferred from the length of your stay and your actions on the website. The primary results list is therefore the result of SEO activities in the classic sense, which can be achieved with keyword content and link building, but this relevance can be overridden according to the search habits of the visitors. Whether you have really received this from the website it will infer it from your length of stay and your actions on the website. The primary results list is therefore the result of SEO activities in the classic sense, which can be achieved with keyword content and link building, but this relevance can be overridden according to the search habits of the visitors.
If we search for two independent devices with the same IP address so that one search party has already searched for similar content and the other party has not, we will notice that there is a good chance that we will not see the same hit list. Pages that we’ve visited before (and spent a critical amount of time on) or that Google thinks are the closest match to our previous searches will be in a better position. So the history of several of our searches will also affect this.
And now comes the point:This is due to the fact that the list of results is constantly on the move, as relevance should be examined not only in parallel with the content elements of the website and the search terms, but also in the ever-changing search habits.
Ahh it’s just Google Dance… Or not?
Google Dance is in practice an A / B testing process. If a website suddenly changes its SEO values, Google will investigate whether this was really generated by visitor needs or perhaps a manipulation. Google then periodically, for a control group, intentionally brings the website to the top of the results list to verify that visitors who have not seen the website so far really value the content there. In other words, it checks to see if the website is in the right set of relevance. You can measure this from CTR, bounce rate, and session length. That is why improving these three parameters is an extremely important factor in improving our rankings. When updating an algorithm, we often notice that placements are completely overturned for a short period of time. Google will then create completely new relevancy sets according to its updated selection method. This causes a radical rearrangement of the results list in some cases. Over time, this changes according to stuff described above.
This is why it is difficult to state exactly where a particular website is in the list of results for a particular search term, because we have to consider a lot of parameters to determine this. A more accurate approach might be to think about the number of times a website has been clicked on a given search term per month. This is a much more practical approach to measuring the success of our SEO activities.
Approach this assumption with a simple example
If you search for a car, phone, or any device manufactured in 2021 in 2021, there is a good chance that relevancy will build on classic SEO values (PA, DA, etc.…). If you repeat this search, let’s in 5 years later, to search for 2021 products as well, you may no longer be listed by Google. For the sake of example, let’s put this aside now because the „algorithm is changing in the meantime” or because „the SEO quality metrics of the websites have changed”. Let’s suppose these do not change during this time. By then, the list of results will be based on the statistics of which website provided the best quality content to the most visitors of a similar profile in a given period, there will be an overlap between websites with good SEO values, but the list of results can be reaaranged quite strongly depending on the users habits.
You may want to look at this in relation to the following three parallels:
- If we search for content on Google that we’ve never
- searched
- before, we don’t have a similar browsing history, the results list will be sorted solely by the SEO values of the websites.
- If we already ha
- ve
- a similar history, our search profile will be affected by this list.
- And if the content you are looking for is no longer new, the popularity of the topic has
- fell
- and you look for it afterwards, then the SEO values of the website will determine the structure of the results list in addition to the average popular search habits and our individual search habits.
Location or the influence of community events
Other determinants of the relevance struggle outlined above are location and interpretation of community events. If we search for, let’s say, the word „pub” in the search engine, there’s a good chance you’ll list pubs near us, or ones where you already know we’ve turned around, or perhaps Google’s algorithm suggests we’ve turned around (e.g. we commented on the place before). On the other hand, if Google detects a connection between watching football in a pub and just having an Europe Cup or World Cup match around the search date, chances are it will suggest pubs where you can watch a football match, or the pub is already built on a football fans. This will be further filtered by Google from our search history. Chances are it won’t suggest a pub to a Arsenal fan in Italy. At least not on the first page of results, because if we return to our assumption that the first results are from a completly different set of relevance then the folloing pages it is already conceivable that theese are less relevant results and will also be found only later.
And what is the conclusion?
The fact that if our goal is to get from the back of the results list of the first page results, is by no means certain, it will only be possible with link building. There are many more factors that need to be affected in order for our website to effectively move to a new set of relevance. To this is added a lot of textual content built around a particular topic and what user habits and reactions they evoke from visitors.
So can traffic and visitor habits be more serious factors than link building?
If we approach the issue well, then yes. As part of the Visitorgen project, a number of case studies have been carried out to simulate traffic to the websites they manage thanks to their service. Adjusting the length and intensity of this according to what is not too bounced and significantly improves the average of the CTR, bounce rate, and session time statistics. A fairly precise set of rules was developed for this, which in most cases resulted in a lasting improvement in ranking for several keywords. Prior to the 4-month test period, there were no new content or planned link building on these websites for 3 months, so the results can actually be attributed to the processes performed during the test.
But how is this still possible? Can this really work? Why didn’t you do it, then someone else?
They are first ones from everything, including this one, although the idea was not new. A similar service is promised by so-called traffic bots. However, a significant part of them only simulates direct traffic, which is not enough for the combined improvement of the 3 statistical indicators mentioned above. Bots that may click from Google’s results list also only guarantee a residence time of a few seconds on the website. In this form, they will be more detrimental to our rankings than to deliver on their promises, in contrast, the system used in the study, filling these gaps, has implemented this type of operation with a solution that can be measured in both Analytics and the Search Console, with good results. This was achieved by creating each search individually and in each case with a new profile. Thus, each of their visits was considered a completely new session. This could also be monitored by UTM parameterization. In addition, scrolling and mouse movement were simulated on the visited website. Outbound links and Adsense or other advertisements are not clicked. They have strict ethical standards for this. We won’t dive into more in-depth technical details right now, but to understand how it works, let’s take a look at how Google worked long ago and presumably now.
How did Google work in the past, which many people still trust today?
SERP (Search Engine Result Page) ranking used to be the result of a redesign of the Markov chain, the classification system of which was known to everyone as PageRank. However, this solution calculated the relevance only by referring to the websites. The basic logic was that if there are N links to a website, then if that website links elsewhere, the strength of the link will be decided based on the N links. Overall, this forms a very complex graph system in the end. In most cases, the result is usually represented in 3D by columns, where each column is the webpage and the height of the columns gives the PageRank, the value (height) of which is basically determined by the amount and weight of the vectors pointing to the column. The system has been constantly updated,
Then, around ~ 2010, Google announced a major overhaul, in which, in addition to links, real content and website operation became increasingly important. It was during this period that the introduction of rules began that the load time, the structure of the structure (HTML structure) is at least as important as how many people link to a given website.
However, development has not stopped here and as the 2010s are considered the golden age of social media advances, it has had an impact on everything. Google has begun to pay close attention to habits whose basic logic is “If someone else is interested, it can be for you too” – meaning that when someone spends a lot of time actively on a website, Google interprets the content on the website relevant, as a visitor only actively spends time on the website, and does not go back to start a new search if he finds what he is looking for, and this is perhaps more important than ever, as almost nothing better describes the relevance of a website than human reactions. .
In light of this, in the new system, three factors affect a website’s ranking and position:
Link building – which is therefore still important, as the popularity of a website is partly influenced by how many places it can be reached, however, the traffic flowing through the links remains an important factor. However, it is important to note here that in the absence of an increase in traffic, link building will only result in a momentary hype (meaning A / B will test Google for relevance), so continuous link building is required, however, if this rate deteriorates greatly, it is significant If the number of links does not improve, the represented link power will deteriorate and the effect of link building on the results list will be weakened.
The linking website and the linked website also affect each other. In other words, a link is really valuable if it also generates potential traffic. and the content on the linking website is also relevant to our visitors. The lack of this was one of the main reasons for the devaluation of so-called link farms. These listing pages did not in themselves represent value that would have been consistent with the content of the linked websites, and since Google’s new search methods no longer diverted a sufficient number of organic traffic here, the number of clicks also fell sharply. Because Google also relies on the full relevance of websites for linker referrals, so did the popular so-called PR article websites, which never had a single line of action, just a series of inconsistent articles with outbound links. A poorly structured link building strategy can also increase our organic traffic for a while, but if our content is not relevant to visitors, as Google would have first assumed based on the content of the linking websites, this advantage will disappear over time.
Website quality
As already mentioned, load time, content structure, flawlessness of the structure are all aspects that GoogeBot monitors when examining a website, as a slow or poorly formatted website, for example, creates a bad User Experience, which not good in any way.
Community evaluation
Here is the next factor that comes out specifically based on the real experience of the users, we could say that it evaluates the website based on the Real User Experience, which gives a completely objective opinion about the website.
So basically, the classification of a website is currently evolving according to these 3 factors. Of these, most SEO professionals tend to prioritize the first factor, as it is easiest to manipulate without modifying the website. In the case of the second factor, the content structure is usually corrected by most, but to improve the load time and structure, it is usually necessary to have a web developer who is either available and does it or not (this is more typical), so this is a questionable point. However, with the third factor, almost no one knows what to do with it, but there is no explicitly proven methodology how could it be improved.
What if your SEO values and content are exceptionally good, but you’re not on the first page for everything yet?
Maybe this question really concerns everyone. Let us also try to explain the reason for this based on the lessons learned above.
Have you heard of the amoeba principle?
Instead of the older columnar display, in terms of a better interpretation of the new mapping considerations, we can now imagine pages to be an amoeba form of a website that represents what qualities the page has in terms of SEO. The properties of amoeba describe the status of each value on a web page:
- The result of link building is seen in the size of the amoeba, as in the case of the older PageRank-based system, only here it does not specify the size of the column, but the size of the amoeba.
- The quality of the website determines the amoeba’s mobility, so overall, how easy or difficult it is to move the website forward in the results list.
- Community evaluation, on the other hand, gives the form of amoeba, which has roughly uniform legs in healthy form.
Based on our observations, Google likes to prioritize the content of a webpage, so it is possible that some subpages may appear significantly ahead or backward in the search engine than other pages, even if we think both are equally relevant to that topic.
For example, if a website has 3 articles that have nothing to do with each other, but one article has exceptionally high traffic, while the other 2 are less relevant, then Google will designate the topic of the most visited article on the website as its primary target. This means that the other two articles will be less relevant to the target topic, so while the targeted subpage soars in the search engine, the others may not, or even rather poorly, even though there is no distinct difference in quality or other factors between them.
An average, well-optimized website has the same foot size – hence there is no significant difference between the relevancy of each subpage, most of which are “lived” by having either all the links “behind” the search engine or each one “in front”. This can be affected by the size of the amoeba.
However, if the traffic on one side rises clearly, it clearly starts ahead on the relevance scale, so that branch starts to stand out from the other branches, the leg of the amoeba starts to get bigger, but then an interesting thing happens, because the other legs start less be relevant. The simple reason for this is that Google starts prioritizing that subpage (highlighting it as a target) based on user habits, so the other pages fall back according to the theme of the website. The only exception to this is if you also visit those pages, then of course those pages will reach better. And if the visit to all our pages runs proportionally, the size of the amoeba will start to increase.
This can ultimately lead to a serious shift in relevance within the website, which can be easily remedied by having approx. the same relevancy value increases should be achieved on each subpage, even if each subpage of the website was created for a different topic (e.g., a general blog page or a general webshop).
To use a simple example: If you have a website that you have written about dogs so far, your link building also comes mainly from websites relevant to this topic, then the relevance of your website will be reduced if you just start uploading kitten content to this website and those articles will be optimized with more success. This upsets the relevance values on the page, so you need to ensure the right number of users on the front of dog articles, so that both your dog and kitten content can be ranked well in the results list.
Strictly themed websites can be more effective on the results list than websites that deal with general topics.
So, if you publish articles in a wider area, you need to add a links from widest areas as possible so that your website can grow uniformly across all topics, otherwise you may even worsen your relevance by trying to cover a wider topic. This can later result in some of your keywords being able to appear very high in the results list, while your less relevant content may be ranked back in a roughly proportionate way, meaning some legs in the amoeba will extend while others will retract.
In summary the older PageRank-based evaluation did not disappear completely, so of course long-established techniques remained important than link building, but meeting community needs added two factors to the process that made the current system much more traceable of the amoeba form.
Is placement improvement available without link building?
According to the Visitorgens study and special service it is possible for the largest number of searches made by users with the most diverse profiles are willing to visit and spend a significant amount of time on your website, this will make it possible to recommend the website to a growing number of relevant sites from Google’s point of view. So as a website improves with the CTR, bounce rate, and average session time, it can get into more and more relevancy sets and move up in those sets, which also helps it permanently gain more rankings in the search results for relevant keywords. We provide our new visitors with quality content that they find valuable during the trigger period of Google’s A / B testing process.
In summary
It’s time to think in a much more complex way about our successful SEO strategy than the simple straight method of link building. If you are interested in a full study with deeper technical details, visit visitorgen.com where you can find more.
Be the first to comment