The SERP Bounce: the most underestimated positioning criterion?
The majority of articles dealing with positioning criteria do not even mention it, and yet, in my opinion, this is one of the most important positioning criteria and the sine qua non condition for the correct positioning of a page on search engines. .
The SERP Bounce
The SERP Bounce is the percentage of users who will click on a result present on the results page of a given query, then return to the same results page to choose another. The lower it is, the better you will be positioned.
The opposite could be the satisfaction rate of the result for a query, corresponding to the percentage of Internet users who will type this request and not return to the results page (the user is satisfied). The higher it is, the better you will be positioned.
Why would Google use the SERP Bounce in its algorithm?
Why not just use the bounce rate and visit time?
The SERP Bounce goes much further than the bounce rate, because it incorporates other dimensions that make it more relevant to classify results: the query typed by the user and the landing page.
Your page may be relevant and have a large bounce rate. This is the case of cooking recipes sites or music speech sites. On the other hand if someone searches the lyrics of a music, clicks on the first result, returns to the results page, clicks on the second result because he did not find what he was looking for (a video for sing along with music for example) and that the majority of Internet users do like him, the second result is very likely to pass in front, without additional link or internal optimization.
In the same way, if I search for the ingredients of the recipe for French toast, I will stay a few seconds on the recipe site and that does not detract from the relevance of the page.
So take a step back from these metrics that depend on each topic, each page, the path of the user before finding this page and the intention of the user.
This is an effective relevance criterion
Let’s do a survey. Put yourself in Google’s place, whose second goal after selling advertising, is to post the most relevant results in order to sell even more advertising in the long run.
In which order would you position these 5 pages with popularity and equal authority (in theory) on the request [motorcycle jacket]:
- The home page of a site that sells nothing directly but redirects users to Amazon and Ebay via affiliate links, with content optimized for keywords but that does not really help the user in his search motorcycle jacket
- The homepage of a guide / comparative of different types of motorcycle jackets and referring to different e-commerce sites via affiliate links
- The homepage of an e-commerce site offering all types of jackets (leather, synthetic) at all prices
- The homepage of an e-commerce website offering only high-end leather jackets
- A Wikipedia page on motorcycle jackets
I have not done a specific survey of bikers but I will bet that the order of these pages classified by SERP Bounce in an increasing way would be the following:
- The # 3, because it responds to the majority of Internet users who intend to buy (and it is a request with strong intention to purchase)
- The # 2 because it answers the majority of Internet users who would not have decided yet
- The # 1 below the other two because Google has no interest in positioning a site with no added value increasing the click of a click the course of the user
- The # 4 because it responds to a minority segment of Internet users typing this query
- The # 5 because despite the authority of Wikipedia, it is unlikely that the user is interested in encyclopedic information on motorcycle jackets …
It is a criterion of simple and inexpensive relevance
Technically, the recovery of these behavioral data is very simple and inexpensive compared to the cost associated with the development of an artificial intelligence. Google does not need to suck and peel all Chrome traffic to calculate it. And if it does, it’s for advertising profiling purpose, let’s stop the parano SEO two minutes!
Google can not rely solely on this criterion
If this criterion is so magical as that, one would wonder why Google would not drop its other positioning criteria such as HTTPS (lol) or inbound links, a criterion that it has so much trouble to stop manipulation?
Some elements of answer:
- The site must be a minimum optimized for Google to understand what he is talking about.
- Google must consider the site, one or two links directories are not enough to give him enough authority for Google to consider a minimum.
- It must have a minimum of behavioral data to compute the Bounce SERP, and its results must still be relevant to queries that are typed for the first time or not typed.
- Finally, when two sites are of equivalent quality, other criteria come into account: the quality of the links (authority) and the number (popularity) for example.
The question that everybody asks then: is it manipulable?
I will tend to say that yes, it is manipulable, but more complex and less legal (use of botnets) than links.
One could imagine that to guard against it, Google only collects the data of connected users with a natural activity (receiving emails, sending emails, regular navigation). It would avoid the majority of attempts to manipulate.
The simplest thing is still to influence it rather than try to manipulate it by surpassing its competitors (call-to-action that makes you want, content of better quality, similar content to consult, answers to the needs of all Internet users, etc.). This is of real interest since these changes also positively impact your business.
This explains that…
Why you can not clean up your e-reputation
SERP BOUNCE! Who is not fond of sensational stories about a person or company whose name is being searched on Google? Nobody ! And that’s why these pages sometimes remain very long, despite a hard work of e-reputation.
The solution ? Consolidate in advance the first page of results on your name with strong pages in authority, so that a simple negative article never arrives on the first page. If the news is repeated on several newspapers, it is a job lost in advance with the premium that these sites receive when Google detects a wave of news for a given topic.
Why is LinkedIn generally better positioned than Viadeo on a person’s name? (totally arbitrary finding)
SERP BOUNCE! You have to register to see someone’s Viadeo profile, unlike LinkedIn who displays at least some of the information … including the photo of the person. The latter therefore satisfies the majority of Internet users. Two simple results to decide for Google!
Why Google is better than Bing
If we forget the fact that Bing is published by Microsoft, the main reason why Google is better than Bing and any other engine is its overwhelming market share thanks to which it can accumulate a lot more user behavior data. and thus better exploit them to classify the results in a more precise and relevant way.
Let’s break some myths!
No, Google does not use data from Google Analytics or Chrome for its algorithm.
Not all sites use Google Analytics and calculating the Bounce SERP without this data is immensely simpler and less expensive. In addition, manipulating data from Google Analytics is a breeze.
Yes, you can position yourself with unnatural links.
What interest for Google to penalize a site more relevant than another? The final result is not to satisfy the user? Unless he wants to make an example of it, it is not really in his interest. And it is also why the big sanctioned sites come back in the good graces a few weeks after the penalty, despite a summary cleaning links. I am thinking in particular of the RapGenius case, a site of lyrics clearly better than the others (meaning of the lyrics, comments, videos, general user experience) that was penalized because a case of artificial link had become public and that Google had to act for appearances. The site was back 10 days after..
No, good links and a good internal structure are not enough.
Votre page doit répondre exactement à la requête de l’internaute. Par exemple, certains plombiers travaillant dans un ou deux arrondissements de Paris vont s’obstiner, à coup de milliers d’euros de SEO dans le vent, à essayer de se positionner sur la requête [plombier paris] alors qu’il est plus logique d’afficher prioritairement des societes intervenant dans tous les arrondissements ou des annuaires. Ce n’est pas ce qui manque.
Google is a robot, and will remain a robot. Certainly, it can detect quite easily when a text is duplicated or automatically generated and you downgrade it for that, but the world’s best link and a 2000 word text written by the best editor of the universe will have no effect if you do not do not respond to what the majority of Internet users are looking for. He understands the interest of your texts by analyzing the behavior of the user.
No, the number of clicks does not affect your positioning.
Do not make me say something I did not say! It’s the quality of the clicks that counts, not the quantity. Like the links actually. The result of this study of Moz is to be taken with tweezers, the majority of the participants having certainly clicked on the result requested.
No, loading time is not just about improving the number of crawled pages.
It’s secondary. The loading time mainly influences the user experience, as shown in this KissMetrics infographic . If your site is too slow, users will see another result and eventually it will negatively influence your positioning.
How to observe this phenomenon?
I observe this phenomenon since late 2011, the use of this criterion is particularly obvious in the following cases:
Daily fluctuations on “mark” queries
Did you think everyone would stop at the first result on a query containing the name of a site (“zalando”, “badoo”, etc.)? You are mistaken, even if the click rate is down following the passage of these pages to 7 results and the democratization of mega-sitelinks, a significant part of the traffic comes to sites under the official site. SERP Bounce is however catastrophic (unless the result is “shock” or “negative”), Google does not know what result to display. It is not uncommon to see the results of these queries change completely from one day to the next, with new pages being pushed overnight, as if Google could not do anything with this user data and was trying new ones results in order to find a more relevant one.
The arrival of a disruptive result
Does a start-up hustle a market by dividing prices by two, or by approaching a market differently? If its site is positioned quickly on competitive queries, it is certainly not because of a hypothetical “premium freshness”. It’s just because what he’s offering is better than his competitors.
A concrete example
There was a time when I was editing dating sites. I had the idea to insert a free chat and without registration on all pages with a base common to all sites for more activity. Overnight, the average duration of a visit doubled, the bounce rate dropped sharply (and SERP Bounce has certainly dropped) and positions have risen sharply.
All about small seo tools
How to get google seo certification
About keyword revealer
All about prepostseo
What is PBN
Best SEO services firm in Bangalore
Advantages Of Digital Marketing Over Offline Marketing