Bounce Back or Bounce Through
-
Bounce rate is defined as 'single page visits to a site divided by total visits to the site' as I understand it. It could be argued that a well designed site might vector people on to other sites effectively (I generally use Wikipedia this way for instance). On the other hand a site that bounces people back to where they came from may be genuinely poor. So the questions:
Is the bounce rate really calculated in the stated way by Google?
Is it used, as far as we know, as a metric for the search engine?
What should we do to mitigate the effects of this poor metric?!
thanks,
Mike
-
Actually, bounce rate would be of a concern to search engines, at least for visits that originate from the search engine. The SEs want the users to have a good experience, and if a user clicks on a result and then comes right back to the results page, the SEs may feel that the user did not have a good experience with that result and maybe a different result for that query should be shown.
-
Thanks, yes, it looks from this as if the experts think that Google is doing what we would hope they do and not take account of bounce through. Although of course there may be good reasons for a site not wanting bounce through either (as EGOL notes), it shouldn't be a concern for the search engines
-
As far as I'm aware, Google will use your 'bounce back' rate (where by users return to the search results page straight away) as a search metric as this could indicate whether the site is relevant for that specific search query. This was mentioned in the 2011 SEO Ranking Factors Report.
Hope that helps
-
If search engines are using this data they are certainly only using it for sites competing for the same or similar keywords.
A high bounce rate can be bad or it can be "normal". It would be bad if your site is offensive (and people run away), it can be bad if your site has irrelevant content for the query, it can be bad if your site has thin content, you can probably think of more.
It can be normal if you have a dictionary site and the searcher finds the word, gets the definition and leaves happily.
THE IMPORTANT THING TO DO..... I believe that everyone should be working to reduce their bounce rate and any webmaster should be able to find improvements.
The best way to do it is to have relevant links, obviously placed on every page. For example in the dictionary site your goal should be have linked words within the definition, links to related words adjacent to the definition and links to a few enticing articles along the side.
On an article site you can links within the text to related articles, a "recommended" box of links beside the article and even a few enticing links to "popular" or "related" articles where every one will see them.
Try to reduce your bounce rate by improving your site and making your relevant content visible on every page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase of non-relevant back-links drop page ranking?
Hi community, Let's say there is a page with 50 back-links where 40 are non-relevant back-links and only 10 are relevant in-terms of content around the link, etc....Will these non-relevant back-links impact the ranking of the page by diluting the back-link profile? Thanks
Algorithm Updates | | vtmoz0 -
Canonicals from sub-domain to main domain: How much content relevancy matters? Any back-links impact?
Hi Moz community, I have this different scenario of using canonicals to solve the duplicate content issue in our site. Our subdomain and main domain have similar landing pages of same topics with content relevancy about 50% to 70%. Both pages will be in SERP and confusing users; possibly search engine too. We would like solve this by using canonicals on subdomain pointing to main domain pages. Even our intention is to only to show main domain pages in SERP. I wonder how Google handles it? Will the canonicals will be respected with this content relevancy? What happens if they don't respect? Just ignore or penalise for trying to do this? Thanks
Algorithm Updates | | vtmoz0 -
Do the referring domains matter a lot in back-links? Google's stand?
Hi, It's a known fact about quality of back-links than quantity. Still domains are heavily different from links. Multiple domains are huge comparing to multiple links. Taking an average, how much does 'number of referring domains" boost website authority? I am not speaking about low quality domains, just number of domains including which are irrelevant to the topic or industry. Thanks
Algorithm Updates | | vtmoz0 -
Bounce rate seems to decrease as site climbs the SERP but SERP ranking always drops down after a few days - WHY ?
Like quite a few sites recently we've seen some large fluctuations for our domain the SERPs over the past 6 or so weeks. One thing ive noticed, is that when the site seems to rank higher in the SERPs we get a lower bounce rate. If the sites average across all of its main keywords in #5 the bounce rate is c. 40% (this site is a creative portfolio site, so i guess the niche has a slightly higher bounce rate than normal as you will get some people who click through an notice straight away that the style of the portfolio isnt what they where looking for) But when the sites ranking averages #2-3 the bounce rate tends to be about 25%. (The thing is that we tend to always drop back down after a fews days or so) Has any one else noticed this ?
Algorithm Updates | | jpeg801 -
Our Journey back to Good Rankings.
17 year old support site on the topic of hair loss. The home page (and pretty much all internal pages) enjoyed Page 1 Place 1 ranking out of 64 million search results for 12 of those years, for our main search phrase: hair loss. Other internal pages ranked #1 for other search phrases. I believe we were blessed by Google because we did everything the best we could: Genuine, manually constructed, unique, relevant content that was created from the heart. Other generalized health sites linked-to our site for more information on hair loss, and we had a couple thousand back-links that we never had to pay for. For the last 7 years or so, core content and news center went stagnant, but user-driven content (discussion forums) continued chugging along. Very old CMS systems had created duplicate content (print pages, PDF pages, share pages) and the site was not mobile-friendly at all. By the end of 2013, our home page had been bumped to the middle of Page 2 for "hair loss" as Google began pushing us down. Replacing our 700 page site dedicated to the topic of hair loss with random news articles, and dermatology organization sites that had little more than a paragraph of content on the topic. Traffic and income dropped by over 75% with this change, and by 2015 we were looking at a 9 year old site design that wasn't mobile-friendly, and had no updated content outside of the Forums for about as long. Mid 2015 we began a frantic renovation. The store was converted to a mobile-friendly design, tossed into HTTPS, and our developer screwed up, forgetting to put canonicals in place. Soon after, our store rankings dropped to almost zero. By the end of 2015 this was fixed, and we were spending tens of thousands to convert a very large, very old site into WordPress with a responsive, mobile friendly, lightning fast page-load design. We had no Google Analytics data prior to this either. Actions Taken starting Jan 1, 2016 - May 2016: Static Homepage + core content > Now put into WordPress. (80 pages) - proper 301's. News section running a 10 year old "PostNuke" CMS > Now put into WordPress. (300 pages). 301's. Forums running a 5 year old vBulletin > Now put into XenForo. (160,000 pages). 301's. Profiles section running a 10 year old "SocialEngine" CMS > Now put into new SocialEngine. (10,000 pages)* Site moved from HTTP > HTTPS. Proper 301's. Store CMS already finished months prior but sales dropped by 90%. Almost zero. Old forum CMS had created countless duplicate URLs. All of these 410'd. Old forum CMS had 65,000 pointless member profile pages indexed. All 410'd. Old news CMS created 4+ dup pages for every article (print, etc). All 301'd to new Article URL. Our HTACCESS file is thousands of lines long, trying to clean everything up, and redirect everything back to one, accurate, proper URL for each piece of content. It was a lot of work! After 17 years, we obviously had spammy sites linking to us. I quickly deleted content on my site the worst offenders were linking to. Then hired an SEO person to create a disavow audit on the other 20,000 sites liking to us. He settled on around 300 URLs needing disavow, but commented that didn't see any evidence we'd been penalized by Panda. He finished Friday and we will submit disavow Monday. Ran Screaming Frog audit on the site Cleaned up Google Search Console fully Created properties and submitted new sitemaps there. Monitored each property for the last 3 months and addressed 100% of issues raised. Revived Facebook, Twitter, Google+, Pinterest, and Instagram Accounts. Began publishing new content in our /news/ section and cross-posting to Social Media. Began improving up our Title Tags in the Forums as they often were pointless: "Hi! Need help!?" **Despite this, nothing has helped. Nothing has budged. Our traffic hasn't moved an inch since January. Sales have dropped 90% and site income has almost dried up. ** I have taken out a $25,000 personal loan just to cover my mortgage and pay my bills while I attempt to identify what's going wrong, and how to fix it. It bought me about 3 months, and that 3 months is almost up. I hired 2 or 3 different SEO experts with varying levels of experience. Due to no Google Analytics data to draw on, none of them could come up with any specific explanations for our drop in ranking over the last 4 years. That's why I took the approach to just "do everything" to fix all problems identified, and then cross my fingers. It hasn't worked. As of today our home page is not even found in google for our main search phrase: hair loss. Its simply not there. At all. And the only thing that is ranking is our forums, ranked at "67", which is horrible. But I don't understand why a site that was doing so well for over a decade has now been completely dropped from Google, without a single notice in Console or otherwise, explaining any problems. I realize this is a massive undertaking, and an equally massive post. But any time you can spend helping me will be forever appreciated.
Algorithm Updates | | HLTalk0 -
Yahoo/Bing cache date went back in time
Within 12 hours of submitting a new site to Yahoo/Bing webmasters it was ranking #3 for the primary homepage search term and in the top 5 for about a dozen other. On 7/23 the rankings were steady or climbing with the most recent cache date of 7/21. Now the site only comes up when searching for the domain name with a cache date of 7/11. I launched the site about 14 days ago so I am not expecting results yet but I had never seen this happen so I am just curious if anyone else had.
Algorithm Updates | | jafabel0 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0