Bounce Back or Bounce Through
-
Bounce rate is defined as 'single page visits to a site divided by total visits to the site' as I understand it. It could be argued that a well designed site might vector people on to other sites effectively (I generally use Wikipedia this way for instance). On the other hand a site that bounces people back to where they came from may be genuinely poor. So the questions:
Is the bounce rate really calculated in the stated way by Google?
Is it used, as far as we know, as a metric for the search engine?
What should we do to mitigate the effects of this poor metric?!
thanks,
Mike
-
Actually, bounce rate would be of a concern to search engines, at least for visits that originate from the search engine. The SEs want the users to have a good experience, and if a user clicks on a result and then comes right back to the results page, the SEs may feel that the user did not have a good experience with that result and maybe a different result for that query should be shown.
-
Thanks, yes, it looks from this as if the experts think that Google is doing what we would hope they do and not take account of bounce through. Although of course there may be good reasons for a site not wanting bounce through either (as EGOL notes), it shouldn't be a concern for the search engines
-
As far as I'm aware, Google will use your 'bounce back' rate (where by users return to the search results page straight away) as a search metric as this could indicate whether the site is relevant for that specific search query. This was mentioned in the 2011 SEO Ranking Factors Report.
Hope that helps
-
If search engines are using this data they are certainly only using it for sites competing for the same or similar keywords.
A high bounce rate can be bad or it can be "normal". It would be bad if your site is offensive (and people run away), it can be bad if your site has irrelevant content for the query, it can be bad if your site has thin content, you can probably think of more.
It can be normal if you have a dictionary site and the searcher finds the word, gets the definition and leaves happily.
THE IMPORTANT THING TO DO..... I believe that everyone should be working to reduce their bounce rate and any webmaster should be able to find improvements.
The best way to do it is to have relevant links, obviously placed on every page. For example in the dictionary site your goal should be have linked words within the definition, links to related words adjacent to the definition and links to a few enticing articles along the side.
On an article site you can links within the text to related articles, a "recommended" box of links beside the article and even a few enticing links to "popular" or "related" articles where every one will see them.
Try to reduce your bounce rate by improving your site and making your relevant content visible on every page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do the back-links go wasted when anchor text or context content doesn't match with page content?
Hi Community, I have seen number of back-links where the content in that link is not matching with page content. Like page A linking to page B, but content is not really relevant beside brand name. Like page with "vertigo tiles" linked to page about "vertigo paints" where "vertigo" is brand name. Will these kind of back-links completely get wasted? I have also found some broken links which I'm planning to redirect to existing pages just to reclaim the back-links even though the content relevancy is not much beside brand name. Are these back-links are beneficial or not? Thanks
Algorithm Updates | | vtmoz0 -
Google search console: 404 and soft 404 without any back-links. Redirect needed?
Hi Moz community, We can see the 404 and soft 404 errors in Google web masters. Usually these are non-existing pages which are found somewhere on internet by Google. I can see some of these reported URLs don't have any back-links (checked on ahrefs tool). Do we need to redirect each and every link reported here or ignore or marked to be fixed? Thanks
Algorithm Updates | | vtmoz0 -
Any disadvantages of employing additional images which open in new window? Will it impact bounce rate and rankings?
Hi all, Our website is all about our software product. Generally our website pages are filled with 3 to 6 screenshots of our product features. As Google recently shifting to mobile index and pageload speed is going to be priority, we decided to compress the images on our pages and show the same images of large size in new window when someone clicks on a image. I wonder if this helps or has any disadvantages? Users may click on these clickable images while browsing the pages and may shift to new window to view the image. Will this have any negative impact on bounce rate? Please share your thoughts. Thanks
Algorithm Updates | | vtmoz0 -
Our Journey back to Good Rankings.
17 year old support site on the topic of hair loss. The home page (and pretty much all internal pages) enjoyed Page 1 Place 1 ranking out of 64 million search results for 12 of those years, for our main search phrase: hair loss. Other internal pages ranked #1 for other search phrases. I believe we were blessed by Google because we did everything the best we could: Genuine, manually constructed, unique, relevant content that was created from the heart. Other generalized health sites linked-to our site for more information on hair loss, and we had a couple thousand back-links that we never had to pay for. For the last 7 years or so, core content and news center went stagnant, but user-driven content (discussion forums) continued chugging along. Very old CMS systems had created duplicate content (print pages, PDF pages, share pages) and the site was not mobile-friendly at all. By the end of 2013, our home page had been bumped to the middle of Page 2 for "hair loss" as Google began pushing us down. Replacing our 700 page site dedicated to the topic of hair loss with random news articles, and dermatology organization sites that had little more than a paragraph of content on the topic. Traffic and income dropped by over 75% with this change, and by 2015 we were looking at a 9 year old site design that wasn't mobile-friendly, and had no updated content outside of the Forums for about as long. Mid 2015 we began a frantic renovation. The store was converted to a mobile-friendly design, tossed into HTTPS, and our developer screwed up, forgetting to put canonicals in place. Soon after, our store rankings dropped to almost zero. By the end of 2015 this was fixed, and we were spending tens of thousands to convert a very large, very old site into WordPress with a responsive, mobile friendly, lightning fast page-load design. We had no Google Analytics data prior to this either. Actions Taken starting Jan 1, 2016 - May 2016: Static Homepage + core content > Now put into WordPress. (80 pages) - proper 301's. News section running a 10 year old "PostNuke" CMS > Now put into WordPress. (300 pages). 301's. Forums running a 5 year old vBulletin > Now put into XenForo. (160,000 pages). 301's. Profiles section running a 10 year old "SocialEngine" CMS > Now put into new SocialEngine. (10,000 pages)* Site moved from HTTP > HTTPS. Proper 301's. Store CMS already finished months prior but sales dropped by 90%. Almost zero. Old forum CMS had created countless duplicate URLs. All of these 410'd. Old forum CMS had 65,000 pointless member profile pages indexed. All 410'd. Old news CMS created 4+ dup pages for every article (print, etc). All 301'd to new Article URL. Our HTACCESS file is thousands of lines long, trying to clean everything up, and redirect everything back to one, accurate, proper URL for each piece of content. It was a lot of work! After 17 years, we obviously had spammy sites linking to us. I quickly deleted content on my site the worst offenders were linking to. Then hired an SEO person to create a disavow audit on the other 20,000 sites liking to us. He settled on around 300 URLs needing disavow, but commented that didn't see any evidence we'd been penalized by Panda. He finished Friday and we will submit disavow Monday. Ran Screaming Frog audit on the site Cleaned up Google Search Console fully Created properties and submitted new sitemaps there. Monitored each property for the last 3 months and addressed 100% of issues raised. Revived Facebook, Twitter, Google+, Pinterest, and Instagram Accounts. Began publishing new content in our /news/ section and cross-posting to Social Media. Began improving up our Title Tags in the Forums as they often were pointless: "Hi! Need help!?" **Despite this, nothing has helped. Nothing has budged. Our traffic hasn't moved an inch since January. Sales have dropped 90% and site income has almost dried up. ** I have taken out a $25,000 personal loan just to cover my mortgage and pay my bills while I attempt to identify what's going wrong, and how to fix it. It bought me about 3 months, and that 3 months is almost up. I hired 2 or 3 different SEO experts with varying levels of experience. Due to no Google Analytics data to draw on, none of them could come up with any specific explanations for our drop in ranking over the last 4 years. That's why I took the approach to just "do everything" to fix all problems identified, and then cross my fingers. It hasn't worked. As of today our home page is not even found in google for our main search phrase: hair loss. Its simply not there. At all. And the only thing that is ranking is our forums, ranked at "67", which is horrible. But I don't understand why a site that was doing so well for over a decade has now been completely dropped from Google, without a single notice in Console or otherwise, explaining any problems. I realize this is a massive undertaking, and an equally massive post. But any time you can spend helping me will be forever appreciated.
Algorithm Updates | | HLTalk0 -
Moving to https and back to http, would it it hurt?
We have redirected everything on our blog from http to https. Our blog is in a subfolder so that now it looks like this: https://ourdomain/blog; But everything else i.e. our shop continues to be on http at http://ourdomain We are wondering: 1- Does the domain authority for SEO purposes have different values for the http and the https version of a domain? 2- If yes, is there a way to check the difference in authority between the http and the https version? 3- If we do have a higher authority on our http version (as historically we have been mostly on our http), would it make sense to go back to the http for the blog to enjoy that authority too? 4- Would changing our mind and going back to http after a few months of just having moved to https from http send any negative signals to Google? Would Google care if we do a back and forth essentially? Many thanks!
Algorithm Updates | | TVape0 -
Spam Back Link Removal Problem.
I have just paid a lot of money to have spam back links removed from directories owned by the same person, the links were on pages that were set up for me without me knowing, at the end of each url is my domain name, the links have been removed on the page leaving a directory page with no other links on however the url is still there with my domain name at the end of the url and in each search box is my domain name, I have asked for the pages to be removed altogether as I did say before I paid the money I did not want my domain name on any of his directories, he has come back and said leaving my domain name in the urls is not a problem as far as Google is concerned, can anyone please advise, I can ask for a refund from PayPal, there are over 768 links on different sections of a number of directories. Thank you inadvance.
Algorithm Updates | | Palmbourne0 -
How do these people have so many external back links?
A few of the sites that I 'compete' with have blogger accounts...and according to the research I've done they have over 500 million external, followed, links! These are mom blogs....and they aren't trying to do any SEO. I've noticed that if you are using blogger (as opposed to wordpress) and adding embed youtube videos to your blog you can set it to auto populate to youtube and also get a link in return. From what I can gather this just happens with blogger accounts since bloggers...and youtube...and google are all related. Does that have something to do with the crazy amount of followed external links they are getting? They also have a domain authority of 96 (which is because blogger is a trusted domain.... Is the sub-domain a better number to look at since their blog is a sub domain on blogger? Also, does anyone know who to get the same sort of exposure in youtube as blogger blogs get...or if it's even possible? Thanks.
Algorithm Updates | | NoahsDad0 -
Yahoo/Bing cache date went back in time
Within 12 hours of submitting a new site to Yahoo/Bing webmasters it was ranking #3 for the primary homepage search term and in the top 5 for about a dozen other. On 7/23 the rankings were steady or climbing with the most recent cache date of 7/21. Now the site only comes up when searching for the domain name with a cache date of 7/11. I launched the site about 14 days ago so I am not expecting results yet but I had never seen this happen so I am just curious if anyone else had.
Algorithm Updates | | jafabel0