Shouldn't Lower Bounce Rate Correlate into Greater Click Thru Rate for a Web Site?
-
Greetings:
I run a real estate web site in New York City with about 650 pages out of which 330 are property listing pages. About 250 of those listing pages contain less than 150 words of content.
In late August I set about 250 of the listing pages that generated the least traffic (generally corresponding to those with the least content) to "no-index, follow". Now Google has removed those pages from their index.
The overall bounce rate for the site has been reduced from about 69% to about 64% since the removal of these low quality listing pages.
However the click thru rate has not improved and is stuck at about 2.2 pages per visitor.
Shouldn't the click thru rate improve if the bounce rate goes own? Am I missing something?
Also, is a lower bounce rate something that Google will take into account when calculating rank?
Thanks, Alan
-
Good idea, however according to Google Webmaster Tools under Google Index>Index Status the number of indexed pages has been dropping. It is down by 120 which is about half the 250 which we have set to "no-index, follow" on August 20th. I suspect it may be down a bit more as the results on Webmaster Tools may lag a bit.
I just can't explain why the pages views per visitor has not increased if the bounce rate is down. If the bounce rate has decreased from about 69% in August to 63% in September which means that 37% of visitors are staying on the site instead of 31% which is significant improvement (about 18%). I would think this would translate into more page views per visitor. But it has not. Pages views per session was 2.38 in August and 2.18 in September. This seems impossible.
Thanks, Alan
-
Hey Alan,
You said that you made these changes in late August. Could it be that Google hasn't updated this number for you in the one week ish amount of time it has been since you made the changes? It seems odd that the number would stay exactly the same.
-
Hi Alan,
I assume you mean pageviews per visit rather than click through rate, since you mentioned 2.2 pages/visit.
Pageviews per visit is the average number of pages viewed per session, while bounce rate is single page visits.
Normally, the answer is yes where lower bounce rates would usually correlate with higher average pageviews per visit, however the correlation is not that high since there are many factors included over here (Sources of traffic, landing pages, search keywords, and the list goes on).
So the assumption here is that you would be getting the same quality of visitors, however with decreasing single page visits which would normally increase the pageviews per visit. However bounce rate moving from 69% to 64% is not that big of a difference, and I am not sure what the sample size is for these visits.
I would recommend that you check landing pages with high pageview per visit and start focusing your marketing efforts there and you should find an increase in average pageviews/visit.
With regards to bounce rate affecting rankings, well this only applies for bounce rate you get from organic traffic, since google can not actually determine your overall website bounce rate (or atleast they claim they don't use analytics data), so make sure that your top organic landing pages are well optimized for their target terms with proper call to actions to avoid bounces over there.
Hope this was helpful.
Have a great day,
Moe
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why some pages show schema and some don't in Google?
I notice Google displays the schema(reviews, price, availability etc.) in results only for some of our item pages in same category using same template. Any ideas why this is happening. They are created around same time - more than a year ago. Schema was also added a year ago.
Intermediate & Advanced SEO | | rbai0 -
Don't affiliate programs have an unfair impact on a company's ability to compete with bigger businesses?
So many coupon sites and other websites these days will only link to your website if you have a relationship with Commission Junction or one of the other large affiliate networks. It seems to me that links on these sites are really unfair as they allow businesses with deep pockets to acquire links unequitably. To me it seems like these are "paid links", as the average website cannot afford the cost of running an affiliate program. Even worse, the only reason why these businesses are earning a link is because they have an affiliate program; that to me should violate some sort of Google rule about types and values of links. The existence of an affiliate program as the only reason for earning a link is preposterous. It's just as bad as paid link directories that have no editorial standards. I realize the affiliate links are wrapped in CJ's code, so that mush diminish the value of the link, but there is still tons of good value in having the brand linked to from these high authority sites.
Intermediate & Advanced SEO | | williamelward0 -
What NAP format do I use if the USPS can't even find my client's address?
My client has a site already listed on Google+Local under "5208 N 1st St". He has some other NAPs, e.g., YellowPages, under "5208 N First Street". The USPS finds neither of these, nor any variation that I can possibly think of! Which is better? Do I just take the one that Google has accepted and make all the others like it as best I can? And doesn't it matter that the USPS doesn't even recognize the thing? Or no? Local SEO wizards, thanks in advance for your guidance!
Intermediate & Advanced SEO | | rayvensoft0 -
I have a general site for my insurance agency. Should I create niche sites too?
I work with several insurance agencies and I get this questions several times each month. Most agencies offer personal and business insurance and in a certain geographic location. I recommend creating a quality general agency site but would they have more success creating other nice sites as well? For example, a niche site about home insurance and one about auto insurance. What would your recommendation be?
Intermediate & Advanced SEO | | lagunaitech1 -
I run an (unusual) clothing company. And I'm about to set up a version of our existing site for kids. Should I use a different domain? Or keep the current root domain?
Hello. I have a burning question which I have been trying to answer for a while. I keep getting conflicting answers and I could really do with your help. I currently run an animal fancy dress (onesie) company in the UK called Kigu through the domain www.kigu.co.uk. We're the exclusive distributor for a supplier of Japanese animal costumes and we've been selling directly through this domain for about 3 years. We rank well across most of our key words and get about 2000 hits each day. We're about to start selling a Kids range - miniature versions of the same costumes. We're planning on doing this through a different domain which is currently live - www.kigu-kids.co.uk. It' been live for about 3-4 weeks. The idea behind keeping them on separate domains is that it is a different target market and we could promote the Kids site separately without having to bring people through the adult site. We want to keep the adult site (or at least the homepage) relatively free from anything kiddy as we promote fancy dress events in nightclubs and at festivals for over 18s (don't worry, nothing kinky) and we wouldn't want to confuse that message. I've since been advised by an expert in the field that that we should set up a redirect from www.kigu-kids.co.uk and house the kids website under www.kigu.co.uk/kids as this will be better from an SEO perspective and if we don't we'll only be competing with ourselves. Are we making a big mistake by not using the same root domain for both thus getting the most of the link juice for the kids site? And if we do decide to switch to have the domain as www.kigu.co.uk/kids, is it a mistake to still promote the www.kigu-kids.co.uk (redirecting) as our domain online? Would these be wasted links? Or would we still see the benefit? Is it better to combine or is two websites better than one? Any help and advice would be much appreciated. Tom.
Intermediate & Advanced SEO | | KIGUCREW0 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0 -
How to make SEF URL for PHP/MySQL web site
Hi mozzers! I'm fairly new to SEO topic, but I'm learning fast because all of you, so please take my warm thanks first! The problem: I have a web site based on PHP/MySQL that has no SEF addresses, it's made by unknown CMS, so I cannot use any extensions or modules, I have to write my own SEF extension. The question: Would you suggest me, please an article or idea, what I need to make my URLs search engine friendly? What's best to use: .htaccess or something else? This is the aforementioned web site: www.nortrak.bg Thanks a lot, Kolio
Intermediate & Advanced SEO | | kolio_kolev0