Discrepancy between Google analytics and Alexa
-
I understand that the data on Google Analytics and Alexa would probably differ a little bit, but in our case, we see a substantial difference between the reporting parameters between these two. For example, the Alexa data shows that the number of page views per visit is approximately 11 pages whereas Google gives a figure closer to 5 pages/visit. The avg time spent on a visit is approximately 15 minutes on Alexa but closer to 4 mins on Google Analytics!
Any inputs on why this huge discrepancy would exist?
TIA
Asif
-
I'll 2nd what Martijn is saying, Alexa can be hugely inaccurate, and the smaller your site is (traffic wise) the more inaccurate it is. If you have Alexa installed on your browser then you are boosting the stats on alexa as you spend more time on your site than anyone else.
I only use Alexa an odd time to get a very rough idea of competitors traffic.
-
If you ask me, I would say Alexa and GA are not slightly different but in many cases you will see major differences… more then you are facing for your websites.
I believe you should relay on your GA data as they give the exact information where as Alexa usually give a guess version of it.
Hope this helps!
-
Hi Asif,
It's fairly simple, Alexa is using a really low sample of users to find out how many visitors you have. People using the Alexa toolbar are measured as they visit pages. They probably visit by accident more pages on your site and as the data is extrapolated that would be the reason the data is so off. In my experience Alexa visitor data is a good guideline to see how much people visited the site compared to others but for high traffic sites the data is off by millions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
Google for Jobs: how to deal with third-party sites that appear instead of your own?
We have shared our company's job postings on several third-party websites, including The Muse, as well as putting the job postings on our own website. Our site and The Muse have about the same schema markup except for these differences: The Muse...
Local Website Optimization | | Kevin_P
• Lists Experience Requirements
• Uses HTML in the description with tags and other markup (our website just has plain text)
• Has a Name in JobPosting
• URL is specific to the position (our website's URL just goes to the homepage)
• Has a logo URL for Organization When you type the exact job posting's title into Google, The Muse posting shows up in Google for Jobs--not our website's duplicate copy. The only way to see our website's job posting is to type in the exact job title plus "site:http://www.oursite.com". What is a good approach for getting our website's posting to be the priority in Google for Jobs? Do we need to remove postings from third-party sites? Structure them differently? Do organic factors affect which version of the job posting is shown, and if so, can I assume that our site will face challenges outranking a big third-party site?1 -
Google my business - Image sizes
I have scoured the web in order to find a guide that would give me the ideal dimensions for images to populate google my business page... in vain. Google itself is very vague about it as indicated below Format: JPG, PNG, TIFF, BMP Size: Between 10 KB and 5 MB Minimum resolution: 250px tall, 250px wide Does anyone know of a guide with optimum recommendation for each photo (profile, Cover photo, business specific photos...) or alternatively can recommend the exact size needed. Thanks
Local Website Optimization | | coolhandluc0 -
Google plus page multiple domains
Hi I have had a .com domain for many years linked to my google plus page and local verified to my UK office address. This site sells and advertises my products, some of them are uk only like the school and computers I sell and the rest are digital and world wide. I decided to start a .co.uk domain to be more targeted to the uk and advertise only the school and computers which I sell to the uk and just link to the .com for digital products. I want the .com domain to attract world wide customers and the .co.uk for uk customers. What do I do, does it make sense to connect my google plus business page to the .co.uk site? Should I still have a google plus page for the .com site? I only have 1 office and thats in the uk. Not sure what to do here. I dont want to lose rankings or do anything negative. Thoughts? Thanks.
Local Website Optimization | | theindic0 -
Google Structured Data Verses Yandex Structured Data Validator Query
Hi All, We have implemented Schema.org on our website and we have chosen a specific schema as opposed to just using the standard localbusiness. When we ran it through the Google Structured data tool it did not report any error , however , when we tried it on Yandex, it showed up as us having problems with the way we have tagged our addresses for our different locations so we have made additional changes to fix this. I have read somewhere that the Google Structured data tool is not 100% correct at showing any errors etc and that one should use Yandex as well for validation. I am wondering what others thoughts and if what I read should be taken as correct infomation?... I would be surprised if google did release something like the structured data validator if it wasn't as good at reporting than some others out there. thanks Pete
Local Website Optimization | | PeteC120 -
Where does analytics pull information from for general keyphrases that do not list a city? Ex: Restaurants, Playgrounds, Librarys
I have heard that when doing a general search, the search engine will pull the results based on IP address. But what if that information is not available? Where does analytics pull that keyword information if there is no location associated with the keyphrase?
Local Website Optimization | | seomozinator0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0 -
Site does not rank on Google's country specific search engines.
My site shows up on the first page of 'google.com' but not on the other search engines like google.co.uk / google.co.in / google.com.au. It shows up on the 3rd or 4th page for the most part. My competitors' sites rank consistently across all geographical versions of Google. Is there something i am missing out on? My website is a web applicaton and not a business listing.
Local Website Optimization | | dlsound0