A/B Tests: How To Verify Difference In Average Order Value?
-
Hi there! When the data from an A/B test shows a difference in AOV between the variants, how do you determine if this difference is statistically probable or not? Best Regards, Martin
-
Thank you, David.
-
Perhaps this will get your on the right track, from http://flintanalytics.com/how-to-determine-if-you-should-advertise-higher-priced-goods-to-mac-users/
Average Order Value Statistical Significance
To do this you need to download the transaction data for both variants, because you need to be able to determine the variance in order value. You can download this data by going to conversions > ecommerce > transactions in Google Analytics. Once there, Google only lets you export 500 rows of transactions at one time, so to keep it simple I ensured my date ranges were under 500 transactions.
Once you get the order values for each set of customers in Excel you need to perform a t-test so that you can compare the means of the two groups. This t-test will give you the p-value that you will use to determine if the difference in average order values between the groups is significant. Yout formula in excel should look like the following:
=t.test(array of group 1, array of group 2,two-tailed distribution, two-sample equal variance homoscedastic)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to reverse a G algo update (Penguin/Panda)?
...if yes, how? Can you share resources / blogs / etc... I want to reverse my site's rankings. Here's the gist of it: I recently purchased a website that has 600+ pieces of aged content on it. Domain was ranking great about 10 years ago (1M uniques a year) It apparently got hit by a G algo update in 2012/2013 (Penguin and Panda?), because the rankings have tanked (10 hits a day) In the past two years, the previosu owner published about 100+ off-topic blog posts and it appears been using the site as a PBN. The UX sucks and there's a ton of 404s. (NOTE: I am in the process of removing that content and have cleaned up the 404s). Domain stats: 20+ years old (1998) and DA 32, linking domains 850+, inbound links of 16k+ What I've done: disavow (550 domains), fix all the 404s What I'm doing / about to do: remove spammy content write new/fresh on-topic content update the site UX start a backlink building campaign My questions: is it common to bounce back from a G algo update? is it hard / am I over my head / am I a sucker trying to get the site back alive? are there articles about bouncing back that you can share so I can learn more about this process? Or agencies / consultants, etc that you recommend? what other recommendations / suggestions do you have that would help reverse this 8-year-old penalty?
Reporting & Analytics | | seo.owl0 -
Cross-domain / subdomain tracking in GA?
Hi there, My client has a site website.com and a booking engine, booking.website.com They are currently tracking the main site and the booking subdomain as two separate properties in the same GA account. The issue is we can't see where users are originating on the subdomain property; it's all being counted as referral. My understanding is we need to set up subdomain tracking using Google Tag Manager in order for GA to pass the user data between the two subdomains. This is fine, except for this one line I am reading on Google's guide to cross-domain tracking: Subdomains If you have updated your tracking code to analytics.js, then no additional configuration is required to track subdomains. You can use cross domain tracking to collect data from a primary domain, like www.example.com, and a subdomain, like www.subdomain.example.com, in a single Analytics account property. That last line makes it sound like we should be using cross-domain tracking for this purpose. Are we correct in setting up subdomain tracking and NOT cross-domain tracking to be able to track users across subdomains on the same domain?
Reporting & Analytics | | FPD_NYC0 -
In google analytic for google /cpc it is showing url with 404 which even not exists in my database
Hello All, In google analytic for google /cpc it is showing url with 404 which even not exists in my database that also more than 300 per day. How can it is possible? it is showing /black-friday-offers but I don't have such page. Thanks!
Reporting & Analytics | | pragnesh96390 -
Google Analytics Average Position
I'm looking at Google Analytics -> Acquisition -> Search Engine Optimization -> Queries reports. I'm looking at keywords and the average position. What Google reports and what I see in a Google incognito search is different (usually my search is much lower). For example, for one search term, Google reports 5.8 average position and every time I search it is 8. My local result is 4. Anyone know why this is? I'm wondering if Google is averaging the Local results into number?
Reporting & Analytics | | CalicoKitty20000 -
Index.php and /
Hello, We have a php system and in the MOZ error report our index.php shows up as a duplicate for / (home page). I instituted a rel canonical on the index.php because the / gets better rank than the other. This said, the error report through MOZ still shows them as duplicates. Should I be using a 301 instead? Please help! Also, I would love a good technical SEO book (for bridging the gap between SEO and programmer) if someone can recommend one? Thanks in advance!
Reporting & Analytics | | lfrazer0 -
How come de results from the old Google Keyword Tool are so different from the results from the keyword planner?
When I insert the keyword 'IT jobs' with Belgium as country and Dutch as the language, there is a huge difference between the results in the keyword planner and the keyword tool. Keyword tool says 1,000,000 local searches per month Keyword planner says 590 local searches per month This is a really big difference and I don't know which results I should trust. Can anybody help me with this?
Reporting & Analytics | | Murielleu0 -
Any harm and why the differences - multiple versions of same site in WMT
In Google Webmaster Tools we have set up: ourdomain.co.nz
Reporting & Analytics | | zingseo
ourdomain.co.uk
ourdomain.com
ourdomain.com.au
www.ourdomain.co.nz
www.ourdomain.co.uk
www.ourdomain.com
www.ourdomain.com.au
https://www.ourdomain.co.nz
https://www.ourdomain.co.uk
https://www.ourdomain.com
https://www.ourdomain.com.au As you can imagine, this gets confusing and hard to manage. We are wondering whether having all these domains set up in WMT could be doing any damage? Here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 it says: "If you see a message that your site is not indexed, it may be because it is indexed under a different domain. For example, if you receive a message that http://example.com is not indexed, make sure that you've also added http://www.example.com to your account (or vice versa), and check the data for that site." The above quote suggests that there is no harm in having several versions of a site set up in WMT, however the article then goes on to say: "Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead." This suggests that having multiple versions of the site loaded in WMT may cause Google to continue crawling multiple versions instead of only crawling the desired versions (https://www.ourdomain.com + .co.nz, .co.uk, .com.au). However, even if Google does crawl any URLs on the non https versions of the site (ie ourdomain.com or www.ourdomain.com), these 301 to https://www.ourdomain.com anyway... so shouldn't that mean that google effectively can not crawl any non https://www versions (if it tries to they redirect)? If that was the case, you'd expect that the ourdomain.com and www.ourdomain.com versions would show no pages indexed in WMT, however the oposite is true. The ourdomain.com and www.ourdomain.com versions have plenty of pages indexed but the https versions have no data under Index Status section of WMT, but rather have this message instead: Data for https://www.ourdomain.com/ is not available. Please try a site with http:// protocol: http://www.ourdomain.com/. This is a problem as it means that we can't delete these profiles from our WMT account. Any thoughts on the above would be welcome. As an aside, it seems like WMT is picking up on the 301 redirects from all ourdomain.com or www.ourdomain.com domains at least with links - No ourdomain.com or www.ourdomain.com URLs are registering any links in WMT, suggesting that Google is seeing all links pointing to URLs on these domains as 301ing to https://www.ourdomain.com ... which is good, but again means we now can't delete https://www.ourdomain.com either, so we are stuck with 12 profiles in WMT... what a pain.... Thanks for taking the time to read the above, quite complicated, sorry!! Would love any thoughts...0 -
Differences in organic search visits and non-paid keyword visits
Hi folks, I was just wondering at the disparity between the "Organic Search Visits" total in the Traffic Data tab and the total visits from "Non Paid Keywords Sending Search Visits". Once you add up all the traffic generated from the individual keywords (including the not provided numbers) shouldn't the total match the number of organic search results? Or is there something I'm missing? At the moment the total visits from non-paid keywords is about 2,000 short of the total organic search visits.
Reporting & Analytics | | BrettCollins0