Visual Website Optimizer vs. Optimizely vs. Conductrics
-
I've always done my own A/B testing and haven't used a software package. I've heard a lot of good about Visual Website Optimizer. However, I like that Conductrics uses statistical modeling where the others don't. So what do you see as being the really big pluses?
Note: I need to find a platform so that we can do testing faster and for more clients at a time... I just don't have the time to hand build and track each client's multivariation test. Thus the need for software. Therefore, ease of use, quick integration, and good analytics are important in my decision making process.
Thanks in advance!
-
I meant anger - just short-hand for "haven't used it on anything live". I've played around with demos etc but haven't used it personally on live projects.
I believe they're doing what's called bandit optimization in the statistical literature. Fascinating area of study. I like it from a mathematical perspective - though I prefer declaring a specific winner and then implementing that version rather than always having the split-test software running and serving up variations.
It feels tidier, easier to maintain and results in a faster website...
-
Will,
Thanks for the awesome feedback - especially about the page snapshots..
"I haven't used Conductrics in anger" - not sure that you meant anger or ages but either way I would be interested to know why you haven't used it? I like the idea behind machine learning and real-time implementation, but I haven't found a lot of people that have much to say about it. On the flipside, I'm always a little nervous that the machine learning process will leave out key information that could only be noticed by a human and therefore optimize incorrectly - thus, I lean more towards Optimizely/VWO.
Thanks again!
Steve
-
I haven't used Conductrics in anger, but have recently started using Optimizely more - I really like the fact that it's all jQuery based. The big benefits of that for us are:
- Super easy point and click interface for non-technical writers to make changes
- More advanced (edit jQuery code) options for more involved tests
- Not limited to page snapshots (as I believe VWO is) - this was something that caused us issues with our XSS protection scripts and meant we couldn't use VWO effectively on distilled.net
Hope that helps - sorry I haven't got comprehensive comparison data. Optimizely has a free trial though so you can test it out for yourself (really quick to get up and running).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to capture leads from website?
Hi, We have a contact and a registration form on our website to capture leads/enquiries. However, I have come across many websites who give away many resources for free i.e. without asking for any user details. In this case, how do they track or capture the data of people coming to their website and interacting with the content? There are other methods we well like CTAs, pop-up on exit intent etc., but these are different to what I have asked above. Regards
Search Behavior | | IM_Learner0 -
Correct approach to a business website with separate content for personal and business customers
I'm laying the groundwork for a fairly involved website. The website is for a telco that caters to both residential and B2B. I was browsing the websites of the likes of Verizon, AT&T, Sprint & T-Mobile. What I saw is that they compartmentalize almost everything - all their business pages are in a business subdoman, all their investor info is in an investor subdomain and so-on. So I'm going implement this strategy on this website update. I just want to make sure that my idea makes sense and isn't a complete cluster****. I've attached a link to the mind map. Everything with "(sub)" attached to it is a subdomain. Everything else is a page at the root level of the top domain. Most of the visitors we get to the website are residential, so instead of loading a portal at first and asking if they're there for person or business reasons, I'm considering forwarding all visitors to the top-level domain to the personal.example.com site. Is this okay or would it be better to just keep the content in the top-level rather than forwarding all traffic to a subdomain? Thank you! 1JY7DWw
Search Behavior | | CucumberGroup0 -
My satellite website is ranking better than my actual site. Why and what to do?
This is our website: www.sauspiel.de and this is the satellite website. As the satellite site has a better ranking for the keyword "schafkopf" since two or three weeks we wonder how this could happen and what would be best to change it or improve the situation. Shall we take down the satellite site, redirect to the "real" site or try to downgrade? And what would you consider to be the (strongest) reason? Any suggestions? Thanks a lot for your help and assessment!
Search Behavior | | sauspiel0 -
UK & Ireland Websites displaying same page differently in serps
Hi there, I have a Magento Enterprise store based in the UK and we have an Irish version which we created after finding an increasing amount of traffic from Ireland. One thing that I don't understand, is that if I search for one keyword on Google, the websites are displayed differently, when effectively they are identical websites. Here is how a result looks like for my UK store: http://i.imgur.com/NKSt4Qq.png And here is how a result looks like for my IE store: http://i.imgur.com/Cynv8Mz.png They both have the same Meta description, barring a few geographical words and terms. Both have on page descriptions as well as products. Yet the IE results display the description and the UK results display products and a tiny snippet of the description. Any ideas on how I can make my UK pages display like the IE one? Or, just why they are both displayed differently? Thanks
Search Behavior | | tomhall900 -
Spammy website dominating SERPs! Why!?
Hey guys, I've recently noticed that a series of EMDs have been setup to completely spam an extensive set of keywords - and it seems to be working. All of the URLs are keyword targeted with tons of keyword variations. And they're getting massive ranking preference over a number of more established websites. These are just an example of some of the domains; diykitchens1.co.uk fittedkitchens1.co.uk cheapkitchens1.co.uk kitchenunits1.co.uk And then there's loads of local targeted domains such as; kitchensglasgow1.co.uk kitchensedinburgh1.co.uk Again, all of these are getting high ranking with what seems to be duplicated websites. It's pretty bizarre. Will Google penalise these sites? Surely they will?
Search Behavior | | Webrevolve0 -
2 websites for 2 dealer locations or one website for both locations - Thoughts?
I'm trying to decide what would be the best option for my client. They are a car dealership group who own 2 dealerships about a half hour away from each other. The dealerships have the same name but are just located in different locations. One dealership is in a small city in competition with several other dealerships within the city. "Dealership name city name" The other dealership they own (Same dealership name) is located in a small town close to an even larger city. "Same dealership name small town name" My options are: 1. Creating 1 authoritative website optimized for all 3 locations. The 2 cities both dealerships are located in as well as the large city close to the small town. This option would be less time consuming, we would only have to earn links, citations & blog for one website. However we'd still need to have citations using both dealership addresses. So that's still double the work. This site would probably be more authoritative and we could have a page promoting each dealership & have shared vehicle inventory. We'd attach 2 Google+ pages using the different addresses & have both location addresses prominently in the footer of the site. 2. Create 2 separate websites for each dealership & target the surrounding towns/cities in their respective areas (even though both dealerships are only a half hour apart). This option is more time consuming as we'd have to earn double the amount of links. Work on citation building, blog for 2 websites etc. But we wouldn't be diluting our SEO by trying to rank for all 3 locations. We'd have a better chance if we focused on each locations separately on 2 sites. BUT the 2 sites would have less authority. What is everyone's thoughts? What would you recommend to be the best option. Money isn't an issue. Thanks so much for any help.
Search Behavior | | DCochrane0 -
How to optimize website for several US locations we service?
Hi, Our business has a few brick-and-mortar business locations, but it is servicing multiple big cities in the US, where we do not have a location, but we do business in though our independent agents (and we cannot use their locations).
Search Behavior | | SCLTeamShip
I have inherited the website, which has duplicate content for all the locations (cities and states), and I am worried about possible penalties. Every major city and state in the US has been targeted so far, but it seems pretty spammy to me- duplicate content, pages for all major US cities, pages for all states, etc. This is a B2C services website, and we can service anyone in the US. Example of pages: domain.com/services/service-from-x-city
and domain.com/services/service-from-x-state The goal is to rank locally for all the cities we are targeting. What on-page optimization should I work on besides unique content for each one? Should I consolidate some pages, and if yes, what do you recommend?
What overall strategies should I follow so I do not lose the traffic for the targeted cities?
Off-page, I am working on building local citations for these cities. Thank you.0 -
Decline in engagement metrics, due to nav changes vs. content changes
With improvements in our rankings, we are seeing adverse changes in our measures of engagement. My gut reaction is to believe we are attracting more unqualified traffic, thus higher bounce rates, declines in pages/visit and time on site (approx 15%, 15%, 25%, respectively). While recent improvements in navigation might have contributed to these engagement declines, do you have any suggestions how best to determine whether these declines are due to nav changes vs. due to copy/content issues? There's been no change in copy content during this period. Thanks.
Search Behavior | | ahw0