What's the best way to A/B test new version of your website having different URL structure?
-
Hi Mozzers,
Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well.
Now, my question is, what's the best way to do a A/B test with the new version?
We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions.
Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users?
Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this.
Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
-
Hi Martijn,
Yeah, not planning to block them from robots.txt of course. By blocking, I meant reducing the crawl rate to ZERO temporarily to make sure we're not creating any URL related confusions for bots.
But, this might not be a good solution for our customers as customer might be redirecting to /new-url for the first hit, which might give him an error for in the next session.
-
Hi Nitin,
Yes, that's why I mentioned that you should block these URLs via robots.txt so bots don't even find them in the first place.
-
Hi Martijn,
Thank you so much for sharing your thoughts on this, really appreciate that.
But, the problem here is, we're planning to launch some of the new urls which aren't backward compatible. Hence, exposing them to bots isn't a good idea, they won't work for old website.
Also, if I 301 redirect the /old-url to /new-url, would need to redirect it back to /old-url if hit goes to old website. This might confuse bots.
Btw, number of URLs affected by this is, the almost whole website i.e a very large number of indexed pages.
-
Hi Nitin,
Don't change the crawl rate for an A/B test, it probably will hurt you more in the long run that it will do any good for the time being. In this case it also depends on how many URLs are affected by the change, if it's only 1 page that we will have a duplicate then I really wouldn't worry about it if it's a dynamic page with thousands of it then please make sure you will block these pages via the robots.txt so search engines won't find them in the first places.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
Moved brand's shop to a new domain. will our organic traffic recuperate?
Hello, We are a healthcare company with a strong domain authority and several thousand pages of service related content at brand.com. We've been operating an ancillary ecommerce store that sells related 3rd party products at brand.com/shop for a little over a year. We recently invested in a platform upgrade and moved our site to a new domain, brandshop.com. We implemented page-level 301 redirects including all category pages, product detail pages, canonical and non-canonical URLs, etc.. which the understanding that there would not be any loss in page rank. What we're seeing over the last 2 months is an initial dive in organic traffic, followed by a ramp-up period of if impressions (but not position) in the following weeks, another drop and we've steady at this low for the last 2 weeks. Another area that might have hurt us, the 301 redirects were implemented correctly immediately post launch (on a wednesday), but it was discovered on the following Monday that our .htaccess file had reverted to an old version without the redirect rules. For 3-4 days, all traffic was being redirected from brand.com/shop/url to brandshop.com/badurl. Can we expect to recover our organic traffic giving the launch screw up with the .htaccess file, or is it more of an issue with us separating from the brand.com domain? Thanks,
Intermediate & Advanced SEO | | eugene_p
Eugene0 -
One of my Friend's website Domain Authority is Reducing? What could be the reason?
Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https.
Intermediate & Advanced SEO | | Max_
There is another problem that his blog is on subfolder with HTTP.
So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
here is the blog URL: myfitfuel.in/mffblog/0 -
Building a product clients will integrate into their sites: What is the best way to utilize my clients' unique domain names?
I'm designing a hosted product my clients will integrate into their websites, their end users would access it via my clients' customer-facing websites. It is a product my clients pay for which provides a service to their end users, who would have to login to my product via a link provided by my clients. Most clients would choose to incorporate this link prominently on their home page and site nav.
Intermediate & Advanced SEO | | emzeegee
All clients will be in the same vertical market, so their sites will be keyword rich and related to my site.
Many may even be .org and ,edus The way I see it, there are three main ways I could set this up within the product.
I want to know which is most beneficial, or if I'm missing anything. 1: They set up a subdomain at their domain that serves content from my domain product.theirdomain.com would render content from mydomain.com's database.
product.theirdomain.com could have footer and/or other no-follow links to mydomain.com with target keywords The risk I see here is having hundreds of sites with the same target keyword linking back to my domain.
This may be the worst option, as I'm not sure about if the nofollow will help, because I know Google considers this kind of link to be a link scheme: https://support.google.com/webmasters/answer/66356?hl=en 2: They link to a subdomain on mydomain.com from their nav/site
Their nav would include an actual link to product.mydomain.com/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. 3: They link to a subdirectory on mydomain.com from their nav/site
Their nav would include an actual link to mydomain.com/product/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. In all scenarios, my marketing content would be set up around mydomain.com both as static content and a blog directory, all with SEO attractive url slugs. I'm leaning towards option 3, but would like input!0 -
URL Change Best Practice
I'm changing the url of some old pages to see if I can't get a little more organic out of them. After changing the url, and maybe title/desc tags as well, I plan to have Google fetch them. How does Google know that the old url is 301'd to the new url and the new url is not just a page of duplicate content? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
What would be best way to transition from mobile website to responsive
We have a mobile website (mobile.website.com) that mirror our desktop site (www.website.com) with +100 000 pages. We have an alternate tag on our desktop to our mobile site and a user agent detect that redirect mobile traffic to our mobile site Our mobile site is no index and has a canonical to our desktop. Everything works pretty well, the mobile website is not index and only show up in SERP when a user make a search from a mobile. Our main website is now responsive and we would like to kill our mobile site without compromising our traffic. We know that a slight speed change or content change can affect our traffic, what would be the best way to do that? Big bang: redirect all mobile URL to desktop, remove user agent detect and remove alternate tag on desktop Semi Big bang: remove user agent detect and remove alternate tag on desktop and see how the traffic react before redirecting Progressive: remove the user agent detect and the alternate tag on some section of the website to see how the traffic react Other ? Anyone has any experience with that? Thanks and let me know if anything is not clear.
Intermediate & Advanced SEO | | Digitics0 -
URL Structure - Keywords vs. Information Architecture/Navigation
I'm creating the URL structure for an ecommerce site and was wondering if it's better to structure my URLs according to the most popular way people word their key phrases or by what makes most sense from a navigation perspective. Let's say I'm selling clothing (I'm not, just an example). I want the site to be open enough so a user can navigate by Person Type (Men's, Women's, Children's), Clothing Type (Shoes, Shirts, Hats), and Brands (Nike, Reebok, adidas). My gut and past experience say to structure the URLs from the least specific to the most specific: mysite.com/mens/shoes/nike But I know "men's Nike shoes" is searched for more than "men's shoes Nike", which would render this URL: mysite.com/mens/nike/shoes I know mysite.com/mens-nike-shoes would be best, but the folders setup is what I have to work with. So which is best for SEO? URLs that play to the structure of the most searched for key phrases? Or URLs that follow the information architecture/navigation of a site? Nate
Intermediate & Advanced SEO | | rball10 -
Best way to find all url parameters?
In reference to http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html, what is the best way to find all of the parameters that need to be addressed? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0