What's the best way to A/B test new version of your website having different URL structure?
-
Hi Mozzers,
Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well.
Now, my question is, what's the best way to do a A/B test with the new version?
We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions.
Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users?
Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this.
Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
-
Hi Martijn,
Yeah, not planning to block them from robots.txt of course. By blocking, I meant reducing the crawl rate to ZERO temporarily to make sure we're not creating any URL related confusions for bots.
But, this might not be a good solution for our customers as customer might be redirecting to /new-url for the first hit, which might give him an error for in the next session.
-
Hi Nitin,
Yes, that's why I mentioned that you should block these URLs via robots.txt so bots don't even find them in the first place.
-
Hi Martijn,
Thank you so much for sharing your thoughts on this, really appreciate that.
But, the problem here is, we're planning to launch some of the new urls which aren't backward compatible. Hence, exposing them to bots isn't a good idea, they won't work for old website.
Also, if I 301 redirect the /old-url to /new-url, would need to redirect it back to /old-url if hit goes to old website. This might confuse bots.
Btw, number of URLs affected by this is, the almost whole website i.e a very large number of indexed pages.
-
Hi Nitin,
Don't change the crawl rate for an A/B test, it probably will hurt you more in the long run that it will do any good for the time being. In this case it also depends on how many URLs are affected by the change, if it's only 1 page that we will have a duplicate then I really wouldn't worry about it if it's a dynamic page with thousands of it then please make sure you will block these pages via the robots.txt so search engines won't find them in the first places.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best comments system / plugin for websites
Hi, What is the best comments system / plugin for websites that not harm seo Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
What is the best structure for paginating comment structures on pages to preserve the maximum SEO juice?
You have a full webpage with a great amount of content, images & media. This is a social blogging site where other members can leave their comments and reactions to the article. Over time there are say 1000 comments on this page. So we set the canonical URL, and use Rel (Prev & Next) to tell the bots that the next subsequent block of 100 comments is attributed to the primary URL. Or... We allow the newest 10 comments to exist on the primary URL, with a "see all" comments link that refers to a new URL, and that is where the rest of the comments are paginated. Which option does the community feel would be most appropriate and would adhere to the best practices for managing this type of dynamic comment growth? Thanks
Intermediate & Advanced SEO | | HoloGuy0 -
Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
Intermediate & Advanced SEO | | McTaggart
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on... OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke0 -
New site, new URL, lots of custom content. Load it all or "trickle" it over time?
New site, new URL, lots of custom content. Load it all or "trickle" it over time? Would it make a difference in terms of ranking the site? Interested in your thoughts. Thanks! BBuck!
Intermediate & Advanced SEO | | BBuck0 -
New Site Structure and 301s
We're moving towards a new site with new site structure. The old site has numerous backlinks to past events that won't be published on the new site. The new site will have about 60 future events that are currently active on the old site as well. I was wondering the best way to move forward with the 301 redirect plan. I was considering redirecting the old site structure to an "archive.ourdomain.co.uk" subdomain and redirecting the 60 or so active events to their equivalents on the new site. Would this be a sensible plan? Also for the active events, is there any difference between: _redirecting the old page to the archive page and then forwarding to the equivalent on the new page _ and redirecting the old page directly to the new page
Intermediate & Advanced SEO | | chanm790 -
Redirect aspx files to a different path structure on a different domain using a different server-side language?
Without getting into the debate/discussion about which server-side language should or should not be used, I am faced with the reality of moving an old ASP.NET site to a Coldfusion one with a different domain and different folder structure. Example: www.thissite.com/animals/lion.aspx --> www.thatsite.com/animals/africa/lion.cfm What is the best way to redirect individual .aspx pages to their .cfm counterparts keeping in mind that, in many cases, the folder paths will be different? If it would mean less work, I am hoping this can be done at the server level (IIS 6) rather than modifying the code on each now-defunct page. And on a related note, how long should any redirects be kept in place? My apologies if this has been answered in this forum in the past, but I did do a lot of searching first (both here and elsewhere) before posting this query.
Intermediate & Advanced SEO | | hamackey0 -
Is there any importance in including http:// in the url?
I have seen some sites that always redirect to https and some sites that always redirect to http://, but lately I have seen sites that force the url to just the site. As in [sitename].com, no www. no http://. Does this affect SEO in anyway? Is it good or bad for other things? I was surprised when I saw it and don't really know what effect it has.
Intermediate & Advanced SEO | | MarloSchneider0 -
In order to improve SEO with silos'urls, should i move my posts from blog directory to pages'directories ?
Now, my website is like this: myurl.com/blog/category1/mypost.html myurl.com/category1/mypage.html So I use silos urls. I'd like to improve my ranking a little bit more. Is it better to change my urls like this: myurl.com/category1/blog/mypost.html or maybe myurl.com/category1/mypost.html myurl.com/category1/mypage.html Thanks
Intermediate & Advanced SEO | | Max840