Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best approach to launch a new site with new urls - same domain
-
We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results.
The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites.
Except for the homepage the URL structure for the new site is different than the old site.
What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues?
Here is what we got back from a Google post which may highlight our concerns better:
Thank You,
sincerely,
Stephan Woo Cude
SEO Specialist
-
Hi there,
I was just reading this old thread to get some info, but I'd love it if you could share you actual results from the launch. What did you do and how much did traffic change? How long before you were back to normal?
I usually find that with a new website and all new URLs, I end up seeing maybe a month or sodip in traffic that can be up to 10%. But that seems to be less and less as time goes on. The search engines are usually on top of it though, they recrawl and recatalog quite quickly.
Would love to hear from you.
Thanks!
Leslie
- topic:timeago_earlier,9 months
-
Just to chime in on this, albeit maybe a little late now... I had the same thought as I was reading through this with using rel=canonical to point the old pages to the new for now, so the search engines don't have any duplicate content issues until a 301 redirect can take over when the new site is fully launched.
However, depending on your rollout schedule, this would mean that the SERPs would soon be indexing only the new pages. You'd need to ensure that the traffic diverter you are using would handle this. Otherwise you could put the rel=canonical on the new pages for now, which would avoid the duplicate content until you are fully launched. Then you'd remove it and 301 redirect the old pages to the new.
Just something you maybe want to think about! Hopefully your traffic diverter can handle this though.
-
Thank you very much for the insight!
-
Ah ok. I understand now. I wasn't picking up on what you were saying before.
If with the soft launch you are already putting the "new" version of the site on their intended final URLs then yes, you can let the engines start crawling those URLs. For each new URL you let the search engines crawl make sure to 301 its corresponding old URL (the old site) to the new version to minimize any duplicate content issues.
If for whatever reason you can't quite 301 the old URLs yet (like if you still need instant access to reroute traffic back to them) you could try using rel=canonical on the old pages and point them to their new counter part only if the main content on each of the pages is almost exactly the same. You don't want Google to think you're manipulating them with rel=canonical.
-
Sorry this is so confusing and thank you so much for your responses... there would be no subdomain when we do the soft launch... it would be http://www.sierratradingpost.com/Mens-Clothing.html (old site) vs http://www.sierratradingpost.com/mens-clothing~d~15/ (new site)...
-
As I'd said, there really isn't a reason to let them get a head start. The URL's will be changing when you transition the new site out of the subdomain (ie beta.sierratradingpost.com/mens vs sierratradingpost.com/mens - those are considered 2 completely different URLs) and the engines will have to recrawl all of the new pages at that point anyway.
-
We do plan to do that... it is just since we plan a soft launch we will essentially have 2 sites out there. We are wondering when to remove the noindex from the new site. We will have 2 sites for about a month... should we let the bots crawl the new site (new urls, same domain) only we we take down the old site and have the 301's or let Google crawl earlier to get the new site a head start on indexing.
-
And when you drop the sub domain you definitely want to 301 all of the old site structure's URLs to their corresponding new page's URLs. That way nothing gets lost in the transition.
-
We would drop the subdomain - so we would have 2 "Men's Clothing" department pages - different URLs, slightly different content...
-
Yeah, just refer to our conversation above as I think it will pertain better to your situation.
-
The only issue is that you have to keep in mind that Google/Bing defines pages on the internet through their URL's, not the content. The content only describes the pages.
So if you let the engines pre crawl the pages before dropping the subdomain - simply for the reason of letting them have a "sneak peek" - you won't really be doing yourself much of a favor, as the engines will just be recrawling the content on the non subdomain URL as if it were brand new anyway.
The reason to do it the pre crawl way would be if you're already building back links to the new beta pages. Then it could make sense to let the engines index those pages and 301 them to their new non subdomain versions later. In my opinion the benefit from this route would outweigh any potential duplicate content issues.
-
But the URL structer is different... does that matter?
-
What YesBaby is talking about is somehting like Google's Website Optimizer. When someone goes to sierratradingpost.com/mens-stuff, for example, it will give 50% of the people the old version of the site for that page, and the other 50% the new version. It will eliminate any duplicate content issues as the 2 page variations will still be attached to the same exact URL.
Definitely a viable option if it fits with your game plan of how you want to do things.
-
SInce all of the URLs except for the homepage - what do you think about letting the new site get crawled maybe 2 weeks before it is 100% launched? We would have some duplicate content issues but I am hoping this would give us a head start with the new site.... then when we go 100% we add the 301's and new sitemap. It is my understanding we will be dropping the sub domain for the soft launch.
Thank you so much!
-
First of all - I love the new design. It looks great!
The absolutel best way to go about it in my opinion would be to simply have the new site ready, and then launch it fully under the base domain (no subdomain) while 301 redirecting important old pages on the site to their related new versions. That way the search engine will have the easiest time of discovering the new site and indexing it, while making sure you don't lose anything in the transition via proper 301'ing.
I can't say it would provide you with a massive benefit to set up a way for the search engines to start crawling the new site for now, as you're just going to be moving all of those URL's off of the subdomain in the near future anyway - where they will then need to be recrawled on the parent domain as if they were brand new.
If the traffic diverter you have set up automatically 301's requests for old site pages to their new beta URL version then you might as well let those new versions be indexed for the time being. Just make sure that when you transfer the beta site to the parent domain that you 301 the old beta URL's to their new permanent home.
-
So with the service - the new site is not crawled until we launch it?
-
The new site is beta.sierratradingpost.com where we will be dropping the beta. On the old one has catalog departments... ie Men's Classics, which, at this time, are not being carried over to the new site. I guess we are wonding when we should allow the robots to crawl the new site?
-
Hey Stephan,
I'm assuming you want to measure how the traffic is converting on the new site, hence the strategy to send small portions of traffic to new pages?
If so, the easiest way might to just straight up A/B split test the new pages with a service like Adobe/Omniture Test&Target. This doesn't cause any cloaking/dupe isseues. When you are happy with the results you can realese the site with all the 301's in place.
-
Let me make sure I have this straight... you're not going to be directing the new site format to a subdomain permanently, right? You were only using the sub domain for beta purposes?
The way I see it, when I go to Sierra Trading Post's site now I can make out what looks like 2 different types of architecture structures. You have one link on the page pointing to Men's clothing which executes at a single defined .htm file. Then you can see that you have the "Men's Classics" (still general men's clothing?) which points to a directory which I'm guessing is your new site. Correct me if I'm wrong on this, or if I'm right but have the old vs. new reversed.
If that is the case your best bet to try and minimalize any ranking impact would be to 301 redirect pages from the old catalog architecture to the new. That way you could remove the old site files completely and let the server take care of the direction.
If you need to leave the old site up for throttling purposes like you said - you could use canoniclazation tags to refer the old pages to the new ones. That along with employing 301 tags would help train the search engines into understanding what you're doing.
I'm sorry if I didn't answer your question as you needed. I'm still not sure if I understood your issue as intended. =P
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting from a "bad" domain "infect" the new domain?
Hi all, So a complicated question that requires a little background. I bought unseenjapan.com to serve as a legitimate news site about a year ago. Social media and content growth has been good. Unfortunately, one thing I didn't realize when I bought this domain was that it used to be a porn site. I've managed to muck out some of the damage already - primarily, I got major vendors like Macafee and OpenDNS to remove the "porn" categorization, which has unblocked the site at most schools & locations w/ public wifi. The sticky bit, however, is Google. Google has the domain filtered under SafeSearch, which means we're losing - and will continue to lose - a ton of organic traffic. I'm trying to figure out how to deal with this, and appeal the decision. Unfortunately, Google's Reconsideration Request form currently doesn't work unless your site has an existing manual action against it (mine does not). I've also heard such requests, even if I did figure out how to make them, often just get ignored for months on end. Now, I have a back up plan. I've registered unseen-japan.com, and I could just move my domain over to the new domain if I can't get this issue resolved. It would allow me to be on a domain with a clean history while not having to change my brand. But if I do that, and I set up 301 redirects from the former domain, will it simply cause the new domain to be perceived as an "adult" domain by Google? I.e., will the former URL's bad reputation carry over to the new one? I haven't made a decision one way or the other yet, so any insights are appreciated.
Intermediate & Advanced SEO | Apr 3, 2021, 3:22 AM | gaiaslastlaugh0 -
Combining Two Sites With Similar Domain Authority
Hello, We run two sites with the same product, product descriptions and url structure. Essentially, the two sites are the same except for domain name and minor differences on the home pages. We've run this way for quite a few years. Both sites have a domain authority of 48 and there are not a large number of duplicate incoming links. I understand the "book" to say we should combine the sites with 301's to the similar pages. I am concerned about doing this because "site 2" still does about 20% of our business. We have been losing organic traffic for a number of years. I think this mainly has to do with a more competitive environment. However, where google used to serve both our sites for a search term it now will only show one. How much organic benefit should we see if we combine. Will it be significant enough to merge the two sites. Understandably, I realize the future can't be predicted but I would like to know if anyone has had a similar experience or opinion Thanks
Intermediate & Advanced SEO | Jan 31, 2019, 4:18 PM | ffctas0 -
Switching URLs after acquisition to retain domain authority?
Hey everyone! My company just acquired our biggest competitor and we're switching to their platform because they have a better technical structure for SEO--what's the best way to do that, other than a 301 redirect? Can we even rename their domain to ours? How do we ensure we keep both our and their domain authority and SEO juice? Thanks!
Intermediate & Advanced SEO | Oct 16, 2017, 11:12 AM | genevieveagar0 -
What is the best practice for URLs for E-commerce products in multiple categories?
Hello all! I have always worked successfully with SEO on E-commerce sites, however we are currently revamping an older site for a client and so I thought I'd turn to the community to ask what the best practices that you guys are experiencing for url structures at the moment. Obviously we do not wish to create duplicate content and so the big question is, what would you guys do for the very best structure for URLs on an E-commerce site that has products in multiple categories? Let's imagine we are selling toy cars. I have a sports car for sale, so naturally it can go in the sports cars category and it could also go in to the convertibles category too. What is the best way you have found recently that works and increases rankings, but does not create duplicate content? Thanks in advance! 🙂 Kind Regards, JDM
Intermediate & Advanced SEO | Aug 12, 2014, 3:21 PM | Hatfish0 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | Feb 26, 2014, 10:25 PM | jcgoodrich0 -
Merging Domains... Sub-domains, Directories or Seperate Sites?
Hello! I am hoping you can help me decide the best path to take here... A little background: I'm moving to a new company that has three old domains (the oldest is 10 years old), which get a lot of traffic from their e-letters. Until recently they have not cared about SEO. So the websites have some structural, coding, URL and other issues. The sites are indexed, but have a problem getting crawled and/or indexed for new content - haven't delved into this yet but am certain I will be able to fix any of these issues. These three domains are PR4, PR4, PR5 and contain hundreds of unique articles. Here's the question... They want to move these three sites **to their main company site (PR4) and create sub domains for each one. ** I am wondering if this is a good idea or not. I have merged sites before (creating categories and/or directories) and the end result is that the ONE big site, is much for effective than TWO smaller, less authoritative sites. But the sub domain idea is something I am unsure about from an SEO perspective. Should we do this with sub domains? Or do you think we should keep the sites separate? How do Panda and Penguin play into this? Thanks in advance for the help! SD P.S. I'm not a huge advocate in using PR as a measurement tool, but since I can't reveal the actual domains, I figured I would list it as a reference point.
Intermediate & Advanced SEO | Aug 27, 2012, 11:14 AM | essdee0 -
Best Practice for Inter-Linking to CCTLD brand domains
Team, I am wondering what people recommend as best SEO practice to inter-link to language specific brand domains e.g. : amazon.com
Intermediate & Advanced SEO | Aug 15, 2012, 6:06 AM | tomypro
amazon.de
amazon.fr
amazon.it Currently I have 18 CCTLDs for one brand in different languages (no DC). I am linking from each content page to each other language domain, providing a link to the equivalent content in a separate language on a different CCTLD doamin. However, with Google's discouragement of site-wide links I am reviewing this practice. I am tending towards making the language redirects on each page javascript driven and to start linking only from my home page to the other pages with optimized link titles. Anyone having any thoughts/opinions on this topic they are open to sharing? /Thomas0 -
How long does a new domain need to get a specific level of trust?
We are a small start-up in germany in the Sports and health sector. We currently are building a network of people in that sector and give each person a seperate wordpress blog. The idea is to create a big network of experts. My question is: How long is the period for google to trust a completely new URL? We set up each project and create content on the page. Each week the owner of the site puts up an expert article that contain keywords. And we set certain links from other blogs, etc. Also, do you think it is more important for a site to get say, 20 backlinks from anywhere. Or 5 backlinks from very trusted blogs, etc.?
Intermediate & Advanced SEO | Oct 27, 2011, 9:14 AM | wellbo0