Massive URL Migration with thousands of 301
-
Hey Everyone!
I'm currently working on a project that we have A Lot of product pages and we have thousands of URL's that need to be 301'd over. I know this can be a major issue and could lead to tons of errors. What is everyone's thought of doing such a huge Migration, Should I do it all in phases? or should I do them all at once so they can all be indexed together?
What would you suggest to be the best way to go about doing such a massive migration?
-
I have never done a migration of a website with so many URL's it is kind of overwhelming. I do have a 404 page with the catchall rule in place. See i'm coming from the business side of SEO i'm not really the developer getting in there and actually doing the migration ( I tip my hat to all you developers without you I am nothing but a voice). I'm guiding my developer to the safest route. Doing the migration all at once does make sense to prevent both sites being cached
-
Webmaster's Tools has been pretty good to me regarding telling me about 404s I may have missed after a migration, but using the tactic I described really limits missed URLs, assuming you have a good grasp on the subdirectories of your site.
Phasing is an option, but say, in the case of a redesign or domain migration, I like to do it all at once. That way Google isn't trying to index the website on two different URL structures or domains. Doing it in one shot makes it clear to Google what has happened, since everything is now moved, instead of just a fraction at a time. I'm sure this is a point of debate, and not necessarily the definitive way to do it.
-
Hey Thanks for the response!
I'm currently dealing with a site that has a large duplication problem and i'm Canon tagging them all to then 301 them all over. Why shouldn't I do a phase approach though? Wouldn't it be easier to see where errors happen if I come at this migration with a phased approach?
-
Assuming not all of your pages hold incredible value and don't get visited a lot, I don't think you'd need to do it in phases. You can save yourself some time by throwing in some regex to grab large chunks of URLs at a time and redirect them that way.
Your more valuable pages should be one-to-one redirected as to not confuse users and to retain the most juice to the right place, but for less important pages, grab a bunch at a time.
One way to do this is:
redirectmatch 301 ^/sub/directory/(.*)$ http://site.com/newdirectory
This would grab all pages under site.com/new/directory/ANYTHING to site.com/newdirectory/.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive Amount of Pages Deindexed
On or about 12/1/17 a massive amount of my site's pages were deindexed. I have done the following: Ensured all pages are "index,follow" Ensured there are no manual penalites Ensured the sitemap correlates to all the pages Resubmitted to Google ALL pages are gone from Bing as well In the new SC interface, there are 661 pages that are Excluded with 252 being "Crawled - currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling." What in the world does this mean and how the heck do I fix this. This is CRITICAL. Please help! The url is https://www.hkqpc.com
Intermediate & Advanced SEO | | D.J.Hanchett0 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Dealing with 404s during site migration
Hi everyone - What is the best way to deal with 404s on an old site when you're migrating to a new website? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Multiple 301 Redirect Query
Hello all, I have 2 301 redirects on my some of my landing pages and wondering if this will cause me serious issues. I first did 301 directs across the whole website as we redid our url structure a couple of months ago. We also has location specific landing pages on our categories but due to thin/duplicate content , we have got rid of these by doing 301's back to the main category pages. We do have physical branches at these locations but given that we didnt get much traffic for those specific categories at those locations and the fact that we cannot write thousands of pages of unique content content , we did 301's. Is this going to cause me issues. I would have thought that 301's drop out of serps ? so is this is an issue than it would only be a temporary one ?.. Or should I have 404'd the location category pages instead. Any advice greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Should we use URL parameters or plain URL's=
Hi, Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site. Let's say we are creating a AirBNB clone, and we want to be found when people search for apartments new york. As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so clone.com/Appartments/New-York but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft? clone.com/Apartments/New-York?price=30&size=100 or (We are using Node.js so no problem) clone.com/Apartments/New-York/Price/30/Size/100 The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google. I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter. We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
Intermediate & Advanced SEO | | Peekabo0 -
How to set up 301 redirect for URL with question mark
I have encountered some issue with 301 redirect and htaccess file. I need to redirect the following url: http://www.domain.com/?specifications=colours/page/3 to: http://www.domain.com/colours The 301 redirect command I wrote in htaccess file is as follow: Redirect 301 /?specifications=colours/page/3 http://www.domain.com/colours And it doesn't work at the moment. What is the correct way to set up 301 redirect here? Your help will be sincerely appreciated!
Intermediate & Advanced SEO | | robotseo0 -
URL blocked
Hi there, I have recently noticed that we have a link from an authoritative website, however when I looked at the code, it looked like this: <a <span="">href</a><a <span="">="http://www.mydomain.com/" title="blocked::http://www.mydomain.com/">keyword</a> You will notice that in the code there is 'blocked::' What is this? has it the same effect as a nofollow tag? Thanks for any help
Intermediate & Advanced SEO | | Paul780 -
Best way to migrate to a new URL structure
Hello everyone, We’re changing our URL structure from something like this: example.com/index.php?language=English To something like this: example.com**/english/**index.php The change is implemented with mod_rewrite so all the old URLs can still work We have hundreds of thousands of pages that are currently indexed with the old URL structure What’s the best way to get Google to rapidly update its index and to maintain as much ranking as possible? 301 redirect all the old URLs to the new equivalent format? If we detect that the URL is in an old format, render the page with a canonical tag pointing to the new equivalent format as well as adding a noindex, nofollow tag? Something else? Thanks for your input!
Intermediate & Advanced SEO | | anthematic0