Huge e-commerce site migration - what to do with product pages?
-
My very large e-commerce client is about to undergo a site migration in which every product page URL will be changing. I am already planning my 301 redirect process for the top ~1,000 pages on the site (categories, products, and more) but this will not account for the more than 1,000 products on the site. The client specified that they don't want to implement much more than 1,000 redirects so as to avoid impacting site performance. What is the best way to handle these pages without causing hundreds of 404 errors on site migration day?
Thanks!
-
They are changing from what old format to what new format? Sometimes, if there is logic to the URL structure, you can handle thousands of redirects with only one line of code called a regular expression.
-
Hmm, I'm not too familiar with Oracle ATG. Is that Java-based? I mainly work with PHP. To answer your question, yes it does return a clean 301 redirect response for crawlers.
I came across this link on Oracles doc for ATG. It looks like you could possibly use the URL templates to accomplish this. Do the URLs that you plan on redirecting have a commonality between them? (ie. switching 100 URLs from http://example.com/example1/* to http://example.com/*). If so, you can possibly separate them into these common "groups" and develop a redirect for each group instead of each URL.
-
The new site will be built on the Oracle ATG platform. Does your method still return a 301 response code?
-
What language is the new site coded in? You can always create a redirector within the new sites code all at once using a database query and amending a URL. This will prevent you from having to create such a large .htaccess file for redirects. I may be able to help you if you let me know which language you're using along with providing examples of the type of redirects you'd like to setup.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of Removing 60,000 Page from Sites
We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic. I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run). My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely? Any other ideas for minimizing SEO issues?
Intermediate & Advanced SEO | | MJTrevens1 -
Thinking about not indexing PDFs on a product page
Our product pages generate a PDF version of the page in a different layout. This is done for 2 reasons, it's been the standard across similar industries and to help customers print them when working with the product. So there is a use when it comes to the customer but search? I've thought about this a lot and my thinking is why index the PDF at all? Only allow the HTML page to be indexed. The PDF files are in a subdomain, so I can easily no index them. The way I see it, I'm reducing duplicate content On the flip side, it is hosted in a subdomain, so the PDF appearing when a HTML page doesn't, is another way of gaining real estate. If it appears with the HTML page, more estate coverage. Anyone else done this? My knowledge tells me this could be a good thing, might even iron out any backlinks from being generated to the PDF and lead to more HTML backlinks Can PDFs solely exist as a form of data accessible once on the page and not relevant to search engines. I find them a bane when they are on a subdomain.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
External resources page (AKA a satellite site) - is it a good idea?
So the general view on satellite sites is that they're not worth it because of their low authority and the amount of link juice they provide. However, I have an idea that is slightly different to the standard satellite site model. A client's website is in a particular niche, but a lot of websites that I have identified for potential links are not interested because they are a private commercial company. Many are only interested in linking to charities or simple resource pages. I created a resource section on the website, but many are still unwilling to link to it as it is still part of a commercial website. The website is performing well and is banging on the door of page one for some really competitive keywords. A few more links would make a massive difference. One idea I have is to create a standalone resource website that links to our client's website. This would be easy to get links from sites that would flat out refuse to link to the main website. This would increase the authority of the resource and result in more link juice to the primary website. Now I know that the link juice from this website will not be as good as getting links directly to the primary website, but would it still be a good idea? Or would my time be better spent trying to get a handful of links directly to the client's website? Alternatively, I could set up a sub-domain to set up the resource, but I'm not sure that this would be as successful.
Intermediate & Advanced SEO | | maxweb0 -
"No index" page still shows in search results and paginated pages shows page 2 in results
I have "no index, follow" on some pages, which I set 2 weeks ago. Today I see one of these pages showing in Google Search Results. I am using rel=next prev on pages, yet Page 2 of a string of pages showed up in results before Page 1. What could be the issue?
Intermediate & Advanced SEO | | khi50 -
Want to merge high ranking niche websites into a new mega site, but don't want to lose authority from old top level pages
I have a few older websites that SERP well, and I am considering merging some or all of them into a new related website that I will be launching regardless. My old websites display real estate listings and not much else. Each website is devoted to showing homes for sale in a specific neighborhood. The domains are all in the form of Neighborhood1CityHomes.com, Neighborhood2CityHomes.com, etc. These sites SERP well for searches like "Neighborhood1 City homes for sale" and also "Neighborhood1 City real estate" where some or all of the query is in the domain name. Google simply points to the top of the domain although each site has a few interior pages that are rarely used. There is next to zero backlinking to the old domains, but each links to the other with anchor text like "Neighborhood1 Cityname real estate". That's pretty much the extent of the link profile. The new website will be a more comprehensive search portal where many neighborhoods and cities can be searched. The domain name is a nonsense word .com not related to actual key words. The structure will be like newdomain.com/cityname/neighborhood-name/ where the neighborhood real estate listings are that would replace the old websites, and I'd 301 the old sites to the appropriate internal directories of the new site. The content on the old websites is all on the home page of each, at least the content for searches that matter to me and rank well, and I read an article suggesting that Google assigns additional authority for top level pages (can I link to that here?). I'd be 301-ing each old domain from a top level to a 3rd level interior page like www. newdomain/cityname/neighborhood1/. The new site is better than the old sites by a wide margin, especially on mobile, but I don't want to lose all my top positions for some tough phrases. I'm not running analytics on the old sites in question, but each of the old sites has extensive past history with AdWords (which I don't run any more). So in theory Google knows these old sites are good quality.
Intermediate & Advanced SEO | | Gogogomez0 -
Huge Google index on E-commerce site
Hi Guys, I got a question which i can't understand. I'm working on a e-commerce site which recently got a CMS update including URL updates.
Intermediate & Advanced SEO | | ssiebn7
We did a lot of 301's on the old url's (around 3000 /4000 i guess) and submitted a new sitemap (around 12.000 urls, of which 10.500 are indexed). The strange thing is.. When i check the indexing status in webmaster tools Google tells me there are over 98.000 url's indexed.
Doing the site:domainx.com Google tells me there are 111.000 url's indexed. Another strange thing which another forum member describes here : Cache date has been reverted And next to that old url's (which have a 301 for about a month now) keep showing up in the index. Does anyone know what i could do to solve the problem?0 -
More Indexed Pages than URLs on site.
According to webmaster tools, the number of pages indexed by Google on my site doubled yesterday (gone from 150K to 450K). Usually I would be jumping for joy but now I have more indexed pages than actual pages on my site. I have checked for duplicate URLs pointing to the same product page but can't see any, pagination in category pages doesn't seem to be indexed nor does parameterisation in URLs from advanced filtration. Using the site: operator we get a different result on google.com (450K) to google.co.uk (150K). Anyone got any ideas?
Intermediate & Advanced SEO | | DavidLenehan0 -
What is the best tool to crawl a site with millions of pages?
I want to crawl a site that has so many pages that Xenu and Screaming Frog keep crashing at some point after 200,000 pages. What tools will allow me to crawl a site with millions of pages without crashing?
Intermediate & Advanced SEO | | iCrossing_UK0