What is the best way to fix legacy overly-nested URLs?
-
Hi everyone,
Due to some really poor decisions I made back when I started my site several years ago, I'm lumbered with several hundred pages that have overly-nested URLs. For example:
/theme-parks/uk-theme-parks/alton-towers/attractions/enterprise
I'd prefer these to feature at most three layers of nesting, for example:
/reviews/alton-towers/enterprise
Is there a good approach for achieving this, or is it best just to accept the legacy URLs as an unfixable problem, and make sure that future content follows the new structure? I can easily knock together a script to update the aliases for the existing content, but I'm concerned about having hundreds of 301 redirects (could this be achieved with a single regular express in .htaccess, for example?).
Any guidance appreciated.
Thanks, Nick
-
Thanks Alan and Irving, your responses are both very helpful. In reality, these pages have relatively few external links pointing to them compared to other sections of the site, so I think I will opt to redirect them. The newer sections of the site have a nice clean URL structure and good on-page optimization, so I think it's best to bite the bullet and move the older pages over to a new system.
-
Except that hundrends of 301's means hundreds of link juice leaks
-
there's no problem with having hundreds of 301's.
having /theme-parks/ twice in the url is slightly spammy, but probably better than your second example where you don't have "theme parks" even once in the url. I would make it /theme-park-reviews/ unless your domain name already have "theme parks" in it.
If you're having great rankings with the current pages you may want to just leave the legacy pages and work on new structure for posts going forward, but if it's not brining you a ton of traffic that your business depends on, then I would 301 them to the new structure, should be fine and you could always revert if you see negative effects.
-
I would accept it, google does not mind, it is said Bing does count the folders as a signal, but with modern routing engines like MVC along with friendly urls, it is common to have many sections to a url, and Bing i assume will take that into account.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to deal with 100 product pages
It feels good to be BACK. I miss Moz. I left for a long time but happy to be back! 🙂 My client is a local HVAC company. They sell Lennox system. Lennox provides a tool that we hooked up to that allows visitors to their site to 'see' 120+ different kind of air quality, furnace and AC units. They problem is (I think its a problem) is Google and other crawl tools are seeing these 100+ pages that are not unique, helpful or related to my client. There is a little bit of cookie cutter text and images and specs and that's it. Are these pages potentially hurting my client? I can't imagine they are helping. Best way to deal with these? Thank you! Thank you! Matthew
Technical SEO | | Localseo41440 -
How best to fix 301 redirect problems
Hi all Wondering if anyone could help out with this one. Roger Bot crawler has just performed it's weekly error crawl on my site and I appear to have 18,613 temp redirect problems!! Rather, the same 1 problem 18,613 times. My site is a magento store and the errors it is giving me is due to the wishlist feature on the site. For example, it is trying to crawl links such as index.php/wishlist/index/add/product/29416/form_key/DBDSNAJOfP2YGgfW (which would normally add the item to one's wishlist). However, because Roger isn't logged into the website it means that all these requests are being sent to the login url with the page title of Please Enable Cookies. Would the best way to fix this be to enable wishlists for guests? I would rather not do that but cannot think of another way of fixing it. Any other Magento people come across this issue? Thanks, Carl
Technical SEO | | daedriccarl0 -
Should I make a new URL just so it can include a target keyword, then 301 redirect the old URL?
This is for an ecommerce site, and the company I'm working with has started selling a new line of products they want to promote.Should I make a new URL just so it can include a target keyword, then 301 redirect the old URL? One of my concerns is losing a little bit of link value from redirecting. Thank you for reading!
Technical SEO | | DA20130 -
Removing links - Best practice
Hi I have noticed on webmaster that I have a lot of links to my sites from link building directories. Either I did this many years a go or somehow they've linked to me. Would links to link building directories harm my site? i.e linkspurt.com pingerati.net I have quite a few and just wondering what to do with them. Also I have some customer sites which are massive one site has 38,000 links coming to my site as I have put a credit that I built the site with a link back to mine. It has a low score in Google would this also harm my site? Any advise would be appreciated.
Technical SEO | | Cocoonfxmedia0 -
Would these be considered dynamic URLs?
Hi, I have a (brand) new client (outdoor recreation), and it links to many different lodges. It's built in Wordpress (Pagelines), and the partner page link URLs. Although they do have the "?" in there, it's only has a single parameter. http://www.clientsite/?partners=partner-name Google is indexing the URLs, I do plan to increase the amount of content/on-page for each. Yet, weighing the risk/reward of rewriting all of these URLs.
Technical SEO | | csmithal0 -
Formatting dynamic urls?
We have a long-time previously well-established website that was hit by panda. On one section of the site, we have dynamic urls that include %20 in them (e.g. North%20America). It's recently come to our attention that google has both a version of the url with a plus sign (+) and the version with the %20 (space) (e.g. North+America). Upon researching this, it seems that a hyphen (-) is preferable to either of the above. We obviously need to remove the %20's from the urls as they can cause issues. So, should we stick with the + sign since it's already indexed and ranking or do a 301 rewrite and change them all to hyphens instead of the plus sign? This is the one section of the site that has maintained rankings through the panda debacle, so we need to take that into consideration as we don’t want to lose the rankings that we have. Along the same lines, we have two other sections of the site that provide search results as well, though these are all formatted to use a plus sign. Is it advisable to do a 301 rewrite to change the plus signs to hyphens on these as well or just leave them alone? This particular section has lost rankings over the last year with panda updates.
Technical SEO | | Odjobob0 -
Best way to setup large site for multi language
Hello, I am setting up a new site which is going to be very large, over 250,000 products. Most of our customers are in the UK (45%), the rest are from various European countries and the USA. Unfortunately we only have a team of two people writing content for these pages in English. I would value some input on the best way to setup my website structure for ranking. Obviously the best would be individual country oriented domains I.e. domain.fr domain.de domain.co.uk . However we wouldnt have the time to create content for every page and most pages would contain the same content as the English domain. Would I get a penalty for this from google? The second choice is to follow the example of overstock.com and pull in information relating to each country I.e. currency and delivery time. this would be a lot easier but I am concerned that the lack of geo focus would effect my rankings. Does any one have any ideas?
Technical SEO | | DavidLenehan0 -
Best Way to Handle - International Content - Different Language
Our site currently is focused in the USA and the entire site is in the English language. We have considered broadening our scope to include content from foreign countries - i.e. Brazil. What is the best way to approach this -- can we use our existing domain and just have a specific section of the site that is dedicated to a particular Country with content translated into that Country's predominant language? OR could this create SEO issues -- having a domain with both English and some other language? Would it be better to have this on a totally different domain with Country extension? This is totally foreign territory for me - bad pun intended. Any advice, help would be appreciated. Thanks. Matt
Technical SEO | | MWM37720