Changing URLS: from a short well optimised URL to a longer one – What's the traffic risk
-
I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages.
They've invested in their content over the years and managed to rank well for key search terms.
They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed.
E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians
New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians
My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.
-
There's some good news about 301 redirects that you may have missed: Since early/mid 2016, changing the directory structure (alone) and creating 301 redirects isn't going to (or shouldn't) cause any loss in PageRank.
The fact that 301s generally resulted in a loss of around 15% of PageRank (which was confirmed by Matt Cutts in 2013: https://www.youtube.com/watch?v=Filv4pP-1nw) is simply no longer the case.
Sceptical? So was I. Read on...
In February 2016, Google Web Trend Analyst, John Mueller confirmed this:
Q. Do I lose "link juice" from the redirects?
A. No, for 301 or 302 redirects from HTTP to HTTPS no PageRank is lost.
(Source: https://plus.google.com/+JohnMueller/posts/PY1xCWbeDVC)
Further, Google's Gary Illyes confirmed this in July 2016, on twitter:
"30x redirects don't lose PageRank anymore."
(Source: https://twitter.com/methode/status/757923179641839616?ref_src=twsrc%5Etfw)(Bear in mind, PR is not the only ranking signal.)
So, changing URLs for SEO purposes, including "Improving directory/subfolder structure" is considered less risky now that 301 redirects preserve PageRank (as long as the content and structure remains the same).
There's a great article on the subject of 301 redirect rules: "301 Redirects Rules Change: What You Need to Know for SEO" here: https://moz.com/blog/301-redirection-rules-for-seo
Remember: For this to work out for you, the content of the page at the receiving end of the 301 needs to match the original source as closely as possible.
Good luck!
-
Thanks James, appreciate the speedy response. I found huge amounts of information on URL structure, but many varying points of view and nothing I could relate specifically to the situation.
The site does have a good amount of content for each state and region landing page – which are all top level in the current flat structure at the moment.
There are over 100 different regions, so I did feel that there would be value in the directory that the digital agency had recommended, but also felt there is a risk (large) of traffic loss introducing the new sub-directory / hierarchical structure. General rule of thumb being 15% PR loss for any URL changes...
So I'm trying to do a bit of a risk / benefit analysis – what the client may lose in traffic initially, will they gain back overtime.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO's Structuring Your Work Week
Hi I wanted some feedback on how other SEO's structure their time. I feel as though I'm falling into the trap of fire fighting with tasks rather than working on substantial projects... I don't feel as though I'm being as effective as I could be. Here's our set up - Ecommerce site selling thousands of products - more of a generalist with 5 focus areas. 2 x product/merchandising teams - bring in new products, write content/merchandise products Web team - me (SEO), Webmaster, Ecommcerce manager Studio - Print/Email marketing/creative/photography. A lot of my time is split between working for the product teams doing KWD research, briefing them on keywords to use, checking meta. SEO Tasks - Site audits/craws, reporting Blogs - I try and do a bit as I need it so much for SEO, so I've put a content/social plan together but getting a lot of things actioned is hard... I'm trying to coordinate this across teams Inbetween all that, I don't have much time to work on things I know are crucial like a backlink/outreach plan, blog/user guide/content building etc. How do you plan your time as an SEO? Big projects? Soon I'm going to pull back from the product optimisation & try focussing on category pages, but for an Ecommerce site they are extremely difficulty to promote. Just asking for opinions and advice 🙂
Intermediate & Advanced SEO | | BeckyKey3 -
Changing the permalink estructure, worth the risk in 2017 ?
Hello, I have a wordpress blog that have more than 10 years old, when I created the blog the permalink had the date included. Example: site.com/2007/02/02/my-post/ Do you guys think is it worth the risk of changing my url escructure to remove the date ? Ofcourse I would do the 301 redirects and such... What I want to know if this will have any significant SEO advantage considering Google evolved so much ?
Intermediate & Advanced SEO | | Glinski
Thank you very much for reading my question 🙂0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
My blog's categories are winning over my landing pages, what to do?
Hi My blogs categories for the ecommerce site are by subject and are similar to the product landing pages. Example Domain.com/laptops that sells laptops Domain.com/blog/laptops that shows news and articles on laptops Within the blog posts the links of anchor laptop are to the store. What to do? Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Should I disallow via robots.txt for my sub folder country TLD's?
Hello, My website is in default English and Spanish as a sub folder TLD. Because of my Joomla platform, Google is listing hundreds of soft 404 links of French, Chinese, German etc. sub TLD's. Again, i never created these country sub folder url's, but Google is crawling them. Is it best to just "Disallow" these sub folder TLD's like the example below, then "mark as fixed" in my crawl errors section in Google Webmaster tools?: User-agent: * Disallow: /de/ Disallow: /fr/ Disallow: /cn/ Thank you, Shawn
Intermediate & Advanced SEO | | Shawn1240 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0 -
Anyone managed to change 'At a glance:' in local search results
On Google's local search results, i.e when the 'Google places' data is displayed along with the map on the right hand side of the search results, there is also an element 'At a glance:'
Intermediate & Advanced SEO | | DeanAndrews
The data that if being displayed is from some years ago and the client would if possible like it to reflect there current services, which they have been providing for some five years. According to Google support here - http://support.google.com/maps/bin/answer.py?hl=en&answer=1344353 this cannot be changed, they say 'Can I edit a listing’s descriptive terms or suggest a new one?
No; the terms are not reviewed, curated, or edited. They come from an algorithm, and we do not help that algorithm figure it out. ' My question is has anyone successfully influenced this data and if so how.0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0