Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
URL Rewriting Best Practices
-
Hey Moz!
I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to:
- Improve our website structure by removing redundant directories.
- Replace underscores with dashes and remove file extensions for our URLs.
Please see my example below:
Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm
New structure: https://www.widgets.com/commercial-widgets/small-blue-widget
I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement).
One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this?
Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively?
Please offer any advice/reliable guides to handle this properly.
Thanks in advance!
-
Thanks for clearing that up and all of the help!
-
I'm saying rename files first and do rewrite for removing extensions.
You will have to do rewrite for replacing underscores with hyphens anyway, just for redirect purposes.
So, rename files from underscores to hyphens; do rewrite rule for underscore to hyphens to insure old pages are being redirected; do another rewrite for removing file extensions. In som time (2-3-4 months) when old file names (with underscores) are out of google index, delete first rewrite.
-
Hey Dmitrii,
I was planning on using two rewrites.
One rewrite for replacing the underscores with hyphens.
And another rewrite for removing the file extensions.
Just so I fully understand, you recommend implementing the rewrite for replacing the underscores with hyphens in our .htaccess file. Then once the new URLs are indexed, change the webpage file names themselves by replacing the underscores with hyphens, make the newly named files live and remove this rewrite from our .htaccess. Is my understanding correct?
Again...thanks for all of your help!
-
Well, I thought that's what you were going to do and use rewrite just for deleting file extensions. Honestly, I'd leave file extensions and rename files to hyphens. This way there is no server processing involved.
-
Another question just popped into my head...
Once our new website directory structure and URL format has been rewritten, redirected and indexed by search engines, would it make sense to edit the actual webpage file names (replacing the underscores w/ hyphens) and then remove the URL rewrite that replaces the underscores with the hyphens? Or is this not recommended?
-
Thanks for the help Dmitrii!
Both the rewrite I posted above and yours for removing file extensions failed to work. However, it seems this one does the trick (taken from the Apache help forums).
RewriteCond %{THE_REQUEST} ^[A-Z]{3,}\s([^.]+).htm [NC,OR]
RewriteCond %{THE_REQUEST} ^[A-Z]{3,}\s([^.]+).php [NC]
RewriteRule ^ %1 [R,L] -
Yes, I believe so, that's the only rewrite you'd need not to mess up rankings.
I don't know if one of codes is better than another. All I know that my piece of code is working and i haven't used the one you wrote. It seems ok to me, but just test it. If it works, I don't think there is any difference.
-
Hey Dmitrii,
This rewrite that I posted above...
RewriteRule ^old/(.*)$ /new/$1 [L,R=301]
...isn't intended to remove the file extensions. I'm using it to redirect the old directory structure to our new directory structure.
I was asking if using this rewrite when changing my directory structure will be all I need in regards to having all the necessary redirects in place to not negatively affect our SEO/SERP rankings. Any idea?
Also, would you recommend the rewrite you provided above over the one below when removing file extensions?
RewriteBase /
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteRule ^(.*)$ $1.htmlLet me know if I'm being clear enough
Thanks! -
the rule you wrote wont work.
What it will do is redirect this: _domain.com/old/small_blue_widget.htm _to this: domain.com/new/small_blue_widget.htm
To remove the extension would be:
<code>RewriteRule ^([^\.]+)$ $1.htm [NC,L]</code> -
Thanks for the response Dmitrii!
Thanks for for confirming that I don't need to update the webpage file names.
Do you know if redirecting the old directories to the new ones (using the the rewrite below) is all I need to do regarding redirects? In other words, when redirecting directories using the rewrite below is there any need to redirect the old URL format (small_blue_widget.htm) to the new (small-blue-widget)? My understanding is no, all I need to do is redirect the directories; but please share your knowledge.Thanks in advance!
<code>RewriteRule ^old/(.*)$ /new/$1 [L,R=301]</code> -
Hi there.
Well, as for best practices - you got it covered - remove/substitute underscores, remove redundant directories, make urls readable and understandable by users, implement redirects for pages, which are being renamed.
As for removing extensions from files - i'm not sure it has any effect on SEO or user experience at all. But no, you don't have to create new format pages. Basically what mod_rewrite does is when somebody requests a page, server says "I gonna server you this file with this name, because you sent me this specific request". Just be aware that there is no way to access both original url and rewritten url at the same time, since it would create duplicate issues.
As for rankings affect - as long as all redirects are done properly and urls are targeting the keywords on the page - you should be fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My url disappeared from Google but Search Console shows indexed. This url has been indexed for more than a year. Please help!
Super weird problem that I can't solve for last 5 hours. One of my urls: https://www.dcacar.com/lax-car-service.html Has been indexed for more than a year and also has an AMP version, few hours ago I realized that it had disappeared from serps. We were ranking on page 1 for several key terms. When I perform a search "site:dcacar.com " the url is no where to be found on all 5 pages. But when I check my Google Console it shows as indexed I requested to index again but nothing changed. All other 50 or so urls are not effected at all, this is the only url that has gone missing can someone solve this mystery for me please. Thanks a lot in advance.
Intermediate & Advanced SEO | | Davit19850 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
URL in russian
Hi everyone, I am doing an audit of a site that currently have a lot of 500 errors due to the russian langage. Basically, all the url's look that way for every page in russian: http://www.exemple.com/ru-kg/pешения-для/food-packaging-machines/
Intermediate & Advanced SEO | | alexrbrg
http://www.exemple.com/ru-kg/pешения-для/wood-flour-solutions/
http://www.exemple.com/ru-kg/pешения-для/cellulose-solutions/ I am wondering if this error is really caused by the server or if Google have difficulty reading the russian langage in URL's. Is it better to have the URL's only in english ?0 -
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Help FORUM ( User generated content ) SEO best practices
Hello Moz folks ! For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC. I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated. Best, Yan
Intermediate & Advanced SEO | | ydesjardins2000 -
How to deal with URLs and tabbed content
Hi All, We're currently redesigning a website for a new home developer and we're trying to figure out the best way to deal with tabbed content in the URL structure. The design of the site at the moment will have a page for a development and within that you can select your house type, then when on the house type page there will be tabs displayed for the user to see things like the plot map, availability and pricing, specifications, etc. The way our development team are looking at handling this is for the URL to use a hashtag or a query string at the end of it so we can still land users on these specific tabs for PPC for example. My question is really, has anyone had any experience with this? Any recommendations on how to best display the urls for SEO? Thanks
Intermediate & Advanced SEO | | J_Sinclair0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0