Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
URL Rewriting Best Practices
-
Hey Moz!
I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to:
- Improve our website structure by removing redundant directories.
- Replace underscores with dashes and remove file extensions for our URLs.
Please see my example below:
Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm
New structure: https://www.widgets.com/commercial-widgets/small-blue-widget
I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement).
One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this?
Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively?
Please offer any advice/reliable guides to handle this properly.
Thanks in advance!
-
Thanks for clearing that up and all of the help!
-
I'm saying rename files first and do rewrite for removing extensions.
You will have to do rewrite for replacing underscores with hyphens anyway, just for redirect purposes.
So, rename files from underscores to hyphens; do rewrite rule for underscore to hyphens to insure old pages are being redirected; do another rewrite for removing file extensions. In som time (2-3-4 months) when old file names (with underscores) are out of google index, delete first rewrite.
-
Hey Dmitrii,
I was planning on using two rewrites.
One rewrite for replacing the underscores with hyphens.
And another rewrite for removing the file extensions.
Just so I fully understand, you recommend implementing the rewrite for replacing the underscores with hyphens in our .htaccess file. Then once the new URLs are indexed, change the webpage file names themselves by replacing the underscores with hyphens, make the newly named files live and remove this rewrite from our .htaccess. Is my understanding correct?
Again...thanks for all of your help!
-
Well, I thought that's what you were going to do and use rewrite just for deleting file extensions. Honestly, I'd leave file extensions and rename files to hyphens. This way there is no server processing involved.
-
Another question just popped into my head...
Once our new website directory structure and URL format has been rewritten, redirected and indexed by search engines, would it make sense to edit the actual webpage file names (replacing the underscores w/ hyphens) and then remove the URL rewrite that replaces the underscores with the hyphens? Or is this not recommended?
-
Thanks for the help Dmitrii!
Both the rewrite I posted above and yours for removing file extensions failed to work. However, it seems this one does the trick (taken from the Apache help forums).
RewriteCond %{THE_REQUEST} ^[A-Z]{3,}\s([^.]+).htm [NC,OR]
RewriteCond %{THE_REQUEST} ^[A-Z]{3,}\s([^.]+).php [NC]
RewriteRule ^ %1 [R,L] -
Yes, I believe so, that's the only rewrite you'd need not to mess up rankings.
I don't know if one of codes is better than another. All I know that my piece of code is working and i haven't used the one you wrote. It seems ok to me, but just test it. If it works, I don't think there is any difference.
-
Hey Dmitrii,
This rewrite that I posted above...
RewriteRule ^old/(.*)$ /new/$1 [L,R=301]
...isn't intended to remove the file extensions. I'm using it to redirect the old directory structure to our new directory structure.
I was asking if using this rewrite when changing my directory structure will be all I need in regards to having all the necessary redirects in place to not negatively affect our SEO/SERP rankings. Any idea?
Also, would you recommend the rewrite you provided above over the one below when removing file extensions?
RewriteBase /
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteRule ^(.*)$ $1.htmlLet me know if I'm being clear enough
Thanks! -
the rule you wrote wont work.
What it will do is redirect this: _domain.com/old/small_blue_widget.htm _to this: domain.com/new/small_blue_widget.htm
To remove the extension would be:
<code>RewriteRule ^([^\.]+)$ $1.htm [NC,L]</code> -
Thanks for the response Dmitrii!
Thanks for for confirming that I don't need to update the webpage file names.
Do you know if redirecting the old directories to the new ones (using the the rewrite below) is all I need to do regarding redirects? In other words, when redirecting directories using the rewrite below is there any need to redirect the old URL format (small_blue_widget.htm) to the new (small-blue-widget)? My understanding is no, all I need to do is redirect the directories; but please share your knowledge.Thanks in advance!
<code>RewriteRule ^old/(.*)$ /new/$1 [L,R=301]</code> -
Hi there.
Well, as for best practices - you got it covered - remove/substitute underscores, remove redundant directories, make urls readable and understandable by users, implement redirects for pages, which are being renamed.
As for removing extensions from files - i'm not sure it has any effect on SEO or user experience at all. But no, you don't have to create new format pages. Basically what mod_rewrite does is when somebody requests a page, server says "I gonna server you this file with this name, because you sent me this specific request". Just be aware that there is no way to access both original url and rewritten url at the same time, since it would create duplicate issues.
As for rankings affect - as long as all redirects are done properly and urls are targeting the keywords on the page - you should be fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Link juice through URL parameters
Hi guys, hope you had a fantastic bank holiday weekend. Quick question re URL parameters, I understand that links which pass through an affiliate URL parameter aren't taken into consideration when passing link juice through one site to another. However, when a link contains a tracking URL parameter (let's say gclid=), does link juice get passed through? We have a number of external links pointing to our main site, however, they are linking directly to a unique tracking parameter. I'm just curious to know about this. Thanks, Brett
Intermediate & Advanced SEO | | Brett-S0 -
Inactive Products - Inactive URLs
Hi, In our website www.viatrading.com we have many products that might be in stock or not depending on availability. Until now, when a product was not available anymore, we took this page down (and redirected to its product category page). And, only if the product was available again, we re-activated the URL - this might be days, months or even years later. To make this more SEO-friendly, we decided now that while a product is not available, instead or deactivating/redirecting the page, we will leave it online and just add a message saying "This product is currently not available". If we do this, we will automatically re-activate about 500 products pages at once. 1. Just to make sure, is it harmful for SEO to keep activating/deactivating URLs this way? 2. Since most of these pages have been deindexed for a long time due to being redirected - have they lost all their SEO juice? 3. How can we better activate these old 500 pages - is it ok activating them all at once? Thank you,
Intermediate & Advanced SEO | | viatrading11 -
Does rewriting a URL affect the page authority?
Hi all, I recently optimized an overview page for a car rental website. Because the page didn’t rank very well, I rewrote the URL, putting the exact keyword combination in it. Then I asked Google to re-crawl the URL through Search Console. This afternoon, I checked Open Site Explorer and saw that the Page Authority had decreased to 1, while the subpages still have an authority of about 18-20. Hence my question: is rewriting a URL a bad idea for SEO? Thank you,
Intermediate & Advanced SEO | | LiseDE
Lise0 -
Attack of the dummy urls -- what to do?
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
Intermediate & Advanced SEO | | friendoffood1 -
SEO impact difference between a URL Rewrite and 301 redirect
Hi guys and girls! Just putting a new site live, we changed the URL from one thing to another and I created a 301 file redirecting the urls like for like. The developer installing it has created a different file with columns like: RewriteRule ^page/ http://www.site/page [R=301,L] RewriteRule ^/page/ http://www.site/page [R=301,L] What's the difference? The page redirects but is there a difference between the 301 redirect and this URL rewrite in terms of SEO and link value?
Intermediate & Advanced SEO | | shloy23-2945840 -
Best Practices for Moving a Sub-Domain to a Sub-Folder
One of my clients is moving their subdomain to a subfolder on their main domain. (ie. blog.example.com to example.com/blog) I just wanted to get everyone's thoughts on some best practices for things we should be doing/looking for when making this move.? ie WMT, .htaccess, 301s etc? Thanks.
Intermediate & Advanced SEO | | DarinPirkey0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0