Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to force a trailing slash after the domain name
-
My campaign analysis is predictably listing domain.com and domain.com/ as repeated content. I've searched and searched but cannot find a way to force a trailing slash on the end of the domain name unless there's a file or directory after it..
Is there a way to accomplish this using .htaccess
-
I've gone with this .htaccess from your soulgorithm.com:
Options +FollowSymlinks
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.domain.co.uk [NC]
RewriteRule (.*) http://domain.co.uk/$1 [L,R=301]RewriteCond %{REQUEST_URI} (.)/$
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule (.)/$ $1.php [L]RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule .* %{REQUEST_FILENAME}/ [R=301,L]and I'm now getting the results I'm after. I'm getting similar behaviour to you in Firefox and IE, which explains a lot. I really appreciate the length you've gone to to help me here, so big thank you!
-
Test Site: soulgorithm.com
In the .htaccess file for this site:
Options +FollowSymlinks
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_HOST} ^www.soulgorithm.com [NC]
RewriteRule (.*) http://soulgorithm.com/$1 [L,R=301]RewriteCond %{REQUEST_URI} (.)/$
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteRule (.)/$ $1.html [L]RewriteCond %{REQUEST_URI} (.)/$
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule (.)/$ $1.php [L]RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.html -f [OR]
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule .* %{REQUEST_FILENAME}/ [R=301,L]Which has the following effect:
soulgorithm.com > soulgorithm.com/
(slash is added, but only shows in IE and looks
likes its being stripped by Firefox but page
still loads fine)
soulgorithm.com/ > soulgorithm.com/
(loads fine, but only shows in IE and lookslikes its being stripped by Firefox but page
still loads fine)
soulgorithm.com/test > soulgorithm.com/test/
(loads fine, slash even shows in FF)
soulgorithm.com/test/ > soulgorithm.com.com/test/
(loads fine)
soulgorithm.com/testdir > soulgorithm.com/testdir/
(loads fine, slash even shows in FF)
soulgorithm.com/testdir/ > soulgorithm.com.com/testdir/
(loads fine, slash even shows in FF)
Let me know if this is what you see. I feel likes its getting close to working.
-
Thanks for sticking with this. Rather than me share the domain, do you know of any example sites using your code (or similar) which add a trailing slash after the domain name? I'd like to rule out my browser stripping it out.
-
Man, my mind is blown right now. I'm not giving up and hopefully someone else can chime in on this discussion and shed some light on this issue.
The code provided should have worked. Let me look into it some more. Also, if you don't mind what is the actual domain name?
-
That's right - nothing in there but the code you supplied.
-
Is this the only thing you have in your htaccess file?
if not, I would remove everything in the file and only have what i posted above, and let me know if it works.
-
Nope. Still no trailing slashes being added.
-
Try just the following:
Let me know if this works for you.
RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !index.php RewriteCond %{REQUEST_URI} !(.*)/$ RewriteRule ^(.*)$ http://www.domain.com/$1/ [L,R=301] -
Thanks for the reply, but this looks like all the other examples I've found. My .htaccess file looks like this :
DirectoryIndex index.php
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ $1.php [L,QSA]RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.)/$
RewriteRule ^(.)$ http://domain.co.uk/$1/ [L,R=301]But I get the following redirects going on:
domain.co.uk > domain.co.uk (ie nothing happens)
domain.co.uk/ > domain.co.uk (ie slash is removed)
domain.co.uk/page2 > domain.co.uk/page2 (ie nothing happens, but page loads)
domain.co.uk/page2/ > Internal server error
Any ideas?
-
Hi Clive.
Yes, you can easily do this with an .htaccess file, here is the code:
RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.*)/$ RewriteRule ^(.*)$ http://domain.com/$1/ [L,R=301]Just replace "domain.com" with your proper url for your site. This should be all that is needed.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
Domain authority and keyword difficulty
I know there are too many variables for a certain answer, however do people take their domain authority into account when using keyword difficulty tool? I have a new domain which only has a score of seven at the moment. When using the keyword searching tool what is the maximum difficulty level keywords people would target initially? Obviously I would seek to increase the difficulty of the words over time but to start off its a hard choice between keywords which can be ranked for in a reasonable period of time and the keywords which are getting enough traffic to make the effort worthwhile.
Technical SEO | | Grumpy_Carl0 -
Multiple Domains on 1 IP Address
We have multiple domains on the same C Block IP Address. Our main site is an eCommerce site, and we have separate domains for each of the following: our company blog (and other niche blogs), forum site, articles site and corporate site. They are all on the same server and hosted by the same web-hosting company. They all have unique and different content. Speaking strictly from a technical standpoint, could this be hurting us? Can you please make a recommendation for the best practices when it comes to multiple domains like these and having separate or the same IP Addresses? Thank you!
Technical SEO | | Motivators0 -
Checkout on different domain
Is it a bad SEO move to have a your checkout process on a separate domain instead of the main domain for a ecommerce site. There is no real content on the checkout pages and they are completely new pages that are not indexed in the search engines. Do to the backend architecture it is impossibe for us to have them on the same domain. An example is this page: http://www.printingforless.com/2/Brochure-Printing.html One option we've discussed to not pass page rank on to the checkout domain by iFraming all of the links to the checkout domain. We could also move the checkout process to a subdomain instead of a new domain. Please ignore the concerns with visitors security and conversion rate. Thanks!
Technical SEO | | PrintingForLess.com0 -
Trailing Slashes In Url use Canonical Url or 301 Redirect?
I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?
Technical SEO | | upick-1623910 -
Using hyphenated sub-domains or non-hyphenated sub-domains? What is the question! I Any takers?
For our corporate business level domain, we are exploring using a hyphenated sub-domain foir a project. Something like www.go-figure.extreme.com I thought from a user perspective it seems cluttered. The domain length might also be an issue with the new Algorithm big G has launched in recent past. I know with past experience, hyphenated domains usually take longer to index, as they are used by spammers more frequently and can take longer to get out of the supplementary index. Our company site has over 90 million viewers / year, so our brand is well established and traffic isn't an issue. This is for a corporate level project and I didn't have the answer! Will this work? anyone have any experience testing this. Any thoughts will help! Thanks, Rob
Technical SEO | | RobMay0