Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on URL trailing slash
-
Hello,
Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links).
Links that used to send to
example.com/webpage.htmlWere now linking to
example.com/webpage.html/Urls in the xml sitemap remained unchanged (no trailing slash).
We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash.
However, Google had time to index these pages. Is implementing 301 redirects required in this case?
-
Yes you want to have it match the canonical tag so most effective method is to 301 redirect so they match the canonical tag site map and robots.txt etc. You can use a Regex code like this at the end of the URL /?$ in the case of category URLs it will allow them when needed.
if you use the proper 301 you will not have to deal with the category issue anyway.
rel="canonical" href="https://moz.com/community/q/duplicate-content-on-url-trailing-slash" />
I hope this is able to shed more light on the issue and great answer Eric.
Hope I was of help,
Tom
-
Hi Eric,
I was at Step 3 of your 3 Step plan, looking for confirmation as to whether or not the 301 redirects were required in this situation.
Thanks!
-
Hi yacpro13! Did Eric or Thomas answer your question, and if so, would you mind marking one or both responses as a "Good Answer?"
Otherwise, what questions do you still have?
-
If you have changed the URLs with trailing slashes, then there are a few things you'll want to do:
-
make sure all the internal links on your site are updated to point to the proper version.
-
make sure that the sitemap.xml file(s) are correct, pointing to the proper version.
-
set up 301 permanent redirects so that the ones with the slash are redirecting to the old URLs.
As long as you have corrected the links internally, updated the sitemap file, and set up the 301 redirects, everything should go "back to normal" within a fairly short period of time. You will need to give it time, though, as Google will need to re-crawl all of those URLs and get it all ironed out.
-
-
I have provided the Apache and Nginx configurations you would need in addition to a URL that will convert
Apache Htaccess to Nginx
The instructions are right here
Remove Trailing Slash
Just like with the WWW example, some prefer to remove the trailing slash. It's a commonly debated question that you'll find around the Internet, but it just depends on what you prefer.
Remember, though, your browser and even your server, by default, add a trailing slash to a directory. It is done for a reason. If you must strip the trailing slash, though, this is how you would do it:
<code class="hljs apache">RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.*)$ RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]</code>
For Nginx
nginx configuration location ~ (.)$ { } location / { if (!-e $request_filename){ rewrite ^(.)$ http://www.domain.com/$1 redirect; } }
The explanation for this rule is the same as it is for when we want to add a trailing slash, just in reverse. We can also specify specific directories that we don't want apply this rule over.
<code class="hljs apache">RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !directory/(.*)$ RewriteCond %{REQUEST_URI} !(.*)$ RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]</code>
For Nginx
nginx configuration location ~ directory/(.)$ { } location ~ (.)$ { } location / { if (!-e $request_filename){ rewrite ^(.*)$ http://www.domain.com/$1 redirect; } }
Please see the note about mod_dir and the
DirectorySlash
directive in the previous example. You might need to turn this directive off.HTaccess converter for Apache to Nginx configuration.
http://winginx.com/en/htaccess
https://www.maxcdn.com/one/tutorial/remove-trailing-slash/
https://www.crucialhosting.com/knowledgebase/htaccess-apache-rewrites-examples
https://moz.com/community/q/how-to-remove-trailing-slashes-in-urls-using-htaccess-apache
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Country Code Top Level Domains & Duplicate Content
Hi looking to launch in a new market, currently we have a .com.au domain which is geo-targeted to Australia. We want to launch in New Zealand which is ends with .co.nz If i duplicate the Australian based site completely on the new .co.nz domain name, would i face duplicate content issues from a SEO standpoint?
Intermediate & Advanced SEO | | jayoliverwright
Even though it's on a completely separate country code. Or is it still advised tosetup hreflang tag across both of the domains? Cheers.0 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
How do I geo-target continents & avoid duplicate content?
Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
Intermediate & Advanced SEO | | AxialDev
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-ca Link hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!0 -
How to Remove Joomla Canonical and Duplicate Page Content
I've attempted to follow advice from the Q&A section. Currently on the site www.cherrycreekspine.com, I've edited the .htaccess file to help with 301s - all pages redirect to www.cherrycreekspine.com. Secondly, I'd added the canonical statement in the header of the web pages. I have cut the Duplicate Page Content in half ... now I have a remaining 40 pages to fix up. This is my practice site to try and understand what SEOmoz can do for me. I've looked at some of your videos on Youtube ... I feel like I'm scrambling around to the Q&A and the internet to understand this product. I'm reading the beginners guide.... any other resources would be helpful.
Intermediate & Advanced SEO | | deskstudio0 -
Duplicate Content on Wordpress b/c of Pagination
On my recent crawl, there were a great many duplicate content penalties. The site is http://dailyfantasybaseball.org. The issue is: There's only one post per page. Therefore, because of wordpress's (or genesis's) pagination, a page gets created for every post, thereby leaving basically every piece of content i write as a duplicate. I feel like the engines should be smart enough to figure out what's going on, but if not, I will get hammered. What should I do moving forward? Thanks!
Intermediate & Advanced SEO | | Byron_W0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
How get rid of duplicate content, titles, etc on php cartweaver site?
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction. Thank you, Jesse
Intermediate & Advanced SEO | | WSOT0