Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on URL trailing slash
-
Hello,
Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links).
Links that used to send to
example.com/webpage.htmlWere now linking to
example.com/webpage.html/Urls in the xml sitemap remained unchanged (no trailing slash).
We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash.
However, Google had time to index these pages. Is implementing 301 redirects required in this case?
-
Yes you want to have it match the canonical tag so most effective method is to 301 redirect so they match the canonical tag site map and robots.txt etc. You can use a Regex code like this at the end of the URL /?$ in the case of category URLs it will allow them when needed.
if you use the proper 301 you will not have to deal with the category issue anyway.
rel="canonical" href="https://moz.com/community/q/duplicate-content-on-url-trailing-slash" />
I hope this is able to shed more light on the issue and great answer Eric.
Hope I was of help,
Tom
-
Hi Eric,
I was at Step 3 of your 3 Step plan, looking for confirmation as to whether or not the 301 redirects were required in this situation.
Thanks!
-
Hi yacpro13! Did Eric or Thomas answer your question, and if so, would you mind marking one or both responses as a "Good Answer?"

Otherwise, what questions do you still have?
-
If you have changed the URLs with trailing slashes, then there are a few things you'll want to do:
-
make sure all the internal links on your site are updated to point to the proper version.
-
make sure that the sitemap.xml file(s) are correct, pointing to the proper version.
-
set up 301 permanent redirects so that the ones with the slash are redirecting to the old URLs.
As long as you have corrected the links internally, updated the sitemap file, and set up the 301 redirects, everything should go "back to normal" within a fairly short period of time. You will need to give it time, though, as Google will need to re-crawl all of those URLs and get it all ironed out.
-
-
I have provided the Apache and Nginx configurations you would need in addition to a URL that will convert
Apache Htaccess to Nginx
The instructions are right here
Remove Trailing Slash
Just like with the WWW example, some prefer to remove the trailing slash. It's a commonly debated question that you'll find around the Internet, but it just depends on what you prefer.
Remember, though, your browser and even your server, by default, add a trailing slash to a directory. It is done for a reason. If you must strip the trailing slash, though, this is how you would do it:
<code class="hljs apache">RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.*)$ RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]</code>For Nginx
nginx configuration location ~ (.)$ { } location / { if (!-e $request_filename){ rewrite ^(.)$ http://www.domain.com/$1 redirect; } }
The explanation for this rule is the same as it is for when we want to add a trailing slash, just in reverse. We can also specify specific directories that we don't want apply this rule over.
<code class="hljs apache">RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !directory/(.*)$ RewriteCond %{REQUEST_URI} !(.*)$ RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]</code>For Nginx
nginx configuration location ~ directory/(.)$ { } location ~ (.)$ { } location / { if (!-e $request_filename){ rewrite ^(.*)$ http://www.domain.com/$1 redirect; } }
Please see the note about mod_dir and the
DirectorySlashdirective in the previous example. You might need to turn this directive off.HTaccess converter for Apache to Nginx configuration.
http://winginx.com/en/htaccess
https://www.maxcdn.com/one/tutorial/remove-trailing-slash/
https://www.crucialhosting.com/knowledgebase/htaccess-apache-rewrites-examples
https://moz.com/community/q/how-to-remove-trailing-slashes-in-urls-using-htaccess-apache
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
How do I geo-target continents & avoid duplicate content?
Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
Intermediate & Advanced SEO | | AxialDev
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-ca Link hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0 -
Is an RSS feed considered duplicate content?
I have a large client with satellite sites. The large site produces many news articles and they want to put an RSS feed on the satellite sites that will display the articles from the large site. My question is, will the rss feeds on the satellite sites be considered duplicate content? If yes, do you have a suggestion to utilize the data from the large site without being penalized? If no, do you have suggestions on what tags should be used on the satellite pages? EX: wrapped in tags? THANKS for the help. Darlene
Intermediate & Advanced SEO | | gXeSEO0 -
Infinite Redirect Loop without trailing slash, please help
I've been searching for an answer all day, I can't seem to figure this out. When I Fetch my blog as Google(http://www.mysite.com/blog) WITHOUT a trailing slash at the end, I get this error: The page seems to redirect to itself. This may result in an infinite redirect loop **HTTP/1.1 301 Moved Permanently** When I Fetch my blog as Google WITH the trailing slash at the end(http://www.mysite.com/blog/), it is fine without errors. When I pull it up in a browser comes up fine both with and without the trailing slash. My .htaccess file in the root directory contains this: RewriteEngine On
Intermediate & Advanced SEO | | debc
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.htm\ HTTP/
RewriteRule ^index.htm$ http://www.mysite.com/ [R=301,L]
RewriteCond %{HTTP_HOST} ^mysite.com$
RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] My .htaccess file in the blog directory contains this: BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteCond %{REQUEST_URI} ^./index.php/. [NC]
RewriteRule ^index.php/(.*)$ http://www.mysite.com/blog/$1 [R=301,L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule> END WordPress Do I have something incorrectly coded in these .htaccess files that could be causing this? Or is there something else I should look at? Thank you for any help!!0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0