Having a hard time with duplicate page content
-
I'm having a hard time redirecting website.com/ to website.com
The crawl report shows both versions as duplicate content.
Here is my htaccess:
RewriteEngine On
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L]RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L]I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.
-
Does the URL redirect in the browser, or are you still needing to get the redirect itself set up correctly?
-
Tried this, ran new crawl, still says duplicate pages.
-
Thanks, I'll give it a shot and recrawl.
-
This is what we use
Replace html with php
RewriteEngine on
RewriteCond %{THE_REQUEST} ^./index.html?\ HTTP/
RewriteRule ^(.)index.html?$ "/$1" [R=301,L]I'd remove the last 2 lines from your post and add that in...
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Problem with Duplicate Page Wordpress
Hi all My name is Riccardo and i work for a web agency. I'am working on a new client website and i have found this kind of errors through MOZ (Image 1). I checked all the URLs; they work and they remind to the Homepage.
Intermediate & Advanced SEO | | advmedialab
The website is made with Wordpress. I have already tried to solve this problem with 301 redirect but, as i supposed, it didn't work.
I think that is a problem related to Wordpress URL in Wordpress settings (Image 2). However i would like to know if anybody had the same problem or if there are other possibile causes. Thank you in advance! zDVL0pj aB7MeGe0 -
Content Publishing Volume/Timing
I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online. Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities). My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release. Are there pitfalls I should avoid in terms of pushing out so much back content at once?
Intermediate & Advanced SEO | | andrewkissel0 -
Bigcommerce & Blog Tags causing Duplicate Content?
Curious why moz would pick up our blog tags as causing duplicate content, when each blog has a rel canonical tag pointing to either the blog post itself and on the tag pages points to the blog as a whole. Kinda want to get rid of the tags in general now, but also feel they can add some extra value to UX later on when we have many more blog posts. Curious if anyone knows a way around this or even a best solution practice when faced with such odd issues? I can see why the duplicate content would happen, but when grouping content into categories?
Intermediate & Advanced SEO | | Deacyde0 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
Should we deindex duplicate pages?
I work on an education website. We offer programs that are offered up to 6 times per year. At the moment, we have a webpage for each instance of the program, but that's causing duplicate content issues. We're reworking the pages so the majority of the content will be on one page, but we'll still have to keep the application details as separate pages. 90% of the time, application details are going to be nearly identical, so I'm worried that these pages will still be seen as duplicate content. My question is, should we deindex these pages? We don't particularly want people landing on our application page without seeing the other details of the program anyway. But, is there problem with deindexing such a large chunk of your site that I'm not thinking of? Thanks, everyone!
Intermediate & Advanced SEO | | UWPCE0 -
Affiliate Site Duplicate Content Question
Hi Guys I have been un-able to find a definite answer to this on various forums, your views on this will be very valuable. I am doing a few Amazon affiliate sites and will be pulling in product data from Amazon via a Wordpress plugin. The plugin pulls in titles, descriptions, images, prices etc, however this presents a duplicate content issue and hence I can not publish the product pages with amazon descriptions. Due to the large number of products, it is not feasible to re-write all descriptions, but I plan re-write descriptions and titles for 50% of the products and publish then with “index, follow” attribute. However, for the other 50%, what would be the best way to handle them? Should I publish them as “noindex,follow”? **- Or is there another solution? Many thanks for your time.**
Intermediate & Advanced SEO | | SamBuck0