Solution for duplicate content not working
-
I'm getting a duplicate content error for:
http://www.website.com/default.htm
I searched for the Q&A for the solution and found:
Access the.htaccess file and add this line:
redirect 301 /default.htm http://www.website.com
I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page:
"This webpage has a redirect loop
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer.""Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects."
How can I correct this?
Thanks
-
Hi Joseph
The info you sent me looked fine. Have you seen if other users on other browsers or operating systems are having this issue? Also have you cleared your cookies as advised in the warning message?
If you upload a htaccess file with just one redirect do you still get the same issue? If so it would suggest that it may be a server issue. As such it may be worth talking to the relevant people within the organisation that deal with them or as may be the case if there is no one then it may be worth contacting the web host.
-
it wont harm to try this and try to add it before all redirects
<code>Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{THE_REQUEST} ^/default.htm HTTP/ RewriteRule ^/(default.htm)$ http://www.otherdomain.com [R=301,L]</code>
-
PM sent...
-
Hi Lewis,
I tried recreating the file as you said, but no luck yet... I sent you a pm with my .htaccess info, maybe you can see something I don't.
-
Do you mind posting URL or PM me
-
Ok, tried it and got a 200 OK
-
try to check your http header please.
add your http://www.website.com
and see what it will serve you. a 200 OK or a 301 to default.htm
-
Is there anything else in the .htaccess file that may be causing the loop?
If the file looks correct I would try copying and pasting the text into a new htaccess file and deleting the original and seeing if the problem remains.
I have fixed a random issue with htaccess before by effectively recreating the file.
-
Hi Wissam,
I'm not sure, there is no redirect set up for /default.htm in my .htaccess
I tried adding the redirectmatch 301 and got back the same error message as before...
-
Hi joseph,
first investigate who is redirecting root to /default... is it htaccess or script?
try to use
redirectmatch 301 /default.htm http://www.website.com
please follow up
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do mobile and desktop sites that pull content from the same source count as duplicate content?
We are about to launch a mobile site that pulls content from the same CMS, including metadata. They both have different top-level domains, however (www.abcd.com and www.m.abcd.com). How will this affect us in terms of search engine ranking?
Technical SEO | | ovenbird0 -
Is this duplicate content?
All the pages have same information but content is little bit different, is this low quality and considered as duplicate content? I only trying to make services pages for each city, any other way for doing this. http://www.progressivehealthofpa.com/brain-injury-rehabilitation-pennsylvania/
Technical SEO | | JordanBrown
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-york/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-jersey/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-connecticut/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-maryland/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-massachusetts/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-philadelphia/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-york-city/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-baltimore/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-boston/0 -
Duplicate Page content / Rel=Cannonical
My SEO Moz crawl is showing duplicate content on my site. What is showing up are two articles I submitted to Submit your article (article submission service). I put their code in to my pages i.e. " <noscript><b>This article will only display in JavaScript enabled browsers.</b></noscript> " So do I need to delete these blog posts since they are showing up as dup content? I am having a difficult time understanding rel=cannonical. Isn't this for dup content on within one site? So I could not use rel="cannonical" in this instance? What is the best way to feature an article or press release written for another site, but that you want your clients to see? Rewritting seem ridiculous for a small business like ours. Can we just present the link? Thank you.
Technical SEO | | RoxBrock0 -
Multiple Sites Duplicate Content Best Practice
Hi there, I have one client (atlantawidgets.com) who has a main site. But also has duplicate sites with different urls targeting specific geo areas. I.e. (widgetmakersinmarietta.com) Would it be best to go ahead and create a static home page at these add'l sites and make the rest of the site be nonindexed? Or should I go in and allow more pages to be indexed and change the content? If so how many, 3, 5, 8? I don't have tons of time at this point. 3)If I change content within the duplicate sites, what % do I need to change. Does switching the order of the sentences of the content count? Or does it need to be 100%fresh? Thanks everyone.
Technical SEO | | greenhornet770 -
Duplicate Page Content for sorted archives?
Experienced backend dev, but SEO newbie here 🙂 When SEOmoz crawls my site, I get notified of DPC errors on some list/archive sorted pages (appending ?sort=X to the url). The pages all have rel=canonical to the archive home. Some of the pages are shorter (have only one or two entries). Is there a way to resolve this error? Perhaps add rel=nofollow to the sorting menu? Or perhaps find a method that utilizes a non-link navigation method to sort / switch sorted pages? No issues with duplicate content are showing up on google webmaster tools. Thanks for your help!
Technical SEO | | jwondrusch0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
Duplicate Page Titles and Content
I have a site that has a lot of contact modules. So basically each section/page has a contact person and when you click the contact button it brings up a new window with form to submit and then ends with a thank you page. All of the contact and thank you pages are showing up as duplicate page titles and content. Is this something that needs to be fixed even if I am not using them to target keywords?
Technical SEO | | AlightAnalytics0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0