Solution for duplicate content not working
-
I'm getting a duplicate content error for:
http://www.website.com/default.htm
I searched for the Q&A for the solution and found:
Access the.htaccess file and add this line:
redirect 301 /default.htm http://www.website.com
I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page:
"This webpage has a redirect loop
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer.""Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects."
How can I correct this?
Thanks
-
Hi Joseph
The info you sent me looked fine. Have you seen if other users on other browsers or operating systems are having this issue? Also have you cleared your cookies as advised in the warning message?
If you upload a htaccess file with just one redirect do you still get the same issue? If so it would suggest that it may be a server issue. As such it may be worth talking to the relevant people within the organisation that deal with them or as may be the case if there is no one then it may be worth contacting the web host.
-
it wont harm to try this and try to add it before all redirects
<code>Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{THE_REQUEST} ^/default.htm HTTP/ RewriteRule ^/(default.htm)$ http://www.otherdomain.com [R=301,L]</code>
-
PM sent...
-
Hi Lewis,
I tried recreating the file as you said, but no luck yet... I sent you a pm with my .htaccess info, maybe you can see something I don't.
-
Do you mind posting URL or PM me
-
Ok, tried it and got a 200 OK
-
try to check your http header please.
add your http://www.website.com
and see what it will serve you. a 200 OK or a 301 to default.htm
-
Is there anything else in the .htaccess file that may be causing the loop?
If the file looks correct I would try copying and pasting the text into a new htaccess file and deleting the original and seeing if the problem remains.
I have fixed a random issue with htaccess before by effectively recreating the file.
-
Hi Wissam,
I'm not sure, there is no redirect set up for /default.htm in my .htaccess
I tried adding the redirectmatch 301 and got back the same error message as before...
-
Hi joseph,
first investigate who is redirecting root to /default... is it htaccess or script?
try to use
redirectmatch 301 /default.htm http://www.website.com
please follow up
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Wordpress tags and duplicate content?
I've seen a few other Q&A posts on this but I haven't found a complete answer. I read somewhere a while ago that you can use as many tags as you would like. I found that I rank for each tag I used. For example, I could rank for best night clubs in san antonio, good best night clubs in san antonio, great best night clubs in san antonio, top best night clubs in san antonio, etc. However, I now see that I'm creating a ton of duplicate content. Is there any way to set a canonical tag on the tag pages to link back to the original post so that I still keep my rankings? Would future tags be ignored if I did this?
Technical SEO | | howlusa0 -
Duplicate Content - Reverse Phone Directory
Hi, Until a few months ago, my client's site had about 600 pages. He decided to implement what is essentially a reverse phone directory/lookup tool. There are now about 10,000 reverse directory/lookup pages (.html), all with short and duplicate content except for the phone number and the caller name. Needless to say, I'm getting thousands of duplicate content errors. Are there tricks of the trade to deal with this? In nosing around, I've discovered that the pages are showing up in Google search results (when searching for a specific phone number), usually in the first or second position. Ideally, each page would have unique content, but that's next to impossible with 10,000 pages. One potential solution I've come up with is incorporating user-generated content into each page (maybe via Disqus?), which over time would make each page unique. I've also thought about suggesting that he move those pages onto a different domain. I'd appreciate any advice/suggestions, as well as any insights into the long-term repercussions of having so many dupes on the ranking of the 600 solidly unique pages on the site. Thanks in advance for your help!
Technical SEO | | sally580 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
Is 100% duplicate content always duplicate?
Bit of a strange question here that would be keen on getting the opinions of others on. Let's say we have a web page which is 1000 lines line, pulling content from 5 websites (the content itself is duplicate, say rss headlines, for example). Obviously any content on it's own will be viewed by Google as being duplicate and so will suffer for it. However, given one of the ways duplicate content is considered is a page being x% the same as another page, be it your own site or someone elses. In the case of our duplicate page, while 100% of the content is duplicate, the page is no more than 20% identical to another page so would it technically be picked up as duplicate. Hope that makes sense? My reason for asking is I want to pull latest tweets, news and rss from leading sites onto a site I am developing. Obviously the site will have it's own content too but also want to pull in external.
Technical SEO | | Grumpy_Carl0 -
Does duplicate content on word press work against the site rank? (not page rank)
I noticed in the crawl that there seems to be some duplicate content with my word press blog. I installed a seo plugin, Yoast's wordpress seo plugin, and set it to keep from crawling the archives. This might solve the problem but my main question is can the blog drag my site down?
Technical SEO | | tommr10