Https Duplicate Content
-
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously.
Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version.
Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version.
I searched over the internet and found 3 possible solutions.
1 No-Index https version
2 Use rel=canonical
3 Redirect https versions with 301 redirectionNow I don’t know which solution is best for me as now https version is not working.
One more thing I don’t know how to implement any of the solution. My blog is running on WordPress.
Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google.
Thank you
-
Unfortunately, I think this may not only be the best option, but the only option. You can't NOINDEX or use rel=canonical if the https pages aren't resolving, so you would have to do the redirects in .htaccess.
Unfortunately, messing with .htaccess for anything sitewide can get pretty tricky, and a little dangerous. I believe this code should work, but I'm not sure about any quirks in WordPress or specifics of your site implementation:
http://www.highrankings.com/forum/index.php/topic/35833-how-do-i-redirect-https-to-http-on-my-site/
You may want a WordPress specialist to take a look.
-
Redirect is the best option for you I believe, you will pass the majority of link juice if anybody has backlinked to the https version.
You will need to contact the host where your server is in order to get this implemented. Alternatively you can just resubmit your http site for index in Google via GWT and see how that goes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with old website still online & duplicate content
I launched a new wordpress site at www.cheaptubes.com in Sept. I haven't taken the old one down yet, it is still at http://65.61.43.25/ The reason I left it up is I wanted to make sure everything was properly redirected 1st. Some pages and images are still ranking but most point to the new site. When I search for carbon nanotubes pricelist and look in images I see some of our images on the old site are still ranking there https://www.google.com/imgres?imgurl=http://65.61.43.25/images/single-walled-nanotubes.1.gif&imgrefurl=http://65.61.43.25/ohfunctionalizedcnts.htm&h=359&w=451&tbnid=HKlL84A_9X0jGM:&docid=N2wdCg7rSQBsjM&ei=-A2qVqThL4WxeKCyjdAM&tbm=isch&ved=0ahUKEwikvcWdxczKAhWFGB4KHSBZA8oQMwhJKCIwIg I guess I can put WP on the old server and do some 301s from there but I'm not sure if that is best or if I should just kill it off entirely? My rankings took a hit on Nov 15th and business has been bad ever since so I'm trying to figure this out quickly. Moz.com and onpage.org both say my site has duplicate content on several pages. I've looked at the content and it isn't duplicate. How can I figure this out? Google likely see's it the same way. These aren't duplicate pages, they are different products. I even searched my product pages to make sure I didn't have 2 of each in there and I don't. With Moz its mostly product tags it sees as duplicate but the products are completely different
Technical SEO | | cheaptubes0 -
Does duplicate content not concern Rand?
Hello all, I'm a new SEOer and I'm currently trying to navigate the layman's minefield that is trying to understand duplicate content issues in as best I can. I'm working on a website at the moment where there's a duplicate content issue with blog archives/categories/tags etc. I was planning to beat this by implementing a noindex meta tag on those pages where there are duplicate content issues. Before I go ahead with this I thought: "Hey, these Moz guys seem to know what they're doing! What would Rand do?" Blogs on the website in question appear in full and in date order relating to the tag/category/what-have-you creating the duplicate content problem. Much like Rand's blog here at Moz - I thought I'd have a look at the source code to see how it was dealt with. My amateur eyes could find nothing to help answer this question: E.g. Both the following URLs appear in SERPs (using site:moz,com and very targeted keywords, but they're there): https://moz.com/rand/does-making-a-website-mobile-friendly-have-a-universally-positive-impact-on-mobile-traffic/ https://moz.com/rand/category/moz/ Both pages have a rel="canonical" pointing to themselves. I can understand why he wouldn't be fussed about the category not ranking, but the blog? Is this not having a negative effect? I'm just a little confused as there are so many conflicting "best practice" tips out there - and now after digging around in the source code on Rand's blog I'm more confused than ever! Any help much appreciated, Thanks
Technical SEO | | sbridle1 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
Ways of Helping Reducing Duplicate Content.
Hi I am looking to no of anyway there is at helping to reduce duplicate content on a website with out breaking link and affecting Google rankings.
Technical SEO | | Feily0 -
How do I get rid of duplicate content
I have a site that is new but I managed to get it to page one. Now when I scan it on SEO Moz I see that I have duplicate content. Ex: www.mysite.com, www.mysite.com/index and www.mysite.com/ How do I fix this without jeopardizing my SERPS ranking? Any tips?
Technical SEO | | bronxpad0 -
Noindex duplicate content penalty?
We know that google now gives a penalty to a whole duplicate if it finds content it doesn't like or is duplicate content, but has anyone experienced a penalty from having duplicate content on their site which they have added noindex to? Would google still apply the penalty to the overall quality of the site even though they have been told to basically ignore the duplicate bit. Reason for asking is that I am looking to add a forum to one of my websites and no one likes a new forum. I have a script which can populate it with thousands of questions and answers pulled direct from Yahoo Answers. Obviously the forum wil be 100% duplicate content but I do not want it to rank for anyway anyway so if I noindex the forum pages hopefully it will not damage the rest of the site. In time, as the forum grows, all the duplicate posts will be deleted but it's hard to get people to use an empty forum so need to 'trick' them into thinking the section is very busy.
Technical SEO | | Grumpy_Carl0 -
Duplicate content
I have to sentences that I want to optimize to different pages for. sentence number one is travel to ibiza by boat sentence number to is travel to ibiza by ferry My question is, can I have the same content on both pages exept for the keywords or will Google treat that as duplicate content and punish me? And If yes, where goes the limit/border for duplicate content?
Technical SEO | | stlastla0