What is the better way to fix duplication https and http?
-
Hi All!
I have a doubt about how to fix the duplication problem https @ http. What is the better way to fix it in your opionion/experience? Me, for instance, I have chosen to put "noindex, nofollow" into https version. Each page of my site has a https version so I put this metarobots into it....But I am not sure about what happens with all backlinks with "https" URLs I have, I've just checked I have some...What do you think about it?
Thanks in advance for helping!
-
You bet - please mark my answer as Good response!
-
Hi Wissam an CleverPhD!
I am totally agreed with you, actually I'm going to implement this solution.
Metarobots "noindex, nofollow" isn't at all effective, If you already have a lot of HTTPS URLS version indexed....
Thanks!!!
Francesca
-
The best way to approach this is 301 redirect https to http through htaccess or web.config for iis
-
301 redirects is the way to go. If you have a user on the wrong page, it forwards them to the correct one. Link equity is passed as well.
The noindex nofollow will help get whatever version you want out of the SERPs, but it does not fix it from happening again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Version of My Website
Hello Again, Looking for a little help to help me understand what exactly is going on here. Ive taken over maintenance of a website and have so far fixed a lot of issues. ahrefs has shown me that a second version of my companies website exists that exists at a second url. This second website is linked to the actual company website like I haven't seen before. www(dot)#(dot)co(dot)uk is the main company website. But a second accessible version exists and is accessible at www(dot)#(dot)co(dot)uk The instruments version is a direct copy and all of the links point directly to my main site. Any changes I make on the main version are automatically applied to the other version. It shows up as a SPAM back link on moz as all of the link points to my website etc Ideally in my mind, the instruments version homepage should simply re-direct to the main homepage to solve this "duplicate content and spammy backlink" issue however, the instruments version is the same suffix that all our company emails work with. Basically, HELP lol. I have no understanding of how this is set up, and the best way in which to deal and if it could affect anything such as company emails.
Technical SEO | | ATP0 -
Which URL structure is better?
Quick question - Have a real estate site focused on "apartments", but apartments in not part of my company name. That being said, should which of the following URL structures should I use? http://website.com/city/neighborhood/property-name OR http://website.com/city-apartments/neighborhood/property-name
Technical SEO | | ChaseH0 -
Duplicate content for vehicle inventory.
Hey all, In the automotive industry... When uploading vehicle inventory to a website I'm concerned with duplicate content issues. For example, 1 vehicle is uploaded to the main manufacturers website, then again to the actual dealerships website & then again to Craigslist & even sometimes to a group site. The information is all the same, description, notes, car details & images. What would you all recommend for alleviating duplicate content issues? Should I be using the rel canonical back to the manufacturers website? Once the vehicle is sold all pages disappear. Thanks so much for any advice.
Technical SEO | | DCochrane0 -
How to protect against duplicate content?
I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google. My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored. Is that the right move? Are there other things I should do? Thanks!
Technical SEO | | williammarlow0 -
Duplicate Content
Hi, we need some help on resolving this duplicate content issue,. We have redirected both domains to this magento website. I guess now Google considered this as duplicate content. Our client wants both domain name to go to the same magento store. What is the safe way of letting Google know these are same company? Or this is not ideal to do this? thanks
Technical SEO | | solution.advisor0 -
Fixing Crawl Errors
Hi! I moved my Wordpress blog back in August, and lost much of my site traffic. I recently found over 1000 crawl errors in Webmaster Tools because some of my redirects weren't transferred, so we are working on fixing the errors and letting Google know. I'm wondering how long I should expect for Google to recognize that the errors have been fixed and for the traffic to start returning? Thanks! Jodi - momsfavoritestuff.com
Technical SEO | | JodiFTM0 -
Duplicate homepage content
Hi, I recently did a site crawl using seomoz crawl test My homepage seems to have 3 cases of duplicate content.. These are the urls www.example.ie/ www.example..ie/%5B%7E19%7E%5D www.example..ie/index.htm Does anyone have any advise on this? What impact does this have on my seo?
Technical SEO | | Socialdude0 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0