How to Resolve Duplication of HTTPS & HTPP URLs?
-
Right now, I am working on eCommerce website. [Lamps Lighting and More]
I can find out both URLs in website as follow.
HTTP Version:
http://www.lampslightingandmore.com/
HTTPS Version:
https://www.lampslightingandmore.com/
I have check one of my competitor who has implemented following canonical on both pages. Please, view source code for both URLs.
Then, I checked similar thing in SEOmoz website. Why should I not check in SEOmoz because, They are providing best SEO information so may be using best practice to deal with HTTPS & HTTP. LOL
I tried to load following URL so it redirect to home page.
https://www.seomoz.org is redirecting to http://www.seomoz.org
But, following URL is not redirecting any where as well as not set canonical over there.
https://www.seomoz.org/users/settings
I can find out following code on http://www.seomoz.org/robots.txt
**User-agent: ***
** Disallow: /api/user?***
So, I am quite confuse to solve issue. Which one is best 301 redirect or canonical tag? If any live example to see so that's good for me and make me more confident.
-
I have set robots.txt file for HTTP and HTTPS versions. You can find out both file above your response. Thanks for your answer.
-
Our solution to this, was to make sure we had a canonical for each and every page pointing to the http:// version.
Secondly https:// was only made available after logging in.
-
Yep
-
Now, Looks fine... Right??
-
You are right. Because, I have solid confusion after reading article about duplication. I checked my website and found HTTPS and HTTP pages and raising questions in that direction.
-
So, What about canonical tag. I am too confuse with it. What is ultimate conclusion. Because, I have make it live one website after getting suggestion.
Any eCommerce experience which will help me to understand more. What is best solution in my case. My goal is remove duplication in website and improve crawling rate.
-
I believe you're messing things, honestly. 1st > choose a canonical version for your site (www. or not). Sometimes absolute urls can give problems for https version of a site. 2nd > consider if your really want to index the htpps version... If not, put no index or block it via robots.txt. If yes use as canonical tag the http URL of the https page.
-
I would use no índex for the https version of the site, or block it from robots.txt, if i don't want it to be indexed.
-
I want to add similar mind bubble in this question.
http://www.lampslightingandmore.com/
https://www.lampslightingandmore.com/
I have make canonical tag live after discussion over here. But, I have confusion regarding Relative & Absolute URLs.
I am using absolute URLs in canonical tag but, website have relative URLs.
So, Does it create any issue or stop down get benefit of canonical tag?
-
Yes, I don't want to crawl my HTTPS pages and don't want to create duplication by HTTPS and HTTP pages.
-
My question is in same manner. So, why WayFair have set canonical in website?
-
But you don't want your https pages crawled if there's the same version available as http. This is mostly a technical issue, but crawling a https site is way more expensive for both bot and server.
-
How to Resolve Duplication of HTTPS & HTTP URLs?
Neither a redirect nor a canonical tag is necessary.
HTTP, HTTPS, FTP, etc are various protocols used to access information contained on your web server. The data itself is only instanced once, but you can access the data by using these various protocols. It is not a duplication of data and will not cause any SEO issues.
-
301 redirect doesn't exclude a canonical. If you just want to use one solution, use the 301. There was a YouMoz post about exactly this topic a time ago, have look at it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEM Rush & Duplicate content
Hi SEMRush is flagging these pages as having duplicate content, but we have rel = next etc implemented: https://www.key.co.uk/en/key/brand/bott https://www.key.co.uk/en/key/brand/bott?page=2 Or is it being flagged as they're just really similar pages?
Intermediate & Advanced SEO | | BeckyKey0 -
Full title in url
Hi to all, what is the best url structure, to have all words in the url or to tweak url like Yoast suggest? If we remove some words from url , not focus keyword but stop words and other keywords to have shorter url will that impact search rankings? example.com/one-because-two-for-three-on-four - long url, moz crawl error, yoast red light example.com/one-two-three-four - moz ok, yoast ok Where one is a focus keyword.
Intermediate & Advanced SEO | | WalterHalicki0 -
What should my main sitemap URL be?
Hi Mozzers - regarding the URL of a website's main website: http://example.com/sitemap.xml is the normal way of doing it but would it matter if I varied this to: http://example.com/mainsitemapxml.xml or similar? I can't imagine it would matter but I have never moved away from the former before - and one of my clients doesn't want to format the URL in that way. What the client is doing is actually quite interesting - they have the main sitemap: http://example.com/sitemap.xml - that redirects to the sitemap file which is http://example.com/sitemap (with no xml extension) - might that redirect and missing xml extension the redirected to sitemap cause an issue? Never come across such a setup before. Thanks in advance for your feedback - Luke
Intermediate & Advanced SEO | | McTaggart0 -
Title & Keywords
Hi Quick question on arrangement of keywords in titles. I know the order isn't so important anymore, but would there be a real issue if I want to rank for 'Henry Xtra' but my title reads 'Numatic Henry Xtra Vacuum Cleaner' Rather than 'Henry Xtra Vacuum Cleaner' ?? Will it really make much difference? Thank you!
Intermediate & Advanced SEO | | BeckyKey0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
Lots of incorrect urls indexed - Googlebot found an extremely high number of URLs on your site
Hi, Any assistance would be greatly appreciated. Basically, our rankings and traffic etc have been dropping massively recently google sent us a message stating " Googlebot found an extremely high number of URLs on your site". This first highligted us to the problem that for some reason our eCommerce site has recently generated loads (potentially thousands) of rubbish urls hencing giving us duplication everywhere which google is obviously penalizing us with in the terms of rankings dropping etc etc. Our developer is trying to find the route cause of this but my concern is, How do we get rid of all these bogus urls ?. If we use GWT to remove urls it's going to take years. We have just amended our Robot txt file to exclude them going forward but they have already been indexed so I need to know do we put a redirect 301 on them and also a HTTP Code 404 to tell google they don't exist ? Do we also put a No Index on the pages or what . what is the best solution .? A couple of example of our problems are here : In Google type - site:bestathire.co.uk inurl:"br" You will see 107 results. This is one of many lot we need to get rid of. Also - site:bestathire.co.uk intitle:"All items from this hire company" Shows 25,300 indexed pages we need to get rid of Another thing to help tidy this mess up going forward is to improve on our pagination work. Our Site uses Rel=Next and Rel=Prev but no concanical. As a belt and braces approach, should we also put concanical tags on our category pages whereby there are more than 1 page. I was thinking of doing it on the Page 1 of our most important pages or the View all or both ?. Whats' the general consenus ? Any advice on both points greatly appreciated? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
Best Product URL For Indexing
My proposed URL: mydomain.com/products/category/subcategory/product detail Puts my products 4 levels deep. Is this too deep to get my products indexed?
Intermediate & Advanced SEO | | waynekolenchuk0 -
Is it OK to have a site that has some URLs with hyphens and other, older, legacy URLs that use underscores?
I'm working with a VERY large site that has recently been redesigned/recategorized. They kept only about 20% of the URLs from the legacy site, the URLs that had revenue tied to them, and these URLs use underscores. Whereas the new URLs created for the site use hyphens. I don't think that this would be an issue for Google, as long as the pages are of quality, but I wanted to get everyone's opinion on this. Will it hurt me to have two different sets of URLs, those with using hyphens and those using underscores?
Intermediate & Advanced SEO | | Business.com0