Our site is on a secure server (https) will a link to http:// be of less value?
-
Our site is hosted on a secure network (I.E. Our web address is - https://www.workbooks.com).
Will a backlink pointing to: http://www.workbooks.com provide less value than a link pointing to: https://www.workbooks.com ?
Many thanks,
Sam
-
I've taken a look into the issue and it seems there is no duplication issues in Moz, so I'm going to consider all okay.
-
Thanks will look at getting this sorted.
Sam
-
Doesn't really matter what moz shows it matters what Google does. Google typically looks at http and https as separate sites. Same with www and non-www. As he mentioned above pick one and build authority around it. Since HTTPS is now a ranking signal I would stick with that one.
-
Thanks,
However Moz doesn't show any duplicate content issues with my site for this reason? So I assume we don't have an issue here.
So do you still consider http link to be an issue? I'm sure the links are benefiting us.
Sam
-
Hi,
You can have either http or https but not both because that would be considered as duplicate content. I am insisting to keep https version because Google considered HTTPS as a ranking signal .
Please also read this @ https://moz.com/ugc/solving-duplicate-content-issues-with-http-and-https
Hope this helps.
Thanks
-
Thanks,
What's your references/source backing up that we should only have https rather than http and https?
Would love to read more.
Sam
-
Hi,
Your website is opening on both with http & https . You must keep one version only & in your case https. Redirect (301) http version to https.
As you intended to keep https version take link on https version not on http version.
If you are talking about old links that pointed to http version those links will pass link juice only if you redirect http version to https using 301 redirect otherwise all links won't have any value.
Hope this helps.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
Does link position matter in the content/html code
My question is that if I have several links going to different landing pages will the one at the top of the content pass more value than ones at the bottom. Assuming that there are not more than 1 of the same link in the content. The ultimate question is whether or not link position in the content/html code make a difference if it passes more value. This question comes in response to this whiteboard Friday https://www.youtube.com/watch?v=xAH762AqUTU Rand talks about how if there are 2 links going to the same URL from the same content page then google will only inherit the value of the anchor text from the first link on the page and not the both of them. Meaning that google will treat that second link as if it doesn’t exist. There are lots of resources that shows this was true but there isn’t much content newer than 2010 that say this is still true, We all know that things have changed a lot since then Does that make sense?
Intermediate & Advanced SEO | | 97th_Floor0 -
Google cache is for a 3rd parties site for HTTP version and correct for HTTPS
If I search Google for my cache I get the following: cache:http://www.saucydates.com -> Returns the cache of netball.org (HTTPS page with Plesk default page) cache:https://www.saucydates.com -> Displays the correct page Prior to this my http cache was the Central Bank of Afghanistan. For most searches at present my index page is not returned and when it is, it’s the Net Ball Plesk page. This is, of course hurting my search traffic considerably. ** I have tried many things, here is the current list:** If I fetch as Google in webmaster tools the HTTPS fetch and render is correct. If I fetch the HTTP version I get a redirect (which is correct as I have a 301 HTTP to HTTPS redirect). If I turn off HTTPS on my server and remove the redirect the fetch and render for HTTP version is correct. The 301 redirect is controlled with the 301 Safe redirect option in Plesk 12.x The SSL cert is valid and with COMODO I have ensured the IP address (which is shared with a few other domains that form my sites network / functions) has a default site I have placed a site on my PTR record and ensured the HTTPS version goes back to HTTP as it doesn’t need SSL I have checked my site in Waybackwhen for 1 year and there are no hacked redirects I have checked the Netball site in Waybackwhen for 1 year, mid last year there is an odd firewall alert page. If you check the cache for the https version of the netball site you get another sites default plesk page. This happened at the same time I implemented SSL Points 6 and 7 have been done to stop the server showing a Plesk Default page as I think this could be the issue (duplicate content) ** Ideas:** Is this a 302 redirect hi-jack? Is this a Google bug? Is this an issue with duplicate content as both servers can have a default Plesk page (like millions of others!) A network of 3 sites mixed up that have plesk could be a clue? Over to the experts at MOZ, can you help? Thanks, David
Intermediate & Advanced SEO | | dmcubed0 -
Fetch As Google Redirect from HTTPS to HTTP
Hi Moz Community, We just launched a redesigned site and moved our blog from http://sparkline.motifinvesting.com to https://www.motifinvesting.com/blog. When I went to check the site today I noticed something strange. When I Fetch a Blog URL as Google it is redirecting from https to http AND dropping the / <post-id>from the end of the URL.</post-id> So for the URL https://www.motifinvesting.com/blog/cleantech-obama-romney/1494 I am getting the attached results when I Fetch as Google. When I access the same URL in my browser, I see a 200 status code and no redirect. Why would Google Bot show that the URL is being redirected? Thanks,
Intermediate & Advanced SEO | | motif_marketing
Breanna https://www.motifinvesting.com/blog/cleantech-obama-romney/1494 I7yQPza.png0 -
Will a GEO Localization site create thousands of duplicates?
Hi mozzers, We are about to launch a new site and right now I am worried that this new site may create thousands of duplicate content which will harm all the SEO that has been done in the last few years. Here is a situation: You land on the example.com/Los-angeles page (geo located) but if you modify URI to example.com/chico then a pop up appears and ask you for the location you want to be in (pop up attached). When choosing chico the URI switches to example.com/chico?franchise=chico instead of /chico only. This site has over 40 different microsites so my question are all these arguments ?franchise=city going to be indexed and create thousands of dups? or are we safe because this geo localization happens thanks to javascript? Thanks! GopRinh.png
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
OSE link report showing links to 404 pages on my site
I did a link analysis on this site mormonwiki.com. And many of the pages shown to be linked to were pages like these http://www.mormonwiki.com/wiki/index.php?title=Planning_a_trip_to_Rome_By_using_Movie_theatre_-_Your_five_Fun_Shows2052752 There happens to be thousands of them and these pages actually no longer exist but the links to them obviously still do. I am planning to proceed by disavowing these links to the pages that don't exist. Does anyone see any reason to not do this, or that doing this would be unnecessary? Another issue is that Google is not really crawling this site, in WMT they are reporting to have not crawled a single URL on the site. Does anyone think the above issue would have something to do with this? And/or would you have any insight on how to remedy it?
Intermediate & Advanced SEO | | ThridHour0 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Blog/Shop/Forum site structure - are we right to make these changes?
We run a fairly large online community with a popular blog and Europe's largest online shop for drift-specific motor sport parts and our website has been around since 2004 I believe. Since it was launched, the blog (or previous CMS system) has been at the domain root, the forums have been located at /forum and the shop at /shop (or similar) but we have decided to move things around a bit and would like some comments as to whether we are doing the right thing or if you would make any addition or different changes to us. Currently the entire website gets around 3m page views per month from 500,000 visitors, but this is split roughly 75% to the forums, 10% to the shop and 15% to the blog (but remember the blog is at the root so anyone who visits our homepage "visits" the blog). We plan to move the shop to the domain root (since the shop provides the income for the business - surely it should be the 1st thing visitors see?), the blog from root to /blog and the forums will stay where they are at /forum. We have read Steven Macdonald's post here, and have taken notes to help minimize traffic loss and disruption to our army of users and hopefully avoid too many penalties from Google and plan to: 301 redirect old URLs to new ones where they have changed. Submit new site maps to search engines. Update old links where we have control (such as forums where we are paid traders etc.). Send out a newsletter to our subscribers. Update our forum members. Fix errors via WMT before and after the re-structure. Should we be taking this opportunity to actually set each of the three sections of the site to it's own sub domain? Our thoughts are that if we are disrupting things, it's surely best to have lots of disruption once rather than a little bit of disruption several times over a 3-6 month period? OSE shows us to have roughly 1500 inbound links to /shop, 2100 to /forum and 4800 to the root / - if we proceed with our plan and put 301 redirects in place this seems to be the best plan to retain the value of these links but if we were to switch to sub domains would the 301s lose most of the link values due to them being on "different" domains? Any help, advise or suggestions are very welcome but comments from experience are what we are seeking ideally! Thanks Jay
Intermediate & Advanced SEO | | DWJames0