Linkbuilding to http/https
-
Hello,
Last year we changed our domain name from http to https. In the .htaccess we included a 301 redirect from http to https.
Nevertheless, some pages do not let you link to https, so for linkbuilding we sometimes need to use http://www.domain.ext in stead of https://www.domain.ext.
Would this be a problem? I should expect with the 301 redirect that we don't have an issue here, but I would like to be sure.
Although, I can make the site accessible without https too, but I don't think this is a good idea regarding duplicate content, is it?
Thanks.
Tom
-
No, you can do it, but you'll run into some technical issues. For instance try to run this tool on one of your URLs and you'll see it gets rejected.
If you're only making the site https because of a technicaly easier implementation, then I would suggest putting the time in to set the site up right and force https only when secure pages are needed, and force http when coming off of those secure pages
http://www.internetofficer.com/seo-tool/redirect-check/
Response
Checked link: https://site.com
This link does not work.Error message: Invalid protocol (URL must start with http://)
-
True. Nevertheless, is there a problem to have your whole site run under https, and have links to both http and https as long as the http has a permanent redirect to https?
There is no way Google can index pages on http, as it will always redirect to https.
-
Why was the decision made to move from http to https, most sites only use https for their secure login pages. I'm guessing it was "easier to implement technically" which is why this was redirected that way.
-
Thanks for your answer!
We currently have a 301 Permanent Redirect for the entire http domain, so the site is only accessible at https. Is it therefore an option to include an canonical that points to https?
-
Hi Tom, You definitely have a duplciate content issue, it has always been struggling to optimize an https site, but now with the new feature SE gave you can solve them.
Duplication can be solved using canonicals. Teh canonical tag will help bots to understand which page ranks. I recommend to use the http url which is the frendlier one and the easiest to build links, you can leave the https url for the transactions portions of the website which is where the https is really needed.
Read this article though, it helped me a lot while optimizing such a website
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Questions About LinkBuilding
HI, I am about to start link building for my website. I have made a 5 month plan. Can anyone have a look on my linkbuilding plan ? Any suggestions ? Thanks Web2. Submission (Monthly 10 Submission ) (( Squdoo, Hubpage , Tumblr , Blogspot , WordPress )) Blog Commenting (Very Related and Related Comment ) ( Monthly 15 Comments Maximum) Guest Posting ( Must be no-follow links ) (Monthly 10 submission ) Press Release (Weekly Distribution ) (( Distributing 1 Press release the whole month ,15 Submission maximum High PR and Related Only )) Directory Submission (Related Directory Submission Only ) N.B:: Don’t get confused with article directory submission. ( Monthly 10 Submission ) Rss Syndication Submit Infographics (Monthly 10 Submission )
Link Building | | businessowner1 -
Site received penalty, Traffic restricted to 3-4 visits/hour
Dear all We received a penalty notice at webmaster tools 2 months ago, in that stated "your pages violate our quality guidelines" and "contain some inorganic links to your site". Ours is a Coupon website and tried to follow all quality guidelines given in webmaster central. Also we tried to identify and remove spam kind of links and still we are working on that. Suddenly from yesterday website traffic restricted to 3-4 visits/hour, but website main keyword SERPs are not changed much and still ranking at good positions. my question if any website is under manual penalty, it's traffic will be restricted by Google???. Every link we build manually and not gone for any paid links or reciprocal links. But the sites given some links as sitewide, example we tried to get a back link & submitted url at once only, but it was showing in webmaster Total links 468 and linked pages 1. Can any one please explain, how it can happen?. We are trying to remove these spam kind of links, but from directory owners response is a bit disappointing, kindly any one suggest that, will Google Disavow tool help in this regard and how much time it will take to nullify the links?. Kindly help us..Thanking you all in advance
Link Building | | Shashidhar.SEO0 -
Best way to set up a support site - sub domain, separate site or com/support?
My company is setting up our support site. It has been suggested that we set it up as a separate domain - www.company-support.com, rather than setting it up as a sub-domain (www.support.company.com) or the usual www.company.com/support. The person who proposed this believes it wlll create link juice, but i wonder if splitting the traffic will create more harm than good? Opinions please!
Link Building | | googilvie0 -
Can high SERPS and/or social signals minimize Google penalties and a back linking removal question
As I am continually sizing up my competition in the SERPS I have scanned their sites with a fine tooth and comb. I have found that these sites practice in the very things that I have practiced in the past and have removed thinking that may be some of the reasons I was hit with Penguin. Some of these factors are: Link Scheme with sites they own (C Blocks) Content for Search Engines (Keyword rich text) Exact anchor text in back linking profile Yet even though my competition practices in these methods (One site even places exact anchor text in the footer and header of every page for one of their other forum site) they seem to have not even been touched with any of the recent updates. In fact it seems their ranking have increased. In scanning these sites the only major difference that I have been able to see between them and I is that their SERPS are higher than mine and they have way more social signals than me. One site has about 73k facebook likes where I only have about 300. My question is Can Google ignore penalties for sites that have higher SERPS and /or social signals that would effect another site that had lower ones? My other question is related to back links My main site has links from another site I built a long time ago (Pre SEO and not knowing what I was doing) somewhere in the 73k range. Obviously a HUGE signal to Google that this might be spam and I am aware. I have removed the links from that site but unfortunately the average crawl rate per day is very low so it is taking a very long time for Google to find those pages and re-crawl them to find the links gone. Since that site I have than has those links pointing to my main site has very low traffic I am totally willing to kill that entire site with a 404. Can this help speed up the removal of those links from that site? I figure since the site no longer exists all links from that site will be removed almost immediately from my main site. Any thoughts?
Link Building | | cbielich0 -
Are directories still an option for linkbuilding
I know that after the recent slate of Google updates, many free directories got taken down. We've typically used the list of directories placed on SEOmoz (www.seomoz.org/directories). Before adding a site to a directory, we always check to see if it is indexed by Google. Is this still a safe option or should directories be avoided?
Link Building | | TopFloor0 -
One website with different niches/keywords.
Hello Mozzers! I am currently building a website, which basically contains information from different niches. Let's make it clear: My website is something like eHow. The idea is to write quality articles, written especially to give the searcher the right information, without any scam, product advertisement or etc. The incomes are from AdSense. Actually I got inspired for this website from the Google Panda update. So... my question is - would it be harder to rank with such a website(domain name) that contains different niches and keyword phrases? Lets change a lil bit the situation and show you an example - I am trying to rank with keyword "how to make nice cheezeburgers" with http://theanswer.org/how-to-make-nice-cheezeburgers/ . At the same time I am also trying to rank with "The best movies of all times" with http://theanswer.org/the-best-movies-of-all-times/ . Thank you, Ralchev 🙂
Link Building | | Ralchev0 -
Good/Bad Backlinks - Penalties
Just a quick question from someone that is extremely new to this: If "G" penalizes sites for bad backlinks (low quality, spammy) then why are people not building these types of links towards the competitors that are currently outranking them? Or are they? and if they are, is it beneficial? (are your competitors dropping in rankings?) Seems like a fair question.... Any insight is appreciated
Link Building | | Prime850 -
How to improve (ASAP) the linking root domain, the followed linking root domains and the linking C-Blocks? Linkbuilding (or whatever) techniques.
I have a small site (.com) like any website in my sector. 30-90 pages. I have no crawls errors. Everythings is fine, just I need to improve my linking root domain, the followed linking root domain and the linking C-Blocks. Example: my competitors have 300 (one of them have 1300) of total links. I have 30. Anyone know some good strategies? techniques? tips? I just dont want to be in a farm directory, I want free links. I'm already running two strategies but it works so slowly. I want something faster at this moment. Also, any recommendation will be thankful.
Link Building | | NicoDavila0