Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do I lose link juice if I have a https site and someone links to me using http instead?
-
We have recently launched a https site which is getting some organic links some of which are using https and some are using http. Am I losing link juice on the ones linked using http even though I am redirecting or does Google view them the same way? As most people still use http naturally will it look strange to google if I contact anyone who has given us a link and ask them to change to https?
-
Thanks, that was my impression of it as well. Just wanted to check I wasn't overlooking something.
-
Hi Jonathan
Not really. Some say it shortens load times because it shortens code on the page, but that's minimal. I am looking at it from the standpoint of if you ever change your domain or URL structure, you don't have to go through the tedious task of manually changing internal links.
Does that make sense?
-
Is there any reason to use relative Urls from an SEO perspective?
-
Hi Hector
This is something I wouldn't be too concerned about. Do the following and you will be fine:
-
Ensure all pages are following the new https structure
-
No http version
-
Make sure your site is either www. or non www. consistently
-
Make sure you have no chain redirects
-
Update any old redirect files
-
Make sure resources are secure as well (CSS files / javascript / widgets / images / etc)
-
Make sure canonical tags reflect URL change
-
Update your sitemap.xml
-
Update sitemap.xml in Google Webmaster Tools / Bing Webmaster Tools
-
Correct all of your internal links to reflect the new structure
-
Consider using relative URLs
-
If you want, find your high quality backlinks using Moz or Majestic
-
Reach out and ask them to reflect the https structure
Again, this isn't a huge issue as Google will pass 90-99% of link equity and https is now a ranking factor. You should be all good, but make sure you run through this list I just put up for you and you can tie up any loose ends. Here's a great resource from Moz.
Hope this helps! Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it Okay to Nofollow all External Links
So, we all "nofollow" most of the external links or all external links to hold back the page rank. Is it correct? As per Google, only non-trusty and paid links must be nofollow. Is it all same about external links and nofollow now?
White Hat / Black Hat SEO | | vtmoz0 -
Spam sites with low spam score?
Hello! I have a fair few links on some of the old SEO 'Directory' sites. I've got rid of all the obviously spammy ones - however there are a few that remain which have very low spam scores, and decent page authority, yet they are clearly just SEO directories - I can't believe they service any other purpose. Should we now just be getting rid of all links like this, or is it worth keeping if the domain authority is decent and spam score low? Thanks Sam
White Hat / Black Hat SEO | | wearehappymedia0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
Are link directories still effective? is there a risk?
We've contracted a traditional SEO firm, mostly for link building. As part of their plan they want to submit our site to a large list of link directories, and we're not sure if that's a good option. As far as we know, those directories have been ineffective for a long time now, and we're wondering if there is the chance of getting penalized by google. When I asked the agency their opinion about that, they gave me the following answer - Updated and optimized by us - We are partnered with these sites and control quality of these sites. Unique Class C IP address - Links from unique Referring Class C IP plays a very important role in SEO. Powered by high PR backlinks Domain Authority (DA) Score of over 20 These directories are well categorized. So they actually control those directories themselves, which we think is even worse. I'm wondering what does the Moz community think about link directory submission - is there still something to be gained there, is there any risk involved, etc. Thanks!
White Hat / Black Hat SEO | | binpress0 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
White Hat / Black Hat SEO | | CaseyDaline0 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
White Hat / Black Hat SEO | | THB0