Do backlinks need to be clicked to pass linkjuice?
-
Hi all:
Do backlinks need to be clicked to pass linkjuice? Is so, can someone explain how much traffic is needed from a backlink to count as linkjuice?
Thanks for the help.
Audrey.
-
Backlinks do not have to be clicked in order for them to count as linkjuice. Recently my org (missionquest.org) joined MOZ and it helped our backlinks and improved our SEO.
-
I would be surprised.
Google knows a lot, but not everything. Unless GA tracking code is installed google shall not know about things such a user click.
If they were passing page juices only for clicked backlink they would be ruling out a too big chunk of the web. It doesn't sound logic to me.
Also it doesn't sound realistic to analyze all users click in the world when refreshing google index, they do have a lot of metal, but not that much.
-
So, are you saying that a link having traffic kind of disqualifies it as spammy? Or at least in the eyes of Google?
-
Absolutely not. Spam links still work fantastic for ranking a site (temporarily). Those are links that never get seen or clicked, they pretty much just get crawled. Don't go the spam route, but also don't worry too much about people clicking links. I've gotten a ton of great links that have sent very, very little referral traffic, meaning links on popular posts still don't guarantee getting any/many clicks.
-
I don't think so. I usually fetch and render then submit my pages anytime I add one to my site, or make a significant change, like adding content or changing images. Nothing unnatural about it.
-
Good idea. I wonder if it would seem "un-natural" however?
-
Submitting the page to Google for Indexing doesn't guarantee that the backlinks will be crawled, but it can be a good way to try to force them to be crawled.
-
In that case, wouldn't it be ideal to submit the page to google indexing right after it's published?
-
I think it's about Page popularity and users engagements. Popularity in search results means a lot of spiders in the page. And, when a user clicks the link, there's a spider follows him to the new page. And it's all about the spider discovered your page and your link as well (as I think).
-
In fact, it's not like that.
I will tell you a very important rule about backlinks and really hard to find it. Tha main point is that the link need to be discovered by Google. And, the page which contain the link must have popularity in Google search results which mean a lot of people entering the page through search results. This what we call "the Quality of the link"
Keep up with your link building journey.
-
The way that I understand it is that the click helps the link to be found faster than if it had not been clicked. It might have equity and pass link juice prior, but before Google finds it, it might not be counted as a link to your site. Does that make sense? The link needs to be discovered before the link juice is actually counted. At least that is the way that I understand it.
I do know a few professionals who believe that if a link isn't clicked link juice is never passed. I don't know if that is necessarily true. It makes sense that a link could be discovered but not have any equity because it isn't being used. I wonder if someone has a better idea of whether or not that is true, or if it another secret Google keeps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Merging Two Sites: Need Help!
I have two existing e-commerce sites. The older one, is built on the Yahoo platform and had limitations as far as user experience. The new site is built on the Magento 2 platform. We are going to be using SLI search for our search and navigation on the new Magento platform. SLI wants us to 301 all of our categories to the hosted category pages they will create, that will have a URL structure akin to site.com/shop/category-name.html. The issue is: If I want to merge the two sites, I will have to do a 301 to the category pages of the new site, which will have 301s going to the category pages hosted by SLI. I hope this makes sense! The way I see it, I have two options: Do a 301 from the old domain to categories of the new domain, and have the new domain's categories 301 to the SLI categories; or, I can do my 301s directly to the SLI hosted category pages. The downside of #1 is that I will be doing two 301s, and I know I will lose more link juice as a result. The upside of #1, is that if decide not to use SLI in the future, it is one less thing to worry about. The downside of #2, is that I will be directing all the category pages from the old site to a site I do not ultimately control. I appreciate any feedback.
Intermediate & Advanced SEO | | KH20171 -
Huge amount of backlinks detected - what to do ?
The websites that use Yotpo review solution can display product galleries like this //imgur.com/4dHUh7O - orginal source page: http://skibox.fr/fr/veste-de-pluie-dynastar-long-shell.html Every product in the gallery generates a link to https://yotpo.com such as https://yotpo.com/go/eAaQNjJh This generate a huge amount of links detected in Google Search Console (GWMT) of yotpo.com And every of those links redirects 301 to a page of the website using Yotpo review solution. Example: https://yotpo.com/go/eAaQNjJh redirects to http://skibox.fr/fr/batons-de-ski-leki-worldcup-lite-slalom-4683.html?#.VymNdr5_TwY It seems to be similar to shorten URL links (that are legitimate), but I am not about the influence of this, what do you think ? Is this really influencing (in bad) the (potential) rankings of https://www.yotpo.com subdomain pages? What would you recommend to do?
Intermediate & Advanced SEO | | KobyYotpo0 -
Clarity needed on 301 redirects
Looking to get a bit of clarity on redirects: We're getting ready to launch a new website with a simplified url structure (we're consolidating pages & content) & I already know that I'll have to employ 301 redirects from the old url structure to the new. What I'm not clear about is how specifc I should be. Here's an example of my file structure: Old website: www.website.com
Intermediate & Advanced SEO | | JSimmons17
New website: www.website.com Old website: www.website.com/vacations
New website: www.website.com/vacations Old website: www.website.com/vacations/costa-rica
New website: www.website.com/vacations/central-america Old website: www.website.com/vacations/costa-rica/guanacaste
New website: www.website.com/vacations/central-america Old website: www.website.com/vacations/mexico
New website: www.website.com/vacations/central-america Old website: www.website.com/vacations/mexico/cancun
New website: www.website.com/vacations/central-america Old website: www.website.com/vacations/bolivia
New website: www.website.com/vacations/south-america Old website: www.website.com/vacations/bolivia/la-paz
New website: www.website.com/vacations/south-america Do I need to redirect each and every page or would just redirecting just the folder be enough to keep my SEO juice? Many thanks in advance for any help!0 -
Need to shorten and change site-wide meta titles (50.000 pages). OK to do all at once?
Just noticed that google completely screws up our meta titles in the SERPs. Google decided to show titles which are not understandable to visitors and worst of all even shows titles in different languages than the actual page. The words of the displayedf titles are nowhere on the page (actually they are parts of old title tags that we stopped using 6 months ago and that we used on different pages). Pages are crawled weekly. All our meta titles are a bit longer than the 70 character limit, so I plan to rephrase and shorten them so that they are all max. 66 characters. Dynamically we choose different variations of title texts based on character length of keywords. Having titles that fit into SERPs without cutting are supposed to have less probability to be changed by google. I heard some people reporting loss of rankings after site-wide meta title changes. Especially since we changed title tags sitewide already about 6 months ago I am a bit concerned. How would you proceed? Just do the site-wide change all at once?
Intermediate & Advanced SEO | | lcourse0 -
Google+ Personal Page pass link juice?
I noticed recently that a clients google plus business page (Set up as a personal page) has a followed link pointing to their site. They have many links on the web pointing to the google+ page, however that page is an https page. So the question is, would a google+ page that is https still pass authority and link juice to the site linked in the about us tab?
Intermediate & Advanced SEO | | iAnalyst.com0 -
Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
I have a site that has been affected by Panda, and I think I have finally found the problem. When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to. The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding. Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet. Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used. With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example: http://www.mysite.com/en/this-is-a-pandalized-page/ 301 to http://www.mysite.com/this-is-the-rewritten-page/ The benefits of doing this are: decreasing the amounts of directories in the URL getting rid of pages that are possibly causing trouble getting fresh pages added to the site Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead 🙂 Thanks, Giorgio PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
Intermediate & Advanced SEO | | VisualSense1 -
Does having multiple links to the same page influence the Link juice this page is able to pass
Say you have a page and it has 4 outgoing links to the same internal page. In the original Pagerank algo if these links were links to an page outside your own domain, this would mean that the linkjuice this page is able to pass would be devided by 4. The thing is i'm not sure if this is also the case when the outgoing link, is linking to a page on your own domain. I would say that outgoing links (whatever the destination) will use some of your link juice, so it would be better to have 1 outgoing link instead of 4 to the same destination, the the destination will profit more form that link. What are you're thoughts?
Intermediate & Advanced SEO | | TjeerdvZ0 -
Need advice on local search optimization
HI all, I've found myself in a puzzling position and not quite sure which direction to push my current SEO project so if anyone who's done this particular type of SEO can offer some suggestions I'd be eternaly grateful. I am currently working on a project for a Law Firm based in New Jersey. Lets say the town they are in is Garfield. What I really want to try and achieve is see them appearing in the number one spot whenever anyone within Garfield or the immediate area searches for a lawyer relating to the individuals need. E..g searches like "personal injury lawyers", "real estate lawyer". The problem is I can see how I can easily make it to the number one position if people are specific and enter garfield in the search term but in reality they wouldn't be doing that. An additional problem is that peoples ISP's in garfield aren't located in Garfield, in some cases they're as far away as Newark so when they're doing a search for 'real estate lawyer' google is bringing up results for the Newark based firms. It seems using tools like market samurai to look at the traffic and competition is proving useless as searches like the ones I'm doing for local business are so closely tied to the ISP location I don't really know whether to target broad range searches like "Real Estate Lawyer", or to be really specific and include the town name in my page titles, H1 tags etc... I hope I put across my dilemma and someone can help me chose which direction to go in.. Thanks
Intermediate & Advanced SEO | | davebrown19750