How not to lose link juice when linking to thousands of PDF guides?
-
Hi All,
I run an e-commerce website with thousands of products.
In each product page I have a link to a PDF guide of that product.Currently we link to it with a "nofollow" <a href="">tag.</a>
<a href="">Should we change it to window.open in order not to lose link juice?
Thanks</a>
-
Dear Egol,
I'm assuming by your answer that PDF's take link juice and JPG's don't (please correct me if I'm wrong).
Is that the case even if I have <a href="">to enlarge the image? (I use href with JQuery and not regular target=_blank)</a>
<a href="">Also, what to do about the certificates that are on other sites? (too many and rapidly changing for me to get it to my site)</a>
-
Thanks,
I was assuming that these had product dimensions, how to use, etc information. I would want all of that content on my site.
If these are certificates then I would link to them as .jpg images and that eliminates concerns about link juice.
-
Dear Egol,
Thank for the reply. I'm guessing my example for product guides was not exact.
These are not guides but rather certificates of authentication which the customers expect to see. Some I have in PDF's, some in JPG's and some I refer to a cetificate page in another site. The question is what to do...I currently use "nofollow" links on all types of certificates (jpg, pdf and links to other site). What do you suggest to do?
I was considering either window.open (which I fear would look spammy) or leaving it as is...
Thanks
-
Just saying what I would do if this was my website.
My employees would be told that getting that pdf data onto the product sales page is a top priority job... and I would call the car dealer and order my new Jaguar.
-
I don't think having them indexed would do me any good.
The important page is the product page itself - the way I see it, there is no good reason to lose link juice for these guides...
I'm trying to improve my sites overall performance.
Important note...
Also, I have some cases where the manuals are pages on the manufacturer's site. Should I change these links from follow to JS links in order not to lose the juice from my site to other sites?
-
Are you just trying to streamline your product pages, or are you experiencing a specific issue with your product pages that you're hoping this will solve?
Linking to the PDF's via a javascript function should help preserve link juice and crawl budget.
You should only add those PDF's that have been indexed to the robots.txt if you absolutely do not want them in the index, otherwise it won't really do anything to help.
-
Due to your answer I checked and I see that some of the PDF's are indexed.
I didn't know that Google does that.
It is probably from the time that the links were not "nofollow".Should I exclude them using robots.txt?
-
Are the PDF's currently being crawled and indexed?
If you want to hide the PDF's from search engines, and preserve link juice, then a javascript method like you've mentioned ought to keep things wrapped up.
You could also consider some data capture, such as requiring an e-mail address to be entered before you see the download link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal no follow links
I have just discovered that the WordPress theme I have been using for some time has no follow internal links on the blog. Simply put each post has an image and text link plus a 'read more'. The Read more is a no-follow which is also on my homepage. The developer is saying duplicate follow links are worse than an internal no follow. What is your opinion on this? Should I spend time removing the no follow?
Technical SEO | | Libra_Photographic0 -
What's the best way to pass link juice to a page on another domain?
I'm working with a non-profit, and their donation form software forces them to host their donation pages on a different domain. I want to attempt to get their donation page to appear in their sitelinks in Google (under the main website's entry), but it seems like the organization's donation forms are at a disadvantage because they're not actually hosted on that site. I know that no matter what I do, there's no way to "force" a sitelink to appear the way I want it, but... I was trying to think if there's a way I can work around this. Do you think 1) creating a url like orgname.org/donate and having that be a 301 redirect to the donation form, and 2) using the /donate redirect all over the site (instead of linking directly to the form) would help? Are there alternatives other folks recommend?
Technical SEO | | clefevre0 -
Assessing Link Profiles
Hi Guys, When doing a link cleanup, it can be sometimes hard to tell, how a link got there (i.e is it natural or not). Apart from spammy directories, blog comments and forum profiles, some link exchanges could have been done naturally with just very good outreach. If you were looking at this one:- http://5startemplates.com/communications_links(4).html Would you say remove if I know they have definitely taken part in link exchanges (their link profile seems to suggest they have) or just change it to a brand/url. This sites rankings have been tanking due to duplicate content and possibly (although not definitely) a penguin update too. Any advice would be great! Kind Regards Neil
Technical SEO | | nezona0 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
How to find all crawlable links on a particular page?
Hi! This might sound like a newbie question, but I'm trying to find all crawlable links (that google bot sees), on a particular page of my website. I'm trying to use screaming frog, but that gives me all the links on that particular page, AND all subsequent pages in the given sub-directory. What I want is ONLY the crawlable links pointing away from a particular page. What is the best way to go about this? Thanks in advance.
Technical SEO | | AB_Newbie0 -
Too Many Internal Links?
Hi Guys, I'm completing a overhawl of our website at the moment have a certain penguin killed our site for our main keyword. I'm currently working on our internal linking as most of our blog posts have a link back to our home page with the main money keyword. At present we have 3,331 internal links and our site has only 1,000 pages. Can you get penalised for having too many internal links with exact match anchors. Thanks, Scott
Technical SEO | | ScottBaxterWW0 -
Removing links from another site
Hello, Some site that I have never been able to access as it is always down has over 3,000 links to my website. They disappeared the other week and our search queries dramatically improved but now they are back again in Google Webmaster and we have dropped again.I have contacted the site owner and got no response and I have also put in a removal form (though I am not sure this fits for that) and asked Google to remove as they have been duplicating our content also. It was in my pending section but has now disappeared.This links are really damaging our search and the site isnt even there. Do I have to list all 3,000 links in the link removal to Google or is there another way I can go about telling them the issue.Appreciate any help on this
Technical SEO | | luwhosjack0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0