Will Google Count Links Loaded from JavaScript Files After the Page Loads
-
Hi,
I have a simple question. If I want to put an image with a link to another site like a banner ad on my page, but do not want it counted by Google. Can I simply load the link and banner using jQuery onload from a separate .js file?
The ideal result would be for Google to index a script tag instead of a link.
-
Good Answer. I completely abandoned the banner I was thinking of using. It was from one of those directories that will list your site for free if you show their banner on your site. Their code of course had a link to them with some optimized text. I was looking for a way to display the banner without becoming a link farm for them.
Then I just decided that I did not want that kind of thing on my site even if it is in a javascript onload event if Google is going to crawl it anyway, so I just decided not to add it.
Then I started thinking about user generated links. How could I let people cite a source in a way that the user can click on without exposing my site to hosting spammy links. I originally used an ASP.Net linkbutton with a confirm button extender from the AJAX Control ToolKit that would display the url and ask the user if they wanted to go there. Then they would click the confirm button and be redirected. The problem was that the URL of the page was in the head part of the DOM.
I replaced that with a feature using a modal popup that calls a javascript function when the link button is clicked. That function then makes an ajax call to a webservice that gets the link from the database. Then the javascript writes an iframe to a div in the modal's panel. The result should be the user being able to see the source without leaving the site, but a lot of sites appear to be blocking the frame by using stuff like X-Frame-Options, so I'm probably going to use a different solution that uses the modal without the iframe. I am thinking of maybe using something like curl to grab content from the page to write to the modal panel along with a clickable link. All of this of course after the user clicks the linkbutton so none of that will be in the source code when the page loads.
-
I think what we really need to understand is, what is the purpose of hiding the link from Google? If it's to prevent the discovery of a URL or prevent the indexation of a certain page (or set of pages) - it's easier to achieve the same thing by using Meta no-index directives or wildcard-based robots.txt rules or by simply denying Gooblebot's user-agent, access to certain pages entirely
Is is that important to hide the link, or is it that you want to prevent access to certain URLs from within Google's SERPs? Another option is obviously to block users / sessions referred from Google (specifically) from accessing the pages. There's lots can be done, but a bit of context would be cool
By the way, no-follow does not prevent Google from following links. It actually just stops PageRank from passing across. I know, it was named wrong
-
What about a form action? Where instead of an a element with a href attribute you add a form element with an action attribute to what the href would be in a link.
-
Thanks for that answer. You obviously know a lot about this issue. I guess they would be able to tell if the .js script file creates an a element with a specific href attribute and then add that element to a specific div tag after the page loads.
It sounds like it might be easier just to nofollow those links instead of going to all the trouble to redirect the .js file whenever Google Bot crawls the page. I fear that could be considered cloaking.
Another possibility would be a an alert that requires a user interaction before grabbing a url from a database. The user would click on the link without an href, the javascript onclick fires, the javascript grabs the the url from a database, the user is asked to click a button if they want to proceed, and then the user is redirected to the external url. That should keep the external URL out of the script code.
-
Google can crawl JavaScript and its contents, but most of the time they are unlikely to do so. In order to do this, Google has to do more than just a basic source code scrape. Like everyone else seeking to scrape data from inside of generated elements, Google has to actually check the modified source-code, after all of the scripts have run (the render) rather than the base (non-modified) source code before any scripts fire
Google's mission is to index the web. There's no doubt that, non-rendered crawls (which do not contain the generated HTML output of scripts) can be done in a fraction of the time it takes to get a rendered snapshot of the page-code. On average I have found rendered crawling to take 7x to 10x longer than basic source scraping
What we have found is that Google are indeed, capable of crawling generated text and links and stuff... but they won't do this all the time, or for everyone. Those resources are more precious to Google and they crawl more sparingly in that manner
If you deployed the link in the manner which you have described, my anticipation is that Google would not notice or evaluate the link for a month or two (if you're not super popular). Eventually, they would determine the presence of the link - at which point it would be factored and / or evaluated
I suppose you could embed the script as a link to a '.js' module, and then use Robots.txt to ban Google from crawling that particular JavaScript file. If they chose to obey that directive, the link would pretty much remain hidden from them. But remember, it's only a directive!
If you wanted to be super harsh you could block Googlebot (user agent) from that JS file and do something like, 301 them to the homepage when they tried to access it (instead of allowing them to open and read the JS file). That would be pretty hardcore but would stand a higher chance of actually working
Think about this kind of stuff though. It would be pretty irregular to go to such extremes and I'm not certain what the consequences of such action(s) would be
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is showing erroneous results on SERPs page
Hello, All, In April, two months ago, we caught a hack on a client's website. It created about 40 pages in what looked to be a black hat link tactic. We removed the pages, resubmitted the sitemap.xml (it reprocessed) and ran it through screaming frog to confirm all the pages were gone, but the forty pages still show up in the search results for a site search. We have both the www. and non www. version of sites claimed and set a preference. Nothing is awry with the robots.text. We're not really sure what to do to resolve it. We asked Google to recrawl (fetch) the site. I'm not sure what's going on with it. The website's name is fortisitsolutions.com The site search bringing up the pages from the hack is below. site:www.fortisitsolutions.com Any ideas?
On-Page Optimization | | Cazarin-Interactive0 -
I have a client where every page has over 100 links
Some links are in the main navigation (it has a secondary and tertiary level) and some links are repeated in the left navigation. Every page has over 100 links if crawled. From a practical standpoint, would you (a) delete the 3rd-level links (or at least argue for that) or (b) rel='nofollow' them? From a usability standpoint, this setup works as they are almost one click from everything. From a crawl standpoint, I see some pages missed in google (the sitemap has over 200 links). Looking for the best on-page current SEO advice to set these guys on the road to success.
On-Page Optimization | | digimech0 -
Too Many On-Page Links
If a page has more than 100 links, rather than splitting up the page into multiple pages, is it ok to use name="robots" content="noindex, follow" />? The page in question lists links to articles so the page itself isn't that important to appear in serps, but the articles are the helpful content pages: www.ides.com/articles/processing/injection-molding/
On-Page Optimization | | Prospector-Plastics0 -
Opinions please on Duplicate page titles & too many on-page links warnings.-
Hello folks, I'm a total SEO newbe but totally enjoying
On-Page Optimization | | CSC
using SEOmoz to learn more. We have ecommerce sites and the 1st crawl flags – as appears typical too many on-page links. We display up to 20 products (each with three links!)
and I’m trying to push to have fewer but meeting resistance from colleagues.
We have links duplicated all over the site believing it eases navigation. My question is just how critical is the number of products displayed
and the resulting volume of links to SEO results? Also we currently have collections of products displayed
across several pages which of course have the same page title and this is flagged
as a duplication error. I wonder if product auto-scrolling help as this means only a certain number of products are displayed at one time on one page thus reducing links and the need for duplicate page titles? My superiors are resisting change (perhaps nervous of spoiling
what already works) and I need to know where to direct my persuasive powers! Many thanks in anticipation, Spence0 -
Break-up content into individual pages or keep on one page
I am working on a dental website. Under menu item "services" lists everything he does like.. Athletic Sports Guards
On-Page Optimization | | Czubmeister
An athletic sports guard is a resilient plastic appliance that is worn to protect the teeth and gum tissues by absorbing the forces generated by traumatic blows during sports or other activities. Digital X-Rays We use state of the art digital x-rays and digital cameras to help with an accurate diagnosis of any concerns. Digital Imaging On initial visits, and recall visits, we take a series of digital photographs to aid us in diagnosis as well as to give you a close-up view of your mouth and any oral conditions. Smile Makeovers
We offer a number of different options including bleaching, bonding, porcelain veeners, and in some cases, implants and/or orthodontic care is utilized in our smile makeover planning. Nitrous oxide for your Comfort Would it be better to break these services up into individual pages? I was thinking I would because then I could add more pictures and expand on the topic and try to get an "A" grade on each page. I'm not sure how I could rank a page if I have 35 services listed on the page. That would be an awfully big H1! Suggestions?0 -
SERP - Hi How come I get different results on page one of Google with the same query from my colleague who sits next to me? We are both logged out of Google and it’s on google.co.uk thanks in advance Daniel
Hi How come I get different results on page one of Google with the same query from my colleague who sits next to me? We are both logged out of Google and it’s on google.co.uk thanks in advance Daniel
On-Page Optimization | | ds80 -
Will adding affiliate links negatively affect my SEO?
I'm thinking of signing up to a couple of affiliate programs related to my industry. Will adding affiliate links at the bottom of articles for example negatively affect my seo? I intend to have lots of useful content on my site and not just affililate links.
On-Page Optimization | | SamCUK0 -
Eliminating outbound links for long tail-targeting pages
I have a number of pages that rank on the 1st page for highly long tail phrases, despite the pages having outbound links to things like 'privacy policy' 'terms of use', make a payment, etc...all pages that can be accessed from the home page. Do you recommend I eliminate these administrative pages from the long tail-targeting pages, to reduce outbound page rank flow? Does anyone create a different breadcrumb navigation or remove one altogether for pages that are highly targeted to improve their rank?
On-Page Optimization | | ilyaelbert0