JavaScript encoded links on an AngularJS framework...bad idea for Google?
-
Hi Guys,
I have a site where we're currently deploying code in AngularJS. As part of this, on the page we sometimes have links to 3rd party websites.
We do not want to have followed links on the site to the 3rd party sites as we may be perceived as a link farm since we have more than 1 million pages and a lot of these have external 3rd party links.
My question is, if we've got javascript to fire off the link to the 3rd party, is that enough to prevent Google from seeing that link? We do not have a NOFOLLOW on that currently.
The link anchor text simply says "Visit website" and the link is fired using JavaScript.
Here's a snapshot of the code we're using:
Visit website
Does anyone have any experience with anything like this on their own site or customer site that we can learn from just to ensure that we avoid any chances of being flagged for being a link farm?
Thank you
-
Hm, I'd be a little concerned if GSC can see it. Maybe GSC can see that JS turns it into a link, but can't figure out what that link is?
Any way, sounds like your hands are kind of tied until you can get those nofollows! Definitely make a note in your analytics platform when you get them implemented - it'll be interesting to see what effect they have on your rankings.
Good luck!
Kristina
-
Hi Kristina,
First of all, thank you for taking the time out to respond.
Very valid rationale you provided. I did have a look at the cache version before I posted on here and it didnt show the link I was looking for, however the GSC screen showed the link highlighted as a link.
That's what got me confused. I guess its safe to assume in that case that it wont be seen by Google considering it's not in the text version of the cached page.
I'll work on getting a NOFOLLOW in there since there's no guarantees with Google when they change stuff around. But, its great to know that it isnt an immediate requirement at the moment...
Thank you again Kristina!
-
Hi Kavit,
The short answer is no. Google can render some JS - possibly even AngularJS - so never assume that something rendered in JS is invisible to Google. You should assume that Google can see all links visitors can, and really push for a nofollow tag.
I usually check what Google can render by loading Google's cache of the page (go to Google.com and type in "cache:" in front of the exact URL of one of your pages). Look at the text-only version of the cache, and see if Google puts a link there. If they do, it's safe to assume that they can see that link. Another option is to use GSC to Fetch as Google; Google claims this is exactly what they're seeing.
If both the cache and GSC show that Google can't see a link, Google's probably not crawling it. But, Google's always getting better, and could suddenly see the links any day now. If these links are really a concern to you, I'd strongly suggest that you push your dev team to add nofollow tags to these outgoing links.
Best,
Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regular links may still
Good day: I understand guest articles are a good way to pass linkjuice and some authors have a link to their website on the "Author Bio" section of the article. These links are usually regular links. However, I noticed that some of these sites (using wordpress) have several SEO plugins with the following settings: Nofollow: Tell search engines not to spider links on this webpage. My question is: If the setting above was activated, I would assume the author's website link would look like a regular link but some other code could still be present in the page (ex, header) that would prevent this regular link from being followed. Therefore, the guest writer would not experience any linkjuice. Any idea if there's a way of being able to see if this scenario is happening? What code would we look for?
White Hat / Black Hat SEO | | Audreythenurse0 -
Exchange link from sites in same google account
Hi everyone, Anybody have experience when you have some websites which stored in Google Webmaster Tool and they exchange links between sites. So is it good for sites? We are hosted on different server. Thank you so much
White Hat / Black Hat SEO | | Jeepster0 -
Google webmasters tools, Majestic and Ahref in a simple case study (Bad links and Good links)
Hey guys, This case study started from here. A simple summary, I discover that I got +1000 backlinks from Blogspot though Google webmasters tools after making a connection with owners of these blogs which points to my new blog. Before starting I proudly invite Thomas Zickell and Gary Lee in this discussion. I wish you accept my invitation. Let's go to the main point, I've used Google webmaster tools so I will start with. Then Ahref which used by **Thomas **and then Majestic which used by Gary. Take a look at "001" screenshot, you will see that Google webmaster tools discovered 1291 links points to my site. Take another look at "002" screenshot, you will find that there are 22 domains points to my site. Most of them are good links since they are coming from websites such as Google.com, Wikipedia.org, Reddit, Shoutmeload, WordPress.org, ...etc. Beside +1000 backlinks came from Blogspot.com (blogs). Also, there's some bad links such as this one came from tacasino.com Necessary to say that I've got some competitors and they nicely asked me to stop the competition for some keywords and I've ignored their request. So, I'm not surprised to see these bad links. At "002" screenshot, we can see that Google didn't discover the bad links as they discovered the good links. And they discovered a lot of backlinks which not discovered by any other tools. **Let's move to Ahref, ** I will use screenshots provided by Thomas. At "003" screenshot, you can see Ahref report that say 457 links from 10 domains. By the way, social engagements data are wrong. I got more than zero engagements .. really. At "004" screenshot, you can see domains points to my site, links with anchor text. Take a look at the second link you will find that it's a spammy link coming from PR2 home page since it's is over optimized. the third link is also a spammy link since it coming from a not-relevant website. Beside other bad links need to be removed. So, Ahref didn't discover all of my good links. Instead of that it discovered few good links and a lot of bad links. In a case like this a question come needs to be answered since there are some people trying so hard to hurt my site, Do I have to remove all this bad links? Or, just links discovered by Google. Or, Google understand the case? **Let's move to majestic, ** Gray Lee provided data from majestic which say "10 Unique Referring Domains, with 363 links, 2 domains make up a majority." Since Gray didn't take any screenshots I will provide mine. At "005" screenshot, you can see some of the bad links discovered by Majestic. Not all of them discovered by Ahref or Google. In the other hand, Majestic didn't discover all of my Good links. Also, there's a miss understand I would like to explain here. When I published the Discussion about +1000 link. Some people may think that I trying to cheat you by providing fake info and this totally wrong. I said before and I'm saying that again you are so elite and I respect you. Also, I'm preparing for an advanced case study about this thing. If any expert would like to join me this will be great. Thank you for reading and please feel free to share you thoughts, knowledge and experience in this Discussion. EE5bFNc jYg21cf Xyfgp28.png iR4UOwi.png D1pGAFO
White Hat / Black Hat SEO | | Eslam-yosef1 -
Bad links showing up in opensiteexplorer
Hello Everybody,I've been working as an inhouse SEO for nearly a year and a half now and i've gotten some pretty great results. Two years ago our site was on the second page for the most important keywords in our niche and with a lot of work we've managed to get top 5 rankings for most keywords and even the number 1 spot for the most important keywords. I've been using opensite explorer to track backlinks and today i noticed that a lot of links we're discovered in the last week from websites that i did not recognize. Most url's won't even load properly because each "blogpost" has over a thousand comments. It took me a couple of tries to even find one that loaded properly and find the link to our website, and it was really there. There haven't been any drops in our rankings but i'm worried about a possible spam penalty. I know that i can use the disavow tool to at least disavow the links from these domains, but is that really the only thing i can do? Furthermore these are just the links that opensiteexplorer picked up, who knows how many more are out there.For any of you questioning wether or not i did this to myself, I'm no saint, but I'm definitely not stupid enough to buy these kinds of links. any help would be highly appreciated
White Hat / Black Hat SEO | | Laurensvda0 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
Tools to check Google Local SEO with suggestions.
Is there any tool for to check website position on Google maps ?? and also what is the way to check that a website is listed on which local directories and on which not listed and to get suggestions for improvements ?? so need Tools to check Google Local SEO with suggestions.
White Hat / Black Hat SEO | | mnkpso0 -
Have I created link spam.....
Howdy fellow Mozzers.... Since Googles Penguin Update I am overly cautious when reviewing our link profile. I spotted 2 domains linking to us yesterday, 80+ links from each domain to our homepage. This looked superstitious, site wide links effectively. At first inspection I couldn't spot the links....they turned out to be two individual comments, but as the site had a plugin with "most recent comments", 1 link became 80. The link is an exact match of the individuals name who made the comment. And is a result of filling out the comment form. Name: Website: Comment: By filling out the name and website the name becomes the anchor text for the link to the website. Long story short...do you think this is penguin esq. link spam? Is it not? Or is it just not worth the risk and remove them anyway???
White Hat / Black Hat SEO | | RobertChapman0 -
Link package review and recommendations
Hello there, I recently spoke to a contractor that offered me the following package, and i have to ask, in this post-penguin world, does it make sense to pursue this kind of linking? Or will it be considered spam. They said it's a manual submission process and they will 'do their best' to ensure that it's under a related category, but can't promise anything in regards to that. What should i be requesting in this post-penguin world? How do i get quality backlinks that won't harm me given the current environment? Any help is greatly appreciated, here is the package info: 1. 900 links submissions = 450 Guaranteed One Way Theme Links - The links are built by manually publishing 5 Original Articles (500 words each) on 125 different article sites (each published article will have 2 back-links to your site). We can use up to 10 keywords and 10 different URLs of your site to build the links.70% of our Article Sites have PR 2 to 6, all with different C classes IPs. 2. 300 links submissions = 150 Guaranteed One Way Theme Links – The links are built by manually publishing 4 Reviews for your site from 4 different accounts (we can use up to 4 URLs of your site to link back) on 150 Social Bookmarking sites, 90% of the sites have PR 2 to 8, all with different C classes IPs. 3. 480 links submissions = 240 Guaranteed One Way Theme Links – The links are built by manually publishing 3 Original Press Releases on 35 Press Release sites(each published press release will have 2 back-links to your site). We can use up to 6 keywords and 6 different URLs of your site to build the links. All our Press Release Sites have PR 2 to 7 all with different C classes IPs. 4. 220 links submissions = 110 Guaranteed One Way blog links – These links are built by publishing 3 Original Blog Article (300 words each) with 2 back links to your site on 20 different free blog sites. These free blog sites are our sites (new sites with PR 0) which we are promoting to get the highest PR for them and your blog back links too.
White Hat / Black Hat SEO | | symbolphoto0