In order for Google to recognize a hyper-link on your website, does it have to be written in a specific java script?
-
Does it have to read as the following script?
-
Not a problem
I find that all too often, if the question is a bit ambiguous - people will ignore it. If there are only a handful of interpretations, I will still try to answer
-
Thank you, that was extremely insightful and helpful.
-
Just so you are aware, the code-sample which you supplied is HTML and not JavaScript (or for that matter, any type of script. Scripting languages include JavaScript, Python, Ruby, Perl etc).
You may be asking one of two things (I think!):
1) Is there a set HTML format for hyperlinks which Google knows how to read?
Yes, and you can find **information all about ** conventional use of the <a></a><a>(HTML) tag here:</a>
<a></a>
<a></a>
HTML is a static language and is not (unlike many scripting languages) 'object oriented'. You don't define "<a>" and as such</a> <a>is not interpreted based upon your programmed parameters.</a> <a>always means the same thing (to a a web browser). Sure stuff like CSS can style links in different ways, JavaScript can modify</a> <a>tags by injecting event-tracking attributes etc (also a common use of jQuery) but fundamentally the usage of</a> <a>is</a> <a>(mostly) universally agreed. So yes - links are coded according to conventions and Google will interpret those widely accepted conventional use-cases, as well as a few more experimental deployments (possibly through error handling in Google's algorithms). In general, you should follow W3C / W3 Schools guidelines. There are many forms of link (no-followed links, text links, image links) and all are valid but yes - they are predetermined</a>
<a>2) This is the HTML which my JavaScript will output - is it ok?
Yeah it's fine dude. If you can handle JS, you can handle HTML (it's way simpler). One thing though, although Google can deploy rendered (JS-enabled) crawling, that involves using headless browsers and such to render the 'modified' source code (so, what you see in 'inspect element' is the modified source. What you see in "view page source" is different, that's the pre-modified or base-source code).
Usually speaking this takes 10x longer than simple DOM / base-source scrapes. As such if Google were to deploy that tech on every crawl for every page on the web, the efficiency hit to their 'index the web' mission would be colossal. Many studies show that Google will not render JS on all sites (especially one perceived to be low value). Even on sites where they will use this tech, they won't deploy it all of the time. There really is no substitute for forcing your links and content to be readable in the base-source code (un-modified). It's way better for crawlers, way more efficient for them to work with. Just because Google ' can' do something, it doesn't mean they always will. It doesn't mean it's a good idea to ignore basic SEO principles!
Hope that helps</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can a slight brochureware website with little content or keywords be getting DA 96, PA 59 and linking domains of 50.5million?
How can a Facebook page get domain authority of 96, page authority of 59 and have 50million linking domains? I come across websites that feature crappy video clips in serp features in google searches, the website.
Link Building | | DavidLandManager
It's surely black hat tactics. How do they get away with this? The website has zero content beyond light brochureware.
Screenshot 2021-07-23 at 06.59.45.png0 -
What does google think about legit link exchanges where one is follow and one is no follow?
Hi Experts! Here is my Question for you. I am doing a link exchange in a legit way to increase sales for my site and my associate's site. My associate just wants a sales increase and no link juice. He has a very low DA so I want to give him a no follow link. Is it suspicious of fishy that I give a no follow link and receive a followed link in return? Please let me know how to proceed, I don't want to take any changes. Can you tell me the best way to proceed with this link exchange? Thanks
Link Building | | Ruchy0 -
Linking strategy between my websites
Dear all, I am quite new in SEO and I am very confused about linking strategy. I have several websites proposing proofreading services in French and I don't know how to link them together so they get the best chance to have a good ranking. What strategy would you suggest ? Thanks a lot for your help, Yafiz
Link Building | | Yafiz0 -
I want to design SEO link building strategy for my website? Is wordpress.com, squidoo, tumblr, blogger, typepad - Good option for link building?
I am currently concentrating on 8 keywords, for e.g (A, B, C, D, E, F, G, H). I will be writing blogs with 2 of any related keywords present in it. I am thinking to post 5 blogs on 5 different platforms as (wordpress.com, Squidoo, Typepad, Tumblr, blogger) respectively. I am thinking a strategy as: Monday: Keyword A,B on Wordpress/ C,D on Squidoo/E,F on Typepad/G,H on Blogger/ A,C on Tumblr. Tuesday:C,D on Wordpress/E,F on Squidoo/ etc.... and will rotate these keywords through out the week and the cycle restarts on Monday. The URL for every keyword will be different and relevant to that keyword. I need quick suggestion on this topic..Please..
Link Building | | Christain0 -
Multiple Links from High Ranking Site Vs. Links from Multiple Domains - What's More Important?
I understand it is important to get links from many quality domains. Currently, I do have links from top domains (PR, Trust) and it I can get more from (high rank) pages on these same domains. Would it be better to focus on expanding my reach (find additional domains to link from) or to continue to build links from the current domains I have a connection with? What is weighted more? I realize doing both is important, but trying to figure out how to best use my time. Thanks! David
Link Building | | DWill0 -
Does the ratio of external nofollow links to external "do follow" links matter in terms of SERPs ranking?
My site has an external link nofollow:dofollow ratio of approximately 1:1 That is, there are about as many nofollow external links as "do follow" external links. I have an impression that the ratio of no-follow to "do follow" links is a factor in the way that our website shows up in SERPs. I have the impression from reading a variety of sources, and from looking at Seomoz, that calculate "trust" factors as if they mattered (in SERPs), that seem to value a relatively low nofollow:dofollow ratio. Am I correct about that? Thanks,
Link Building | | tcolling
Tim PS - I don't know whether or not this matters, but our website is at: www.trustworthycare.com - Tim0 -
How many nofollow links should we build to have a natural link profile
Hi guys, As with many link builders we have been building lots of dofollow links for our site, so many that mainly our incoming links are dofollow. Some pages have 99.5% dofollow external links which I know is not very natural. In your experience in terms of percentage how much should you have in terms of nofollow external links in your link profile. I noticed SEOMoz has about 12-13% nofollow external links, shall I go with this figure? Thanks guys. David
Link Building | | sssrpm0