Cantags within links affect Google's perception of them?
-
Hi, All!
This might be really obvious, but I have little coding experience, so when in doubt - ask...
One of our client site's has navigation that looks (in part) like this:
<a <span="">href</a><a <span="">="http://www.mysite.com/section1"></a>
<a <span="">src="images/arrow6.gif" width="13" height="7" alt="Section 1">Section 1</a><a <span=""></a>
WC3 told us the
tags invalidate, and while I ignored most of their comments because I didn't think it would impact on what search engines saw, because thesetags are right in the links, it raised a question.
Anyone know if this is for sure a problem/not a problem?
Thanks in advance!
Aviva B
-
Thanks, Ryan. Good ideas, and we'll see what "the authorities" choose to do.
-
If they would have to pay a significant amount of money to have it redone, though, would it be worth it in this kind of case? What would the odds be?
Without having any information about the site, it's not possible to offer any credible details, odds or measurements of worth. If you are asking for a guess, I would say it is very unlikely for the div tags to cause any SEO problems, but that's the problem with invalid code, you don't know how it will be handled.
The bigger concern I have is if that line of code was coded so poorly, there are likely other coding issues with the site.
May I suggest asking a couple developers for an estimate on how much it would adjust the site's code so it validates?
-
Thanks, Ryan. Point well taken. I think I may copy and paste this for the client in question. If they would have to pay a significant amount of money to have it redone, though, would it be worth it in this kind of case? What would the odds be?
Aviva
-
Thanks, Kyle. We're not the design/webmaster team, so while it might not have been a good idea to do that in the first place, our job here is just to tell our client what MUST change for SEO and what doesn't need to change, even though it might not have been ideal. The challenges of not having unlimited budget...
Thanks,
Aviva
-
Simply from a front-end development perspective, why would you place a
inside of an <a>? If you are trying to force a block element style, why not simply apply it through the CSS sheet to the</a> <a>tag?
If you supply a URL i can give more specific coding advice
Thanks - Kyle</a>
-
The problem with using invalid code is every browser may handle it differently. Even if your current browser handles it fine today, the next time it updates the results may change.
Code validation is representatives from all the major browsers getting together and agreeing on coding rules. The biggest problem with invalid code is people thinking their site is fine but then later finding out (or worse not finding out) their site does not appear correctly in various browsers.
You have ie6, ie7, ie8, ie9, ie10, Chrome, FF, Opera, Safari and other browsers on the market. You have a variety of phones, ipads and other devices. It is more important then ever to use valid code. If your page doesn't fully validate, it should still be almost valid and the couple errors which remain have been thoroughly researched and you consciously choose to not validate on those particular items. An example would be if you are using HTML 5 and the validation tool has not fully been updated for all the latest changes.
With the above noted, I am not aware of any problem with your code. The challenge is since it is not valid, you cannot predict how it will be handled by Google. Even if it is handled correctly today, a change can be made at any time which can impact you.
-
Thanks, Andy. You've seen sites that have used the tags the same way?
-
To be honest, I can't see, from an SEO perspective, how Google would view these in a negative way. I can only tell you that from all of the sites that I have seen, I have never seen this as a problem.
Someone else might come up with a definitive answer, but I would say that there is nothing wrong with
tags for SEO.
Cheers,
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a page that's 301 redirected get indexed / show in search results?
Hey folks, have searched around and haven't been able to find an answer to this question. I've got a client who has very different search results when including his middle initial. His bio page on his company's website has the slug /people/john-smith; I'm wondering if we set up a duplicate bio page with his middle initial (e.g. /people/john-b-smith) and then 301 redirect it to the existent bio page, whether the latter page would get indexed by google and show in search results for queries that use the middle initial (e.g. "john b smith"). I've already got the metadata based on the middle initial version but I know the slug is a ranking signal and since it's a direct match to one of his higher volume branded queries I thought it might help to get his bio page ranking more highly. Would that work or does the 301'd page effectively cease to exist in Google's eyes?
Technical SEO | | Greentarget0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
New Site maintaining rank on old URL's
Hi I have a new website going live which has a different page names etc i.e. the old site had pages that are ranking called aboutus.html and the new site is called about.php What is the best approach to maintain the rank and also on orphaned pages Many Thanks
Technical SEO | | ocelot0 -
Different IP's in one Server
Hi, I just want to ask if there is no bad effect in SEO if we do have different websites that has different IP address but has shared in only 1 server? Thank you
Technical SEO | | TirewebMarketing0 -
Does HTTPS Affect Inbound Link Numbers?
Hi All, I'm dealing with an internal IT staff that is trying to change an entire site to run on HTTPS instead of HTTP. The way they want things configured, all links pointing to HTTP URLs would redirect to the HTTPS. I'm assuming this would adversely affect page rank/domain authority, etc... am I right there? Thanks, Ben
Technical SEO | | Ben_Alvord0 -
Why isn't Google pushing my Schema data to the search results page
I believe we have it set up right. I'm noticing all my competitors schema data is showing up which is really giving them a leg up on us. We have a high ranking website so I'm just not sure why it's now showing up. Here is an example URL http://www.airgundepot.com/3576w.html I've used the Google webmaster tools tester and it all looks fine. Any ideas? Thanks in advance.
Technical SEO | | AirgunDepot0 -
Google Has Indexed Most of My Site, why won't Bing?
We've got 600K+ pages indexed by Google and have submitted our same sitemap.xml's to Bing, but have only seen 100-200 pages get indexed by Bing. Is this fairly typical? Is there anything further we can do to increase indexation on Bing?
Technical SEO | | jamesti0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10