Cantags within links affect Google's perception of them?
-
Hi, All!
This might be really obvious, but I have little coding experience, so when in doubt - ask...
One of our client site's has navigation that looks (in part) like this:
<a <span="">href</a><a <span="">="http://www.mysite.com/section1"></a>
<a <span="">src="images/arrow6.gif" width="13" height="7" alt="Section 1">Section 1</a><a <span=""></a>
WC3 told us the
tags invalidate, and while I ignored most of their comments because I didn't think it would impact on what search engines saw, because thesetags are right in the links, it raised a question.
Anyone know if this is for sure a problem/not a problem?
Thanks in advance!
Aviva B
-
Thanks, Ryan. Good ideas, and we'll see what "the authorities" choose to do.
-
If they would have to pay a significant amount of money to have it redone, though, would it be worth it in this kind of case? What would the odds be?
Without having any information about the site, it's not possible to offer any credible details, odds or measurements of worth. If you are asking for a guess, I would say it is very unlikely for the div tags to cause any SEO problems, but that's the problem with invalid code, you don't know how it will be handled.
The bigger concern I have is if that line of code was coded so poorly, there are likely other coding issues with the site.
May I suggest asking a couple developers for an estimate on how much it would adjust the site's code so it validates?
-
Thanks, Ryan. Point well taken. I think I may copy and paste this for the client in question. If they would have to pay a significant amount of money to have it redone, though, would it be worth it in this kind of case? What would the odds be?
Aviva
-
Thanks, Kyle. We're not the design/webmaster team, so while it might not have been a good idea to do that in the first place, our job here is just to tell our client what MUST change for SEO and what doesn't need to change, even though it might not have been ideal. The challenges of not having unlimited budget...
Thanks,
Aviva
-
Simply from a front-end development perspective, why would you place a
inside of an <a>? If you are trying to force a block element style, why not simply apply it through the CSS sheet to the</a> <a>tag?
If you supply a URL i can give more specific coding advice
Thanks - Kyle</a>
-
The problem with using invalid code is every browser may handle it differently. Even if your current browser handles it fine today, the next time it updates the results may change.
Code validation is representatives from all the major browsers getting together and agreeing on coding rules. The biggest problem with invalid code is people thinking their site is fine but then later finding out (or worse not finding out) their site does not appear correctly in various browsers.
You have ie6, ie7, ie8, ie9, ie10, Chrome, FF, Opera, Safari and other browsers on the market. You have a variety of phones, ipads and other devices. It is more important then ever to use valid code. If your page doesn't fully validate, it should still be almost valid and the couple errors which remain have been thoroughly researched and you consciously choose to not validate on those particular items. An example would be if you are using HTML 5 and the validation tool has not fully been updated for all the latest changes.
With the above noted, I am not aware of any problem with your code. The challenge is since it is not valid, you cannot predict how it will be handled by Google. Even if it is handled correctly today, a change can be made at any time which can impact you.
-
Thanks, Andy. You've seen sites that have used the tags the same way?
-
To be honest, I can't see, from an SEO perspective, how Google would view these in a negative way. I can only tell you that from all of the sites that I have seen, I have never seen this as a problem.
Someone else might come up with a definitive answer, but I would say that there is nothing wrong with
tags for SEO.
Cheers,
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How google bot see's two the same rel canonicals?
Hi, I have a website where all the original URL's have a rel canonical back to themselves. This is kinda like a fail safe mode. It is because if a parameter occurs, then the URL with the parameter will have a canonical back to the original URL. For example this url: https://www.example.com/something/page/1/ has this canonical: https://www.example.com/something/page/1/ which is the same since it's an original URL This url https://www.example.com/something/page/1/?parameter has this canonical https://www.example.com/something/page/1/ like i said before, parameters have a rel canonical back to their original url's. SO: https://www.example.com/something/page/1/?parameter and this https://www.example.com/something/page/1/ both have the same canonical which is this https://www.example.com/something/page/1/ Im telling you all that because when roger bot tried to crawl my website, it gave back duplicates. This happened because it was reading the canonical (https://www.example.com/something/page/1/) of the original url (https://www.example.com/something/page/1/) and the canonical (https://www.example.com/something/page/1/) of the url with the parameter (https://www.example.com/something/page/1/?parameter) and saw that both were point to the same canonical (https://www.example.com/something/page/1/)... So, i would like to know if google bot treats canonicals the same way. Because if it does then im full of duplicates 😄 thanks.
Technical SEO | | dos06590 -
Do you get penalized in search results when you use a heading tag, but it's not technically a heading (used for emphasis)?
Do you get penalized in search results when you use a heading tag, but it's not technically a heading? My clients are using heading tags for text they want to emphasize and make stand out. Does this affect search rankings for SEO?
Technical SEO | | jthompson05130 -
Can you regain any SERPs / link juice of links that have 404'd?
We have a client whose 301 redirects disappeared and have been gone for about 6 months now. We are going to be putting the 301 redirects back in place. Will we be able to regain any of the previous SERPs or link juice from old links or is all lost? Thanks in advance!
Technical SEO | | SavvyPanda0 -
My old URL's are still indexing when I have redirected all of them, why is this happening?
I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?
Technical SEO | | One2OneDigital0 -
What's Moz's Strategy behind their blog main categories?
I've only just noticed that the Moz' blog categories have been moved within a pull down menu. See it underneath : 'Explore Posts by Category' on any blog page. This means that the whole list of categories under that pull-down is not crawlable by bots, and therefore no link-juice flows down onto those category pages. I imagine that the main drive behind that move is to sculpt page rank so that the business/money pages or areas of the website get greater link equity as opposed to just wasting it all throwing it down to the many categories ? it'd be good to hear about more from Rand or anyone in his team as to how they came onto engineering this and why. One of the things I wonder is: with the sheer amount of content that Moz produces, is it possible to contemplate an effective technical architecture such as that? I know they do a great job at interlinking content from one post onto another, so effectively one can argue that that kind of supersedes the need for hierarchical page rank distribution via categories... but I wonder : "is it working better this way vs having crawlable blog category links on the blog section? have they performed tests" some insights or further info on this from Moz would be very welcome. thanks in advance
Technical SEO | | carralon
David0 -
If Google's index contains multiple URLs for my homepage, does that mean the canonical tag is not working?
I have a site which is using canonical tags on all pages, however not all duplicate versions of the homepage are 301'd due to a limitation in the hosting platform. So some site visitors get www.example.com/default.aspx while others just get www.example.com. I can see the correct canonical tag on the source code of both versions of this homepage, but when I search Google for the specific URL "www.example.com/default.aspx" I see that they've indexed that specific URL as well as the "clean" one. Is this a concern... shouldn't Google only show me the clean URL?
Technical SEO | | JMagary0 -
Mobile site rank on Google S.E. instead of desktop site.
Hello, all SEOers~ Today, I would like to hear your opinion regarding on Mobile site and duplicate contents issue. I have a mobile version of our website that is hosted on a subdomain (m instead www). Site is targeting UK and Its essentially the same content, formatted differently. So every URL on www exists also at the "m" subdomain and is identical content. (there are some different contents, yet I could say about 90% or more contents are same) Recently I've noticed that search results are showing links to our mobile site instead of the desktop site. (Google UK) I have a sitemap.xml for both sites, the mobile sitemap defined as follows: I didn't block googlebot from mobile site and also didn't block googlebot-mobile from desktop site. I read and watched Google webmaster tool forum and related video from Matt Cutts. I found many opinion that there is possibility which cause duplicate contents issue and I should do one of followings. 1. Block googlebot from mobile site. 2. Use canonical Tag on mobile site which points to desktop site. 3. Create and develop different contents (needless to say...) Do you think duplicate contents issue caused my mobile site rank on S.E. instead of my desktop site? also Do you think those method will help to show my desktop site on S.E.? I was wondering that I have multi-country sites which is same site format as I mentioned above. However, my other country sites are totally doing fine on Google. Only difference that I found is my other country sites have different Title & Meta Tag comparing to desktop site, but my UK mobile site has same Title & Meta Tag comparing to desktop. Do you think this also has something to do with current problem? Please people~! Feel free to make some comments and share your opinion. Thanks for reading my long long explanation.
Technical SEO | | Artience0 -
If I redirect my WordPress blog to my main site, will it help my main site's SEO?
I have separate sites for my blog and main website. I'd like to link them in a way that enables the blog to boost my main site's SEO. Is there an easy way to do this? Thanks in advance for any advice...
Technical SEO | | matt-145670