Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The W3C Markup Validation Service - Good, Bad or Impartial?
-
Hi guys,
it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task.
My questions to you fellow SEO'rs out there are 2:
1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with.
2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation.
*As a note i will say that i mostly refer to Wordpress driven sites.
would love ot hear your take.
Daniel.
-
I am my own client, so I can be as picky as a want, and I take care of the details that I feel are important.
I pay close attention to how the site is responding and rendering when I pretend that I am a visitor. I pay even more attention when a customer or visitor writes to me with a complaint. In my opinion, if the site is working great then all is good.
W3C validation seems to be of jugular importance to W3C evangelists. They will tell you that you will burn in Hell if you don't achieve it with flying colors. People who want to sell you their services will point at any fault that can be detected.
Practical people have a different opinion. I try to be as practical as possible.
-
I agree with Andy,
I use it as a guidance tool on any website i build. It serves a purpose, to check things are understood how they should be by a predetermined standard. But like any other automated tool it compares to set requirements that cannot always be met and cannot identify and ok these exceptions.
As long as you understand the error its pointing out and why its pointing it out, and know that despite this the code is rendering correctly and all outcomes are working as expected then there is no problem.
From an SEO stand point, aslong as google see's your site how you want it too i think it is a very very minor factor. Hell all of google returns errors of some variety.
-
Hi Yiannis,
I tend to add these in as an advisory to my clients because for the most part, and unless I see something specific, the results have absolutely no effect on SEO. If they wish to act on them, it is for their developers to handle.
I don't argue my corner really - never had to. I just tell them like it is - the site is rendering fine in everything and with no issues, so fix errors if you have the time and resources.
As I said, unless I spot something that is an actual problem, then it tends to just get bypassed.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the importance of exact match keywords for local SEO in service industry businesses?
I am working with a local service contractor. Several of his competitors have domain names with exact match keywords. Audits of competitor sites and use of other research tools reveals that their sites are behind in content and technical SEO. The competitor sites consistently rank higher in organic search results. I am new to SEO and I understand that some of my lack of clarity here is a result of not understanding the value of key word use in local SEO vs. wider efforts.
Technical SEO | | Andrew Woffenden0 -
Traffic goes down after migration from aws to Microsoft azure cloud service
After migration of web application from aws ec2 instance to Microsoft azure web App service, we observed that we lost our 50% traffic. Our site custom domain is ihealthmantra.com and azure web App has default domain azurewebsites.net . Azure WebApp service has drawback that default domain gets in picture after mapping to my custom domain .We have mapped azure webAPP host name to our custom domain as CNAME record in DNS Table . Now same site working with two domains i.e ihealthmantra.com as well ass azurewebsites.net . As we seen this issues we made 301 redirection from azure default domain to our custom domain, Still no change in traffic.Google is now showing external links from azurewebsites.net to healthmantra.com . We are totally confused now . We don't know what exactly affected to our search traffic . Please Help us.
Technical SEO | | DivyaDubey0 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
Website credits for designers - good or bad
Hi My core service is web design and development. I often place a credit on my clients websites pointing them back to my web design or web development pages. Is this a wise practice with penguin and panda updates? Would this also pull my ranking down?
Technical SEO | | Cocoonfxmedia0 -
Is it bad to have same page listed twice in sitemap?
Hello, I have found that from an HTML (not xml) sitemap of a website, a page has been listed twice. Is it okay or will it be considered duplicate content? Both the links use same anchor text, but different urls that redirect to another (final) page. I thought ideal way is to use final page in sitemap (and in all internal linking), not the intermediate pages. Am I right?
Technical SEO | | StickyRiceSEO1 -
Using Sitemap Generator - Good/Bad?
Hi all I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it. The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me! Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing? Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site? Any thoughts or optinions would be greatly appreciated. Kris
Technical SEO | | yousayjump0 -
How much impact does bad html coding really have on SEO?
My client has a site that we are trying to optimise. However the code is really pretty bad. There are 205 errors showing when W3C validating. The >title>, , <keywords> tags are appearing twice. There is truly excessive javascript. And everything has been put in tables.</keywords> How much do you think this is really impacting the opportunity to rank? There has been quite a bit of discussion recently along the lines of is on-page SEO impacting anymore. I just want to be sure before I recommend a whole heap of code changes that could cost her a lot - especially if the impact/return could be miniscule. Should it all be cleaned up? Many thanks
Technical SEO | | Chammy0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0