Do you validate you websites?
-
Do you consider the guidelines from http://validator.w3.org/ when setting up a new website?
As far as I know they don't influence rankings ... What is your opinion about that topix?
-
I am with you on this. Good to check for any issues. Before focusing on SEO, functionality if my main concern.
-
I always validate HTML with sites I'm working on, particularly if has been coded by a third party. My reasons for doing so are a careful balance between ensuring spiders can crawl the page without bumping hideous html errors and ensuring a website is accessible on as many devices/browsers.
If the webpage doesn't adhere to standards it could indicate issues with viewing the pages correctly in the myriad of browsers and devices out there. So theres a User Experience issue to consider.
-
It depends on the project. I find that it is sometimes plugins that make my code not validate. If the plugin is so useful and that site renders fine in all the major browsers, I stick with the what I have, even if it doesn't validate.
-
We don't bother, I know we probably should but half of the sites we work on are CMS which just don't validate well anyway. Plus it takes time, which could be spent on more SEO
-
Like I said.... Google doesn't validate their website... Of course, Danny answered this question for Matt, sooooo.... there is no official statement from Google on this one.
-
New webmaster video from Matt Cutts about that topic:
-
I find the w3 validator to be more of an accolade than anything else. You're right about them not influencing rankings - there's so many practices that don't validate but actually lead to an unchanged or even improved UX.
IMO, getting w3 validation is like getting MozPoints, except MozPoints are worth something But that's not to say I'm knocking anyone who does follow validator guidelines - fair play to them!
-
Sure.
We do it because it's a great sales tool. Rarely do we ever find a competitor that builds W3C valid websites. In our sales pitch we talk about how our websites are W3C valid, it's adhering to a set of rules and guidelines and it's cleaner code generally which can increase load times.
We tell them they can display a W3C valid button on their site, most of them like that.
It's also a matter of doing things the right way... you can build a frame out of anything but there is a right way and a wrong way to build a door frame. We choose to do it all according to standards and best practices.
It's almost like a committment to excellence type of thing.
-
Hi David, thank you for your reply.
Would you mind sharing your arguments why you find it is important? I would be curious how many pros you find - I like your point of view.
-
It's very important to my company that all websites for our clients validate. Why? Because we feel they pay for a service and we want to provide the highest quality service.
It's like building a house and not sticking to code. We'd rather stick to code and do it the "right" way, rather than just have something that "works".
It's also a sales tool! Because none of our competitors build sites that are compliant, our sales guys use this and it works well. We explain what W3C is, why it's important, and although it doesn't help rankings, we feel it's important because it's simply a matter of doing it the right way. They like that!
-
I don't validate my website... but neither does Google.
-
I don't think it effects rankings, but perhaps the ability to be crawled. It is also good practice for the user when visiting the site. As with most SEOs today, we are not just responsible for getting to the page, but making sure they stay on the site and convert. : )
-
I have one guy in the company who is obsessed with it so no matter what I do he will go back and ensure we comply! I've seen at least one W3C nazi in each web company I have had a chance to work with
-
Even though w3c errors will not influence SEO directly there could be instances where some CSS issues could impact page speed resulting in slower spider crawls causing page speed ranking influence. We do tend to look at these reports once every quarter.
-
To use Google or any of its websites as an SEO example is by itself a mistake
-
lol - yes the resamblance is remarkable! That's the name of my boss :-).
It would be interesting if there were 2 exact same websites with just minor differences which causes some validation issues ... if the one without "faults" would rank better.
I think I even remember that Matt Cutts once said that this is not a ranking factor. Even if you put in google.com in the validator - you get several faults.
The "normal" person who looks at the webpage doesn't care either which faults are indicated in the background. So whom should I please with a w3c.org clean website? I suppose "just" to have a proper webpage....
-
Personally it is not my first worry.
But to run a validation check up doesn't cost a lot of time, so I usually do it. If it finds red marked problems, I solve them. But I don't get crazy with the many less important ones.
-
Hehehe... this old profiles database give weird result.
-
Hansj, you look remarkably like Petra!
As a former designer wannabe, I would always shoot for validation if possible. But since concentrating more on SEO issues these days, like you, I personally don't think it affects rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you 'close down' a website?
Hello all, If a company acquires a smaller company and 'absorbs' its products and services into its own website, what is the protocol with closing down the smaller company's site? So far we added our branding to the site alerting their visitors to the imminent takeover, and 301 redirected certain pages - soon we'll be redirecting all the pages to their counterparts on the main website. Once that's done, should we noindex the old site? Anything else? Thanks, Caro
Technical SEO | | Caro-O0 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
Implementing Cannonical & Alternate tags on a large website
Hi There, Our brochureware website consists of a Desktop site (www.oursite.com)and a Mobile website (m.oursite.com). I know I need to implement the alternate tags on the desktop pages and the cannonical tags on the mobile versions. However we have a huge site is there any dynamic way through javascript to have the code be generated or is it something that should be done manually page by page? Below is sample javascript a colleague completed to attempt to dynamically develop the snippet but I am unsure if bots will be able to interpret it: Alternate version: Thanks in advance Phil
Technical SEO | | Phily0 -
Duplicate Page Title for a Large Listing Website
My company has a popular website that has over 4,000 crawl errors showing in Moz, most of them coming up as Duplicate Page Title. These duplicate page titles are coming from pages with the title being the keyword, then location, such as: "main keyword" North Carolina
Technical SEO | | StorageUnitAuctionList
"main keyword" Texas ... and so forth. These pages are ranked and get a lot of traffic. I was wondering what the best solution is for resolving these types of crawl errors without it effecting our rankings. Thanks!0 -
Only my website homepage is appearing in search and the other indvidual pages are not coming up?This happened after the website revamp
We have revamped our website http://www.wsinetpower.com/ after te revamp the SEO rankings went down and the inner pages are not appearing in serach. What could be the reason
Technical SEO | | Muna0 -
Give your top 3 of best optimized websites
Hey gents & ladies, Give your top 3 of websites that in your eyes are optimized in a good way? Tell me why you think the website is that good and notice the keywords.
Technical SEO | | PlusPort0 -
How can i increase my website traffic
Hello, my boss has decide a build website we have more than 12500 products in ourwebsite its mtscellular.com, im new as seo but im confused and need help i want to know how somebody help me to increase my website traffic
Technical SEO | | jimmylora0 -
Best way to display maintenence mode on a website?
I have a website with lots of traffic and sometimes the backends fail. I want to use lighttpd to show that the website is under mantenence and should be back up shortly. I was thinking of using Soft 503 errors or doing a 302 for every page to /maintenance.html. What would you do (besides fixing the backends, we are already doing that :P) to avoid hurting your SEO efforts? Thanks in advance Mariano
Technical SEO | | marianoSoler980