Do you validate you websites?
-
Do you consider the guidelines from http://validator.w3.org/ when setting up a new website?
As far as I know they don't influence rankings ... What is your opinion about that topix?
-
I am with you on this. Good to check for any issues. Before focusing on SEO, functionality if my main concern.
-
I always validate HTML with sites I'm working on, particularly if has been coded by a third party. My reasons for doing so are a careful balance between ensuring spiders can crawl the page without bumping hideous html errors and ensuring a website is accessible on as many devices/browsers.
If the webpage doesn't adhere to standards it could indicate issues with viewing the pages correctly in the myriad of browsers and devices out there. So theres a User Experience issue to consider.
-
It depends on the project. I find that it is sometimes plugins that make my code not validate. If the plugin is so useful and that site renders fine in all the major browsers, I stick with the what I have, even if it doesn't validate.
-
We don't bother, I know we probably should but half of the sites we work on are CMS which just don't validate well anyway. Plus it takes time, which could be spent on more SEO
-
Like I said.... Google doesn't validate their website... Of course, Danny answered this question for Matt, sooooo.... there is no official statement from Google on this one.
-
New webmaster video from Matt Cutts about that topic:
-
I find the w3 validator to be more of an accolade than anything else. You're right about them not influencing rankings - there's so many practices that don't validate but actually lead to an unchanged or even improved UX.
IMO, getting w3 validation is like getting MozPoints, except MozPoints are worth something But that's not to say I'm knocking anyone who does follow validator guidelines - fair play to them!
-
Sure.
We do it because it's a great sales tool. Rarely do we ever find a competitor that builds W3C valid websites. In our sales pitch we talk about how our websites are W3C valid, it's adhering to a set of rules and guidelines and it's cleaner code generally which can increase load times.
We tell them they can display a W3C valid button on their site, most of them like that.
It's also a matter of doing things the right way... you can build a frame out of anything but there is a right way and a wrong way to build a door frame. We choose to do it all according to standards and best practices.
It's almost like a committment to excellence type of thing.
-
Hi David, thank you for your reply.
Would you mind sharing your arguments why you find it is important? I would be curious how many pros you find - I like your point of view.
-
It's very important to my company that all websites for our clients validate. Why? Because we feel they pay for a service and we want to provide the highest quality service.
It's like building a house and not sticking to code. We'd rather stick to code and do it the "right" way, rather than just have something that "works".
It's also a sales tool! Because none of our competitors build sites that are compliant, our sales guys use this and it works well. We explain what W3C is, why it's important, and although it doesn't help rankings, we feel it's important because it's simply a matter of doing it the right way. They like that!
-
I don't validate my website... but neither does Google.
-
I don't think it effects rankings, but perhaps the ability to be crawled. It is also good practice for the user when visiting the site. As with most SEOs today, we are not just responsible for getting to the page, but making sure they stay on the site and convert. : )
-
I have one guy in the company who is obsessed with it so no matter what I do he will go back and ensure we comply! I've seen at least one W3C nazi in each web company I have had a chance to work with
-
Even though w3c errors will not influence SEO directly there could be instances where some CSS issues could impact page speed resulting in slower spider crawls causing page speed ranking influence. We do tend to look at these reports once every quarter.
-
To use Google or any of its websites as an SEO example is by itself a mistake
-
lol - yes the resamblance is remarkable! That's the name of my boss :-).
It would be interesting if there were 2 exact same websites with just minor differences which causes some validation issues ... if the one without "faults" would rank better.
I think I even remember that Matt Cutts once said that this is not a ranking factor. Even if you put in google.com in the validator - you get several faults.
The "normal" person who looks at the webpage doesn't care either which faults are indicated in the background. So whom should I please with a w3c.org clean website? I suppose "just" to have a proper webpage....
-
Personally it is not my first worry.
But to run a validation check up doesn't cost a lot of time, so I usually do it. If it finds red marked problems, I solve them. But I don't get crazy with the many less important ones.
-
Hehehe... this old profiles database give weird result.
-
Hansj, you look remarkably like Petra!
As a former designer wannabe, I would always shoot for validation if possible. But since concentrating more on SEO issues these days, like you, I personally don't think it affects rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz crawler is not able to crawl my website
Hi, i need help regarding Moz Can't Crawl Your Site i also share screenshot that Moz was unable to crawl your site on Mar 26, 2022. Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster.
Technical SEO | | JasonTorney
my robts.txt also ok i checked it
Here is my website https://whiskcreative.com.au
just check it please as soon as possibe0 -
Does Index Status Goes Down If Website is Penalized by Google?
Hi Friends, Few of my friends told me that index status of a website goes down goes if it is penalized by Google. I can see my organic traffic has went down drastically after the Panda update on July 17th 2015 but index status still remains the same. So, I am bit confused. Any advice on this.
Technical SEO | | Prabhu.Sundar0 -
Traffic on my website hasn't gone up since
Anyone please I am looking for some help!! My website used to get around 40 to 50 visitors a day, as soon as I created the new website and put it live, traffic has dropped by 25%, page authority for the new and some of the old URL's are only 1, but my keywords are still doing well? I have made sure that I have redirected all the old URL's to the new ones, the tracking code in at the end section of the head section. Any ideas anyone?
Technical SEO | | One2OneDigital0 -
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂
Technical SEO | | J-Banz0 -
Building a new website post penalty and redirects
A website I'm working on is clearly algorithmically penalised. I've spent a lot of time mass disavowing spammy links, but it doesn't seem to make a difference. We have been planning to build a new website anyway since we are rebranding. 1. Is it possible to tell which pages are most likely to have a penalty applied? 2. If the website as a whole has a penalty, will redirecting certain pages to the new website carry the penalty? 3. Our website is structured as sales pages and blog content. It is the sales pages that have the spammy links, yet most of the blog content does not rank either. Would it be a good strategy to only redirect all the blog posts (which have natural links pointing to them) to the new website and not the sales pages? 4. The homepage has a mix of spam and very good editorial links. If I have disavowed links and domains, can I safely redirect this page?
Technical SEO | | designquotes0 -
Launching Website
We are developing a new website and thinking google would not find it because of the directory we put it in (no homepage yet) and because there are no links to it. For example, we are building it in this directory example.com/wordpress/ but somehow google found it and indexed pages not ready to be indexed. What should we do to stop this until we are ready to launch? Should we just use a robots.txt file with this in it? User-agent: *
Technical SEO | | QuickLearner
Disallow: / Will this create repercussions when we officially launch?0 -
Reusing content owned by the client on websites for other locations?
Hello All! Newbie here, so I'm working through some of my questions 🙂 I do have two major question regarding duplicate content: _Say a medical hospital has 4 locations, and chooses to create 4 separate websites. Each website would have the same design, but different NAP, and contact info, etc. Essentially, we'd be looking at creating their own branded template. _ My question 1.) If the hospitals all offer similar services, with roughly the same nav, does it make sense to have multiple websites? I figure this makes the most sense in terms of optimizing for their differing locations. 2.) If the hospital owns the content on the first site, I'm assuming it is still necessary to change it duplicates for the other properties? Or is it possible to differentiate between the duplication of owned content from other instances of content duplication? Everyone has been fantastic here so far, looking forward to some feedback!
Technical SEO | | kbaltzell0 -
Do You Have To Have Access to Website Code to Use Open Graph
I am not a website programmer and all of our websites are in Wordpress. I never change the coding on the backend. Is this a necessity if one wants to use Open Graph?
Technical SEO | | dahnyogaworks0