Do you validate you websites?
-
Do you consider the guidelines from http://validator.w3.org/ when setting up a new website?
As far as I know they don't influence rankings ... What is your opinion about that topix?
-
I am with you on this. Good to check for any issues. Before focusing on SEO, functionality if my main concern.
-
I always validate HTML with sites I'm working on, particularly if has been coded by a third party. My reasons for doing so are a careful balance between ensuring spiders can crawl the page without bumping hideous html errors and ensuring a website is accessible on as many devices/browsers.
If the webpage doesn't adhere to standards it could indicate issues with viewing the pages correctly in the myriad of browsers and devices out there. So theres a User Experience issue to consider.
-
It depends on the project. I find that it is sometimes plugins that make my code not validate. If the plugin is so useful and that site renders fine in all the major browsers, I stick with the what I have, even if it doesn't validate.
-
We don't bother, I know we probably should but half of the sites we work on are CMS which just don't validate well anyway. Plus it takes time, which could be spent on more SEO
-
Like I said.... Google doesn't validate their website... Of course, Danny answered this question for Matt, sooooo.... there is no official statement from Google on this one.
-
New webmaster video from Matt Cutts about that topic:
-
I find the w3 validator to be more of an accolade than anything else. You're right about them not influencing rankings - there's so many practices that don't validate but actually lead to an unchanged or even improved UX.
IMO, getting w3 validation is like getting MozPoints, except MozPoints are worth something But that's not to say I'm knocking anyone who does follow validator guidelines - fair play to them!
-
Sure.
We do it because it's a great sales tool. Rarely do we ever find a competitor that builds W3C valid websites. In our sales pitch we talk about how our websites are W3C valid, it's adhering to a set of rules and guidelines and it's cleaner code generally which can increase load times.
We tell them they can display a W3C valid button on their site, most of them like that.
It's also a matter of doing things the right way... you can build a frame out of anything but there is a right way and a wrong way to build a door frame. We choose to do it all according to standards and best practices.
It's almost like a committment to excellence type of thing.
-
Hi David, thank you for your reply.
Would you mind sharing your arguments why you find it is important? I would be curious how many pros you find - I like your point of view.
-
It's very important to my company that all websites for our clients validate. Why? Because we feel they pay for a service and we want to provide the highest quality service.
It's like building a house and not sticking to code. We'd rather stick to code and do it the "right" way, rather than just have something that "works".
It's also a sales tool! Because none of our competitors build sites that are compliant, our sales guys use this and it works well. We explain what W3C is, why it's important, and although it doesn't help rankings, we feel it's important because it's simply a matter of doing it the right way. They like that!
-
I don't validate my website... but neither does Google.
-
I don't think it effects rankings, but perhaps the ability to be crawled. It is also good practice for the user when visiting the site. As with most SEOs today, we are not just responsible for getting to the page, but making sure they stay on the site and convert. : )
-
I have one guy in the company who is obsessed with it so no matter what I do he will go back and ensure we comply! I've seen at least one W3C nazi in each web company I have had a chance to work with
-
Even though w3c errors will not influence SEO directly there could be instances where some CSS issues could impact page speed resulting in slower spider crawls causing page speed ranking influence. We do tend to look at these reports once every quarter.
-
To use Google or any of its websites as an SEO example is by itself a mistake
-
lol - yes the resamblance is remarkable! That's the name of my boss :-).
It would be interesting if there were 2 exact same websites with just minor differences which causes some validation issues ... if the one without "faults" would rank better.
I think I even remember that Matt Cutts once said that this is not a ranking factor. Even if you put in google.com in the validator - you get several faults.
The "normal" person who looks at the webpage doesn't care either which faults are indicated in the background. So whom should I please with a w3c.org clean website? I suppose "just" to have a proper webpage....
-
Personally it is not my first worry.
But to run a validation check up doesn't cost a lot of time, so I usually do it. If it finds red marked problems, I solve them. But I don't get crazy with the many less important ones.
-
Hehehe... this old profiles database give weird result.
-
Hansj, you look remarkably like Petra!
As a former designer wannabe, I would always shoot for validation if possible. But since concentrating more on SEO issues these days, like you, I personally don't think it affects rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using one robots.txt for two websites
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
Technical SEO | | ciehmoz0 -
How preproduction website is getting indexed in Google.
Hi team, Can anybody please help me to find how my preproduction website and urls are getting indexed in Google.
Technical SEO | | nlogix0 -
Duplicate website with http & https
I have a website that only in a specific state in the USA we had to add a certificate for it to appear with https. my question is how to prevent from the website to be penalized on duplicate content with the http version on that specific state. please advise. thanks!
Technical SEO | | taly0 -
Big Mess - Multiple Websites
I have a customer, a Psychologist, who put up +/-20 websites many years ago. He has 1 main site (with his name as the domain) with hundreds of pages of quality content. The other sites are all exact match domains - anxiety counseling, couples counseling, etc. Some are single page sites, others have a good amount of quality content. Many of the EMD sites were getting ranked on the first page, as was the main site. The money site was ranking on the first page for the best keywords All of the EMD pages linked back to the main site, many with site wide footer links. The main site did not link back. All of the sites are on the same IP address. These sites have been in place for years. I don't believe that he has a duplicate content problem. About 8 weeks ago, the rankings for the main site crashed, moving 10 or more SERP pages deep. The EMD sites are still ranking. He has not gotten any nasty-grams from Google in Webmaster Tools. The Psychologist relies exclusively on organic for his business, and it has taken a significant hit. 1. Has anyone else seen this happen? 2. Is it safe to assume that Google finally nailed him for using a linking scheme? 3. How can we unwind this? The other sites are still generating business, and if those go away, he is really screwed. 4. Will taking down all of the links from the other sites be enough? Would moving the money site to another hosting company on a different IP make a difference? Ideas? I think the white hat answer would be to take down the EMD sites, and 301 redirect to the main site. The problem is that the loss of business from this process could be catastrophic.
Technical SEO | | CsmBill0 -
Websites being hacked & duplicated, what should we do?
Hi, please help! Our website was hacked and being totally duplicated. They even injected codes to intercept our orders. Although the codes issue had been solved, still there're two mirror sites out there. When search for some of our key words, they even have good ranks. What exactly can we do to let Google ban those two sites. Thanks in advance!
Technical SEO | | Squall3150 -
How do you incorporate a Wordpress blog onto an ecommerce website?
Hello there, We have a company website: http://www.parklanechampagne.co.uk/ and a Wordpress blog: http://www.alastairharrison.me/ and I would like the blog on the subfolder http://www.parklanechampagne.co.uk/blog so that we get maximum SEO benefit from updating this regularly (I understand this would be better than putting it on a subdomain blog.parklanechampagne.co.uk?). The Wordpress blog is hosted externally but I was after some advice on how we can move this blog to the parklanechampagne/blog subfolder? Any help gratefully received - I've asked several SEO and web agencies this question and had a lot of contrasting replies! Many thanks, Jon
Technical SEO | | jonmorse860 -
Does running 2 domains from the same website harm my seo efforts?
Hi I run a website which has 2 domains running from it. One domain has a .co.uk extension and shows products aimed at the UK market. The other domain is a .com which shows products aimed at the US market. As both domains run from the same website, the content is mostly identical APART from the product listings as they change depending upon whether you're in the UK or US. My reason for doing is this mainly for paid search purposes as I can have a more accurate domain depending upon which country I target. Both sites are set to encourage the bots to index it, and in google WMT each domain is targeted to its specific country. So, my questions are, Is this set-up possibly damaging my SEO efforts for either or both websites? If so, would setting one domain to no-index, improve SEO efforts for the other? Thanks in advance for any replies! Cheers Joe
Technical SEO | | joeprice0 -
Internal website search
Hi, I'd like to index dynamically generated - internal website search pages - to Google. A mod rewrite to make the URL strings friendlier might be one way, but as these pages are created on the fly and effectively don't exist till the search keywords are inputted, is it even possible to index them? thanks
Technical SEO | | richcowley0