Do you validate you websites?
-
Do you consider the guidelines from http://validator.w3.org/ when setting up a new website?
As far as I know they don't influence rankings ... What is your opinion about that topix?
-
I am with you on this. Good to check for any issues. Before focusing on SEO, functionality if my main concern.
-
I always validate HTML with sites I'm working on, particularly if has been coded by a third party. My reasons for doing so are a careful balance between ensuring spiders can crawl the page without bumping hideous html errors and ensuring a website is accessible on as many devices/browsers.
If the webpage doesn't adhere to standards it could indicate issues with viewing the pages correctly in the myriad of browsers and devices out there. So theres a User Experience issue to consider.
-
It depends on the project. I find that it is sometimes plugins that make my code not validate. If the plugin is so useful and that site renders fine in all the major browsers, I stick with the what I have, even if it doesn't validate.
-
We don't bother, I know we probably should but half of the sites we work on are CMS which just don't validate well anyway. Plus it takes time, which could be spent on more SEO
-
Like I said.... Google doesn't validate their website... Of course, Danny answered this question for Matt, sooooo.... there is no official statement from Google on this one.
-
New webmaster video from Matt Cutts about that topic:
-
I find the w3 validator to be more of an accolade than anything else. You're right about them not influencing rankings - there's so many practices that don't validate but actually lead to an unchanged or even improved UX.
IMO, getting w3 validation is like getting MozPoints, except MozPoints are worth something But that's not to say I'm knocking anyone who does follow validator guidelines - fair play to them!
-
Sure.
We do it because it's a great sales tool. Rarely do we ever find a competitor that builds W3C valid websites. In our sales pitch we talk about how our websites are W3C valid, it's adhering to a set of rules and guidelines and it's cleaner code generally which can increase load times.
We tell them they can display a W3C valid button on their site, most of them like that.
It's also a matter of doing things the right way... you can build a frame out of anything but there is a right way and a wrong way to build a door frame. We choose to do it all according to standards and best practices.
It's almost like a committment to excellence type of thing.
-
Hi David, thank you for your reply.
Would you mind sharing your arguments why you find it is important? I would be curious how many pros you find - I like your point of view.
-
It's very important to my company that all websites for our clients validate. Why? Because we feel they pay for a service and we want to provide the highest quality service.
It's like building a house and not sticking to code. We'd rather stick to code and do it the "right" way, rather than just have something that "works".
It's also a sales tool! Because none of our competitors build sites that are compliant, our sales guys use this and it works well. We explain what W3C is, why it's important, and although it doesn't help rankings, we feel it's important because it's simply a matter of doing it the right way. They like that!
-
I don't validate my website... but neither does Google.
-
I don't think it effects rankings, but perhaps the ability to be crawled. It is also good practice for the user when visiting the site. As with most SEOs today, we are not just responsible for getting to the page, but making sure they stay on the site and convert. : )
-
I have one guy in the company who is obsessed with it so no matter what I do he will go back and ensure we comply! I've seen at least one W3C nazi in each web company I have had a chance to work with
-
Even though w3c errors will not influence SEO directly there could be instances where some CSS issues could impact page speed resulting in slower spider crawls causing page speed ranking influence. We do tend to look at these reports once every quarter.
-
To use Google or any of its websites as an SEO example is by itself a mistake
-
lol - yes the resamblance is remarkable! That's the name of my boss :-).
It would be interesting if there were 2 exact same websites with just minor differences which causes some validation issues ... if the one without "faults" would rank better.
I think I even remember that Matt Cutts once said that this is not a ranking factor. Even if you put in google.com in the validator - you get several faults.
The "normal" person who looks at the webpage doesn't care either which faults are indicated in the background. So whom should I please with a w3c.org clean website? I suppose "just" to have a proper webpage....
-
Personally it is not my first worry.
But to run a validation check up doesn't cost a lot of time, so I usually do it. If it finds red marked problems, I solve them. But I don't get crazy with the many less important ones.
-
Hehehe... this old profiles database give weird result.
-
Hansj, you look remarkably like Petra!
As a former designer wannabe, I would always shoot for validation if possible. But since concentrating more on SEO issues these days, like you, I personally don't think it affects rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking a Polish website in English with existing keywords
I have a website that is currently in Polish and I'm interested in ranking it for the same keywords in English. I'm wondering if I need to create entirely new pages for the English version or if there are plugins or other tools that can help me translate and optimize my existing content for English search engines. my website seo factor. Any recommendations or experiences are greatly appreciated!
Technical SEO | | mohammadrehanseo0 -
Some questions about URL structure and multi country website
Gajanand angela dayHi,
Technical SEO | | Shahjahaaan
I have a question from SEO experts and web developers.
I want to setup a job website for 5 countries. for each country i will provide daily jobs listing on the basis of
1. jobs by categories - for example : accounting jobs. IT jobs, Sales jobs
2. jobs by city - for example : jobs in boston, jobs in chicago
3. jobs by companies for example : jobs in facebook, jobs in emirates case :
a company name " emirates " located in "boston" having vacancy of "accounting job " having position of full time this case job will be present in following categories . 1. accounting jobs in boston
2. jobs in boston
3. jobs in emirates and open any above option there will be filter box on left side showing
position i.e full time
salary i.e 1000-1500
location i.e boston,chicago Q.1
i want to know when user search on google these terms "accounting jobs in boston " or "jobs in boston" or "jobs in emirates" same job will display which url structure is recommended in for each search term? Q.2 how we can do on page SEO for these terms because jobs listing will be changing daily because of new jobs addition and content is changing not Q.3 should i create website on separate domains for each country or same domain but with different folders in it
.co.uk or com/uk for UK and .ae OR .com/uae for UAE Note : i will also attach blog on it and each blog will focus on specific country knowledge for example for USA , how to find jobs in new york and for UAE how to find jobs in Dubai etc . Thanks in Advance0 -
Gradual Drop in GWT Indexed Pages for large website
Hey all, I am working on SEO for a massive sports website. The information provided will be limited but I will give you as much context as possible. I just started digging into it and have found several on-page SEO issues of which I will fix when I get to the meat of it but this seems like something else could be going on. I have attached an image below. It doesn't seem like it's a GWT bug as reported at one point either as it's been gradually dropping over the past year. Also, there is about a 20% drop in traffic in Google Analytics over this time as well. This website has hundreds of thousands of pages of player profiles, sports team information and more all marked up with JSON-LD. Some of the on-page stuff that needs to be fixed are the h1 and h2, title tags and meta description. Also, some of the descriptions are pulled from wikipedia and linked to a "view more" area. Anchor text has "sign up" language as well. Not looking for a magic bullet but to be pointed in the right direction. Where should I start checking off to ensure I cover my bases besides the on page stuff above? There aren't any serious errors and I don't see any manual penalties. There are 4,300 404's but I have seen plenty of sites with that many 404's all of which still got traffic. It doesn't look like a sitemap was submitted to GWT and when I try submitting sitemap.xml, I get a 504 error (network unreachable). Thanks for reading. I am just getting started on this project but would like to spend as much time sharpening the axe before getting to work. lJWk8Rh
Technical SEO | | ArashG0 -
Duplicate Version of My Website
Hello Again, Looking for a little help to help me understand what exactly is going on here. Ive taken over maintenance of a website and have so far fixed a lot of issues. ahrefs has shown me that a second version of my companies website exists that exists at a second url. This second website is linked to the actual company website like I haven't seen before. www(dot)#(dot)co(dot)uk is the main company website. But a second accessible version exists and is accessible at www(dot)#(dot)co(dot)uk The instruments version is a direct copy and all of the links point directly to my main site. Any changes I make on the main version are automatically applied to the other version. It shows up as a SPAM back link on moz as all of the link points to my website etc Ideally in my mind, the instruments version homepage should simply re-direct to the main homepage to solve this "duplicate content and spammy backlink" issue however, the instruments version is the same suffix that all our company emails work with. Basically, HELP lol. I have no understanding of how this is set up, and the best way in which to deal and if it could affect anything such as company emails.
Technical SEO | | ATP0 -
My website pages are not crawled, what to do?
Hi all. I have made some changes on the website so i like to crawled them by the search engines Google especially. I have made these changes around 2 weeks ago. I have submitted my website on good bookmarking websites. Also i used a tool available in Google webmasters "Fetch as Google", Resubmitted a sitemap.xml. Still my pages are not crawled your opinion please. Thanks
Technical SEO | | lucidsoftech0 -
Wordpress Website + 404 Errors
Hi everyone, I like to do a bit of auditing for our clients using SEOMoz. Once client that's using a Wordpress website had reported over a couple hundred 404 errors. However, when checking out the links, all the webpages (that I've tested) loaded just fine. Does anyone know why this would be the case? I thought, perhaps, the website might have gone down when it was crawling, but I have no evidence to back this up.
Technical SEO | | ThinkShiftInc0 -
How do I remove Links to my website???
Hi Guys, Please can anyone help!! Can anyone tell me how on earth I can remove links to my website? My website has been hit by the new penguin update and the company that was doing my SEO seems to have built a lot of spammy links!! How can I remove these links??? Please can anyone help Thanks Gareth
Technical SEO | | GAZ090 -
How to create a tree-like structure map of a website?
Hi all, The online marketing manager requested to make a tree-like map of the website. He means that he would like to see a graphical representation of the website and his contents. This way we will be able to see if there are internal link issues. The problem is that there are thousands of pages and many subdomains, manual labour would make this a very tedious task. If you would get this question, how would you try to solve this? Any software recommendation?
Technical SEO | | djingel10