W3C My site has 157 Errors, 146 warning(s) Is it an issue?
-
Is having this number of W3C errors & warnings an issue and will be impacting my site's performance?
When the site was built 6 months ago my developers told me that it "was nothing to worry about", but I have read that any errors aren't good, let alone the huge number my site has?
Your advice please
Thanks
Ash
-
My website that is ranking well has errors too:
Result: 59 Errors, 4 warning(s)
So far the site is still ranking well so i am not very worried.
But i know my friend who engaged in an India SEO company to solve his problem. -
"On the other URL checker you gave me the site scored 3 F's & 3 A's but i am not sure whether that is good or bad."
I use that test mainly to test the load times 1st and reload times, here are your results:
http://www.webpagetest.org/result/140208_3Y_B5M/
This was run for 1.5DSL connection from London, UK. 18sec and 10sec is def on the slower side, and caching could definitely help. As you want to aim for 2-5sec loads.
But roughly looking at the validation errors I didnt spot anything that would decrease your performance, in your case the performance is mostly likely not due to the validations. Again I just did a rough look through, so dont take my words for fact. Maybe others can chime in if they see something out of the ordinary with the errors.
-
Hi Vadim here is the URL for the W3C results
On the other URL checker you gave me the site scored 3 F's & 3 A's but i am not sure whether that is good or bad.
I queried these issues with my developer when they launched the site but was told not to worry about it, but as my knowledge has grown I am not sure whether i should be worrying about it or not!?
I have posted other questions on Moz recently about why i can see x2 menus when looking at google cache of site & why you can only see the site in text and not the full version?
Plus using SEOTools crawler tool it shows me all the HTML coding aswell as body text when i do a keyword search?
Hopefully the W3C results will help you or someone shed some light on whether i am ok of not?
Thanks
Ash
-
Hi Ash,
Most would say the validator is at times too strict or in some ways moving slower to the changing technology. A good example is doing a validation for facebook.com still yields 45 errors, and 4 warnings.
Now 156 errors may be significant, depending on the errors. But assuming your developers did a good job, those might be things that may or may not be significant. For example nytimes.com has 500+ errors. Does that mean their site is slow or broken, not necessarily. It is interesting that you are asking about this 6 months since this issue was brought up and not right then and there
The first check I would run, is actual performance check on your site here is are good options for you: http://www.webpagetest.org/
You can test various browsers, internet connections, and locations, to see if the results are reasonable for your situation. Also posting your error codes here would help and one of us can help you and actually tell you if your errors are a big deal or not!
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a site that has a 302 redirect loop on the home page (www.oncologynurseadvisor.com) i
i am trying to do an audit on it using screaming frog and the 302 stops it. My dev team says it is to discourage Non Human Traffic and that the bots will not see it. Is there any way around this or what can I tell the dev team that shows them it is not working as they state.
Web Design | | HayMktVT0 -
Any body can help me to make my web site seo freindly?
any body can help me to make my web site seo freindly? i have not big budget please email me fabric35@hotmail.com
Web Design | | fabric-fabric0 -
Is there an issue if we show our old mobile site to Google & new site to users
Hi, We have our existing mobile site that contains interlinking in footer & content and new mobile site that does not have interlinking. We will show existing mobile site to google crawler & new mobile site to users. Will this be taken as black hat by Google. The mobile site & desktop site will have same url across devices & browsers. Regards
Web Design | | vivekrathore0 -
What's the point of an EU site?
Buongiorno from 18 degrees C Wetherby UK 🙂 On this site http://www.milwaukeetool.eu/ the client wants to hold on to the EU site despite there being multiple standalone country sittes e.g. http://www.milwaukeetool.fr & http://www.milwaukeetool.co.uk Why would you ever need an EU site? I mean who ever searches for an EU site? If the client holds on to the eu site despite my position it's a waiste of time from a search perspective is the folowing the best appeasment? When a user enters the eu url or redirects to country the detected, eg I'm in Paris I enter www.milwaukeetool.eu it redirects to http://www.milwaukeetool.fr. My felling this would be the most pragmatic thing to do? Any ideas please,
Web Design | | Nightwing
Cioa,
David0 -
301 redirects from old site to new
hey all, we just did a site redesign and have less pages on the new site than the old. is it bad to redirect multiple pages from the old site to the same page on the new? for example redirect ...com/apps ...com/android ...com/mobile and point them all to....com/custom-apps thanks!
Web Design | | jaychow0 -
What's the best way to sculpt links on a page?
I know PR isn't a top ranking factor anymore, so "PR sculpting" isn't something to focus on. But isn't it still true that having more links that you need on any given page is worse than having fewer, in terms of that page's authority? I'm managing a site that has a lot of navigational links in the footer, which are duplicative because they're almost all included in the top nav bar, and several are triplicated in the sidebar as well. I wanted to remove 85% of these duplicative links from the footer, thinking they diluted the page authority and that most users probably won't scroll there anyway when we launch the site. The site owner is pushing back, though, not wanting to remove so many links because he believes they might be useful to some users. We can test our respective user-behavior theories after launching, but right now I have two questions: Will having a sizable number of duplicative links in the footer dilute the page's authority? and 2) Are there any other ways to reduce this dilution, aside from simply removing the links? (I know nofollow is not the answer, but possibly using iframes or Java or something like that?)
Web Design | | KyleJB0 -
Ajax pagination and filters for ecommerce site
Hi There, Is it ok to use ajax for product filters and pagination? In this case url doesn't change when you navigate to 2nd or 3rd page also when you filter by colours, etc. If not what's your advise?
Web Design | | Jvalops0 -
Best way to develop a WordPress version of your site and then move it?
I have two websites that generate hundreds of organic hits each day. I want to switch both of them to WordPress. What is the best way to go about developing these sites then making them live while still keeping the current ones up?
Web Design | | C-Style0