Errors in my coding how significant is this regarding rankings ?
-
I posted a question on here yesterday about the homepage asking for advice regarding the content and then was told by two people were very helpful bbut moved over comment not on content but to say taht the major problem was that the coding on my website basically has too many errors which would result in me receiving lower rankings in the search engines. I realise this website is old-fashioned Dreamweaver template which was constructed several years ago which I've updated and I'm certainly not a professional, but I watch my Google analytics and there doesn't seem to be any significant change in the stats from this time last year.
This is the site http://www.endeavourcottage.co.uk/
I realise the site is old format and has been around for several years it's just from customer feedback they seem to think it looks okay for the products old cottages but I guess technically it's not the best now.
I have run a test using Silktide Nibbler - a free online service that gives you a good complete overview of your website with an overall score. And it did give my website an overall good score but did point out errors in the coding but when I checked some of my competitors near the top of Google for the short tail keywords some of them also have errors in their coding, very similar to my own error score.. I then went to Google Webmaster tools and there were no warning messages.
So the big question is how important are these errors scores when it appears that most of the top competition also are in the same situation?
I think it's quite possible I could do with a redesign using responsive design
Best Alan
-
H tags are also how people with screen readers often navigate. Not including them is making your site even harder for someone using such a device than it needs to be. Screen readers read out all the text on a site, navigation, links etc (all super fast). Many people using screen readers will navigate by heading tags because it's easier to get to the content.
-
Yes that's exactly what I'm looking for is much competitive advantage is possible.
I totally understand you've got to get a good person for the web design and coding but I do have such a person and I spoke to them today
.
182 pages and been placed on the site map for the site and Google as index 158 of these, with the majority are articles about the location which I've written and they are not duplicate content I want to keep them online obviously. I will have the property pages redesigned first. The only thing I'm a little bit lacking in understanding is that I do use the basic H1, H2H H3 tags in all my articles including the homepage I've just at the HTML so I don't think I'm doing too much wrong in that direction hopefully.The site does come up with coding errors how important they are and not sure I run the same test on the competition which is at the very top of Google for most of the short tail keywords and they also came up with a very similar amount of errors most of which unfortunately don't really understand.
I'm following suit with you I will pay to have overall regarding coding and design is definitely worth the money.
-
So the big question is how important are these errors scores when it appears that most of the top competition also are in the same situation?
My response to this is a question. Are you looking for the competitive advantage?
Nobody knows exactly how Google or any other search engine assesses the code on a page.
I personally hold the belief that my use of
,
and
are valuable for informing search engines about the main, second, and tertiary topics of importance on my webpage. That is what I believe and I make a planned effort to present the hierarchy of my document's topics in a way that is marked with <h>designations. So, on that basis alone I would be doing major work on your website because there are hundreds of <h>tags in the code of every page on your website. <h>has been used as formatting instead of marking the important elements of each page.
You agree that you want a responsive site and that involves working in the code. That makes a good time to improve other parts of the code and the design as well.
Finally, you want a site that is quick to deliver and easy for browsers to render properly. For that reason you want code that is as light and as simple as possible. All the more reason to have the site recoded.
I am not a code evangelist. My site isn't perfect. But, I am just saying that if I was the owner of your site, I would want the site recoded from scrach to be as light and as simple as possible. At this moment, I am paying someone to completely redo my main site to improve elements that are less motivating to me that what I see in your code. So, I have put my money where my mouth is on this.
One risk that you have in getting the site recoded is that you could hire someone who will not do the job properly or well. So, it is really important to get the right person to do the job or learn all that is needed to do it right and well yourself.</h></h></h>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A keyword my site ranks on has recently dropped from no 2 to 51+ after weeks ranking high.
Any ideas? The full SERP analysis still puts it at no 2 so I'm quite confused
Moz Pro | | _leanneford0 -
Htaccess and robots.txt and 902 error
Hi this is my first question in here I truly hope someone will be able to help. It's quite a detailed problem and I'd love to be able to fix it through your kind help. It regards htaccess files and robot.txt files and 902 errors. In October I created a WordPress website from what was previously a non-WordPress site it was quite dated. I had built the new site on a sub-domain I created on the existing site so that the live site could remain live whilst I created on the subdomain. The site I built on the subdomain is now live but I am concerned about the existence of the old htaccess files and robots txt files and wonder if I should just delete the old ones to leave the just the new on the new site. I created new htaccess and robots.txt files on the new site and have left the old htaccess files there. Just to mention that all the old content files are still sat on the server under a folder called 'old files' so I am assuming that these aren't affecting matters. I access the htaccess and robots.txt files by clicking on 'public html' via ftp I did a Moz crawl and was astonished to 902 network error saying that it wasn't possible to crawl the site, but then I was alerted by Moz later on to say that the report was ready..I see 641 crawl errors ( 449 medium priority | 192 high priority | Zero low priority ). Please see attached image. Each of the errors seems to have status code 200; this seems to be applying to mainly the images on each of the pages: eg domain.com/imagename . The new website is built around the 907 Theme which has some page sections on the home page, and parallax sections on the home page and throughout the site. To my knowledge the content and the images on the pages are not duplicated because I have made each page as unique and original as possible. The report says 190 pages have been duplicated so I have no clue how this can be or how to approach fixing this. Since October when the new site was launched, approx 50% of incoming traffic has dropped off at the home page and that is still the case, but the site still continues to get new traffic according to Google Analytics statistics. However Bing Yahoo and Google show a low level of Indexing and exposure which may be indicative of the search engines having difficulty crawling the site. In Google Analytics in Webmaster Tools, the screen text reports no crawl errors. W3TC is a WordPress caching plugin which I installed just a few days ago to speed up page speed, so I am not querying anything here about W3TC unless someone spots that this might be a problem, but like I said there have been problems re traffic dropping off when visitors arrive on the home page. The Yoast SEO plugin is being used. I have included information about the htaccess and robots.txt files below. The pages on the subdomain are pointing to the live domain as has been explained to me by the person who did the site migration. I'd like the site to be free from pages and files that shouldn't be there and I feel that the site needs a clean up as well as knowing if the robots.txt and htaccess files that are included in the old site should actually be there or if they should be deleted... ok here goes with the information in the files. Site 1) refers to the current website. Site 2) refers to the subdomain. Site 3 refers to the folder that contains all the old files from the old non-WordPress file structure. **************** 1) htaccess on the current site: ********************* BEGIN W3TC Browser Cache <ifmodule mod_deflate.c=""><ifmodule mod_headers.c="">Header append Vary User-Agent env=!dont-vary</ifmodule>
Moz Pro | | SEOguy1
<ifmodule mod_filter.c="">AddOutputFilterByType DEFLATE text/css text/x-component application/x-javascript application/javascript text/javascript text/x-js text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon application/json
<ifmodule mod_mime.c=""># DEFLATE by extension
AddOutputFilter DEFLATE js css htm html xml</ifmodule></ifmodule></ifmodule> END W3TC Browser Cache BEGIN W3TC CDN <filesmatch ".(ttf|ttc|otf|eot|woff|font.css)$"=""><ifmodule mod_headers.c="">Header set Access-Control-Allow-Origin "*"</ifmodule></filesmatch> END W3TC CDN BEGIN W3TC Page Cache core <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteRule .* - [E=W3TC_ENC:_gzip]
RewriteCond %{HTTP_COOKIE} w3tc_preview [NC]
RewriteRule .* - [E=W3TC_PREVIEW:_preview]
RewriteCond %{REQUEST_METHOD} !=POST
RewriteCond %{QUERY_STRING} =""
RewriteCond %{REQUEST_URI} /$
RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|w3tc_logged_out|wordpress_logged_in|wptouch_switch_toggle) [NC]
RewriteCond "%{DOCUMENT_ROOT}/wp-content/cache/page_enhanced/%{HTTP_HOST}/%{REQUEST_URI}/_index%{ENV:W3TC_PREVIEW}.html%{ENV:W3TC_ENC}" -f
RewriteRule .* "/wp-content/cache/page_enhanced/%{HTTP_HOST}/%{REQUEST_URI}/_index%{ENV:W3TC_PREVIEW}.html%{ENV:W3TC_ENC}" [L]</ifmodule> END W3TC Page Cache core BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress ....(((I have 7 301 redirects in place for old page url's to link to new page url's))).... #Force non-www:
RewriteEngine on
RewriteCond %{HTTP_HOST} ^www.domain.co.uk [NC]
RewriteRule ^(.*)$ http://domain.co.uk/$1 [L,R=301] **************** 1) robots.txt on the current site: ********************* User-agent: *
Disallow:
Sitemap: http://domain.co.uk/sitemap_index.xml **************** 2) htaccess in the subdomain folder: ********************* Switch rewrite engine off in case this was installed under HostPay. RewriteEngine Off SetEnv DEFAULT_PHP_VERSION 53 DirectoryIndex index.cgi index.php BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /WPnewsiteDee/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /subdomain/index.php [L]</ifmodule> END WordPress **************** 2) robots.txt in the subdomain folder: ********************* this robots.txt file is empty **************** 3) htaccess in the Old Site folder: ********************* Deny from all *************** 3) robots.txt in the Old Site folder: ********************* User-agent: *
Disallow: / I have tried to be thorough so please excuse the length of my message here. I really hope one of you great people in the Moz community can help me with a solution. I have SEO knowledge I love SEO but I have not come across this before and I really don't know where to start with this one. Best Regards to you all and thank you for reading this. moz-site-crawl-report-image_zpsirfaelgm.jpg0 -
Keyword Ranking History Chart
The attached image shows the Keyword Ranking History with big downward spirals on Aug 27 and Sep 24th. Yet it's also showing up within the #10-#20 row. What is the official technical reason for why it's in both places? My boss is asking and I'm blanking. 🙂 Just that the URL was found in both spots? B69toK9
Moz Pro | | WineCellarInnovations0 -
What is error message:social_account.no_method?
When trying to add a twitter account to track, I received the following error message:social_account.no_method - has anyone else received this message?
Moz Pro | | solutionbuilt.com0 -
Tools for Monitoring Hundreds to Thousands of Keywords and Rankings
Hi All, I am in process of doing and SEO overhaul for our five global sites in: US, UK, Canada, Sweden, France I'd like to track hundreds of keywords and rankings per site - I'm talking at least 300-400 keywords each site. Each site has its own country domain with both www and www2 domains. So, I need a keyword tool that will let me track massive amounts of keywords. I know that the Moz Pro tool helps, but we only have 350 keywords on this account. I think on this. Any suggestions on something reliable that will provide good data? I'm sure I can get some budget to purchase something, but I also can't spend too too much money. I'm not looking for a massive analytics package. Right now, I'm concerned mainly with our keyword rankings Thanks in advance!
Moz Pro | | CSawatzky0 -
Significance of SEOmoz`s keyword difficulty
Hello everybody, currently, I am working on a project that will be available in multiple languages. My problem concerns the difference of the German and English language in relation to the SEOmoz`s Keyword difficulty (KWD). I just finished my German keyword research and got a satisfying collection of keywords I can use according to the KWD, page/ domain authority, Backlinks and Onpage-check. Now, as I am doing my research for the English market I find it hard to find any appropriate keywords with a satisfying KWD. I imagined the ranking on the English market to be much harder but now I doubt to ever rank for any useful, effective keyword on my list, because most of them are over 70 (some important over 80). According to the research it seems to have something to do with the authority strength of the pages, trust, etc, because they do not have more backlinks. Since they are query independent it is very hard for me to get a feeling for those metrics (I am new at SEO). So here are my questions: Due to your experiences, is it even possible to derank such pages or should I simply save the effort? How significant is the SEOmoz`s KWD? Would you always do Onpage optimization (text, internal linking) in case you earn more trust, authority, etc in the future, just to have your site optimized on the best keyword possible, even though you know that you cannot rank for that keyword (Cue: User-experience)? Or would you switch to less significant words (e.g. flashcards -> index cards) where you have a chance to rank but which is not that effective with traffic? Or focus on longtail, if possible? Many thanks
Moz Pro | | cramlr0 -
What time of day are keyword rankings updated
The keyword rankings in one of my campaigns update on Tuesday. It looks like they updated today but when I downloaded to CSV, some say last Tuesday. Is this because they didn't change or are they still updating? Today is Tuesday. i.e., do they update at a specific time and if so, what time?
Moz Pro | | mikescotty0 -
How to see rankings for a group of keywords over time?
I have around 1200 keywords in an seomoz campaign, chopped into various groups using labels. I am interested in tracking how keywords perform over time, especially groups of keywords. As far as I can tell, in the seomoz tools I can only see historical performance for a single keyword at a time. Is there a way to get a historical performance graph of more than one keyword at once? We often run optimizations for sets of words, so it seems like you would have better stats looking at groups of words vs picking one at a time. Is there a way to export historical data for more than keyword at a time? (So I could graph and analyze this sort of thing myself) Does anyone have other suggestions for tracking rankings in this way? Thanks for any help!
Moz Pro | | mlenz0