Adding Meta Languange tag to xhtml site - coding help needed
-
I've had my site dinged by Google and feel it's likely several quality issues and I'm hunting down these issues.
One of Bing's Webmaster SEO tools said my xhtml pages (which were built in 2007) are missing Meta Language and suggested adding tag in the or on the html tag.
Wanting to "not mess anything up" and validate correctly, I read in **W3C's site and it said: ** "Always add a lang attribute to the html tag to set the default language of your page. If this is XHTML 1.x you should also use the xml:lang attribute (with the same value). Do not use the meta element with http-equiv set to Content-Language."
My current html leads like:
QUESTION:
I'm confused on how to add the Meta Language to my website given my current coding as I"m not a coder.Can you suggest if I should add this content-language info, and if so, what is the best way to do so, considering valid w3c markup for my document type?
Thank you!!!
Michelle -
thank you!
-
Yeah, I don't think you'll go wrong with "en". Glad to help, hope that answers your question
-
Thanks again George. So, I guess "en" or "en-us" is ok. Most of our customers are in the US by far, but we also have a smaller percent in Australia, Canada and the UK. But they all speak English.
That being said, maybe "en" is best?
Michelle
-
At least with Google, I doubt it makes a difference unless there are multiple languages on a page. If you use Chrome you'll see it auto-detects the language and offers to translate. It may only rank the page in a specific country or locale though. If you're aiming at Spanish speakers in the UK, it may be a little different.
-
Hi Michelle, "ll-cc" stands for "language-countrycode". So in the case of English, you can use "en-us" for English United States or "en-gb" for British English. I don't believe case matters (I have seen "en-US" and "en-GB" used too).
For your question, yes you can use:
You could also use:
Either one will work fine :). Which language are you targeting?
Here is some more reading from w3.org that seems more up-to-date, though I think you would be fine using one of the above methods.
-
The Bing Webmaster Central article where they discuss how to set the language for your pages is here.
-
Hi George,
Thanks for your prompt reply - and I agree - I'm sure this isn't a big factor, but when finding reports saying things are "wrong" - I'm trying to fix them for overall improvement.I noticed w3c says: Always add a lang attribute to the html tag to set the default language of your page. If this is XHTML 1.x you should also use the xml:lang attribute (with the same value).
So, is it best practice to add it (xml:lang) to this tag you suggested (is applicable to my document)?
<html xmlns="http://www.w3.org/1999/xhtml" lang="en">And is "en" preferred over "ll-cc"?
Thanks again,
Michelle -
This is likely what you are looking for, but I don't think this is causing you any SEO problems.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Null Alt Image Tags vs Missing Alt Image Tags
Hi, Would it be better for organic search to have a null alt image tag programatically added to thousands of images without alt image tags or just leave them as is. The option of adding tailored alt image tags to thousands of images is not possible. Is having sitewide alt image tags really important to organic search overall or what? Right now, probably 10% of the sites images have alt img tags. A huge number of those images are pages that aren Thanks!
Intermediate & Advanced SEO | | 945010 -
Part of my site does not show the correct Meta title
Hi our website meta title on the directory section is showing the same title, it does not show the page title. We have tried turning off all plugins, reinstalling the theme, creating a new htacces file. installing Yoast, and testing with All in one seo but still the same thing happens. Tried different themes with the same results But when we test with Twenty Thirteen it is ok Completely lost and would love some help Thanks in advance
Intermediate & Advanced SEO | | Taiger0 -
Open Site Explorer - Spam analysis: need help with inbound links... from my site!
hallo, reading my spam analysis report from open explorer, I found somenthing I don't understand (please see attached image): The long list of links inside the red rectangle are inbound links with a spam score of 5 coming from my same site. How is that possible? Should I remove those links? Also , I see that many of those links are links present in the top navigation bar (about page, home page, service description etc.) or in the sidebar section of the website (categories, recent posts, recent comments). Should I treat them differently? Thank you for your time.
Intermediate & Advanced SEO | | micvitale0 -
Spammy sites that link to a site
Hello, What is the best and quickest way to identify spammy sites that link to a website, and then remove them ( google disavow?) Thank you dear Moz, community - I appreciate your help 🙂 Sincerely, Vijay
Intermediate & Advanced SEO | | vijayvasu0 -
Consistent Ranking Jumps Page 1 to Page 5 for months - help needed
Hi guys and gals, I have a really tricky client who I just can't seem to gain consistency with in their SERP results. The keywords are competitive but what the main issue I have is the big page jumps that happen pretty much on a weekly basis. We go up and down 40 positions and this behaviour has been going on for nearly 6 months.
Intermediate & Advanced SEO | | Jon_bangonline
I felt it would resolve itself in time but it has not. The website is a large ecommerce website. Their link profile is OK in regards to several high quality newspaper publication links, majority brand related anchor texts and the link building we have engaged in has all been very good i.e. content relevant / high quality places. See below for some potential causes I think could be the reason: The on page SEO is good however the way their ecommerce website is setup they have formed a substantial amount of duplicate title tags. So in my opinion this is a potential cause. The previous web developer set-up 301 redirects all to their home page for any 404 errors. I know best practice is to go to the most relevant pages, however could this be a potential issue? We had some server connectivity issues show up in webmasters tools but that was for 1 day about 4 months ago. Since then no issues. they have quite a few 'blocked URLs' in their robots.txt file, e.g. Disallow: /login, Disallow: /checkout/ but to me these seem normal and not a big issue. We have seen a decrease over the last 12 months in Webmasters Tools of 'total indexed web pages' from 5000 to 2000 which is quite an odd statistic. Summary So all in all I am a tad stumped. We have some duplicate content issues in title tags, perhaps not following best practice in the 301 redirects but other than that I dont see any major on page issues, unless I am missing something in the seriousness of what I have listed.
Finally we have also do a bit of a cull of poor quality links, requesting links to be removed and also submitting a 'disavow' of some really bad links. We do not have a manual penalty though. Thoughts, feedback or comments VERY welcome.0 -
Site Search Results in Index -- Help
Hi, I made a mistake on my site, long story short, I have a bunch of search results page in the Google index. (I made a navigation page full of common search terms, and made internal links to a respective search results page for each common search term.) Google crawled the site, saw the links and now those search results pages are indexed. I made versions of the indexed search results pages into proper category pages with good URLs and am ready to go live/ replace the pages and links. But, I am a little unsure how to do it /what the effects can be: Will there be duplicate content issues if I just replace the bad, search results links/URLs with the good, category page links/URLs on the navi. page? (is a short term risk worth it?) Should I get the search results pages de-indexed first and then relaunch the navi. page with the correct category URLs? Should I do a robots.txt disallow directive for search results? Should I use Google's URL removal tool to remove those indexed search results pages for a quick fix, or will this cause more harm than good? Time is not the biggest issue, I want to do it right, because those indexed search results pages do attract traffic and the navi. page has been great for usability. Any suggestions would be great. I have been reading a ton on this topic, but maybe someone can give me more specific advice. Thanks in advance, hopefully this all makes sense.
Intermediate & Advanced SEO | | IOSC1 -
PDF on financial site that duplicates ~50% of site content
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site. Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect. Thanks --
Intermediate & Advanced SEO | | 540SEO0 -
Have completed keyword analysis and on page optimization. What else can I do to help improve SERP ranking besides adding authoritative links?
Looking for concrete ways to continue to improve SERP results. thanks
Intermediate & Advanced SEO | | casper4340