OnPage Issues with UTF-8 and ISO-8859-1
-
Hi guys,
I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside.
If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars.
I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8..
You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8
Remove the ?charset parameter and the charset it set to iso-8859-1.
Hope somebody has an answer or can push me into the right direction.
Thanks in advance and have a great day all.
Jan
-
Hi Jan,
I'm following up on old questions. Did Theo's answer solve your problem for you?
-
Wow good answer!
-
Getting a web page to display your content as TRUE utf-8 requires everything to be set at the utf-8 encoding. 'Everything' includes: your database, your database tables, your database fields, your connection from php to your database, you header as set by php, your header as set by html, your content itself etc.
The following resources were extremely helpful to me when I was switching to utf-8 (which is by far the better encoding over ISO):
http://www.phpwact.org/php/i18n/utf-8
http://www.phpwact.org/php/i18n/utf-8/mysql
Bonus tip: make sure your content and files are saved as utf-8 without BOM (Byte Order Mark), this will save you lots of trouble later!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Issues with getting a web page indexed
Hello friends, I am finding it difficult to get the following page indexed on search: http://www.niyati.sg/mobile-app-cost.htm It was uploaded over two weeks back. For indexing and trouble shooting, we have already done the following activities: The page is hyperlinked from the site's inner pages and few external websites and Google+ Submitted to Google (through the Submit URL option) Used the 'Fetch and Render' and 'Submit to index' options on Search Console (WMT) Added the URL on both HTML and XML Sitemaps Checked for any crawl errors or Google penalty (page and site level) on Search Console Checked Meta tags, Robots.txt and .htaccess files for any blocking Any idea what may have gone wrong? Thanks in advance!
Technical SEO | | RameshNair
Ramesh Nair0 -
Issue with duplicate content
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot. I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue. Thank you!
Technical SEO | | odmsoft0 -
#1 on Bing, nowhere on Google. Should be at least top 3\. Any ideas?
I have a page that "should" be top 3 on Google - it's optimised (A on the Moz Pro page grader), it's the most relevant result (it's for an e-book, and the page is the publisher's page for the e-book). Other pages on the site for other books are top of the Google SERPs, and this page itself is top in Bing for the search phrase. The page is https://camphorpress.com/books/formosan-odyssey/ and the keyphrase I want to rank for is "formosan odyssey" (with or without the quotes). Does anyone have any insight as to why it's not ranking in Google? Over-optimised? Duplicate content? Many thanks.
Technical SEO | | C-Tech0 -
Shopify duplicate content issue
We recently moved out site to shopify but now have a duplicate content issue as we have the same products in different collections. I have added canonical code to get rid of this but my webmaster tools still shows hundreds of duplicate pages. How can I tell if the code I added is working? How long will it take for google to recognise this and drop the duplicates from their index and is this likely to have a significant impact on SERPS? Our we page is www.devoted2vintage.co.uk. Thanks Paul
Technical SEO | | devoted2vintage1 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike
Technical SEO | | RCNOnlineMarketing0 -
Local Search | Website Issue with Duplicate Content (97 pages)
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page. Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page. Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
Technical SEO | | ToddSEOBoston0 -
Duplicate Page Issue
Dear All, I am facing stupid duplicate page issue, My whole site is in dynamic script and all the URLs were in dynamic, So i 've asked my programmer make the URLs user friendly using URL Rewrite, but he converted aspx pages to htm. And the whole mess begun. Now we have 3 different URLs for single page. Such as: http://www.site.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=Multi-Day+City+Tours http://www.tsite.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=multi-day-city-tours http://www.site.com/city-tour/multi-day-city-tours/page4-0.htm I think my programmer messed up the URL Rewrite in ASP.net(Nginx) or even didn't use it. So how do i overcome this problem? Should i add canonical tag in both dynamic URLs with pointing to pag4-0.htm. Will it help? Thanks!
Technical SEO | | DigitalJungle0