OnPage Issues with UTF-8 and ISO-8859-1
-
Hi guys,
I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside.
If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars.
I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8..
You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8
Remove the ?charset parameter and the charset it set to iso-8859-1.
Hope somebody has an answer or can push me into the right direction.
Thanks in advance and have a great day all.
Jan
-
Hi Jan,
I'm following up on old questions. Did Theo's answer solve your problem for you?
-
Wow good answer!
-
Getting a web page to display your content as TRUE utf-8 requires everything to be set at the utf-8 encoding. 'Everything' includes: your database, your database tables, your database fields, your connection from php to your database, you header as set by php, your header as set by html, your content itself etc.
The following resources were extremely helpful to me when I was switching to utf-8 (which is by far the better encoding over ISO):
http://www.phpwact.org/php/i18n/utf-8
http://www.phpwact.org/php/i18n/utf-8/mysql
Bonus tip: make sure your content and files are saved as utf-8 without BOM (Byte Order Mark), this will save you lots of trouble later!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect 301 issue. I changed my domain name and Google is killing me.
Hello SEO community: I have this problem, and I don't know exactly what to do. I recently changed my domain from uclasificados.net to uclasificados.com uclasificados.net was a free classified ads for USA in spanish, and was my most affortable site, so I wanted to convert it to .com because I thought it could get more popular with the .com domain. uclasificados.com was before a free classified ads website for Colombia, but was not very popular and had poor traffic so I moved the Colombian content to uclasificados.co. Since I changed my domain from uclasificados.net to uclasificados.com I have lost a lot of ranking, and l my traffic every day is getting lower. I have already checked the 301 redirections and they are working correctly, but even thought I keep getting less traffic and less money. I have also checked with moz tools both sites link juice, and it says that uclasificados.net have better reputation. So I was wondering if I change it back and redirect uclasificados.com to uclasificados.net but I worrie that if I do that, maybe I can make things worse. What do you recommend me?
Technical SEO | | capmartin850 -
Home page canonical issues
Hi, I've noticed I can access/view a client's site's home page using the following URL variations - http://example.com/
Technical SEO | | simon-145328
http://example/index.html
http://www.example.com/
http://www.example.com/index.html There's been no preference set in Google WMT but Google has indexed and features this URL - http://example.com/ However, just to complicate matters, the vast majority of external links point to the 'www' version. Obviously i would like to tidy this up and have asked the client's web development company if they can place 301 redirects on the domains we no longer want to work - I received this reply but I'm not sure whether this does take care of the duplicate issue - Understand what you're saying, but this shouldn't be an issue regarding SEO. Essentially all the domains listed are linking to the same index.html page hosted at 1 location My question is, do i need to place 301 redirects on the domains we don't want to work and do i stick with the 'non www' version Google has indexed and try to change the external links so they point to the 'non www' version or go with the 'www' version and set this as the preferred domain in Google WMT? My technical knowledge in this area is limited so any help would be most appreciated. Regards,
Simon.0 -
Too many on-page links vs. UX issue
I am having an issue with many of our pages having too many on-page links. I have gotten many of them below the 100 page limit that is suggested and I understand this is not a critical factor with SEO, but my issue is this: Many important pages I am trying to optimize are buried at a "3rd" level which is actually not accessible from the home page navigation dropdown due to our outdated CMS. I am trying to decide if we should develop our site to display these pages on-hover from the main navigation. This would make a lot of sense since users would find these pages easier, however adding this functionality would increase on-page links by a lot more. So in your opinion, would it be worth it to spend the money to have this functionality developed? Or would it end up hurting our SEO standings?
Technical SEO | | isret_efront0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
Duplicate content issue with trailing / ?
Hi ,I did a SEOmoz Crawl Test and found most pages show twice, for example: A: www.website.com/index.php/dog/walk B: www.website.com/index.php/dog/walk/ I've checked Google Analytics and 90% of organic search traffic arrives on the URLs with the trailing slash (B). Question 1: Can I assume I've a duplicate content problem? Question 2: Is it best to do 301 redirects from the 'non trailing slash' pages to the 'trailing slash pages'? Question 3: For some reason every web page has a '/index.php' in it (see A&B) above. No idea why. Should it be a SEO concern? Kind regards and thank you in advance Nigel
Technical SEO | | Richard5550 -
Need help with Joomla duplicate content issues
One of my campaigns is for a Joomla site (http://genesisstudios.com) and when my full crawl was done and I review the report, I have significant duplicate content issues. They seem to come from the automatic creation of /rss pages. For example: http://www.genesisstudios.com/loose is the page but the duplicate content shows up as http://www.genesisstudios.com/loose/rss It appears that Joomla creates feeds for every page automatically and I'm not sure how to address the problem they create. I have been chasing down duplicate content issues for some time and thought they were gone, but now I have about 40 more instances of this type. It also appears that even though there is a canonicalization plugin present and enabled, the crawl report shows 'false' for and rel= canonicalization tags Anyone got any ideas? Thanks so much... Scott | |
Technical SEO | | sdennison0 -
Duplicate Page Issue
Dear All, I am facing stupid duplicate page issue, My whole site is in dynamic script and all the URLs were in dynamic, So i 've asked my programmer make the URLs user friendly using URL Rewrite, but he converted aspx pages to htm. And the whole mess begun. Now we have 3 different URLs for single page. Such as: http://www.site.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=Multi-Day+City+Tours http://www.tsite.com/CityTour.aspx?nodeid=4&type=4&id=47&order=0&pagesize=4&pagenum=4&val=multi-day-city-tours http://www.site.com/city-tour/multi-day-city-tours/page4-0.htm I think my programmer messed up the URL Rewrite in ASP.net(Nginx) or even didn't use it. So how do i overcome this problem? Should i add canonical tag in both dynamic URLs with pointing to pag4-0.htm. Will it help? Thanks!
Technical SEO | | DigitalJungle0