Increase of 404 error after change of encoding
-
Hello,
We just have launch a new version of our website with a new utf-8 encoding.
Thing is, we use comma as a separator and since the new website went live, I have a massive increase of 404 error of comma-encoded URL.
Here is an example :
http://web.bons-de-reduction.com/annuaire%2C321-sticker%2Csite%2Cpromotions%2C5941.html
instead of :
http://web.bons-de-reduction.com/annuaire,321-sticker,site,promotions,5941.html
I check with Screaming Frog SEO and Xenu, I can't manage to find any encoded URL.
Is anyone have a clue on how to fix that ?
Thanks
-
I will take a look at it but it's not the issue that SEOMoz tell me as this format concerns only images. It's actually a little trick to do lazyloading on images.
The link you pointed out on your example is good ("/annuaire,amkashop,site,promotion...) as comma are not encoded.
And for your example I see no issue except capitalization.
I bet this is a Moz problem because when I fetch as Googlebot, I don't find encoded URL...
-
just wanted to give you one more thing that I think would help http://www.w3schools.com/html5/att_meta_charset.asp
I believe you should clean up your encoding and that it will not be a big deal.
Sincerely,
Tom
-
I thought this may help as well because you do have to clean up your source code
The online quoted-printable encoder tool first encodes the input text in either UTF-8 or ISO-8859-1. The characters are then output according to this schema:
| Character | Result | Comment |
| "=" (0x3D) | =3D | Special handling of the equal sign |
| " " (0x20) to "~" (0x7E) | Unmodified | Printable ASCII (7 bits) |
| Any other | =XX | Hexadecimal char code |Since quoted-printable does not in itself specify the text character encoding, it is important to specify this correctly when used. The online quoted-printable decoder tool attempts to auto-detect the text encoding.
See the Wikipedia article on quoted-printable for more info.
-
I would use a tool similar to this http://www.percederberg.net/tools/text_converter.html
as you can see your links for your gif photos are encoded "data:image/gif;base64"
please give it a try and tell me if that helps?
Sincerely,
Thomas
-
Hello and thanks for your answer.
No word involved here.
We move from :
http-equiv="content-type" content="text/html; charset=iso-8859-1" />
to
charset="utf-8">
Everything is fine except for Mozbot
-
what you need to do is go into your site and cleanup the links that have been converted and messed up because of the change. Once you clean them you will have no problem this is what your links look like
data:image/gif;base64,R0lGODlhAQABAIAAAP///////yH+A1BTQQAsAAAAAAEAAQAAAgJEAQA7
utf-8 is definitely the right coding it's very good you just have to go in and clean it up looking your source code
"
| {"m":2571,"a":"wrap"}" width="108" height="65" data-original="/upload/merchants_logo/108-65/amkashop.jpeg" src="data:image/gif;base64,R0lGODlhAQABAIAAAP///////yH+A1BTQQAsAAAAAAEAAQAAAgJEAQA7"> <noscript></span><img data-merchant="2571" class="merchantLogo lazy" data-out="{"m":2571,"a":"wrap"}" width="108" height="65" src="/upload/merchants_logo/108-65/amkashop.jpeg" alt="Amkashop"><span></noscript> |
| |<a <span="">href</a><a <span="">="</a>/annuaire,amkashop,site,promotions,2685.html" title="Amkashop">Code promo Amkashop"
|
I hope I've been of help to you.
Thomas
-
Did you happened to possibly write it using Microsoft Word and paste content in? Or are you speaking about a website that you converted from another encoding to Unicode 8?
sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I fix 5xx errors on pdf's?
Hi all, I'm having a huge (and very frustrating) issue-- I have 437 pages with 5xx errors and have no idea how to go about fixing them. All of them are links to pdf's, and when I click on the link on the Site Crawl page, it opens just fine. I've been trying to find the answer to how to fix this issue for weeks with no luck at all. I tried linking the pdf's to their parent page, tried renaming the links to be more concise and crawler friendly, and I've searched all types of articles involving pdf's and 5xx errors, all with no luck. Does anyone have any idea why these pages are getting this error and how I can fix them? Huge thanks in advance! Daniela S.
Moz Pro | | laurelleafep0 -
Error in Moz duplicate content reports
Hi - I've run the Moz campaign on a client's site. Moz is saying that there are duplicate content errors, and when I look at the errors it is showing that they are all to do with the non-www URLs having being duplicated in the www form of the URLs. However this is not the case - all the non-www URLs are all 301 redirected to the www URLs. Is this an error in the Moz tool? Has anybody experienced something similar?
Moz Pro | | rorynatkiel0 -
Seo moz has only crawled 2 pages of my site. Ive been notified of a 403 error and need an answer as to why my pages are not being crawled?
SEO Moz has only crawled 2 pages of my clients site. I have noticed the following. A 403 error message screaming frog also cannot crawl the site but IIS can. Due to the lack of crawling ability, im getting no feed back on my on page optimization rankings or crawl diagnostics summary, so my competitive analysis and optimization is suffering Anybody have any idea as to what needs to be done to rectify this issue as access to the coding or cms platform is out of my hands. Thank you
Moz Pro | | nitro-digital0 -
Why does it keep displaying br tags and claiming 404 errors on like 4 of my URL's for all my Wordpress sites?
Is anyone else having the same issue? These errors don't actually exist and i think it has something to do with wordpress - how can i fix this?
Moz Pro | | MillerPR0 -
404 errors
I have a few 404 errors found using seomoz tool. It also shows the url, but I am not sure where in the site is it linking from. Is there a way to find out the origin of the 404 error. Thanks
Moz Pro | | Accounts0 -
Should I worry about duplicate content errors caused by backslashes?
Frequently we get red-flagged for duplicate content in the MozPro Crawl Diagnostics for URLs with and without a backslash at the end. For example: www.example.com/ gets flagged as being a duplicate of www.example.com I assume that we could rel=canonical this, if needed, but our assumption has been that Google is clever enough to discount this as a genuine crawl error. Can anyone confirm or deny that? Thanks.
Moz Pro | | MackenzieFogelson0 -
Is there any way to view crawl errors historically?
One of the website's we monitor have been getting high duplicate page titles, as we work through the pages, we see changes and the number of duplicate page titles are decreasing. However, lately, it went up again and the duplicate page titles have increased. I wanted to ask if there's any way to view the new errors and the old errors separately or sorted in a way that can help me identify why we are getting new page crawl errors. Any advice would be great. Thanks!
Moz Pro | | TheNorthernOffice790 -
Confounding "Accessible to Engines" error?
Most of the pages on our site "Accessible to Engines" test in the SEOmoz reports. We cannot find any problem with the code and it's largely identical to the few pages that come up with an "A" score. One item that may be a reason is that we use meta http-equiv="refresh" content="600; For example in www.weatherzone.com.au/nsw/sydney/sydney We use this to fresh dynamic content on our site. Do search engines penalise pages that use this form of page refresh? Alternatively, is there a known bug in the SEOmoz "Accessible to Engines" report? Many thanks
Moz Pro | | weatherzone0