Increase of 404 error after change of encoding
-
Hello,
We just have launch a new version of our website with a new utf-8 encoding.
Thing is, we use comma as a separator and since the new website went live, I have a massive increase of 404 error of comma-encoded URL.
Here is an example :
http://web.bons-de-reduction.com/annuaire%2C321-sticker%2Csite%2Cpromotions%2C5941.html
instead of :
http://web.bons-de-reduction.com/annuaire,321-sticker,site,promotions,5941.html
I check with Screaming Frog SEO and Xenu, I can't manage to find any encoded URL.
Is anyone have a clue on how to fix that ?
Thanks
-
I will take a look at it but it's not the issue that SEOMoz tell me as this format concerns only images. It's actually a little trick to do lazyloading on images.
The link you pointed out on your example is good ("/annuaire,amkashop,site,promotion...) as comma are not encoded.
And for your example I see no issue except capitalization.
I bet this is a Moz problem because when I fetch as Googlebot, I don't find encoded URL...
-
just wanted to give you one more thing that I think would help http://www.w3schools.com/html5/att_meta_charset.asp
I believe you should clean up your encoding and that it will not be a big deal.
Sincerely,
Tom
-
I thought this may help as well because you do have to clean up your source code
The online quoted-printable encoder tool first encodes the input text in either UTF-8 or ISO-8859-1. The characters are then output according to this schema:
| Character | Result | Comment |
| "=" (0x3D) | =3D | Special handling of the equal sign |
| " " (0x20) to "~" (0x7E) | Unmodified | Printable ASCII (7 bits) |
| Any other | =XX | Hexadecimal char code |Since quoted-printable does not in itself specify the text character encoding, it is important to specify this correctly when used. The online quoted-printable decoder tool attempts to auto-detect the text encoding.
See the Wikipedia article on quoted-printable for more info.
-
I would use a tool similar to this http://www.percederberg.net/tools/text_converter.html
as you can see your links for your gif photos are encoded "data:image/gif;base64"
please give it a try and tell me if that helps?
Sincerely,
Thomas
-
Hello and thanks for your answer.
No word involved here.
We move from :
http-equiv="content-type" content="text/html; charset=iso-8859-1" />
to
charset="utf-8">
Everything is fine except for Mozbot
-
what you need to do is go into your site and cleanup the links that have been converted and messed up because of the change. Once you clean them you will have no problem this is what your links look like
data:image/gif;base64,R0lGODlhAQABAIAAAP///////yH+A1BTQQAsAAAAAAEAAQAAAgJEAQA7
utf-8 is definitely the right coding it's very good you just have to go in and clean it up looking your source code
"
| {"m":2571,"a":"wrap"}" width="108" height="65" data-original="/upload/merchants_logo/108-65/amkashop.jpeg" src="data:image/gif;base64,R0lGODlhAQABAIAAAP///////yH+A1BTQQAsAAAAAAEAAQAAAgJEAQA7"> <noscript></span><img data-merchant="2571" class="merchantLogo lazy" data-out="{"m":2571,"a":"wrap"}" width="108" height="65" src="/upload/merchants_logo/108-65/amkashop.jpeg" alt="Amkashop"><span></noscript> |
| |<a <span="">href</a><a <span="">="</a>/annuaire,amkashop,site,promotions,2685.html" title="Amkashop">Code promo Amkashop"
|
I hope I've been of help to you.
Thomas
-
Did you happened to possibly write it using Microsoft Word and paste content in? Or are you speaking about a website that you converted from another encoding to Unicode 8?
sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Blog archive pages in Craw Error Report
Hi there, I'm new to MOZ Pro and have a question. My scan shows Archive pages as having crawl issues, but this is because Yoast is set up to block robots on these pages. Should I be allowing search engines to crawl these pages, or am I fine to leave them as I have it set up already? Any advice is greatly appreciated.
Moz Pro | | mhenshall
Marc0 -
404 error for unknown URL that Moz is finding in our blog
I'm receiving 404 errors on my site crawl for messinastaffing.com. They seem to be generating only from our blog posts which sit on Hubspot. I've searched high and low and can't identify why our site URL is being added at the end - I've tried every link in our blog and cannot repeat the error the crawl is finding. For instance: Referer is: http://blog.messinastaffing.com/take-charge-career-story-compelling-cover-letter/ 404 error is: http://blog.messinastaffing.com/take-charge-career-story-compelling-cover-letter/www.messinastaffing.com I agree that the 404 error URL doesn't exist but I can't identify where Moz is finding it. I have approximately 75 of these errors - one for every blog on our site. Beth Morley Vice President, Operations Messina Group Staffing Solutions
Moz Pro | | MessinaGroup
(847) 692-0613 www.messinastaffing.com0 -
Can Moz generate 404 Broken links report in an excel format
Hi all, Hope you are doing good and that all is well at your end. I'm a marketing team member here at Zephyr and I take care of the content side things for the team. I'm reaching out to you this morning in regards to figuring whether any Moz feature can be used for fixing broken links on our website and whether or not it can generate the broken links report which can be exported and saved in a spreadsheet. We are in the middle of cleaning up all the broken links which are currently live on our drupal site and which is why we are in need of a tool that generates the report of all the broken links onto an excel spreadsheet.Once we have the list of broken links set in a spreadsheet, we will work towards fixing those links. Meanwhile, we have also created a 404 error image for our readers who can be directed towards our homepage. Please, let me know if you can help me take care of these below-mentioned requests: 1. Suggest a tool /Moz feature which can generate the site report and list of broken links onto excel/spreadsheet 2. Once I've the list in an excel format, is there an automated way in which we can fix all the broken links which are currently live on our website, instead of manually deleting/unlinking those pages. Truly appreciate your help. Thank you! Is there an automated way in which we can fix all the broken links which are currently live on our website, instead of manually deleting/unlinking those pages.
Moz Pro | | LilianB0 -
URL Encoding
HI SEOmoz has finished crawling the site and surprised me with nearly 4k of 301's all the 301 are on my deal pages Example of the 301 http://www.economy-car-leasing.co.uk/van-leasing-deals/ford/transit-lease/transit-lwb-el-minibus-diesel-rwd-high-roof-17-seater-tdci-135ps%3D586165 as you can see from the above URL it returns a 404 but the URL is actually sent as below http://www.economy-car-leasing.co.uk/van-leasing-deals/ford/transit-lease/transit-lwb-el-minibus-diesel-rwd-high-roof-17-seater-tdci-135ps=586165 For some reason SEOmoz crawler is converting the = to %3d and reporting its a 301 even though it returns 404 Is this an error on SEOMOZ part ? or is there an error on my site Googlebot when i do a fetch as Google bot returns all on with the = sign and every other tool i have tried is ok too so not sure why SEOMOZ is seeing it different and then adding the URL as a 301 I am hoping this is just a glitch on the report tool part as im struggling since a recent site 301
Moz Pro | | kellymandingo0 -
Status Errors generated from xml site map
I just ran a crawl test on our site and I'm seeing a lot of 404 errors that are referredt from the xml sitemap.. Anyone know how to fix it?
Moz Pro | | IITWebTeam0 -
High level of 404 client errors
My clients website is an e-commerce based website, where customers can go on and buy products etc from the website. I placed the website onto seomoz and it cam eback with something like 18,000 errors, mostly 404 client errors, when I checked to see what the URL was from, it was a summary of an order to a client who just purchased something from the website, this was the case for alot of the errors. So i am wondering, will this harm the site's optimisation or any other part of it? and how can I get rid of these errors? Many Thanks Charlene
Moz Pro | | Louise990 -
Why is the linking root domains count not increasing
For the last month I have been tracking my backlinks thru open site explorer. Over the same period I have gone on a minor link building exercise via social media, some directories as well as writing fresh new content for the site. However, my backlink count has increased in the Rank Tracker tool but my root linking domains have remain at the same level. Is there a lag or is this because some of these site are not in the open site index? To be specific - I have personally setup follow links from sites like tumblr, posterous, stumbleUpon, delicious and other like these. Would really like to understand the limitations so I can take the appropriate action. Thanks
Moz Pro | | versinge0