How do I fix 608's please?
-
Hi,
I'm on the free trial and finding it very useful I've fixed all my 301's. but now I have a load of 608's. I don't no what this is!
I feel like I've cured herpes only to get gonorrhea! can any one help. I have 41 608's which is more than the 301's I had. I hope they are non-related!
I won't bore you with the whole list but some of the url's are:
Error Code 608: Page not Decodable as Specified Content Encoding
http://sussexchef.com/catering-at-mr-mrs-currys-50th-wedding-anniversary/guestsarrive----608
Error Code 608: Page not Decodable as Specified Content Encoding
http://sussexchef.com/funeral-catering/picture4-2----608
Error Code 608: Page not Decodable as Specified Content Encoding
-
Weird... yeah, under normal operations, output buffering shouldn't be on. There probably are legitimate uses for it for more complex sites, but not as a default option.
-
Hello folks
I've read this post when I had this 608 issue. But I couldn't figure out what the problem was.
So I'm here to share what was my problem.I used PHP and my website had this buffering option "ob_start()"
Since I wasn't really using it, I removed it and any other buffering option and the problem stopped.Hope it helps
-
Hi All
So after calling Godaddy a second time I got through to a guy that suggested i go in to wordpress, go to permalinks and select a setting other than the one that was already selected and click publish. then select the one it was on previously and click publish.
It fixed all the problems instantly.
I just thought I'd share this incase anyone else has this problem and searches the forum for 608's in the future.
All the best and thanks for your help
Ben
-
Hi
Yes the site is behaving really badly in windows explorer and fire fox, chrome seams to adjust it almost instantly though.
-
Thanks Dr Pete.
I'm toying with the thought of moving to a new theme so may leave it for now. other issues are my site maps, I'm using plugins to do this but google webtools doesn't like them. But I guess that's something to for me to research further.
Thanks again
-
I will add that it's entirely possible that this is a minor, if odd, problem, and Google is crawling the pages fine. You seem to be indexed properly. Fixing it is a nice-to-have, but I doubt it would be worth a big investment unless you've got other issues that need fixing.
-
Do you know what kind of hosting you're running with GoDaddy? Is it Apache, Windows, etc.? I used to do some hosting with them, and I'm trying to remember where that would be set. It depends completely on the web server, though.
-
Hi Dr
It's amazing how much advice you get on the moz forum. I basicly ditched my developer and subscribed to Moz instead. I called go daddy but they couldn't recreate the problem their end, I even emailed your reply and they still couldn't help.
I'll take a look around the server settings soon and see if I can figure it out. If I can't can any one recommend a web developer? The last too I've had moved on to other things.
Thank you all for your help so far, it's most kind of you!
Ben
-
Unfortunately, this is a server-side issue, so the fix is completely different depending on your setup. Basically, the server is trying to compress your pages, most likely (using something like Gzip), and the settings are probably wrong. So, the final encoding isn't quite right.
At first, I was going to say that our crawler might just be finicky on this one, but when I try to load these pages on Google Chrome, I get a temporary error, after which the page loads. This definitely could be causing you some problems.
I tried to check out your setup with BuiltWith, but it's actually choking on the Gzip errors, too:
http://builtwith.com/sussexchef.com
Step 1 might be to just shut the compression/encoding off, and then try to work out the settings. You're probably going to have to pull in your hosting company and/or developer.
-
Hi, Yeah, us brits have a way of getting points across.
thanks for your responce. Moz says :
608 Home page not decodable as specified Content-Encoding
The server response headers indicated the response used
gzip
ordeflate
encoding but our crawler could not understand the encoding used.To resolve 608 errors, fix your site server so that it properly encodes the responses it sends.
The problem is that I don't understand the issue, so I have no idea how to fix it!
-
Hey SussexChef83!!
LOL "I feel like I've cured herpes only to get gonorrhea!"THAT is some funny stuff!!
Check out this article from Moz about HTTP errors in Crawl Reports.
http://moz.com/help/guides/search-overview/crawl-diagnostics/errors-in-crawl-reports
Not sure if this provides you any real help for you or if it just diagnoses the gonorrhea in detail. Basically what I gather is that you need to encode the responses your site is sending in a different way than you currently have set. Hope this helps!!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Organic reports showing a URL that isn't in Search Ask Question
In the image I've attached you can see that I have pulled a source/medium > google organic report. I've also made "landing page" my secondary dimension. The first landing page that is showing up is /v3/?slug=fnl, that is this page (https://orders.freshnlean.com/v3/?slug=fnl). You can see that the page has 230 sessions from Sep 3 - 9 and 17 transactions during that same time frame. The only thing is, that landing page is nowhere to be found in the SERPs. So how is it showing up in this report as having received google organic visitors that converted if it's not even in search? 05OclDp
Reporting & Analytics | | tdastru0 -
Referral Data Q's
1. We recently ran a promotion on both FB and Reddit, which is https, linking to our non-https site. We utilized UTM links to our landing page. Our GA campaign data returned extremely low hits in comparison to what we actually received (and recorded via FB/Reddit dashboard). Obviously our Direct traffic spiked during these times, caused by a secure to nonsecure referral, I'm sure. I'm also noticing a spike in referral traffic from lm.facebook.com that correlates to the ad times. Does this mean Facebook's link shim is stripping away my UTM data? My question is why we receive SOME properly UTM-tagged referral traffic in our campaigns? What's allowing some of it to go through? 2. I've tagged our email signature links with UTM as well, hoping to clean up some of our Direct traffic. I understand that external clients like Outlook and Thunderbird likely won't pass referral data, but do hosted clients like Gmail, Yahoo, and such? And if so, would the https to http difference obstruct this again? I'd love some insight onto these questions, especially if I'm off the mark with a few of my assumptions there.
Reporting & Analytics | | kirmeliux0 -
I've hit a wall. What's next?
I've been working on creating content and optimizing my website for the past 18 months, and have seen decent gains in organic traffic. Recently my keyword ranking increases have slowed dramatically, despite adding additional content. I'm hoping a fresh set of eyes can spot something that may be holding me back. I appreciate any feedback the MOZ community is able to provide. One area of concern/confusion is the number of pages indexed according to GSC. The number tends to go up and down, and I can't figure out why. Of the roughly 8,000 pages that are submitted through the sitemap, it says only around 1,100 are indexed (sitemap page). The index status page says I've got 7,500 pages indexed. Kind of confused on that one. Anyway, my website is MetroAtlantaHome dot com. Thanks everyone!
Reporting & Analytics | | Kyle Eaves0 -
Google Webmaster Tools - spike in 'not selected' under Index Status
Hi fellow mozzers Has anyone seen a huge shift in the number of pages 'Not Selected' under Index Status in Google WMT, and been able to identify what the problem has been? My new client recently moved their site to wordpress - and in doing so the number of pages 'not selected' rose from ~200 to ~1100, It was high before but is ridiculous now. I am thinking there must be a new duplicate content issue which should be cleaned up in my quest to improve their SEO. Could it be the good old WP tag/category issue? In which case I won't worry as Joost is doing its job of keeping stuff out of the index. There are loads of image pages which could well appear as dupe as have no content on them (i do need to fix this), but Google is already indexing these so doesn't explain the ones 'not selected'. I've tried checking dupe title tags but there are very few of them so that doesn't help Any other ideas of how to identify what these problem pages maybe? Thanks very much! Wendy
Reporting & Analytics | | Chammy0 -
Does anyone know what's happened to google analytics -> traffic sources -> SEO -> queries many of my accounts are showing a drop to zero in the laste few days
Howdy mozzers It's in the question title really. Zero impressions showing for the last few days on. Multiple accounts Any thought out there
Reporting & Analytics | | Big_Partnership0 -
Google analytics for mobile apps - do we need different id's?
Dear all, if I want to implement Google Analytics for my mobile apps: Do I need different accounts or id's to ensure separate tracking of my website and mobile apps? How did you do it? Thanks!
Reporting & Analytics | | HMK-NL0 -
Can't figure this ranking out..
Hi, This is puzzling me. I've been in the second/third position for a week or so for my best keyword. That is for Google US unpersonalized, which is the one that brings more traffic, as far as I understand. It can't get MUCH better. Well, I can be first, but second and third position is really awesome in my case (highly competitive keyword according to SEOMOZ PRO). Then, why on earth my traffic for that keyword was 8 times better a year ago?? I mean, a year ago I received an average of 800 visits per day and now I can barely reach 90 visits per day being in the second / third place. Visits can't increase from 90 to 800 just for increasing one spot. I've never seen in my stats such drop in my rankings. I thought that due to google updates my site was sent below the 20th position or something. But my I was shocked today when I saw that I still have the second/third position Am I crazy or this looks wrong? The page title and description that shows in google hasn't changed, so people looking for that keyword are seeing the same as one year ago. It is not a seasonal or time sensitive keyword. My best guess is that people are now always logged in and results are personalized. Don't know much about personalized results but I don't think you can optimize much for those. If that's the case, then how on earth can we optimize a page if everybody is using personalized results? Is there a way to improve your rankings in those cases? Thanks, Enrique
Reporting & Analytics | | enriquef0 -
Basic Purpose of SEO moZ s subscription !!!!!
How the subscription for seomoz will help to improve the traffic of my website ?
Reporting & Analytics | | fullerenedr0