4xx fix
-
Hi
I have quite a lot of 4xx errors on our site. The 4xx occurred because I cleaned poor URLs that had commas etc in them so its the old URLs that now 4xx. There are no links to the URLs that 4xx.What is the best way of rectifying this issue of my own making?!
Thanks
Gavin -
OK, thanks Dean. I'll update the sitemap and look into rectifying the errors identified by screamingfrog.
Thanks for your assistance!
-
No, I would recommend that you fix the underlying issue. I can see from your sitemap that you still have the URL's with commas in.
Personally I would use screamingfrog.co.uk to find your crawl errors as you will not need to wait a week for the next report.
-
I was waiting for the next crawl as I thought the 4xx would be removed from the crawl diagnostics, however I received a new crawl report today and they are still listed in the report.
I think the simplest way to remove the 4xx would be to create 301s for the URLs. Would you agree? -
So since you tidied the URL's has moz crawled your site again or are you waiting for the next crawl?
-
Where are you seeing the errors being reported? If you have corrected the problem with the error URL's and there are no links to theses URL's then there should not be a problem.
If however you are seeing theses URL's in the search results then yes a 301 redirect would be appropriate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword research, creating copy, fixing on-page optimisation - what next?
Hello - Wondered if I could get people's thoughts. We/I have started working on a client's website to improve everything - a general overhaul across SEO, on-page optimisation etc. I'm relatively new to this although picking things up and learning on the job which is great, and Moz is so helpful! So far we have conducted a review of the website, created a large list of keywords and analysed these, started overhauling the copy and adding the new keywords within this, have plans to overhaul the other elements of the site (headings, tags etc) and improve the design, functionality and customer journey through the website. My question is: where do I go from here in terms of keywords and SEO? Is it a case of plugging in the keywords we've researched, watch how they perform, and then switch things up with different keywords if they aren't performing as well as we expected? Is it really a lot of trial and error or is there an exact science behind it that I'm missing? I just feel a little as though we've pulled these keywords out of thin-air to a degree, and are adding them into our copy because the numbers on Moz show they should perform well, and they are what we are trying to promote on the website. But I don't know if this is right?! Perhaps I'm over-thinking it...
Technical SEO | | WhitewallGlasgow0 -
My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
Excuse me for posting this here, I wasn't having much luck going through GWT support. We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere. I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website. If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
Technical SEO | | jampaper0 -
Wordpress 4xx errors from comment re-direct
About a month ago, we had a massive jump in 4XX errors. It seems the majority are being caused by the comment tool on wordpress, which is generating a link that looks like this "http://www.turnerpr.com/blog/wp-login.php?redirect_to=http%3A%2F%2Fwww.turnerpr.com%2Fblog%2F2013%2F09%2Fturners-crew-royal-treatment-well-sort-of%2Fphoto-2-2%2F" On every single post. We're using Akismet and haven't had issues in the past....and I can't figure out the fix. I've tried turning it off and back on; I'm reluctant to completely switch commenting systems because we'd lose so much history. Anyone seen this particular re-direct love happen before? Angela
Technical SEO | | TurnerPR0 -
4XX(Client Error)
Hello there Please help! I am getting this kind of error in the whole site. http://www.mileycyrus-online.co.uk/leaked-hannah-montana-the-movie-pictures.html/comments Running on wordpress site. I chagned the template few times.. most of the error ends with a /comments. Infact all my post has the same issue: http://www.mileycyrus-online.co.uk/miley-cyrus-at-golden-globes-ceremony.html/comments http://www.mileycyrus-online.co.uk/miley-cyrus-at-president-obamas-inauguration-concert.html/comments 404 Error.
Technical SEO | | ExpertSolutions0 -
Fix duplicate content caused by tags
Hi everyone, TGIF. We are getting hundreds of duplicate content errors on our WP site by what appears to be our tags. For each tag and each post we are seeing a duplicate content error. I thought I had this fixed but apparently I do not. We are using the Genesis theme with Yoast's SEO plugin. Does anyone have the solution to what I imagine is this easy fix? Thanks in advance.
Technical SEO | | okuma0 -
How can we fix duplicate title tags like these being reported in GWT?
Hi all, I posted this in the GWT Forum on Monday and still no answers so I will try here. Our URL is http://www.ccisolutions.com
Technical SEO | | danatanseo
We have over 200 pages on our site being flagged by GWT as having
duplicate title tags. The majority of them look similar to this: Title: <a>JBL EON MusicMix 16 | Mixer | CCI Solutions</a> GWT is reporting these URLs to have all the same title: /StoreFront/product/R-JBL-MUSICMIX.prod/StoreFront/product/R-JBL-MUSICMIX.prod?Origin=Category/StoreFront/product/R-JBL-MUSICMIX.prod?Origin=Footer/StoreFront/product/R-JBL-MUSICMIX.prod?Origin=Header/StoreFront/product/R-JBL-MUSICMIX.prod?origin=../StoreFront/product/R-JBL-MUSICMIX.prod?origin=GoogleBase These are all the same page. There was a time when we used these origin codes, but we stopped using them over a year ago. We also added canonical tags to every page to prevent us from having duplicate content issues. However, these origin codes are
still showing up in GWT. Is there anything we can do to fix this problem. Do we have a technical issue with our site code and the way Google is seeing our dynamic URLs? Any suggestions on how we can fix this problem? The same is true in our report for Meta descriptions. Thanks
you,
Dana Tan0 -
Development site accidentally got indexed and now appears in SERPs. How to fix?
I work at a design firm, and we just redesigned a website for a client. When it came time for the coding, we initially built a development site to work out all the kinks before going live. Then we relaunched the actual site about a week ago. Here's the problem: Somehow, the developer who coded the site for us (a freelancer) allowed the development site to be indexed by Google. Now, when you enter the client's name into Google, the development site appears higher in the results pages than the real site! In fact, the real site isn't even in the top 50 search results. The client is understandably angry about this for multiple reasons. We quickly added a robots.txt file to the development site and a 301 redirect to the real site. However, that did seemed to have no effect on the problem. Any ideas on how to fix this mess? Thank you in advance!
Technical SEO | | matt-145670 -
Fixing Missing MetaTag Errors
Hey all, I just had a crawl test done on my site(created using wordpress) and I received a ton of missing meta tag descriptions to fix. The odd thing is though I use "All in One" SEO Tool and the actual pages or posts on the site do have meta tag descriptions, however I noticed for every post an RSS Feed is being automatically generated and this Feed is the link missing the meta tag descriptions. Most of the errors display "Comments on" with a /feed in the end of the url. I am totally clueless on how to resolve these errors as I havent installed any WP plugins that generate feeds automatically. Has anyone encountered this problem before or know how to fix this?? The site url is http:// GovernmentGrantsAustralia . org I have left spaces above to avoid being a link dropper 🙂 Would really appreciate if anyone can help! FYI: I just found this link after digging through all the Q&A history, however I tried it and am not sure if it has worked as I still see the errors on my SEOmoz report. The link is:
Technical SEO | | justin99
http://www.seomoz.org/qa/view/41413/wordpress-missing-meta-description-tag-comments Hope someone can help me figure this one out! Thanks, Justin0