Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
403 forbidden error how to solve them
-
hi, i have been using a great tool today called screaming frog which was shown to me by Thomas Zickell
when i used the tool i found some worrying things for my site www.in2town.co.uk. what i have found is, i have a large number of 403 forbidden status on my home page and i do not know why
here is an example
http://www.in2town.co.uk/emmerdale/emmerdale-debbie-hits-rock-bottom
it loads fine but on the tool it shows it as an error and shows it as having no meta tags or anything but there is meta tags in there
can anyone please let me know how to solve this and why it has happened
many thanks
-
Hi Tim,
Glad it helped. It might be worth asking your host what kind of features they have for preventing flooding attacks, there are various ways of addressing them on the server side that most hosts will have enabled in one way or another. Unless you have a specific issue with these kind of attacks, it seems to me that this part of the module is causing more harm than good as it is now.
-
thank you for this. i have turned it off and will speak to sh404sef to find out what they can do about it, as i am worried about having the security feature, but as you said that was the problem and now the site is showing fine, there are no errors showing.
many thanks for this. I hope other people who are having this problem get to read this post as they must be going through what i am going through. many thanks for all your help and the solution
-
Hi Tim,
Did you ever get to the bottom of the issue mentioned in this question? It is almost certainly the same problem.
Have a look at this page and try either turning of the sh404SEF anti flooding feature or else boosting the max number of requests allowed. http://forum.joomla.org/viewtopic.php?p=1368937
The anti flooding part of this component is basically blocking requests for pages if it thinks someone is trying to do a dos attack on your site. The current setup seems to be too sensitive and is bloocking screaming frog after the first few requests, quite possibly blocking the google bots, maybe blocking the moz crawler also, so certainly something you should address.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | | Extima-Christian0 -
?_escaped_fragment_= Duplicate error in Webmaster
Hi I am not sure where this came from ... ?escaped_fragment= But in webmaster we are seeing hundreds of pages with this and thus webmaster is saying that we have Pages with duplicate title tags How do I fix this, or remove it. Regards T
Technical SEO | | Taiger0 -
Hundreds of 404 errors are showing up for pages that never existed
For our site, Google is suddenly reporting hundreds of 404 errors, but the pages they are reporting never existed. The links Google shows are clearly spam style, but the website hasn't been hacked. This happened a few weeks ago, and after a couple days they disappeared from WMT. What's the deal? Screen-Shot-2016-02-29-at-9.35.18-AM.png
Technical SEO | | MichaelGregory0 -
Schema Markup Errors - Priority or Not?
Greetings All... I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn. I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them. My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have. Any thoughts?
Technical SEO | | AfroSEO0 -
Error report in Bing Evaluated size of HTML....
Hi Whilst checking Bing's SEO analyser I got this error message for our page www.tidy-books.co.uk/childrens-bookcases "Evaluated size of HTML is estimated to be over 125 KB and risks not being fully cached. (Issue marker for this rule is not visible in the current view)" Just wondering what needs to be done about it and what it actually means? Thanks
Technical SEO | | tidybooks0 -
Will deleting Wordpress tags result in 404 errors or anything?
I want to clean up my tags and I'm worried I'm going to look in my webmasters the next day with hundreds of errors. Whats the best way of doing this?
Technical SEO | | howlusa0 -
404 errors on non-existent URLs
Hey guys and gals, First Moz Q&A for me and really looking forward to being part of the community. I hope as my first question this isn't a stupid one but I was just struggling to find any resource that dealt with the issue and am just looking for some general advice. Basically a client has raised a problem with 404 error pages - or the lack thereof- on non-existent URLs on their site; let's say for example: 'greatbeachtowels.com/beach-towels/asdfas' Obviously content never existed on this page so its not like you're saying 'hey, sorry this isn't here anymore'; its more like- 'there was never anything here in the first place'. Currently in this fictitious example typing in 'greatbeachtowels.com/beach-towels/asdfas**'** returns the same content as the 'greatbeachtowels.com/beach-towels' page which I appreciate isn't ideal. What I was wondering is how far do you take this issue- I've seen examples here on the seomoz site where you can edit the URI in a similar manner and it returns the same content as the parent page but with the alternate address. Should 404's be added across all folders on a site in a similar way? How often would this scenario be and issue particularly for internal pages two or three clicks down? I suppose unless someone linked to a page with a misspelled URL... Also would it be worth placing 301 redirects on a small number of common mis-spellings or typos e.g. 'greatbeachtowels.com/beach-towles' to the correct URLs as opposed to just 404s? Many thanks in advance.
Technical SEO | | AJ2340 -
Robots.txt file getting a 500 error - is this a problem?
Hello all! While doing some routine health checks on a few of our client sites, I spotted that a new client of ours - who's website was not designed built by us - is returning a 500 internal server error when I try to look at the robots.txt file. As we don't host / maintain their site, I would have to go through their head office to get this changed, which isn't a problem but I just wanted to check whether this error will actually be having a negative effect on their site / whether there's a benefit to getting this changed? Thanks in advance!
Technical SEO | | themegroup0