403 forbidden, are these a problem?
-
Hi
I have just run a crawl test on screaming frog and it is showing quite a few 403 forbidden status codes.
We are showing none of these in webmaster tools, is this an issue?
-
Your welcome! Anytime
-
Followed advice ran report again, now 100% 200: OK, really chuffed thanks!
-
Thanks guys I will get that sorted.
Still very much a beginner and part time (late nights on our own sites) but this pro membership is proving priceless.
Really appreciate the help!
-
SEo Executive is correct. uyou have some flooding protection on
i doubt very much if you are a target for DOS attacks so get ried of it
this is how crawlers see your code
Forbidden access
(Flooding)
-
Yes it's an issue and I will help you fix this:
I noticed you are using sh404sef
From the back end of your site you will need to go to sh404sef
Then sh404sef configuration security tab,
At the bottom, you will find an option below "Anti-flood configuration", you will need to set "Activate anti-flood" to "No".
-
Yes sure.
-
Yes it means the crawler has no permissions to crawl the page.
Can we get the domain so we can check, use a tiny url if you dont want it shown
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I have a robots.txt problem?
I have the little yellow exclamation point under my robots.txt fetch as you can see here- http://imgur.com/wuWdtvO This version shows no errors or warnings- http://imgur.com/uqbmbug Under the tester I can currently see the latest version. This site hasn't changed URLs recently, and we haven't made any changes to the robots.txt file for two years. This problem just started in the last month. Should I worry?
Technical SEO | | EcommerceSite0 -
ECommerce Problem with canonicol , rel next , rel prev
Hi I was wondering if anyone willing to share your experience on implementing pagination and canonical when it comes to multiple sort options . Lets look at an example I have a site example.com ( i share the ownership with the rest of the world on that one 😉 ) and I sell stuff on the site example.com/for-sale/stuff1 example.com/for-sale/stuff2 example.com/for-sale/stuff3 etc I allow users to sort it by date_added, price, a-z, z-a, umph-value, and so on . So now we have example.com/for-sale/stuff1?sortby=date_added example.com/for-sale/stuff1?sortby=price example.com/for-sale/stuff1?sortby=a-z example.com/for-sale/stuff1?sortby=z-a example.com/for-sale/stuff1?sortby=umph-value etc example.com/for-sale/stuff1 **has the same result as **example.com/for-sale/stuff1?sortby=date_added ( that is the default sort option ) similarly for stuff2, stuff3 and so on. I cant 301 these because these are relevant for users who come in to buy from the site. I can add a view all page and rel canonical to that but let us assume its not technically possible for the site and there are tens of thousands of items in each of the for-sale pages. So I split it up in to pages of x numbers and let us assume we have 50 pages to sort through. example.com/for-sale/stuff1?sortby=date_added&page=2 to ...page=50 example.com/for-sale/stuff1?sortby=price&page=2 to ...page=50 example.com/for-sale/stuff1?sortby=a-z&page=2 to ...page=50 example.com/for-sale/stuff1?sortby=z-a&page=2 to ...page=50 example.com/for-sale/stuff1?sortby=umph-value&page=2 to ...page=50 etc This is where the shit hits the fan. So now if I want to avoid duplicate issue and when it comes to page 30 of stuff1 sorted by date do I add rel canonical = example.com/for-sale/stuff1 rel next = example.com/for-sale/stuff1?sortby=date_added&page=31 rel prev = example.com/for-sale/stuff1?sortby=date_added&page=29 or rel canonical = example.com/for-sale/stuff1?sortby=date_added rel next = example.com/for-sale/stuff1?sortby=date_added&page=31 rel prev = example.com/for-sale/stuff1?sortby=date_added&page=29 or rel canonical = example.com/for-sale/stuff1 rel next = example.com/for-sale/stuff1?page=31 rel prev = example.com/for-sale/stuff1?page=29 or rel canonical = example.com/for-sale/stuff1?page=30 rel next = example.com/for-sale/stuff1?sortby=date_added&page=31 rel prev = example.com/for-sale/stuff1?sortby=date_added&page=29 or rel canonical = example.com/for-sale/stuff1?page=30 rel next = example.com/for-sale/stuff1?page=31 rel prev = example.com/for-sale/stuff1?page=29 None of this feels right to me . I am thinking of using GWT to ask G-bot not to crawl any of the sort parameters ( date_added, price, a-z, z-a, umph-value, and so on ) and use rel canonical = example.com/for-sale/stuff1?sortby=date_added&page=30 rel next = example.com/for-sale/stuff1?sortby=date_added&page=31 rel prev = example.com/for-sale/stuff1?sortby=date_added&page=29 My doubts about this is that , will the link value that goes in to the pages with parameters be consolidated when I choose to ignore them via URL Parameters in GWT ? what do you guys think ?
Technical SEO | | Saijo.George0 -
Duplicate Version of Home Page Causing Problems?
Hello, I have a .php based site and i'm curious if how we split traffic is negatively affecting our rankings. Currently, if you visit Lipozene.com you are split 50/50 between two pages, indexa.php and indexb.php. These have identical content right now, and i'm curious if this has negatively affected our rankings. We've dropped off the SERPs for our brand term "lipozene" even though we are the official site and own www.lipozene.com . Any thoughts are greatly appreciated.
Technical SEO | | lipoweb0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
URL rewriting causing problems
Hi I am having problems with my URL rewriting to create seo friendly / user friendly URL's. I hope you follow me as I try to explain what is happening... Since the creation of my rewrite rule I am getting lots of errors in my SEOMOZ report and Google WMT reports due to duplicate content, titles, description etc For example for a product detail, it takes the page and instead of a URL parameter it creates a user friendly url of mydomain.com/games-playstation-vita-psp/B0054QAS However in the google index there is also the following friendly URL which is the same page - which I would like to remove domain.com/games-playstation-vita/B0054QAS The key to the rewrite on the above URLs is the /B0054QAS appended at the end - this tells the script which product to load, the details preceeding this could be in effect rubbish i.e. domain.com/a-load-of-rubbish/B0054QAS and it would still bring back the same page as above. What is the best way of resolving the duplicate URLs that are currently in the google index which is causing problems The same issue is causing a quite serious a 5XX error on one of the generated URLs http://www.mydomain.com/retailersname/1 - , if I click on the link the link does work - it takes you to the retailers site, but again it is the number appended at the end that is the key - the retailersname is just there for user friendly search reasons How can I block this or remove it from the results? Hope you are still with me and can shed some light on these issues please. Many Thanks
Technical SEO | | ocelot0 -
Problem with indexed files before domain was purchased
Hello everybody, We bought this domain a few months back and we're trying to figure out how to get rid of indexed pages that (i assume) existed before we bought this domain - the domain was registered in 2001 and had a few owners. I attached 3 files from my webmasters tools, can anyone tell me how to get rid of those "pages" and more important: aren't this kind of "pages" result of some kind of "sabotage"? Looking forward to hearing your thoughts on this. Thank you, Alex Picture-5.png Picture-6.png Picture-7.png
Technical SEO | | pwpaneuro0 -
Wordpress Problems.. SEO-Yoast is Toast?
Hello; I have installed the WP Yoast Widget in my Blog, and 2 weeks, after my issues went away, they came back X's 300! lol So I uninstalled it, and my issues obviously got worse, and then I re-activated, and reset everything, and still got the 300+ issues. Is there a secondary plug in you would suggest, to run at the same time as Yoats, or theat will fix all issues? Ever think of making an SEOmoz Widget for WP since it is gaining so much popularity?? Thank you Great work by the way! Loved the Webinar today!
Technical SEO | | smstv0 -
Problem with my site
the site is casino.pt we created the site 7-8 month ago, we started to push it by good and natural links (http://www.opensiteexplorer.org/www.casino.pt/a!links!!filter!all!!source!external!!target!page), links in sites with content rich and most of them related to gambling and sport topics. During the first 3-5 months, the rankings were better and better, after the 6 months, the site lose all its rankings. Aditional details http://www.casino.pt/robots.txt http://www.google.pt/#hl=pt-PT&source=hp&biw=1280&bih=805&q=site:http%3A%2F%2Fwww.casino.pt&aq=f&aqi=&aql=&oq=&fp=2651649a33cd228 no critical errors in google webmaster tools any idea how can I fix it? thanks
Technical SEO | | Yaron530