Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain 403 error
-
Hi Everyone,
A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else?
I would love to hear your thoughts.
Jens -
no at all
-
Hi Roman,
Thanks for your answer!
It's a commercial tool.
I checked the robots.txt file and .htaccess, but didn't saw any problems.
As you say, the problem can just be caused by the user-agent.If so, this will not affect my SEO efforts, right?
-
Which tool are you using is this a custom tool or commercial tool such as Screamingfrog?
-
These are all for client errors. That means the page wasn’t found and something is wrong with the request. Whatever is happening though, the issue is typically on the client side:
403: Forbidden, So In your case, the first place that you need to check is your .htaccess and your Robots.txt file and make sure that they are not blocking any crawler or at least the crawler of your tools.
For example, some Hosting providers block all the crawlers that are not Google or Bing to save resources. So is usual that Roger (Moz Crawler) has problems to crawl a page that is blocked on the server side. Usually, Moz, Ahrefs, Semrush has this kind of problem so in summary
- Make sure your .htaccess and your Robots.txt is not blocking your crawler
- Make sure your hosting is not blocking your crawler
- If all the above does not work try to modify the user-agent of your tool
Hope this info helps you with your problem
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Best strategy to handle over 100,000 404 errors.
I recently been given a site that has over one-hundred thousand 404 error codes listed in Google Webmasters. It is really odd because according to Google Webmasters, the pages that are linking to these 404 pages are also pages that no longer exist (they are 404 pages themselves). These errors were a result of site migration that had occurred. Appreciate any input on how one might go about auditing and repairing large amounts of 404 errors. Thank you.
Technical SEO | | SEO_Promenade0 -
Reverse proxy a successful blog from subdomain to subfolder?
I have an ecommerce site that we'll call confusedseo.com. I created a WordPress blog and CNAME'd it to blog.confusedseo.com. Since then, the blog has earned a PageRank of 3 and a decent amount of organic traffic. I am considering a reverse proxy to forward blog.confusedseo.com to confusedseo.com/blog/. As I understand it, this will greatly help the "link juice" of the root domain. However, I'm concerned about any potential harm done to the existing SEO value of the blog. What, if anything, should I be doing to ensure that the reverse proxy doesn't hurt my "juice" rather than help it?
Technical SEO | | bedbugsupply0 -
Blogs are best when hosted on domain, subdomain, or...?
I’ve heard the it is a best practice to host your blog within your site. I’ve also heard it’s best to put it on a subdomain. What do you believe is the best home for your blog and why?
Technical SEO | | vernonmack0 -
Will errors on a subdomain effect the overall health of the root domain?
As stated in the question, we have 2 sub domains that contain over 2000 reported errors from SEOMOZ. The root domain has a clean bill of health, and i was just wondering if these errors on the sub-domains could have a negative effect on the root domain in the eyes of Google. Your comments will be appreciated. Regards Greg
Technical SEO | | AndreVanKets0 -
Subdomain Removal in Robots.txt with Conditional Logic??
I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live. My specific situation is this: I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content. Should I: a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense) b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning... Hope this makes sense, can really use the help! I'm on a Windows Server 2008 box running ColdFusion websites.
Technical SEO | | ErnieB0 -
Robots.txt file getting a 500 error - is this a problem?
Hello all! While doing some routine health checks on a few of our client sites, I spotted that a new client of ours - who's website was not designed built by us - is returning a 500 internal server error when I try to look at the robots.txt file. As we don't host / maintain their site, I would have to go through their head office to get this changed, which isn't a problem but I just wanted to check whether this error will actually be having a negative effect on their site / whether there's a benefit to getting this changed? Thanks in advance!
Technical SEO | | themegroup0 -
Microsite on subdomain vs. subdirectory
Based on this post from 2009, it's recommended in most situations to set up a microsite as a subdirectory as opposed to a subdomain. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites. The primary argument seems to be that the search engines view the subdomain as a separate entity from the domain and therefore, the subdomain doesn't benefit from any of the trust rank, quality scores, etc. Rand made a comment that seemed like the subdomain could SOMETIMES inherit some of these factors, but didn't expound on those instances. What determines whether the search engine will view your subdomain hosted microsite as part of the main domain vs. a completely separate site? I read it has to do with the interlinking between the two.
Technical SEO | | ryanwats0