SEOMOZ crawler is still crawling a subdomain despite disallow
-
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors.
We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain.
As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: /
We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain.
Any help is greatly appreciated!
-
Thanks Peter for your assistance.
Hope to hear from the SEOMOZ team soon with regards to this issue.
-
John,
Thanks for writing in! I would like to take a look at which project you guys were working with that this is happening. I will go ahead and start a ticket so we can better answer your questions You should hear from me soon!
Best,
Peter
Moz Help Team. -
I have heard of this before recently, I think possibly the moz crawler all or sometimes now just ignores the disallow because it is not a usual S.E crawler.
Hopefully one of the staff can provide some insight in this for you.
All the best.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Crawl Errors?
We are using Google Search Console to monitor Crawl Errors. It seems Google is listing errors that are not actual errors. For instance, it shows this as "Not found": https://tapgoods.com/products/tapgoods__8_ft_plastic_tables_11_available So the page does not exist, but we cannot find any pages linking to it. It has a tab that shows Linked From, but if I look at the source of those pages, the link is not there. In this case, it is showing the front page (listed twice, both for http and https). Also, one of the pages it shows as linking to the non-existant page above is a non-existant page. We marked all the errors as fixed last week and then this week they came up again. 2/3 are the same pages we marked as fixed last week. Is this an issue with Google Search Console? Are we getting penalized for a non existant issue?
Intermediate & Advanced SEO | | TapGoods0 -
¿Disallow duplicate URL?
Hi comunity, thanks for answering my question. I have a problem with a website. My website is: http://example.examples.com/brand/brand1 (good URL) but i have 2 filters to show something and this generate 2 URL's more: http://example.examples.com/brand/brand1?show=true (if we put 1 filter) http://example.examples.com/brand/brand1?show=false (if we put other filter) My question is, should i put in robots.txt disallow for these filters like this: **Disallow: /*?show=***
Intermediate & Advanced SEO | | thekiller990 -
Changing domain names but still ranking as old one
Hi there, I have a client who changed domain names back in November 2015 but is still coming up in search engines with their old domain name not their new one. For example, I search for my clients name, let's call them Example B. So I search for "Example B" and within the search results they come up top and the title tag is correct as it says something along the lines of "Welcome to Example B". However the URL underneath is actually their old name which is Example A. When you click on the link, it redirects over to the new name so thats fine, but it's just annoying that Example A is still appearing when it should be Example B now. I don't think they have a new Webmaster Tools account setup for their new domain (I need to check still), but they do still have their old one setup. Is there something I can do within Webmaster Tools to tell it that Example A is now gone and to start indexing and referring to them as Example B? What else should I do to make sure their new name is coming up not their old one anymore?
Intermediate & Advanced SEO | | Virginia-Girtz1 -
Should I use change of address when moving to subdomain
Hi guys So we had a domain that was only for one country, www.example.com 1 year later we decided to go to another country so we will have all the current website under a country subdomain like : ae.example.com we did a 301 redirect
Intermediate & Advanced SEO | | awrikat
Should I perform a change of address action from www.example.com to ae.example.com ? please help
Thanks0 -
URL Capitalization Inconsistencies Registering Duplicate Content Crawl Errors
Hello, I have a very large website that has a good amount of "Duplicate Content" issues according to MOZ. In reality though, it is not a problem with duplicate content, but rather a problem with URLs. For example: http://acme.com/product/features and http://acme.com/Product/Features both land on the same page, but MOZ is seeing them as separate pages, therefor assuming they are duplicates. We have recently implemented a solution to automatically de-captialize all characters in the URL, so when you type acme.com/Products, the URL will automatically change to acme.com/products – but MOZ continues to flag multiple "Duplicate Content" issues. I noticed that many of the links on the website still have the uppercase letters in the URL even though when clicked, the URL changes to all lower case. Could this be causing the issue? What is the best way to remove the "Duplicate Content" issues that are not actually duplicate content?
Intermediate & Advanced SEO | | Scratch_MM0 -
Our website scores A but on google we are still on 7th page
Hi all, I have run on page keyword optimizations with exact terminology used to find our company service or our competition on google. We have ranked A, with almost all points complete. I did the same for our main competitor and they ranked F. Then i did page positioning on Google and they get on page 1 fifth line and we get page 7. We have plenty of unique content and extensive website.
Intermediate & Advanced SEO | | EMGCSR
Could there be any other reason than reason for this other than backlinks? Many thanks for your help.0 -
Is it bad to host an XML sitemap in a different subdomain?
Example: sitemap.example.com/sitemap.xml for pages on www.example.com.
Intermediate & Advanced SEO | | SEOTGT0 -
Subdomains for niche related keywords
I wanted to know how efficient using a subdomain is, taking in consideration all the updates Google has made lately. I am looking to use a subdomain for a well branded website for a niche specific part of their website. The subdomain will end-up having more than 100 pages. I'd like to see in what cases do you guys recommend using a subdomain? How to get the same benefit out of a subdomain as i am getting from the actual main domain?
Intermediate & Advanced SEO | | CMTM0