How do you tell SEOmoz to ignore a subdomain?
-
We have a subdomain that I don't want to show up in our root SEOmoz campaign. How do I tell SEOmoz to ignore it?
-
Hi John,
You'll want to enter the www.example.com in the campaign, so we crawl JUST the www subdomain. We don't often think of it this way, but www is a subdomain.
-
That is actually what I want. How is that done?
-
Hi John,
Thanks for the question.
Unfortunately though, there is no way to have one campaign specifically ignore a single subdomain and still crawl all other subdomains.It would be possible to set two subdomain campaigns if your site was set as such:
www.example.com
blog.example.comBut anything outside of either of those subdomains would not be touched.
Thanks,
Joel. -
Ah, misunderstood you. I don't think what you are trying to do is possible but you should ask someone who works for SEOMoz to confirm.
-
But I want the subdomain to be craweled, just under a different SEOmoz campaign. Blocking it in the robots.txt file will prevent that.
-
Well the subdomain should have its own folder where you can create you own robots.txt file for. That would only block the subdomain from being crawled.
If you don't have a separate folder, check out this Q&A.
-
That would prevent SEOmoz from crawling it completely. I don't want that. I want the subdomain to be crawled under a different campaign. I want the data for the subdomain, but the number of pages is much greater than the root domain, and is just a lot of noise when looking at the root domain. It's drowning it out. SEOmoz has no option to control this?
-
You can block the SEOmoz spider via robots.txt on the subdomain
<code>User-agent: rogerbot Disallow: /</code>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
About 30days free trial for SEOmoz Pro
Just a quick question. I am pretty sure that the rate limit of requesting for SEOmoz Pro is every 2 seconds.
Moz Pro | | flaminGoGo
Before become a Pro member, we can always sing up for 30days free trial and is that free trial has the same rate limit of requesting every 2 seconds?0 -
Functionality of SEOmoz crawl page reports
I am trying to find a way to ask SEOmoz staff to answer this question because I think it is a functionality question so I checked SEOmoz pro resources. I also have had no responses in the Forum too it either. So here it is again. Thanks much for your consideration! Is it possible to configure the SEOMoz Rogerbot error-finding bot (that make the crawl diagnostic reports) to obey the instructions in the individual page headers and http://client.com/robots.txt file? For example, there is a page at http://truthbook.com/quotes/index.cfm month=5&day=14&year=2007 that has – in the header -
Moz Pro | | jimmyzig
<meta name="robots" content="noindex"> </meta name="robots" content="noindex"> This page is themed Quote of the Day page and is duplicated twice intentionally at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2004 and also at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2010 but they all have <meta name="robots" content="noindex"> in them. So Google should not see them as duplicates right. Google does not in Webmaster Tools.</meta name="robots" content="noindex"> So it should not be counted 3 times? But it seems to be? How do we gen a report of the actual pages shown in the report as dups so we can check? We do not believe Google sees it as a duplicate page but Roger appears too. Similarly, one can use http://truthbook.com/contemplative_prayer/ , here also the http://truthbook.com/robots.txt tells Google to stay clear. Yet we are showing thousands of dup. page content errors when Google Webmaster tools as shown only a few hundred configured as described. Anyone? Jim0 -
SeoMoz: Competitive domain analysis report doesn't add up?
Hi, When I check OSE against the Competitive Domain Analysis report in my account the numbers are way off. If the CDA report was 100%, OSE is showing 30% of that. I'm looking at linking root domains with no filters applied. Is the data not pulled from the same db? Thanks.
Moz Pro | | Bondara0 -
Slowing down SEOmoz Crawl Rate
Is there a way to slow down SEOmoz crawl rate? My site is pretty huge and I'm getting 10k pages crawled every week, which is great. However I sometimes get multiple page requests in one second which slows down my site a bit. If this feature exists I couldn't find it, if it doesn't, it's a great idea to have, in a similar way to how Googlebot do it. Thanks.
Moz Pro | | corwin0 -
Google Inbound Links much higher than SEOMOZ
I just started here and really like everything the community and tools so far. One question. My SEOMOZ dashboard shows only one inbound link while Google Web Tools shows 300+. My site has been up for 3 months now and I'm curious about the lack of indexing. Any help, tips or feedback is greatly appreciated. http://CrossTrainingandFitness.com Robb Jarrett
Moz Pro | | carbbon0 -
Schedule crawls for 2 subdomains every 24 hours
I saw at this link: http://pro.seomoz.org/tools/crawl-test "As a PRO member, you can schedule crawls for 2 subdomains every 24 hours, and you'll get up to 3,000 pages crawled per subdomain." However I am having trouble finding where to schedule this 24 hour crawl in my Pro Dashboard. I did not see the option for this setting in the crawl diagnostics tab or in the campaign settings section from the dashboard home page. Can you help? thanks! Michael
Moz Pro | | texmeix0 -
I want to hire someone to write some PHP code using the SEOmoz API.
...but I'm not sure how to go about it. What I need is simple: all I want is to be able to paste a list of URLs (different domains), and have the program return the Page Authority for all those URLs. I understand I can use the free SEOmoz API, particularly the URL Metrics API. Then I want to export the data to an Excel file. That's it. Problem is, I have absolutely no clue how to do it. Obviously I'd pay someone to do it. Pay very well if you can do it professionally and quickly. How can I go about finding the right person for the job? Apologies if this is not the right place to ask this, but I don't know where else to go. Thanks.
Moz Pro | | thegreatpursuit0 -
Designed new website with new domain has more than 24 million links over night on seomoz, how has this happend?
I recently added http://www.ard.uk.com to seomoz, its a new site with a new domain. on the first overview its saying it has more than 24 million links and has a mozrank over 7, how is this possible?
Moz Pro | | francesco-2850160