What to do with a site of >50,000 pages vs. crawl limit?
-
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages?
Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder?
I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc.
I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean:
To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence.
www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get?
www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?)
Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
-
Hi Sean -- Can you clarify for me how competitors in a campaign figure in to the 50,000 page limit? Does the main page in the campaign get thoroughly crawled first and then competitors are crawled up to the limit?
Some examples:
If the main site is 100 pages, and I pick 2 competitors that are 100 to 1000 pages and a 3rd gargantuan competitor of 300,000 pages, what happens? Does it matter in what order I enter competitors in this situation as to whether the 100-page and 1000-page competitors get crawled vs. whether the limit maxes out on the 300K competitor before crawling the smaller competitors?
If the main site is 300,000 pages, do any competitors in the campaign just not get crawled at all because the 50,000 limit gets all used up on the main site?
What if the main site is 20,000 pages and a competitor is 45,000 pages? Thorough crawl of main site and then partial crawl of competitor?
I feel like I have a direction to go in based on our previous discussion for the main site in the campaign, but now I'm still a little stumped and confused about how competitors operate within the crawl limit.
-
Hi There,
Thanks for writing us and this is a tricky one because it is difficult to say if there is an objectively right answer.
In this case your best bet would be to create a sub folder that is under the standard subscription campaign limit and attempting to pick up what you miss using the other research tools. Although, our research tools are predominantly designed for one off interactions, you could probably use them to capture information that is a bit outside of the campaigns purview. Here is a link to our research tools for your reference: moz.com/researchtools/ose/
If you do decide to enter a website that far surpasses the crawl limits then, what will be cut off is determined by the existing site structure.
The way that our crawler works is that it will go from the link provided and use the existing link structure to keep crawling the site or until we run into a dead end.
Both approaches may present issues so it will be more of a judgement call. One thing that I will say is that we have a much easier time crawling fewer pages so that may be something to keep in mind.
Hope this helps and if you have any questions for me please let me know.
Have a fantastic day!
-
Thanks Patrick for the tip about ScreamingFrog! I checked out the link you shared, and it looks like a powerful tool. I'm going to put it on my list of additional tools I need to get going on using.
In the meantime, though, I still need a strategy for what to do in Moz. Any opinions on whether I should set my Moz campaigns to the smaller sub-folders of a few thousand pages vs. the humongous full sites of 100,000+ pages? I guess I'm leaning towards setting them to the smaller sub-folders. Or maybe I should do a small sub-folder for one of the huge sites and do the full site for another campaign, and see what kind of results I get.
-
Hi there
I would look into ScreamingFrog - you can crawl 500 URIs for free, otherwise, if you have a license, you can crawl as many pages as you'd like.
Let me know if this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content on pages that appear to be different?
Hi Everyone! My name's Ross, and I work at CHARGED.fm. I worked with Luke, who has asked quite a few questions here, but he has since moved on to a new adventure. So I am trying to step into his role. I am very much a beginner in SEO, so I'm trying to learn a lot of this on the fly, and bear with me if this is something simple. In our latest MOZ Crawl, over 28K high priority issues were detected, and they are all Duplicate Page Content issues. However, when looking at the issues laid out, the examples that it gives for "Duplicate URLs" under each individual issue appear to be completely different pages. They have different page titles, different descriptions, etc. Here's an example. For "LPGA Tickets", it is giving 19 Duplicate URLs. Here are a couple it lists when you expand those:
Moz Pro | | keL.A.xT.o
http://www.charged.fm/one-thousand-one-nights-tickets
http://www.charged.fm/trash-inferno-tickets
http://www.charged.fm/mylan-wtt-smash-hits-tickets
http://www.charged.fm/mickey-thomas-tickets Internally, one reason we thought this might be happening is that even though the pages themselves are different, the structure is completely similar, especially if there are no events listed or if there isn't any content in the News/About sections. We are going to try and noindex pages that don't have events/new content on them as a temporary fix, but is there possibly a different underlying issue somewhere that would cause all of these duplicate page content issues to begin appearing? Any help would be greatly appreciated!0 -
Page Rank vs Page and Domain Authority - who wins?
A client has found another SEO agency promising various things to do with link building. Most of these promises are based upon links from sites with allegedly high page ranks. So my questions: Page rank seems to be fading out am I safe to stay with PA and DA metrics instead? I don't agree with link building tactics and feel that it should more a networking activity to provide USEFUL links to users... am I being too white hat and missing opporunities? The other company have promised long list of links including 100 SEO friendly web directory listings, 200 PR 8 back links from Pinterest (which i thought was no follow) & 10 long lasting and high quality mini web sites (with three pages/posts, video and pictures). Am I right that this all sounds a little spammy or is this really what I should be doing for me clients?
Moz Pro | | SoundinTheory0 -
Is there any report / tool that gives me last cache date for each page on my site ?
my site has several hundred pages, and it is important for me to know last crawl date of each page as well as number of pages cralwed in a particualr period ( from / to date ). is there any report in seomoz that can help for this ? or any other suggestion ?
Moz Pro | | elegantmicroweb0 -
How do you solve this issue? Pages coming up as [No Title] in Open Site Explorer
I see this when researching top pages. I know for a fact that the pages have title tags that appear to be fine on google and our site, so I'm not sure what the [No Titlle] means or why it's appearing in Open Site Explorer. Thanks!
Moz Pro | | JasonBilog0 -
Duplicate page errors
I have 102 duplicate page title errors and 64 duplicate page content errors. They are almost all from the email a friend forms that are on each product of my online store. I looked and the pages are identical except for the product name. Is this a real problem and if so is there a work around or should I see if I can turn off the email a friend option? Thanks for any information you can give me. Cingin Gifts
Moz Pro | | cingingifts0 -
How do I scan down to 10000 pages?
Hi very new here I have set up 5 campaigns, all of fairly large sites. It appears seomoz has scanned 4 of them down to 250 and 1 down to 10000. the one a really want to see down to 10000, my own site is the one I started scanning first well over a week ago. How do I get seomoz to scan further? Thanks
Moz Pro | | First-VehicleLeasing0 -
How long would a SEOMoz crawl usually take for a site with around 4000 pages?
We are working through optimising a site for one of our clients and the SEOMoz crawl progress says it has been running since the 8th Feburary. It's now almost a week later and it still hasn't finished. The first run took a few days, is there any way of restarting the process?
Moz Pro | | TJSSEO0