Sitemaps and Indexed Pages
-
Hi guys,
I created an XML sitemap and submitted it for my client last month.
Now the developer of the site has also been messing around with a few things.
I've noticed on my Moz site crawl that indexed pages have dropped significantly.
Before I put my foot in it, I need to figure out if submitting the sitemap has caused this.. can a sitemap reduce the pages indexed?
Thanks
David.
-
Sorry - I missed the part about you looking specifically at the Moz crawler. While useful, it's a stand-in for what will actually be used for rankings - namely the actual crawls by the search engine crawlers themselves. I'd be looking right to the source for that info if you're concerned there's an issue, rather than trusting just Mozbot. You can find the SE crawlers data in Google Search Console and Bing Webmaster Tools. Look for trends and patterns there, especially around the sitemap report.
The challenge to a Screaming Frog-rendered sitemap is that it can only find what's linked. If the site has orphaned pages or an ineffective internal linking scheme, a crawl could easily miss pages. It's certainly better than no sitemap, but a map generated by the site's technology itself (usually the database) is safer.
P.
-
Thanks Paul,
Yes there has been a big clean up of pages. There were over 80,000 to begin with. I managed to get that down to about 14k but then last month MOZ bot only crawled about 4,000 pages.
I was just a bit worried that the sitemap generated by Screaming Frog was incorrect and therefore that was the reason for the drop.
I was referring mainly to the MOZ site crawl. I guess I was worried that the MOZ bot only followed the sitemap!
There were loads of filter URL's and all sorts going on so it's a bit of a spiders web!
-
No - submitting a sitemap won't reduce the crawl of a site. The search engines will crawl the sitemap and add these pages to the index if they consider them worthy. But they'll still also crawl any other links/pages they can find in other ways and index those as well if they consider them worthy.
Note though - having the number of indexed pages drop is not necessarily a bad thing. If removing a large number of worthless/duplicate/canonicalised/no-indexed pages cleans up the site, that will also be reflected in fewer crawled pages - an indication that quality improvement work was effective.
That help?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Authority 2.0 is coming soon!
UPDATE: We’ve made the tough decision to delay the launch of Page Authority 2.0 as our team investigates an unforeseen issue. **To learn more about the rigorous testing process and updates to the timeline, **check out this blog post. Hey there Moz family, We’re stoked to let you know about an upcoming change to a beloved metric — similar to our refresh of the Domain Authority algorithm last year, we’ve been working on developing an improvement to its sibling metric, Page Authority (also known as “PA”). Page Authority (PA) identifies the strength of a particular page (on a 1-100 scale) and its ability to rank in search results in comparison to other pages. PA is a Moz metric, and while it can be used as a good gauge of page strength and ranking potential, it is not used by search engines to determine ranking outcome. On September 30, 2020, we will release the new and improved Page Authority algorithm that will be updated in a similar fashion to last year’s update to DA. The updated algorithm will take into account more modern ranking factors to make the score even more accurate and predictive. We recognize that the update to the DA algorithm took time to communicate to clients and stakeholders, and we wanted to be sure to give you advance notice again this time around. We’ve created a number of resources to help you understand the what, the why, and the how of this update to PA. Let’s start with a few FAQs that you might be curious about! Why didn’t PA update when DA updated? Although many folks associate DA and PA with one another, the two metrics are calculated independently. We chose to update the two metrics separately in order to take the care that each metric deserved, and to provide the highest quality algorithm updates for the SEO community. Why is Moz changing the PA algorithm? As with our update to the DA algorithm, we want to ensure that you have confidence in our metrics and the predictions that they provide. Data integrity is an integral part of our tools and something that we hold in the highest regard. To be sure that PA can best reflect the potential for a page to rank on the SERP, we’re making the necessary improvements. What can I expect to see from the PA algorithm update? Many pages will see changes to their PA scores as a result of this algorithm update. While the changes to scores may be somewhat minimal, there is a possibility that some pages will see material change to their scores. The new PA algorithm takes into consideration Spam Score and link patterns, in addition to dozens of other factors, so your PA scores may see noticeable change if your pages have spammy or unnatural link patterns. How can I prepare for the update? As with any change to a metric that you know and love, we recommend getting in touch with your stakeholders to let them know of the upcoming update. For those who are used to seeing this metric in your SEO reports, giving them a heads-up will help them to prepare for any fluctuations they might see to PA scores once the new PA algorithm rolls out. We also recommend using this update as an opportunity to educate them on the use of Page Authority and how you might use this refreshed metric for future link building projects. Our resource center has a few helpful pieces of collateral that can support these conversations with stakeholders and clients. Is Page Authority an absolute score or a relative one? Page Authority should always be used as a relative metric, to compare the score of your pages to the scores of other sites’ pages. Link Explorer looks at over 7 trillion pages and 40 trillion links to inform the Page Authority metric that you see. As such, it is always a wise idea to use PA as a comparative score to understand where your page stacks up in comparison to the other pages that are present on the SERPs you care about. Will Domain Authority (DA) be impacted by this update? No, DA will not be affected by this update. This particular algorithm update is specific to Page Authority only. Will API users be affected at the same time? Yes, API users will see the update to Page Authority at the same time as users of Moz Pro. We’d love for you to check out our resource page for links to a slide deck, a whitepaper, and other helpful information. The full announcement blog post from Russ Jones can be found here. Happy to chat with you here in the Q&A thread, or feel free to send an email to help@moz.com with any questions. Best, Igor
API | | IgorJesovnik8 -
Crawler unable to access pages
Hi crawler is unable to access site and crawl properly. Mainly for the backlink checker, it's producing no results There is nothing in the robots.txt file blocking crawler access. Any help is much appreciated as it's driving me crazy!
API | | 2Cubedie0 -
/index.php causing a few issues
Hey Mozzers, Our site uses magento. Pages within the site (not categories or products) are set to display as www.domain.co.uk/page-url/ The hta access is set to redirect all version such as www.domain.co.uk/page-url to a url ending in a / However in google analytics and in moz landing page tracker these urls are being represented by www.domain.co.uk/page-url/index.php When visiting www.domain.co.uk/page-url/index.php a 404 is displayed. I know that by default when directed to a directory it automatically finds and displays the index file. So i understand why this is happening to some degree. However, when manually visiting this link does not exist. This poses a problem when trying to view the landing pages information in moz pro. I have 20 keywords being tracked in relation to www.domain.co.uk/page-url/ but because moz is recording it as www.domain.co.uk/page-url/index.php the keywords are unrelated so not showing information in relation to the page. Any ideas?
API | | ATP0 -
Why did the April Index Raise DA?
All of our websites DA raised dramatically, including the competitors we track Any idea why this may have happened across the board?
API | | Blue_Compass0 -
January’s Mozscape Index Release Date has Been Pushed Back to Jan. 29th
With a new year brings new challenges. Unfortunately for all of us, one of those challenges manifested itself as a hardware issue within one of the Mozscape disc drives. Our team’s attempts to recover the data from the faulty drive only lead to finding corrupted files within the Index. Due to this issue we had to push the January Mozscape Index release date back to the 29<sup>th</sup>. This is not at all how we anticipated starting 2016, however hardware failures like this are an occasional reality and are also not something we see being a repeated hurdle moving forward. Our Big Data team has the new index processing and everything is looking great for the January 29<sup>th</sup> update. We never enjoy delivering bad news to our faithful community and are doing everything in our power to lessen these occurrences. Reach out with any questions or concerns.
API | | IanWatson2 -
September's Mozscape Update Broke; We're Building a New Index
Hey gang, I hate to write to you all again with more bad news, but such is life. Our big data team produced an index this week but, upon analysis, found that our crawlers had encountered a massive number of non-200 URLs, which meant this index was not only smaller, but also weirdly biased. PA and DA scores were way off, coverage of the right URLs went haywire, and our metrics that we use to gauge quality told us this index simply was not good enough to launch. Thus, we're in the process of rebuilding an index as fast as possible, but this takes, at minimum 19-20 days, and may take as long as 30 days. This sucks. There's no excuse. We need to do better and we owe all of you and all of the folks who use Mozscape better, more reliable updates. I'm embarassed and so is the team. We all want to deliver the best product, but continue to find problems we didn't account for, and have to go back and build systems in our software to look for them. In the spirit of transparency (not as an excuse), the problem appears to be a large number of new subdomains that found their way into our crawlers and exposed us to issues fetching robots.txt files that timed out and stalled our crawlers. In addition, some new portions of the link graph we crawled exposed us to websites/pages that we need to find ways to exclude, as these abuse our metrics for prioritizing crawls (aka PageRank, much like Google, but they're obviously much more sophisticated and experienced with this) and bias us to junky stuff which keeps us from getting to the good stuff we need. We have dozens of ideas to fix this, and we've managed to fix problems like this in the past (prior issues like .cn domains overwhelming our index, link wheels and webspam holes, etc plagued us and have been addressed, but every couple indices it seems we face a new challenge like this). Our biggest issue is one of monitoring and processing times. We don't see what's in a web index until it's finished processing, which means we don't know if we're building a good index until it's done. It's a lot of work to re-build the processing system so there can be visibility at checkpoints, but that appears to be necessary right now. Unfortunately, it takes time away from building the new, realtime version of our index (which is what we really want to finish and launch!). Such is the frustration of trying to tweak an old system while simultaneously working on a new, better one. Tradeoffs have to be made. For now, we're prioritizing fixing the old Mozscape system, getting a new index out as soon as possible, and then working to improve visibility and our crawl rules. I'm happy to answer any and all questions, and you have my deep, regretful apologies for once again letting you down. We will continue to do everything in our power to improve and fix these ongoing problems.
API | | randfish11 -
Mozscape Index
Hello: There was a Mozscape Index scheduled 9/8/2015 and now it go pushed back October 8,2015. There seems to be a lot of delays with the Mozscape Index. Is this something we should expect? Updates every 2 months instead of every month? Thanks!
API | | sderuyter1 -
Does Moz's crawlers use _escaped_fragment_ to inspect pages on a single-page application?
I just got started, but got a 902 error code on some pages, with a message saying there might be an outage on my site. That's certainly not the case, so I'm wondering if the crawlers actually respect and use the escaped_fragment query parameter. Thanks, David.
API | | CareerDean0