Sitemap generator partially finding list of website URLs
-
Hi everyone,
When creating my XML sitemap here it is only able to detect a portion of the website. I am missing at least 20 URLs (blog pages + newly created resource pages). I have checked those missing URLs and all of them are index and they're not blocked by the robots.txt.
Any idea why this is happening? I need to make sure all wanted URLs to be generated in an XML sitemap.
Thanks!
-
Gaston,
Interestingly enough by default the generator only located only half of the URLs. I hope that one of those 2 fields will do the trick.
-
Hi Taysir,
I´ve never used that service. I suspect that the section you refer to should do the trick.
I believe that you do know how many URLs there are in the whole site, so you can compare how much pro-sitemaps.com finds to your numbers.Best luck!
GR -
Thanks for your response Gaston. These pages are definitely not blocked by the robots.txt file. I think that it is an internal linking problem. I actually subscribed to pro-sitemap.com and was wondering if I should use this section and add remaining sitemap URLs that are missing: https://cl.ly/0k0t093f0Y1T
Do you think this would do the trick?
-
Google not only provides a basic template you could do the sitemap manually if you wished, and this link has Google listing several dozen open source sitemap generators.
If Google Webmaster's can't read the one you generated fully, then clearly an alternate generator should definitely fix that for you. Good luck!
-
Hi taysir!
Have you tried any other crawler to check whether those pages can be finded?
I'd strongly suggest you Screaming Frog spider, the free version allows you up to 500 URLs. Also, it has a feature to create sitemaps from the crawled URLs. Even though dont know if that available in the free version.
Here some info about that feature: XML sitemap genetator - Screaming FrogUsual issues in not being findable are:
- Poor internal linking
- Not having a sitemap (this is why you find out)
- Blocked resources in robots.txt
- Blocked pages with robots meta tag
That being said, its completely normal that Google has indexed pages that you cant find in a AdHoc crawl, that is because GoogleBot could have found those pages from external linking.
Also keep in mind that having pages blocked with Robots.txt or robots meta tag will not prevent that page from being indexed nor will make them deindex if you add some rules to block them.Hope it helps.
Best luck
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reverify website
Hello, I want to disavow some really dodgy links and although the website is verifed since we see a meta tag in the source code, we no longer have the login details for the email address used to verify it so can't access the tools console. So how do I upload a disavow file (the links i got were from Ahref and Moz analysis)? Should I reverify the website using a new email address and upload a new meta tag? Is there a problem with doing this? Sorry if this is obvious!
Technical SEO | | AL123al0 -
Sitemap duplicate title
At the moment we have a html sitemap which is pulling the same h1's/ titles. How big a problem is the duplicate content issue which is medium priority in the moz pro softaware? Would you recommend changes as sitemap page 1 - page 2 etc. Thanks
Technical SEO | | VUK-SEO0 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
Help Website Plumetting :(
Hi I have been smacked by the penguin/panda and traffic plumetted back in April/May. We are still trying to recover and am looking at all of the potential issues. I have since cleaned up the site as much as i can and attempted to remove as much duplicate content as possible which is automatically generated by Zencart. We add content regularly and have new product reviews everyday and all product page are kept fresh as they show the last 12 customers engraving details which change daily on popular items. Could someone give me some pointers as i am hitting my head against the wall and only seeing traffic drop all the time, it's soul destroying just how much work i am putting into this every day without any effect. Site is www.keepitpersonal.co.uk Kind Regards Andy
Technical SEO | | SmithyWhiffy0 -
Google Finds New Zealand Websites
Ok my VPS is in America and all my websites with .co.nz have been found by google but luggagenz.com doesn't come up... I then realised that all my websites are classed as american... umm how do I change this without moving my vps?
Technical SEO | | bonmaklad0 -
Using Sitemap Generator - Good/Bad?
Hi all I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it. The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me! Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing? Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site? Any thoughts or optinions would be greatly appreciated. Kris
Technical SEO | | yousayjump0 -
Why do I get duplicate pages, website referencing the capital version of the url vs the lowercase www.agi-automation.com/Pneumatic-grippers.htm
Can I the rel=canonical tag this? http://www.agi-automation.com/Pneumatic-grippers.htm****http://www.agi-automation.com/pneumatic-grippers.htm
Technical SEO | | AGIAutomation0 -
Redesign existing websites / worried about urls / mapping
Hi Guys, While redesigning existing websites that will have page name changes such as: example.com/products to be called example.com/solutions example.com/about-us to be called example.com/about should I 301 the old url to the new url. In the past I have not done this & I'm just wondering from an SEO point of view how bad is this? (On a scale of 1 to 10 how bad is this not 301ing urls, 10 being really bad & 1 being fine), Thanks.
Technical SEO | | Socialdude0