Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitemap indexed pages dropping
-
About a month ago I noticed my pages indexed from my sitemap are dropping.There are 134 pages in my sitemap and only 11 are indexed. It used to be 117 pages and just died off quickly. I still seem to be getting consistant search traffic but I'm just not sure whats causing this. There are no warnings or manual actions required in GWT that I can find.
-
Just wanted to update this, it took a month but since I decided to completely remove canonical tags and try and handle duplicate content with url rewrites and 301 redirects and I now have 114 out of 149 indexed from my sitemap which is much better. it ended up to dropping to 5 out of 149 at one point.
-
Hi Stephen,
Great that you've probably found the cause - this will absolutely cause mass de-indexation. I had a client a year ago canonicalise their entire site (two sites, actually) to the home page. All their rankings and indexed pages dropped off over a matter of about six days (we spotted the tag immediately but the fix went into a "queue" - ugh!).
The bad news is that it took them a long time to get properly re-indexed and regain their rankings (I am talking months, not weeks). Having said this, the sites were nearly brand new - they had very few backlinks and were both less than six months old. I do not believe that an older site would have had as much problem regaining rankings, but I can't be sure and I have only seen that situation take place first-hand once.
-
I may have found the issue today. Most of the articles are pulled from a database and I think I placed a wrong canonical tag on the page which screwed up everything. Does anyone know how long it takes before a fix like this will show?
-
Thats a good catch, I fixed that. I do use that in WMT and it has been fine for the longest time. I guess its not that big of an issue, my main concern was the pages being indexed. Was reading another Q&A thing and used the info: qualifer to check some of the pages and all the ones I checked are indexed and its more then the 11. I just don't understand why its dropped all a sudden, and if that number really means anything.
-
How are the indexed numbers looking in WMT today? I see 3,370 results for a site: search on the domain, but those can be iffy in terms of up to date accuracy: https://www.google.co.uk/search?q=site%3Agoautohub.com&oq=site%3Agoautohub.com&aqs=chrome..69i57j69i58.798j0j4&sourceid=chrome&es_sm=119&ie=UTF-8
Not that this should matter too much if you are submitting a sitemap through WMT but your robots.txt file specifies sitemap.xml. There is a duplciate sitemap on that URL (http://goautohub.com/sitemap.xml) - are you using sitemap.php, which you mention here, in WMT? .php can be used for sitemaps, but I would update the robots.txt file to reflect the correct URL - http://i.imgur.com/uSB1P1g.png, whichever is meant to be right. I am not aware of problems with having duplicate sitemaps, as long as they are identical, but I'd use just one if it were me.
-
Thanks for checking, I haven't found anything yet.The site is goautohub.com. it's a custom site and the site map file is auto generated. It's goautohub.com/sitemap.php. I've done it like that for over a year. I did start seeing an error message about high response times and I've been working on improving that. It makes since because we have been advertising more to get the site seen. In regards to the rest of Williams points I have checked those but no improvement yet. Thank you
-
Hi Stephen,
Checking in to see if you had checked the points William has raised above. Do you see anything that could have resulted in the drop? Also, are you comfortable sharing the site here? We might be able to have a look too (feel free to PM if you are not comfortable sharing publicly).
Cheers,
Jane
-
Try to determine when the drop off started, and try to remember what kinds of changes the website was going through during that time. That could help point to the reason for the drop in indexing.
There are plenty of reasons why Google may choose not to index pages, so this will take some digging. Here are some places to start the search:
-
Check your robots.txt to ensure those pages are still crawlable
-
Check to make sure the content on those pages isn't duplicated somewhere else on the Web.
-
Check to see if there was any updates to canonical changes on the site around when the drop started
-
Check to make sure the sitemap currently on the site matches the one you submitted to Webmasters, and that your CMS didn't auto-generate a new one
-
Make sure the quality of the pages is worth indexing. You said your traffic didn't really take a hit, so it's not de-indexing your quality stuff.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Indexing without content
Hello. I have a problem of page indexing without content. I have website in 3 different languages and 2 of the pages are indexing just fine, but one language page (the most important one) is indexing without content. When searching using site: page comes up, but when searching unique keywords for which I should rank 100% nothing comes up. This page was indexing just fine and the problem arose couple of days ago after google update finished. Looking further, the problem is language related and every page in the given language that is newly indexed has this problem, while pages that were last crawled around one week ago are just fine. Has anyone ran into this type of problem?
Technical SEO | | AtuliSulava1 -
Customer Reviews on Product Page / Pagination / Crawl 3 review pages only
Hi experts, I present customer feedback, reviews basically, on my website for the products that are sold. And with this comes the ability to read reviews and obviously with pagination to display the available reviews. Now I want users to be able to flick through and read the reviews to help them satisfy whatever curiosity they have. My only thinking is that the page that contains the reviews, with each click of the pagination will present roughly the same content. The only thing that changes is the title tags which will contain the number in the H1 to display the page number. I'm thinking this could be duplication but i have yet to be notified by Google in my Search console... Should i block crawlers from crawling beyond page 3 of reviews? Thanks
Technical SEO | | Train4Academy.co.uk0 -
URLs dropping from index (Crawled, currently not indexed)
I've noticed that some of our URLs have recently dropped completely out of Google's index. When carrying out a URL inspection in GSC, it comes up with 'Crawled, currently not indexed'. Strangely, I've also noticed that under referring page it says 'None detected', which is definitely not the case. I wonder if it could be something to do with the following? https://www.seroundtable.com/google-ranking-index-drop-30192.html - It seems to be a bug affecting quite a few people. Here are a few examples of the URLs that have gone missing: https://www.ihasco.co.uk/courses/detail/sexual-harassment-awareness-training https://www.ihasco.co.uk/courses/detail/conflict-resolution-training https://www.ihasco.co.uk/courses/detail/prevent-duty-training Any help here would be massively appreciated!
Technical SEO | | iHasco0 -
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi! The Problem We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them. The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed. Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions? Thanks for any input on this one. PmHmG
Technical SEO | | AlisonMills0 -
Splitting Page Authority with two URLs for the same page.
Hello guys, My website is currently holding two different URLs for the same page and I am under the impression such set up is dividing my Page Authority and Link Juice. We currently have the following page with both URLs below: www.wbresearch.com/soldiertechnologyusa/home.aspx
Technical SEO | | JoaoPdaCosta-WBR
www.wbresearch.com/soldiertechnologyusa/ Analysing the page authority and backlinks I identified that we are splitting the amount of backlinks (links from sites, social media and therefore authority). "/home.aspx"
PA: 67
Linking Root Domains: 52
Total Links: 272 "/"
PA: 64
Linking Root Domains: 29
Total Links: 128 I am under the impression that if the URLs were the same we would maximise our backlinks and therefore page authority. My Question: How can I fix this? Should I have a 301 redirect from the page "/" to the "/home.aspx" therefore passing the authority and link juice of “/” directly to “/homes.aspx”? Trying to gather thoughts and ideas on this, suggestions are much appreciated? Thanks!0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
How to generate a visual sitemap using sitemap.xml
Are there any tools (online preferably) which will take a sitemap.xml file and generate a visual site map? Seems like an obvious thing to do, but can't find any simple tools for this?
Technical SEO | | k3nn3dy30