Drop in Indexed Page + Organic Traffic
-
Hey Moz Community,
I've been seeing a steady decrease in search console of pages being indexed by Google for our eCommerce site. This is corresponding to lower impressions and traffic in general this year. We started with around a million pages being indexed in Nov of 2015 down to 18,000 pages this Nov. I realized that since we don't have around 3,000 or so products year round this is mostly likely a good thing.
I've checked to make sure our main landing pages are being indexed which they are and our sitemap was updated several times this year, although we're in the process of updating it again to resubmit. I also checked our robots.txt and there's nothing out of the ordinary. In the last month we've recently gotten rid of some duplicate content issues caused by pagination by using canonical tags but that's all we've done to reduce the number of pages crawled. We have seen some soft 404's and some server errors coming up in our crawl error report that we've either fixed or are trying to fix.
Not really sure where to start looking to find a solution to the problem or if it's even a huge issue, but the drop in traffic is also not great. The drop in traffic corresponded to lose in rankings as well so there could be correlation or none.
Any ideas here?
-
Oops, I missed that part. Have you checked Google Search Console to see if someone set any URL parameters?
The first thing I would do is determine how many pages actually should be indexed to see if there's a large discrepancy between that and the number Google shows. A crawler like Screaming Frog can help with this. If you export the crawl to Excel, you can easily remove duplicates in the canonical URL column and filter out the noindexed pages.
If you find there's no real discrepancy, Google may have simply been cleaning house of some really old links in the index that hadn't been crawled in a while.
Beyond that, if you can pinpoint any specific URLs that have been deindexed, use the "Fetch as Google" tool to help diagnose or post it here so the community can take a look.
-
Hi Laura,
Thanks for your response, I know that adding canonical tags can lower our indexed pages count. The only thing I'm worried about is the fact that the drop in indexed pages have been steady since Nov of last year and we just implemented the tags last month.
We have been seeing lower conversions with less traffic some of which has to do with us not ranking as well for a few key-terms as we did last year.
-
I suspect the canonical tag implementation is the culprit for the loss in indexed pages, but that's just a strong suspicion. When you implement a canonical tag, Google will fold together all of the pages pointing to the same canonical URL so that they are treated as one URL for the purposes of indexation. This will naturally cause a reduction in the overall number of pages indexed. You won't see 404 errors as a result of this either.
One important point to remember is that a decrease in indexed pages is not necessarily a bad thing that needs to be fixed. In fact, if you were having problems with duplicate content or crawl issues, canonicalizing those URLs should be beneficial if properly implemented.
As for the loss in traffic, it could be related or not. It may just be an adjustment period from the canonicalization, or it could a separate issue. Have you seen a corresponding loss in conversions?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexable?
Hello, I've been trying to find out why Google Search Console finds these pages non-indexable: https://www.visitflorida.com/en-us/eat-drink.html https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not. Anyone have any thoughts? 6AYn1TL
Technical SEO | | KenSchaefer0 -
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
How to track my actual traffic source using Google Analytics which are now showing as referral traffic?
Hi Mozzers, I went through many Q&As in the community this morning. I found a solution where I could just remove the referral site in analytics>admin>property>tracking info>referral exclusion list. So I removed paypal.com which was the main referral traffic. I thought the problem is solved. Later today I got another order, now the referral traffic is from eway.com, now what? Yes I know I will add this to the exclusion list but there will be many more referral sites. My main concern is I am not able to track the actual traffic source. How do I do that? 1. Do I need to use google url tracking for all my pages?
Technical SEO | | DebashishB
2. Do I need to add tracking code in each page of the site?
3. Is there a way to track the actual source of this traffic, now that the transaction is already made but reflects as referral traffic in Google Analytics? jZjTN0 -
Does Google index internal anchors as separate pages?
Hi, Back in September, I added a function that sets an anchor on each subheading (h[2-6]) and creates a Table of content that links to each of those anchors. These anchors did show up in the SERPs as JumpTo Links. Fine. Back then I also changed the canonicals to a slightly different structur and meanwhile there was some massive increase in the number of indexed pages - WAY over the top - which has since been fixed by removing (410) a complete section of the site. However ... there are still ~34.000 pages indexed to what really are more like 4.000 plus (all properly canonicalised). Naturally I am wondering, what google thinks it is indexing. The number is just way of and quite inexplainable. So I was wondering: Does Google save JumpTo links as unique pages? Also, does anybody know any method of actually getting all the pages in the google index? (Not actually existing sites via Screaming Frog etc, but actual pages in the index - all methods I found sadly do not work.) Finally: Does somebody have any other explanation for the incongruency in indexed vs. actual pages? Thanks for your replies! Nico
Technical SEO | | netzkern_AG0 -
Help. Recently my organic traffic has dropped 40%. Any advice / ideas?
Lately my organic traffic has dropped significantly as well as my adsense revenue. The moz report says, for example, my traffic is down 40%, but I a still #1 for that keyword. Also, in the last week, suddenly my number of indexed pages doubled. We had done some page rewriting and maybe messed that up. We've fixed that though. Webmaster tools is still picking up all of our old pages and the new ones. Background: We recently launched our new responsive website in March. March income was about the same as February. April dropped off suddenly (maybe late march - no sure) When we changed site, we did do 301's for all the old pages to the new ones Any ideas or advice as to why my traffic and revenue has dropped off so sharply? Never submitted questions before - not sure if I am supposed to put urls here so if you just google Home Spelling Words - that's my website. Thanks everyone!!!
Technical SEO | | kimtastic0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Skip indexing the search pages
Hi, I want all such search pages skipped from indexing www.somesite.com/search/node/ So i have this in robots.txt (Disallow: /search/) Now any posts that start with search are being blocked and in Google i see this message A description for this result is not available because of this site's robots.txt – learn more. How can i handle this and also how can i find all URL's that Google is blocking from showing Thanks
Technical SEO | | mtthompsons0 -
35% Drop In Traffic Since March 23 2012
Hi, I am aware ofth e penguin and panda updates and have been monitoring my traffic for the last few months. I did notice that for some reason, from about 22nd / 23rd March we had a drop in traffic by about 50%, it stabilised in a few days but since then from april to present has been hovering around 35% drop in traffic. what perplexes me is hat I also monitored our rankings and that was fine, pretty much the same, yet traffic is down. I have noticed our longer tail keywords have lost traffic by as much as 60-900% I would really appreciate some input, do you think this could be down to panda pulling pages out. how can i find track pages indexesd for my site over time to see if this is where he problem is. I havea lso noticed that in webamster tools, a domain preferance is not set, it is set to default "do not set a preferred domain", would this not cause a dup content issue, what should it be set to, post panda? Any help would be greatly appreciated as this is really hurting our business. thanks you in advance
Technical SEO | | LiquidTech0