Google Index Status Falling Fast - What should I be considering?
-
Hi Folks,
Working on an ecommerce site. I have found a month on month fall in the Index Status continuing since late 2015. This has resulted in around 80% of pages indexed according to Webmaster.
I do not seem to have any bad links or server issues. I am in the early stages of working through, updating content and tags but am yet to see a slowing of the fall.
If anybody has tips on where to look for to issues or insight to resolve this I would really appreciate it.
Thanks everybody!
Tim
-
Hi dude, thank you so much for taking time to look at this site. It is really kind of you. I will be taking a look at all the points raised over the next week to see what we can achieve. Thanks, Tim
-
Thank you for taking so much time to look at our site. I really appreciate it. I will dig in to the points to see what we can achieve. Thanks again, Tim
-
Thanks dude, I will take a look at this. Really appreciate you taking time to respond.
-
Hi Tim,
I agree with Laura on the canonical tags. I've worked on several large Magento sites and I've never seen any issue with the way Magento handles it - by canonicalizing product URLs to the root directory.
In fact, I actually prefer this was over assigning a product to a 'primary' category and using that as the canonical.
As Laura said, a reduction in the total number of indexed pages might actually be a really big positive here! More pages indexed does not mean it's better. If they are low quality/duplicate pages that have been removed from index, that's a really good thing.
I did find some issues with your robots.txt file:
- Disallow: /media/ - should be removed because it's blocking images from being crawled (this is a default Magento thing and they should remove it!)
- Disallow: /? - this basically means that any URLs containing a ? will not be crawled and with the way pagination is setup on the site, this means that any pages after 1 are not being crawled.
This could be impacting how many product pages you have indexed - which would definitely be a bad thing! You would obviously want your product pages to be crawled and indexed.
Solution: I would leave Disallow: /? in robots.txt because it stops a product filter URLs being crawled, but I would add the following line:
Allow: */?p=
This line will allow your paginated pages to be crawled, which will also allow products linked from those pages to be crawled.
Hope this helps!
Cheers,
David
-
I would be interested in seeing examples of where this has happened. Were the canonical tags added after the URLs were already indexed or were the canonicals in place when the site launched?
-
However, the canonical is only an advisory tag. I've had few cases where people have relied on their canonical tag when their site has numerous product url types (as above with category in the url and just product url) which has many references to these different urls elsewhere (onsite and offsite) and they are now indexed as both versions, which is not always ideal. It also means that reporting tools such as Screaming Frog only show the true URLs on the site. It's also saving crawl budget as it doesn't have to crawl the category produced url and the canonical url.
Whilst it's not a major issue, it's something I would look at changing.
-
If I understand you correctly, you are referring to the following two URLs:
https://www.symectech.com/epos-systems/customer-displays/pole-mounting-kit-94591.html
https://www.symectech.com/pole-mounting-kit-94614.html
Both of these have the same canonical referenced, which is https://www.symectech.com/pole-mounting-kit-94614.html.
It doesn't matter what actually shows in the address box. For the purposes of indexation, what matters is what is referenced in the canonical tag.
.
-
What I've suggested will be avoiding these duplicate urls? Here's some actual examples, going via a tier two category I get the following product url:
https://www.symectech.com/epos-systems/customer-displays/pole-mounting-kit-94591.html
With a canonical of:
https://www.symectech.com/pole-mounting-kit-94614.html
Yet when going from https://www.symectech.com/epos-systems/?limit=32&p=2 (a tier 1 category) I get the canonical url.
So if there are products listed in multiple tier two categories then that's multiple urls for the same product. With the suggestion I made, there would only be one variation of this product url (the canonical)
-
A reduction in the number of pages indexed does not necessarily mean something is wrong. In fact, it could mean that something is right, especially if your rankings are improving.
How are you determining that only 80% of pages are indexed? Can you provide a specific URL that is not being indexed?
If you made changes to your canonical tag, robots.txt , or meta robots tag, these could all cause a reduction in the number of pages being indexed.
-
The canonicals appear to be set up correctly, and I would not advise listing the product URLs as their canonicals in the category as suggested above. That will create duplicate URLs with the same content, which is exactly what canonical tags are designed to avoid.
-
Just going through Laura's list as a checklist for ones that are applicable:
- Have you checked your robots.txt file or page-level meta robots tag to see if you are blocking or noindexing anything?
Nothing that I can see, that's causing a major issue.
- Is it a large site? If so, check for issues that may affect crawl budget.
The main thing I can see is that the product urls and canonicals are different, is there anyway of listing the product urls as their canonical versions in the category?
-
<a name="_GoBack"></a>Sorry for the delay in response. Website is symectech.com
We have fixed various issues including a noindex issue earlier this year but our index status is continuing to fall. However, the ranking seems to be improving week on week according to MOZ. Thanks.
Tim
-
Just to echo what Laura has said, if you can share a URL that would be great so we can help you get to the source of the problem.
Try running a tool like screamingfrog (https://www.screamingfrog.co.uk/seo-spider/) to check the issues above that Laura has mentioned, as doing a lot of those by hand can be quite time consuming.
Also, do you have a drop in rankings with your pages falling out the index?
-
Any chance you can share the URL? That would make it much easier for someone to help in this forum. Without the URL, I can offer a few diagnostic questions.
- Have the number of pages on the site remained the same and pages are being removed from the index? Or have you added more content, but the percentage in the index has decreased?
- Have you checked your robots.txt file or page-level meta robots tag to see if you are blocking or noindexing anything?
- Have you submitted an XML sitemap? If so, check the XML sitemap to make sure what's being submitted should be indexed. It's possible to submit a sitemap that includes noindexed pages, especially with some automated tools.
- Is it a large site? If so, check for issues that may affect crawl budget.
- Have you changed any canonical tags?
- Have you used the Fetch as Google tool to diagnose a specific URL?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google slow to index pages
Hi We've recently had a product launch for one of our clients. Historically speaking Google has been quick to respond, i.e when the page for the product goes live it's indexed and performing for branded terms within 10 minutes (without 'Fetch and Render'). This time however, we found that it took Google over an hour to index the pages. we found initially that press coverage ranked until we were indexed. Nothing major had changed in terms of the page structure, content, internal linking etc; these were brand new pages, with new product content. Has anyone ever experienced Google having an 'off' day or being uncharacteristically slow with indexing? We do have a few ideas what could have caused this, but we were interested to see if anyone else had experienced this sort of change in Google's behaviour, either recently or previously? Thanks.
Intermediate & Advanced SEO | | punchseo0 -
Does google index the mobile version or the desktop version?
We use different headlines and text on our mobile site vs. the desktop. Our desktop headlines and text is highly optimized for SEO purposes, but because of user experience and space limitations the headlines and text on the mobile version isn't great for SEO. I'm wondering, what will google look at and will it make a difference? Thanks!
Intermediate & Advanced SEO | | Postable0 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
Sort term product pages and fast indexing - XML sitemaps be updated daily, weekly, etc?
Hi everyone, I am currently working on a website that the XML sitemap is set to update weekly. Our client has requested that this be changed to daily. The real issue is that the website creates short term product pages (10-20 days) and then the product page URL's go 404. So the real problem is quick indexing not daily vs weekly sitemap. I suspect that daily vs weekly sitemaps may help solve the indexing time but does not completely solve the problem. So my question for you is how can I improve indexing time on this project? The real problem is how to get the product pages indexed and ranking before the 404 page shows u?. . Here are some of my initial thoughts and background on the project. Product pages are only available for 10 to 20 days (Auction site).Once the auction on the product ends the URL goes 404. If the pages only exist for 10 to 20 days (404 shows up when the auction is over), this sucks for SEO for several reasons (BTW I was called onto the project as the SEO specialist after the project and site were completed). Reason 1 - It is highly unlikely that the product pages will rank (positions 1 -5) since the site has a very low Domain Authority) and by the time Google indexes the link the auction is over therefore the user sees a 404. Possible solution 1 - all products have authorship from a "trustworthy" author therefore the indexing time improves. Possible solution 2 - Incorporate G+ posts for each product to improve indexing time. There is still a ranking issue here since the site has a low DA. The product might appear but at the bottom of page 2 or 1..etc. Any other ideas? From what I understand, even though sitemaps are fed to Google on a weekly or daily basis this does not mean that Google indexes them right away (please confirm). Best case scenario - Google indexes the links every day (totally unrealistic in my opinion), URL shows up on page 1 or 2 of Google and slowly start to move up. By the time the product ranks in the first 5 positions the auction is over and therefore the user sees a 404. I do think that a sitemap updated daily is better for this project than weekly but I would like to hear the communities opinion. Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?
I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.
Intermediate & Advanced SEO | | sparrowdog0 -
How fast is too fast in building quality links?
So I'm working on a brand new project, and want to go after my competition rather aggressively. I have a long list of industry resources, including blogs, articles and directories, and several news/media/press release type sites that I could have link to me in a very short amount of time. How fast is too fast? Is there a penalty for getting links too fast if they are all legit?
Intermediate & Advanced SEO | | GoogleMcDougald0 -
Google indexing issue?
Hey Guys, After a lot of hard work, we finally fixed the problem on our site that didn't seem to show Meta Descriptions in Google, as well as "noindex, follow" on tags. Here's my question: In our source code, I am seeing both Meta descriptions on pages, and posts, as well as noindex, follow on tag pages, however, they are still showing the old results and tags are also still showing in Google search after about 36 hours. Is it just a matter of time now or is something else wrong?
Intermediate & Advanced SEO | | ttb0 -
Removing pages from index
Hello, I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
Intermediate & Advanced SEO | | AlexGop
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist. The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal. Also, should I 301, 404 or 410 these pages? Any help would be appreciated. Thanks, Alex0