New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
-
In Google Search console
4,218,017 URLs submitted
402,035 URLs indexed
what is the best way to troubleshoot?
What is best guidance for sitemap indexation of large sites with a lot of changing content?
-
Hi Hamish
I'm not sure how many products you have listed on your website but I am only guessing that it is not 4m of even 400,000. I think the question you should be asking yourself is 'do I really need so many URLs?'
If you have 50,000 products in your site then frankly you only need maybe 51000 pages in total (including support pages, brands (maybe), categories and sub-categories. I am only guessing but I would suggest that the other pages are being created by tags or other attributes and that these elements are creating acres of duplicate and very skinny content.
My usual question is - 'so you have 400,000 (never mind 4m) pages in Google? - did you write or generate 400,000 pages of useful, interesting, non-duplicate and shareable content? The answer of course is usually no.
Try switching off sets of tags and canonicalizing very similar content and you'll be amazed how it helps rankings!
Just a thought
Regards Nigel
Carousel Projects.
-
This post from Search Engine Journal (https://www.searchenginejournal.com/definitive-list-reasons-google-isnt-indexing-site/118245/) is helpful for troubleshooting.
This Moz post (https://moz.com/blog/8-reasons-why-your-site-might-not-get-indexed) has some additional considerations. The 6th point the post author raises is one you should pay attention to given you're asking about a large e-commerce site. Point 6 says you might not have enough Pagerank, that "the number of pages Google crawls is roughly proportional to your pagerank".
As you probably know, Google has said they're not maintaining Pagerank anymore, but the essence of the issue raised is a solid one. Google does set a crawl budget for every website and large e-commerce sites often run into situations where they run out before the entire site is indexed. You should look at your site structure, robots tagging, and as Jason McMahon says, internal linking to make sure you are directing Google to the most important pages on your site first, and that all redundant content is canonicalized or noindexed.
I'd start with that.
-
Hi Hamish_TM,
It is hard to say without knowing the exact URL but here are some things to consider:
- Indexing Lag - How long ago did you submit the sitemaps? We usually find there can be at least a few weeks lag between when the sitemaps are submitted and when all the URL's are indexed.
- Internal Linking - What does your sites internal linking structure look like? Good internal linking like having breadcrumbs, in-text links, sidebar links and siloed URL structuring can help the indexation process.
- **Sitemap Errors - **Are there currently any sitemap errors listed in Google Search Console? Either on the dashboard or in the sitemaps section? Any issues here could be adding to your problem.
Hopefully, this is of some help and let me know how you go.
Regards,
Jason.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with URLs when changing shopping cart software to ensure SEO
NSFW ALERT (LINK BELOW) We are changing the shopping section of our website. Currently the products sit on our own website and when a user goes to checkout they are taken to Mals (a shopping cart site). This means our URL’s look like this. NSFWhttps://www.aprilnites.com.au/mascara_vibe.htmlThe new software is Ecwid and we are using this with a site created in RapidWeaver so the URLs will not be clean and will have all ? And # parameters. I’m wondering if this will hurt the SEO of our whole site or just the product pages. I’m also unsure of how best to deal with the current URLs. Should I use a 301 redirect on all of them to take the user back to the home page of the shop. For us the shop is more of a catalogue. Our main website is the most important part but I want to make sure we are following best practice when making this change. Hope someone can help.Many thanks
Technical SEO | | AprilN0 -
URL Indexed But Not Submitted to Sitemap
Hi guys, In Google's webmaster tool it says that the URL has been indexed but not submitted to the sitemap. Is it necessary that the URL be submitted to the sitemap if it has already been indexed? Appreciate your help with this. Mark
Technical SEO | | marktheshark100 -
Do I submit a sitemap for a highly dynamic site or not? If so, what's the best way to go about doing it?
I do SEO for online boutique marketplace. I've been here for about 4 weeks and no one's done there SEO (they've been around for about 5 years), so there's lots to do. A big concern is whether or not to submit a sitemap, and if I do submit one, what's the best way to go about doing one.
Technical SEO | | Jane.com0 -
What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
Hi, We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages. As far as I know, the page URL won't change and won't have parameters. Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots. Is it better to have URL parameters for version B and C of the content? For example: /page for the default content /page?id=2 for the B version /page?id=3 for the C version The dynamic content comes from the server side, so not all pages copy variations are in the default HTML. I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
Technical SEO | | Gyorgy.B1 -
Help with site structure needed - any assistance welcomed!
Hi all, I am currently tasked with finding a better way to optimise our website ukdocumentstorage dot com. For starters, I would like to know what our site structure actually is at present. So I would like to be able to see which pages are linking to what at the moment & which pages have broken links on which I need to remove from the content. Hopefully I'd then be able to tidy up any errors that the site already has in its internal linking. Is there a way to do this easily? Or to have a graphical representation of the sites structure? I have just signed into our Webmaster Tools account and I am faced with a list of 10 'Crawl Errors' which are all 404 errors. Some of them do not actually exist anymore, but are still being linked to from a few pages according to WMT. For example, /industries_served_legal.htm is still being linked to from 5 of our pages (including /industries_served_local_authority.htm) However, this doesn't seem to be a case at all on the page as I can't find a link to /industries_served_legal.htm on /industries_served_local_authority.htm. Any advice as to why this is happening? Is there a way to find out easily where these broken links are situated on the page? And if I do actually manage to find our broken links, how would I go about removing them? The page /document_security.htm doesn't exist in our Sitewizard list of pages anymore, yet still exists online. How do I go about deleting this unecessary page properly? And does this harm our rankings? The document_security page also has an extra link on the top toolbar to a Document Management page, an addition which is no longer present on our up to date pages. Now this page (and the extra dropdown page when you hover over it) still exist on our list of Sitewizard pages at the moment, but we obviously no longer want to have these online anymore. How should I remove these? I understand that this is a lot of information, and so I would appreciate any help that can be given on these! Many thanks
Technical SEO | | janc0 -
What is the best practice to handle duplicate content?
I have several large sections that SEOMOZ is indicating has duplicate content, even though the content is not identical. For example: Leather Passport Section - Leather Passports - Black - Leather Passposts - Blue - Leather Passports - Tan - Etc. Each of the items has good content, but it is identical, since they are the same products. What is the best practice here: 1. Have only one product with a drop down (fear is that this is not best for the customer) 2. Make up content to have them sound different? 3. Put a do-no-follow on the passport section? 4. Use a rel canonical even though the sections are technically not identical? Thanks!
Technical SEO | | trophycentraltrophiesandawards0 -
Old URL redirect to New URL
Alright I did something dumb a year a go and I'm still paying for it. I changed my hyphenated URL to the non-hyphenated version when I redesigned my website. I say it was dumb because I lost most of my link juice even though I did 301 redirects (via the htaccess file) for almost all of the pages I could find in Google's index. Here's my problem. My new site took a huge hit in traffic (down 60%) when I made the change and even though I've done thousands of redirects my old site is still showing up in the SERPS and send much if not most of my traffic. I don't want to take the old site down in fear it will kill all of my traffic. What should I do? Is there a better method I should explore then 301 redirects? Could the other site be affecting my current rank since it's still there? (FYI...both sites are built on the WP platform). Any help or ideas are greatly appreciated. Thank you! Joe
Technical SEO | | kaje0 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0