Does a sitemap override Google parameter handling?
-
This question might seem silly, but I'll ask anyway.
We have an eCommerce site with a ton of duplicate content, mostly caused by faceted navigation. In researching ways to reduce the clutter, I've decided to use Google parameter handling to stop Googlebot from crawling pages with certain parameters, like: sort order, page #, etc...
Now my question:
If I set all of these parameters so that Googlebot doesn't crawl the grids, how will they ever find the individual product pages? We do upload a sitemap with all of the product pages. Does this solve my issue? Or, should I handle the duplicate content with noindex, follow tag?
Or, is there an even better way?
Thanks
-
Hello John,
This is a very good question, and something people don't often think about when blocking the navigational paths on their site from being crawled.
Depending on how fast your category pages load and how many products are on each of them, you may consider a View All Canonical page: http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html
There are many different ways to handle faceted navigation problems, including javascrpt, GWT parameter handling, robots meta, robots.txt, rel canonical... and combinations of these. The right approach should be customized for your specific needs. When possible, I prefer to allow Google to crawl and index down to a certain level of faceting, similar to allowing them into sub-categories (though it depends entirely on your taxonomy) but not tertiary (i.e. sub-sub) categories. For the next couple of levels I might allow them to crawl, but not index. And once it gets down to 4 or 5 levels deep (e.g. /?category=1&size=5&color=blue&price=low&this=that&so-on=so-forth...) I just block them from being both indexed and crawled (i.e. Meta NOINDEX,NOFOLLOW or robots.txt block) to save crawl budget by avoiding spider traps.
With all of that said, if you are giving Google an XML sitemap that contains the indexable URLs to all of your products they should have no problem indexing them, regardless of whether or not they can crawl all the way through your faceted navigation.
-
I would recommend you to use 'Canonical Link'
You can find more here:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not taking Meta...
Hello all, So I understand that Google may sometimes take content from the page as a snippet to display on SERPs rather than the meta description, but my problem goes a little beyond that. I have a section on my site which updates everyday so a lot of the content is dynamics (products for a shop, every morning unique stock is added or removed), and despite having a meta description, title and receiving an 'A' grade in the MOZ on page grader, these pages never show up in Google. After a little research I did a 'site:www.mysite.com/productpage' in Google and this indeed listed all my products, but interestingly for every single one Google had taken the copyright notice at the bottom of the page as the snippet instead of the meta or any H1, H2 or P text on the page... Does anyone have any idea why Google is doing this? It would explain a lot to me in terms of overall traffic, I'm just out of ideas... Thanks!
Intermediate & Advanced SEO | | HB170 -
Big discrepancies between pages in Google's index and pages in sitemap
Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen
Intermediate & Advanced SEO | | Digirank0 -
Google drop down - keyword gone, why?
Hi guys, i received traffic off a yearly based term, this year for '2013' i noticed it is nowhere near what the yearly term was for the year before. I believe that Google has stopped the yearly term appearing in a drop-down menu from a big volume related term, my question is how do they determine what goes in the drop down menu for related/relevant searches?
Intermediate & Advanced SEO | | pauledwards0 -
DMCA Complaint to Google - HELP
I have several sites copying my content, which I found out via Copyscape.com. Unfortunately, this is giving me duplicate content. I filed a DMCA complaint through Google and the infringing pages were approved but the pages still remain. Can someone please help me understand this better? I thought Google was supposed to remove these pages? Am I supposed to content the site owner to get the content removed or are their pages simply de-indexed?
Intermediate & Advanced SEO | | tutugirl0 -
Google badge extracted to SERPs
It's a while ago that (i thought) I read the following information on the Google badge. Here https://developers.google.com/+/plugins/badge/ we have the implementation guide, however I was under the impression the Google badge could be thereafter extracted into SERPs so the user could follow etc... direct from SERPs. I can't find anything confirming this. I think that it might clash with authorship data which does a similar job, but where a site page is not relevant to authorship at all, I would have thought linking back to the G+ page from SERPs was a sensible option. Can anyone confirm the Google badge can appear in SERPs?
Intermediate & Advanced SEO | | richcowley0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
How do I index these parameter generated pages?
Hey guys, I've got an issue with a site I'm working on. A big chunk of the content (roughly 500 pages) is delivered using parameters on a dynamically generated page. For example: www.domain.com/specs/product?=example - where "example' is the product name Currently there is no way to get to these pages unless you enter the product name into the search box and access it from there. Correct me if I'm wrong, but unless we find some other way to link to these pages they're basically invisible to search engines, right? What I'm struggling with is a method to get them indexed without doing something like creating a directory map type page of all of the links on it, which I guess wouldn't be a terrible idea as long as it was done well. I've not encountered a situation like this before. Does anyone have any recommendations?
Intermediate & Advanced SEO | | CodyWheeler0 -
Whats the best way to handle product microformats such as hproduct, goodrelations on ecommerce for Google?
With web3.0 results with microfrmatting showing in google, yahoo etc through reviews, instock, events, sales, pricing etc.
Intermediate & Advanced SEO | | RampUpInteractive0