Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Overly-Dynamic URL
-
Hi,
We have over 5000 pages showing under
Overly-Dynamic URL error
Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like,
http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y
http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all
http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3
Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there?
like:
Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y
if not how far into the url should be disallowed?
So far we have added the following to our robots,txt
Disallow: /?sort=title
Disallow: /?use_selected_filter=Y
Disallow: /?sort=price
Disallow: /?clearall=Y
Just not sure if they are correct.
Any help would be greatly appreciated.
Thank you,Kami
-
Hi Kami,
It's unfortunate, but a number of modern day e-commerce platforms still suffer from poor canonicalisation and multiple URL's.
If possible, rather than blocking off access to those queries via robots.txt or meta, I'd start by trying to specify a canonical URL when a query is created.
E.G
Query: http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all
Canonical: http://www.dellamoda.com/Designer-Accessories.htmlFailing that I'd try to implement a "follow,noindex" meta tag or via x-robots if you're any good with apache.
If that's still a no go, then try GWT, Google is getting much better at handling dynamic URL's within e-commerce platforms and you can specify which queries Google should ignore directly within GWT
There's a great post on Moz that deals with e-commerce and canonicalisation - http://www.seomoz.org/blog/qa-from-ecommerce-seo-fix-and-avoid-common-issues-webinar - I'd suggest starting there!
As a last resort I'd look to block the URL's within robots.txt, but this prevents crawlers from flowing freely within your site and can result in poor indexation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I treat URLs with bookmarks when migrating a site?
I'm migrating an old website into a new one, and have several pages that have bookmarks on them. Do I need to redirect those? or how should they be treated? For example, both https://www.tnscanada.ca/our-expertise.html and https://www.tnscanada.ca/our-expertise.html#auto resolve .
Intermediate & Advanced SEO | | NatalieB_Kantar0 -
Sanity Check: NoIndexing a Boatload of URLs
Hi, I'm working with a Shopify site that has about 10x more URLs in Google's index than it really ought to. This equals thousands of urls bloating the index. Shopify makes it super easy to make endless new collections of products, where none of the new collections has any new content... just a new mix of products. Over time, this makes for a ton of duplicate content. My response, aside from making other new/unique content, is to select some choice collections with KW/topic opportunities in organic and add unique content to those pages. At the same time, noindexing the other 90% of excess collections pages. The thing is there's evidently no method that I could find of just uploading a list of urls to Shopify to tag noindex. And, it's too time consuming to do this one url at a time, so I wrote a little script to add a noindex tag (not nofollow) to pages that share various identical title tags, since many of them do. This saves some time, but I have to be careful to not inadvertently noindex a page I want to keep. Here are my questions: Is this what you would do? To me it seems a little crazy that I have to do this by title tag, although faster than one at a time. Would you follow it up with a deindex request (one url at a time) with Google or just let Google figure it out over time? Are there any potential negative side effects from noindexing 90% of what Google is already aware of? Any additional ideas? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945010 -
Inactive Products - Inactive URLs
Hi, In our website www.viatrading.com we have many products that might be in stock or not depending on availability. Until now, when a product was not available anymore, we took this page down (and redirected to its product category page). And, only if the product was available again, we re-activated the URL - this might be days, months or even years later. To make this more SEO-friendly, we decided now that while a product is not available, instead or deactivating/redirecting the page, we will leave it online and just add a message saying "This product is currently not available". If we do this, we will automatically re-activate about 500 products pages at once. 1. Just to make sure, is it harmful for SEO to keep activating/deactivating URLs this way? 2. Since most of these pages have been deindexed for a long time due to being redirected - have they lost all their SEO juice? 3. How can we better activate these old 500 pages - is it ok activating them all at once? Thank you,
Intermediate & Advanced SEO | | viatrading11 -
Are these URL hashtags an SEO issue?
Hi guys - I'm looking at a website which uses hashtags to reveal the relevant content So there's page intro text which stays the same... then you can click a button and the text below that changes So this is www.blablabla.com/packages is the main page - and www.blablabla.com/packages#firstpackage reveals first package text on this page - www.blablabla.com/packages#secondpackage reveals second package text on this same page - and so on. What's the best way to deal with this? My understanding is the URLs after # will not be indexed very easily/atall by Google - what is best practice in this situation?
Intermediate & Advanced SEO | | McTaggart0 -
How to deal with URLs and tabbed content
Hi All, We're currently redesigning a website for a new home developer and we're trying to figure out the best way to deal with tabbed content in the URL structure. The design of the site at the moment will have a page for a development and within that you can select your house type, then when on the house type page there will be tabs displayed for the user to see things like the plot map, availability and pricing, specifications, etc. The way our development team are looking at handling this is for the URL to use a hashtag or a query string at the end of it so we can still land users on these specific tabs for PPC for example. My question is really, has anyone had any experience with this? Any recommendations on how to best display the urls for SEO? Thanks
Intermediate & Advanced SEO | | J_Sinclair0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Exact keyword URL or not?
Hi all, I have a quick question about the proper use of permalinks. Let's say that I have a website about sports and I want to create an internal page dedicated to shoes. I know that the keyword "shoe" has 15.000 monthly visits, while the keyword "shoes" has 1.000 monthly visits. How do I have to name the internal page? http://www.example.com/shoe or http://www.example.com/shoes (with a final 's')? I would think that by naming the URL http://www.example.com/shoes, the search engine would consider that page for the keywords "shoe" and "shoes", but I am not sure about it. Should I create a URL that only focuses on one specific keyword ("shoe", in this example) or a URL that may encompass more than one keyword ("shoe" and "shoes")? I hope this is clear. Thank you for your time and help. All best, Sal
Intermediate & Advanced SEO | | salvyy0 -
Should I Use City Name in URL?
Having a website designed for a car dealership and deciding what attributes to use in the URL. Should I include the city name in the URL? Or does that help for SEO purposes? Other ideas of what to research or try are appreciated too. Thanks 🙂
Intermediate & Advanced SEO | | kylesuss0