Handling long URLs and overly-dynamic URLs on eCommerce site
-
Hello Forum,
I've been optimizing an eCommerce site and our SEOmoz crawls are favorable for the most part, except for long URLs and overly-dynamic URLs. These issues stem from two URL types: Layered navigation (faceted search) and non-Google internal search results. I outline the issues for each below.
We use an SEO-friendly URL structure for our product category pages, but once bots start "clicking" our layered navigation options, all the parameters are appended to our SEO-friendly urls, causing the SEOmoz crawl warnings.
Layered Navigation :
SEO-Friendly Category Page: oursite.com/shop/meditation-cushions.htmlEffects of layered navigation: oursite.com/shop/meditation-cushions.html?bolster_material_quality=414&bolsters_appearance=206&color=12&dir=asc&height=291&order=name
As you can see the parameters include product attributes and page sorts. I should note that all pages generated by these parameters use the element to point back to the SEO-friendly URL We have also set up Google's Webmaster Tools to handle these parameters.
Internal Search Function:
Our URLs start off simple: oursite.com/catalogsearch/result/?q=brown. Then the bot clicks all the layered navigation options, yielding oursite.com/catalogsearch/result/index/?appearance=54&cat=67&clothing_material=83&color=12&product_color=559&q=brown. Also, all search results are set to noindex,follow.My question is: Should we worry about these overly-dynamic and long ULR warnings? We have set up canonical elements, "noindex,follow" solutions, and configured Webmaster Tools to handle our parameters. If these are a concern, how would you resolve these issues?
-
I see this thread was from last year, so I am hoping between then and now you have determined an answer and would be able to advise. I am having the same issue with our consumer sight.
-
If you make them friendly it will shorten them
x=y can become y
But having done that and they are still too long i would ignore them as they are no-index.
-
There another company handling the server side of things. All I know is that we're using PHP and MySQL for Magento.
Even if we did a friendly URL rewrite, wouldn't we still get long URLs? We would just have each parameter become words separated by slashed. i.e .
/shop/meditation-cushions.html/high quaily/patterened/green/10inches/sortedbyname/
I suppose these URLs shorter. Is something like this better?
-
Marc
The crawl warnings are those found in SEOmoz's crawl diagnostics: "Overly-Dynamic URL" and "Long URL." These are not duplicate content issues and the URLs resolve properly.
I just want to make sure we're not getting dinged for having URLs that are too long. If we are, what are some way to go about shortening them?
-Aaron
-
What kind of "crawl warnings" are we talking about here? Duplicate content? Do the URL's resolve properly when the additional parameters are appending to the SEO-friendly URL's?
"I should note that all pages generated by these parameters use the element to point back to the SEO-friendly URL We have also set up Google's Webmaster Tools to handle these parameters."
Keep in mind, using canonical tags is like setting up 301 redirects on all those pages. Some people don't now that, so I thought I'd just throw it out there. So, if any of those additional pages with the host of parameters contain unique/different content than the seo-friendly versions, using canonical tags is not a good move as they will get no attention from search engines that respect the canonical tag.
For example, do not use a canonical tag on a 'Page 2' to point back to page 1. Each page will contain different information/products/whatever, and you want search engines (SE) to see and index those pages, regardless of what the URL looks like (as long as it works and your Title/META/H1-H6 tags are all in order to reflect the different content on each page.
I'm not sure I'm following your concern 100% percent, so I hope I was on the right path with what I said. Can you please be more specific as to what you concern is with the "overly-dynamic and long ULR warnings" please, and I'll be happy to help you out some more.
- Marc
-
The easy fix is the canonical, yet Bing suggest not using the canonical on the true page, only the duplicates. Best if you can handle that in code, but not a big worry of you cant.
Facet naviagtion is a big problem, with no easy answers.
What sort of server are you using, on a windows server it is very easy to set up friendly urls for your dynamic urls.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Long Do The Link Tracking Lists Take To Update?
Hi, How long do the link tracking lists take to update? It's been over a week and each is still showing a red cross. The reason I created it was because I migrated to a new domain name and Moz is still showing the backlinks on the old property and not the new (the domain swap happened in December 2020). I can see that Ahrefs has picked up all of the links - Both new and redirected - but Moz has not. When will this be reflected in Moz as it has already been over three months? Is there a reason for the above questions? I appreciate any response here. 🙂
Moz Pro | | Smarter_Finances3 -
How to set up Rel=canonical in Joomla based sites
I've built a few sites using joomla (please don't tell me I should be using wordpress!!) and wondered how I can add the rel-canonical to these pages. I'm assuming it would come as a plugin or module but can't seem to find anything that works right for me. Anyone any ideas? Thanks in advance, Gordon
Moz Pro | | Gordon_Hall0 -
URL, Subdomain and Root Domain Structure
Various URL Structure
Moz Pro | | Mark_Ch
mydomain.co.uk
www.mydomain.co.uk
http://www.mydomain.co.uk
http://mydomain.co.uk
mydomain.co.uk/index.html
www.mydomain.co.uk/index.html
http://www.mydomain.co.uk/index.html
http://mydomain.co.uk/index.html HTACCESS File Index Rewrite RewriteRule ^index.(htm|html|php) http://www.mydomain.co.uk/ [R=301,L]
RewriteRule ^(.)/index.(htm|html|php) http://www.mydomain.co.uk/$1/ [R=301,L]
RewriteCond %{HTTP_HOST} ^mydomain.co.uk
RewriteRule ^(.)$ http://www.mydomain.co.uk/$1 [R=301,L] Google WMT Setting: Configuration | Settings
Preferred domain: radio check on "don't set a preferred domain" SEOMoz Open Site Explorer
mydomain.co.uk - (301 Redirect) [No Data] PA38 DA30
http://www.mydomain.co.uk/index.html - (301 Redirect) [No Data] PA23 DA30 Majestic Site Explorer
Number of Referring Domains & External Backlinks vary between the following instances:
URL: http://www.mydomain.co.uk
SUBDOMAIN: www.mydomain.co.uk
ROOT DOMAIN: mydomain.co.uk
Question
I have set up my htaccess file to rewrite "Various URL Structure" to www.mydomain.co.uk. However when i view metrics in Majestic SEO, the url / Subdomain / Root Domain all differ. Why is this happening?
Is this harming my site?
What is common practice when defining URL Structure? Any other quality advise and implementation structure would be much appreciated. Regards Mark0 -
How Would You Plan Long Term SEO (1 year and more)?
Hi, I'm new to SEO and learning fast. Myself joined together with my friends have set up a string of websites each for different products to sell online. To start with we have finalized keywords, optimized the on-page using the on-page analysis tool and right now about to start working on link building. For the first month we have planned to do around 20 bookmarking, 3 articles each to 15 article directories, 30 directory submissions (priority to niche based), business page listing in hot frog etc, 3 articles each submitted to 10 web 2.0 properties, 1 press release to 15 pres release sites, some Q&A links and few blog comments, 10 video submission and 1 guest blogging. We have also completed setting up facebook fan page and twitter account and active in them too. For anchor text diversity we will be using keywords only in links from article submission and web 2.0. For links from other methods, the anchor text will be either website name or website url. And we will be targeting 4 keywords per website (2 keywords for home page and 2 for 2 sub-pages). The difficulty level of the keywords range from 40% to 60%. Now, I have few questions which I believe the experts over here can help. 1. For the first month we have planned the above link building, but hpw build links from different websites in coming month? 2. For web 2.0 properties we can keep adding articles to the same blog we have created or we need to create separate set of web 2.0 properties for each month. 3. Are we missing any link building methods or strategy? If so, can you please tell me the method? I know some of the questions might be silly, but being a beginner it would be a great help to know the answer for these questions from this community. Thanks, Sridhar
Moz Pro | | chosenindian0 -
How to get past PA and DA value for a specific URL ?
Hi everyone, I was wondering if there is a way to get the past PA and DA value for a specific URL ? I did run a small SEO campaign targeting a couple of deep pages over a month on my site and I would like to measure the efficiency of this campaign but I forgot to write down what was the PA (I know more aloess the DA) of those pages before the starting the campaign. Is their a way to retrieve the historical data of PA/DA ? thanks
Moz Pro | | Gus_Martin0 -
Will SEOMoz offer URL data relating to Bot visits
Does SEOMoz in the future plan to report on Bot visits for each URL, when they are spidered and when they appear in for example Google's index ?
Moz Pro | | NeilTompkins0 -
How long does a crawl take?
A crawl of my site started on the 8th July & is still going on - is there something wrong???
Moz Pro | | Brian_Worger1 -
Any tools for scraping blogroll URLs from sites?
This question is entirely in the whitehat realm... Let's say you've encountered a great blog - with a strong blogroll of 40 sites. The 40-site blogroll is interesting to you for any number of reasons, from link building targets to simply subscribing in your feedreader. Right now, it's tedious to extract the URLs from the site. There are some "save all links" tools, but they are also messy. Are there any good tools that will a) allow you to grab the blogroll (only) of any site into a list of URLs (yeah, ok, it might not be perfect since some sites call it "sites I like" etc.) b) same, but export as OPML so you can subscribe. Thanks! Scott
Moz Pro | | scottclark0