Url structure for multiple search filters applied to products
-
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not.
Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is:
http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring
We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be:
http://www.example.com/main-keyword/vintage/blue/spring/
I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment.
So I guess the questions are:
-
Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep?
-
Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
-
-
Hey, that sounds fairly solid. let me know how you get on.
-
Thanks for the links and the advice, Marcus.
I think after reading through the material I will meta noindex any search that has more than one search filter applied. So I'll index "blue" or "vintage" but not "vintage/blue" for instance. The most important top level search filters will become category pages, more or less. I'll try to tailor their content to reflect their importance. Thanks for your input!
-
Hey,
Certainly, if you could potentially create 150,000 search result pages from only 200 or so products, then you are straying into the ground of near duplicate pages and what is often known as 'search within search'. As you stated, chances are not only could these pages be problematic in themselves, they may drag down other pages.
My advice here would be to try and tie this to your search marketing and keyword research. Look at the actual terms that get searched for and consider some pages that may be useful. Then, if you don't have a page for this, then consider creating maybe tags or categories for these few (certainly less than 150,000) pages and supplement these pages with some additional unique content if there is duplication with other categories.
In fact, try to keep the duplication as low as possible and also try to stick to best practice with those search category pages (canonical, prev next, show all page etc).
Certainly have the search, but I would most likely hide it from search engines and noindex the deep search pages but supplement these with some category pages and/or content pages as tied to your keyword strategy.
Some interesting reading:
http://www.mattcutts.com/blog/search-results-in-search-results/
http://www.seomoz.org/blog/fat-pandas-and-thin-content
Alternatively, you could always tinker, and have a go, and then put things back, but odds are, this approach is just creating nearly 150,000 near duplicate pages which are exactly the kind of pages they are currently trying to remove from the index so your main landing pages may end up being collateral damage.
Hope this helps!
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Filtering Views in GA
Hi there, Does anyone here any experience in filtering views in Google Analytics by TLD? I thought the filter type of hostname would have done what I was looking for but it hasn't and I can only find information online about doing it for subdomains rather than top level ones. Many thanks in advance.
Intermediate & Advanced SEO | | BAO.Agency0 -
Corporate website in multiple locations and with multiple services
Dear Team, We are a corporate setup in India, we have the following services under different brand names: 1. Security guard services - ORION 2. Facility management services - NOIRO 3. Investigation services - Ascertain Solutions We are located in different locations - India, Bahrain and Abu Dhabi What is the best way to structure our website properties for getting maximum SEO benefit, please inform from the following options: i. One website with one blog, we will have multiple pages dedicated to each service where page titles can be the brand names of the service and the location, the domain will be of the parent company only. So everything comes under orionsecure.com. There will be a security services page for India, for Abu Dhabi , for Bahrain. Similar for investigation and facility management. **ii. Multiple websites for different locations and different services, **so there will be orionsecure.com, orionsecure.ae and orionsecure.bh. Also, there will be a noiro.com, noiro.ae etc. Each website will have a blog for content publishing (although it is hard to imagine how we can develop content for different locations). **iii. One website each for each service, location shown differently through domain masking, **so we keep developing different pages as per the different locations on one website only, but this is shown differently through domain masking. Please help with this query, I really need an answer to this. If any more questions, please connect on naman.arora@orionsecure.co.in or call on +91-8510999664. Thanks and Regards, Naman Arora
Intermediate & Advanced SEO | | Ascertain_Solutions0 -
Need suggestion for URL structure to show my website in Google News section
I need suggestion for URL structure to include my news section in Google News.Google recommend any specific URL structure to include website in Google News.?
Intermediate & Advanced SEO | | vivekrathore0 -
Multiple Ecommerce sites, same products
We are a large catalog company with thousands of products across 2 different domains. Google clearly knows that the sites are connected. Both domains are fairly well known brands - thousands of branded searches for each site per month. Roughly half of our products overlap - they appear on both sites. We have a known duplicate content issue - both sites having exactly the same product descriptions, and we are working on it. We've seen that when a product has different content on the 2 sites, frequently, both pages get to page 2 of the SERPs, but that's as far as it goes, despite aggressive white hat link building tactics. 1. Is it possible to get the same product pages on page 1 of the SERPs for both sites? (I think I know the answer...) 2. Should we be canonicalizing (is that a word?) products across the sites? This would get tricky - both sites have roughly the same domain authority, but in different niches. Certain products and keywords naturally rank better on 1 site or the other depending on the niche.
Intermediate & Advanced SEO | | AMHC0 -
Site Structure Question
Hi All, Got a question about site structure, I currently have a website where everything is hosted on the root of the domain. See example below: site.com/men site.com/men-shorts site.com/men-shorts-[product name] I want to change the structure to site.com/men/shorts/[product-name] I have asked a couple of SEOs and some agree with me that the structure needs to be changed and some say that as long as I dictate the structure with internal links and breadcrumbs the URL structure doesn't matter... What do you guys think? Many thanks, Carlos
Intermediate & Advanced SEO | | Carlos-R0 -
URL for offline purposes
Hi there, We are going to be promoting one of our products offline, however I do not want to use the original URL for this product page as it's long for the user to type in, so I thought it would be best practice in using a URL that would be short, easier for the consumer to remember. My plan: Replicate the product page and put it on this new short URL, however this would mean I have a duplicate content issue, would It be best practice to use a canonical on the new short URL pointing to the original URL? or use a 301? Thanks for any help
Intermediate & Advanced SEO | | Paul780 -
How to Hide Directories in Search?
I noticed bad 404 error links in Google Webmaster Tools and they were pointing to directories that do not have an actual page, but hold information. Ex: there are links pointing to our PDF folder which holds all of our pdf documents. If i type in , example.com/pdf/ it brings up a unformated webpage that displays all of our PDF links. How do I prevent this from happening. Right now I am blocking these in my robots.txt file, but if i type them in, they still appear. Or should I not worry about this?
Intermediate & Advanced SEO | | hfranz0