How can I best handle parameters?
-
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning!
The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here:
- Focus (ex: “Data Science”)
- Cost (ex: “$<5000”)
- City (ex: “Chicago”)
- State/Province (ex: “Illinois”)
- Country (ex: “Canada”)
When a filter is applied to the directories page the CMS produces a new page with URLs like these:
- coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago
- coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork
My questions:
1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls.
coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago
VERSUS
coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all)
2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city?
-
Should I be changing page titles for the unique filtered URLs?
-
I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ?
https://support.google.com/webmasters/answer/1235687
An assortment of the other stuff I’ve read for reference:
http://www.wordtracker.com/academy/seo-clean-urls
http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail
http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/
http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
-
I think you have your answer then on how you want to focus your URLs and your site!
-
Absolutely helpful. I really appreciate it. I think one real use case that I may want to solve for is the "focus" plus "city" combo. Ie: "data science schools in chicago". Based on the research I've done thus far I think that may be the only permutation really worth worrying about. Again - thanks a lot!
-
I am not going to be very helpful here.
Looking at those parameters and all the options you would have for URLs, yes you are ripe for duplicate content issue and a whole mess of search engine problems/confusion.
I read this the other day in the QNA forum here at Moz and I wish I could remember to give them credit for the quote, they said "Don't submit search results to the search engines" - so true - so true ....
Why? You end up with an almost infinite number of thin, duplicate pages that Google then does not know which ones to rank. Even if you put all the parameters into a static URL you still have the same problem.
I think you need to step back a sec
Are people searching for "data science schools in Chicago Illinois that cost less than $5000"?
Why would you even want to attempt to setup pages that could potentially rank for those terms based on the URL?
Launch the search function on the site, but hide all the search URLs behind robots.txt
Just setup things like
/search/?focus=datascience&cost=$<5000&city=chicago
/search/focus/datascience/cost/$<5000/city/chicago
put /search/ in robots and you are set
Another option (from one of my favorite WBF http://moz.com/blog/whiteboard-friday-using-the-hash)
Hide all the parameters behind the hash and they stay hidden from the search engines
/schools#?focus=datascience&cost=$<5000&city=chicago
Then go back, do your keyword research and build helpful static URL pages around what your users are searching for and then get those pages to rank. If that ultimately is the type of page above, I would bet you $3,141 plus an apple pie that you need to setup a simpler organization of pages and urls around location say /il/chicago/school-name or type /data-science/school-name and then all the other iterations, you would hide behind a hash etc.
Maybe this did help - I hope so.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can we re rank our Penalyzed website in Google?
Hello This is Maqbul, from India. I have a jobs portal blog [ bharatrecruit.com]. It was getting around 50K to 100K Views a Day and made me $100 a day. But after a few months, my competitor made negative SEO with 12,000 Spammy backlinks. Suddenly my site was hit by Google and now it is getting 200 to 300 Pageviews a day. So the question is I did not disavow bad links for a long time like 3 to 4 months. Now I disavow all the bad links but the website is not ranking. Can we re-rank this site or create another website. Please reply must. None of the bloggers can answer this. Thanks, Regards Maqbul
Technical SEO | | vinaso960 -
Canonical in head best practice
Hi Is putting a list of canonical no follow links in the head the best practice? From SEO Moz analysis urls of duplicate content was flagged but now I have lots of cononicals in the head of my doc and the code looks untidy see head here : http://carpetflooringsdirect.com/ Is there a cleaner way of doing this? and how do I retest to see if I have fixed? Many thanks Matt
Technical SEO | | Matt-J0 -
Google Search Parameters
Couple quick questions. Is using the parameter pws=0 still useful for turning off personalization? Is there a way to set my location as a URL parameter as well? For instance, I want to set my location to United States, can this be done with a URL param the same way as pws=0?
Technical SEO | | nbyloff0 -
How is a dash or "-" handled by Google search?
I am targeting the keyword AK-47 and it the variants in search (AK47, AK-47, AK 47) . How should I handle on page SEO? Right now I have AK47 and AK-47 incorporated. So my questions is really do I need to account for the space or is Google handling a dash as a space? At a quick glance of the top 10 it seems the dash is handled as a space, but I just wanted to get a conformation from people much smarter then I at seomoz. Thanks, Jason
Technical SEO | | idiHost0 -
Google Places - What is the best Service Areas Strategy?
I've found a lot of useful info on this topic in these forums, but still can't seem to find the answer to my specific question. Client has one physical location and services many areas. I have seen various comments that claim setting a service area actually has a negative effect on rankings and the login makes sense to me, so we don't want to do that. Using the actual physical address, seems to be what google would prefer, but the address is actually on the outskirts of the city and would mean that competitors that have addresses closer to the city center would show up before us. Our current places listing has the actual address, but the previous SEO put the larger city, with the smaller city zip on the on the website. City Center: San Diego, 92101 Actual: Street Address, El Cajon, 92020 On website: San Diego, 92020 It this large City + Actual zip code strategy any good? Which of these 3 strategies should we use to standardize all of our listings? *we will not be considering a location or mailbox per service are to use multiple listings at this time
Technical SEO | | vernonmack0 -
Best practise for updating software guide
Heya! I write a guide for a specific piece of Internet-based software which is about to undergo a major patch release. No-one's going to be using the old version, so my old-version articles are essentially going to be useless, as are keywords related to the old version number. Given that, I'm intending to update all my guides to be current with the new version. However, obviously I want to keep the Google juice for the old guides, as they rank pretty well. The three options I'm considering: Simply retitle the old guides to the latest version number - "How to use Blue Widget 2.0" becomes "How to use Blue Widget 3.0". Disadvantage - my URLs still include the old version number, 2.0. Write updated guides as seperate articles and 301 redirect the old articles to them. I've done this before with some success. So, I'd 301 the URL for "How to use Blue Widget 2.0" to the url for "How to use Blue Widget 3.0", my new article. Disadvantages - possible loss of link juice? Also, I believe redirects can be kinda tricksy. Just leave both the old and new versions up there, with a link from the old version saying "outdated, check the new version". My belief is that this would be the worst idea. Should I do one of them, or something else? And why?
Technical SEO | | Cairmen0 -
How can I improve my google places ranking?
I am currently registered with google places for 'video conferencing in Melbourne australia' however I don't show up on page 1 of the places sesrch results for this search term. How can I improve it. I do note that my office address is in a residential area and not Melbourne CBD. Thanks Dan
Technical SEO | | dantmurphy0 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0