Should we use URL parameters or plain URL's=
-
Hi,
Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site.
Let's say we are creating a AirBNB clone, and we want to be found when people search for
apartments new york.
As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so
clone.com/Appartments/New-York
but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft?
clone.com/Apartments/New-York?price=30&size=100
or (We are using Node.js so no problem)
clone.com/Apartments/New-York/Price/30/Size/100
The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google.
I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter.
We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
-
Personally, I would agree with you an opt for the following option:
clone.com/Apartments/New-York?price=30&size=100I don't think it matters whether that section of the URL is readable to everyone. I would actually say that anyone who has a technical background would find the URL above easier to change than the other one, as having /'s in the URL almost symbolised different directories rather than a parameter (that's how I would generally interpret it anyway).
I think in the grand scheme of things, It's going to make little different as you don't want the additional sections to actually be indexed in the search engines. Like Gary correctly pointed out, you can setup 'URL Parameters' in GWT and I think that's your best option. There's more information about that here - http://googlewebmastercentral.blogspot.co.uk/2011/07/improved-handling-of-urls-with.html
You could also use robots.txt to block the parameters in the URL but this depends on whether the search engine crawling your website chooses to use it.
Hope this helps!
Lewis -
Good example of a site that does show up in the SERPs for all things related
-
OK, not to sit on the fence here but both are good options.
However when it comes to "URL Parameters" there is a section in Webmaster Tools that you can set to ignore certsin parameters. So that's always an option.
I like to look at sites like oodle in cases like this.
Here is an example
they spent a lot of time working out the best process and they use the node type url.
However Google has been said to prefer shorter urls recently.
Hope my sitting on the fence did not make things worse LOL
-
Personally I would just $_POST price and size - and be done with it. ( as opposed to $_GET which shows the parameter in the URL ) - No need to over think creating more URLs and complicating life.
If anything - you can define in WMT what price is and what size is but just keep it clean. Also, remember # tags in the URL doesn't get followed by google. So, clone.com/Apartments/New-York#price=30&size=100 could work too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?
I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
Intermediate & Advanced SEO | | SearchStan
https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?1 -
What's with the Keyword Apocalypse?
Hi, 9 of my tracked keywords have dropped by over 20 ranks since last week. The nastiest drops in ranking are by 36, 38, and 46 places. For the last month I have been chipping away at the duplicate content with 301 redirects and was expecting my keyword rankings to improve slightly as a result of this; not the opposite. I don't have any manual actions logged against my site and am at a bit of a loss to explain this sudden drop. Any suggestions would be most welcome.
Intermediate & Advanced SEO | | McCaldin1 -
Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
Hi everyone,
Intermediate & Advanced SEO | | AxialDev
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords. Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French. Background info: In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites. Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE). We have a lot of sites on our C-Block, some of poor quality. We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox. We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business. Only a third of our organic visits come from Canada. What are our options? Change domain and delete the current one? Disallow the blog except for a few good articles, hoping it helps Google understand what we really do. Keep donating to Adwords? Any help greatly appreciated!
Thanks!2 -
My blog's categories are winning over my landing pages, what to do?
Hi My blogs categories for the ecommerce site are by subject and are similar to the product landing pages. Example Domain.com/laptops that sells laptops Domain.com/blog/laptops that shows news and articles on laptops Within the blog posts the links of anchor laptop are to the store. What to do? Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Starting Over with a new site - Do's and Don'ts?
After six months, we've decided to start over with a new website. Here's what I'm thinking. Please offer any constructive Do's or Don'ts if you see that I'm about to make a mistake. Our original site,(call it mysite.com ) we have come to the conclusion, is never going to make a come back on Google. It seems to us a better investment to start over, then to to simply keep hoping. Quite honestly, we're freakin' tired of trying to fix this. We don't want to screw with it any more. We are creative people, and would much rather be building a new race car rather than trying to overhaul the engine in the old one. We have the matching .net domain, mysite.net, which has been aged about 6 years with some fairly general content on a single page. There are zero links to mysite.net, and it was really only used by us for FTP traffic -- nothing in the SERPS for mysite.net. Mysite.NET will be a complete redesign. All content and images will be totally redone. Content will be new, excellent writing, unique, and targeted. Although the subject matter will be similar to mysite.COM, the content, descriptions, keywords, images -- all will be brand spankin' new. We will have a clean slate to begin the long painful link building process.We will put in the time, and bite the bullet until mysite.NET rules Google once again. We'll change the URL in all of our Adwords campaigns mysite.net. My questions are: 1. Mysite.com still gets some ok traffic from Bing. Can I leave mysite.com substantially intact, or does it need to go? 2. If I have "bad links" pointing to mysite.com/123.html what would happen if I 301 that page to mysite.NET/abc.html ? Does the "bad link juice" get passed on to the clean site? It would be a better experience for users who know our URL if they could be redirected to the new site. 3. Should we put Mysite.net on a different server in a different clean IP block? Or doesn't matter? We're willing to spend for the new server if it would help 4. What have I forgotten? Cheers, all
Intermediate & Advanced SEO | | DarrenX0 -
What's the best SEO practice for having dynamic content on the same URL?
Let's use this example... www.miniclip.com and there's a function to log in... If you're logged in and a cookie checks that you're logged in and you're on page, let's say, www.miniclip.com/racing-games however the banners being displayed would have more call to action and offers on the page when a user is not logged in to entice them to sign up but the URL would still be www.miniclip.com/racing-games if and if not logged in, what would be the best URL practice for this? just do it?
Intermediate & Advanced SEO | | AdiRste0 -
Is Google's reinclusion request process flawed?
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process. As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google. Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty? This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more. If anyone could contribute from the search quality team, we would love to hear your thoughts! Cheers, Joe
Intermediate & Advanced SEO | | BrowserMediaLtd0 -
Domain Age. What's a good age?
I have a new site that ranks very well and is rich with content. I know that it would rank better but since it's new I'm assuming that it is being held back. My question is how long does it take for a site to mature?
Intermediate & Advanced SEO | | bronxpad0