Which is The Best Way to Handle Query Parameters?
-
Hi mozzers,
I would like to know the best way to handle query parameters.
Say my site is example.com. Here are two scenarios.
Scenario #1: Duplicate content
example.com/category?page=1
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-headerAll have the same content.
Scenario #2: Pagination
example.com/category?page=1
example.com/category?page=2 and so on.What is the best way to solve both?
Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only.
For solving the duplicate content issue, do we need to use canonical tags on each such URL's?
I am not using WordPress. My site is built on Ruby on Rails platform.
Thanks!
-
The new pagination advice is really tough to navigate. I have mixed feelings about rel=prev/next (hard to implement, doesn't work on Bing, etc.) but it seems generally reliable. If you have pagination AND parameters that impact pagination (like sorts), then you need to use prev/next and canonical tags. See the post Alan cited.
I actually do think NOINDEX works fine in many cases, if the paginated search (pages 2+) have little or no search value. It really depends on the situation and the scope, though. This can range from no big deal at all to a huge problem, depending on the site in question, so it's tough to give general advice.
I'm not having great luck with GWT parameter handling lately (as Alan said), especially on big sites. It just doesn't seem to work in certain situations, and I have no idea why Google ignores some settings and honors others. That one's driving me crazy, actually. It's easy to set up and you can try it, but I wouldn't count on it working.
-
no dont de-index them, just use prev next,
yes you are right it is only for google, i really can not give you an answer as what to do for both, you could use canonical for bing only. its a hard one
see this page, for more info http://googlewebmastercentral.blogspot.com.au/2011/09/pagination-with-relnext-and-relprev.html
-
Which do you think is ideal?
De-Indexing Pages 2+ or simply using the rel=next, rel=prev? That's also only for Google right?
-
For the first senario use a canonical tag.
for the second use the prev next tags, this to google will make page one look like one big page with all the content of all the pages on it.
dont use parrametter handing, it is a last resort, it is only for google (though bing has its own), and its effectiveness has been questioned.
-
The problem is that we are talking about thousands of pages and manually doing it is close to impossible. Even if it can be engineered, it will take a lot of time. Unless Webmaster tools cannot effectively handle this situation, it doesn't make sense to go and change the site code.
-
Hi Mohit,
Seems like a waste of time to me when you can put a simple meta tag in there.
-
How about using parameter handling using Google Webmaster tools to ignore ?page=1, ?order=updated_at+DESC and so on. Does that work instead of including canonical tags on all such pages?
-
I can speak to the first scenario, that is exactly what the purpose of the rel="canonical" is for. Dynamic pages in which have a purpose for url appendages.Or in the rare case where you can't control your server (.httaccess) for 301 redirects.
As for pagination, I may not have the best answer as I have also been using rel="canonical" in those cases as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices for lazy loading (content)
Hi all, We are working on a new website and we want to know the best practices for lazy loading of google for content.
Technical SEO | | JohnPalmer
My best sample is: bloomberg.com , look at their homepage. Thank y'all!0 -
What is the better way to fix duplication https and http?
Hi All! I have a doubt about how to fix the duplication problem https @ http. What is the better way to fix it in your opionion/experience? Me, for instance, I have chosen to put "noindex, nofollow" into https version. Each page of my site has a https version so I put this metarobots into it....But I am not sure about what happens with all backlinks with "https" URLs I have, I've just checked I have some...What do you think about it? Thanks in advance for helping!
Technical SEO | | Red_educativa0 -
Best way to do a site in various regions
I have a client who has 2 primary services in 4 regions He does mold removal and water damage repair. He then serves cincinnati, dayton, columbus, and indianapolis. Before hiring my company he had like 30 domains (keyword based) and had tons and tons of fake google places listings. He actually got a lot of traffic that way. However I will not tolerate that kind of stuff and want to do things the right way. First of all what is the best site approach for this. He wants a site for each service and for each city. indy mold cincy mold dayton mold dayton water etc etc etc In the end he will have 8 sites and wants to expand into other services and regions. I feel like this is not the right way to handle this as he also has another site that is more generic To me the best way to do this is a generic domain with a locations page and a page for each city. The for the Places he would get one account - an address that is hidden since he goes to customer locations, and just multiple city defined regions. He does have an office like address at each city. So should I make him a Places listing for each city or just the one? And of course how should the actual sites be organized? Thanks
Technical SEO | | webfeatseo0 -
Best way to setup large site for multi language
Hello, I am setting up a new site which is going to be very large, over 250,000 products. Most of our customers are in the UK (45%), the rest are from various European countries and the USA. Unfortunately we only have a team of two people writing content for these pages in English. I would value some input on the best way to setup my website structure for ranking. Obviously the best would be individual country oriented domains I.e. domain.fr domain.de domain.co.uk . However we wouldnt have the time to create content for every page and most pages would contain the same content as the English domain. Would I get a penalty for this from google? The second choice is to follow the example of overstock.com and pull in information relating to each country I.e. currency and delivery time. this would be a lot easier but I am concerned that the lack of geo focus would effect my rankings. Does any one have any ideas?
Technical SEO | | DavidLenehan0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
What is the most effective way of indexing a localised website?
Hi all, I have a website, www.acrylicimage.com which provides products in three different currencies, $, £ and Euro. Currently a user can click on a flag to indicate which region they are in, or if the user has not manually selected the website looks at the users Locale setting and sets the region for them. The website also has a very simple content management system which provides ever so slightly different content depending on which region the user is in. The difference in content might literally be a few words per page, like contact details, measurements i.e. imperial to metric. I dont believe that GoogleBot, or any other bot for that matter, sets a Locale, and therefore it will only ever be indexing the content on our default region - the UK. So, my question really is if I need to be able to index different versions of content on the same page, is the best route to provide alternate urls i.e.: /en/about-us
Technical SEO | | dotcentric
/us/about-us
/eu/about-us The only potential downside I see to this is there are currently a couple of pages that do have exactly the same content regardless of whether you have selected the UK or USA regions - could this be considered content duplication? Thanks for your help. Al0 -
Which is best blogging platform from SEO POV?
I am curious to find out what is the blogging platform of choice for enterprize level companies (employees more than 500, revenue more tan 150M). What would be the best solution from SEO point of view? I have used Wordpress in the past for small companies and feel that is the best. We are currently using Telligent. Is anybody using it?
Technical SEO | | Amjath0 -
Best cross linking strategy for micro sites?
Hi Guys. I created a micro site (A design showcase gallery) away from the main website to attract a lot of links in my space from competitors. It works so well it has become a valuable resource in my industry and I believe I will keep it running and adding content to it. Is the best SEO strategy for the main site simply to link from each page to the main site? Or should I be looking at something else? Thanks, Alan
Technical SEO | | spoiltchild0