Parameter Strings & Duplicate Page Content
-
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs.
Example: sitename.com/listings & sitename.com/listings/?addr=street name
Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools.
We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site?
I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
-
You started by saying the problem is duplicate content. Are those pages with the various parameter strings basically duplicate content? Because if they are, no matter what you do you will probably not get them all to rank; the URL is not your main problem in that case. (Though you still should do something about those parameter strings.)
-
Thanks for the quick response, EGOL. Very helpful.
I'm not at all familiar with your 3rd suggestion in your response. If we were to strip them off at the server level, what would that actually look like? Both in terms of the code that we need to use in .htaccess as well as the resulting change to the URL?
Would that affect the pages and their ability to be indexed? Any potential negative SEO effects from doing this?
Just trying to make sure it's what we need and figure out the best way to relay this to the web developer. Thanks!
-
Do I really need to do anything about those pages?
**In my opinion, YES, absolutely. ** Allowing lots of parameters to persist on your site increases crawling require, dilutes the power to your pages, I believe that your site's rankings will decline over time if these parameters are not killed.
There are three methods to handle it.... redirect, settings in webmaster tools and canonical. These three methods are not equivalent and each works in a very different way.
-
The parameters control in Google Webmaster Tools is unreliable. It did not work for me. And, it does not work for any other search engine. Find a different solution, is what I recommend.
-
Using rel=canonical relies on Google to obey it. From my experience it works well at present time. But we know that Google says how they are going to do things and then changes their mind without tellin' anybody. I would not rely on this.
-
If you really want to control these parameters, use htaccess to strip them off at the server level. That is doing it where you totally control it and not relying on what anybody says that they are going to do. Take control.
The only reservation about #3 is that you might need parameters for on-site search or category page sorting on your own site. These can be excluded from being stripped in your htaccess file.
Don't allow search engines to do anything for you that you can do for yourself. They can screw it up or quit doing it at any time and not say anything about it.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content, the distrubutors are copying the content of the manufacturer
Hi everybody! While I was checking all points of the Technical Site Audit Checklist 2015 (great checklist!), I found that the distrubutors of my client are copying part of the content to add it in their websites. When I take a content snippet, and put it in quotes and search for it I get four or five sites that have copied the content. They are distributors of my client. The first result is still my client (the manufacturer), but... should I recommend any action to this situation. We don't want to bother the distributors with obstacles. This situation could be a problem or is it a common situation and Google knows perfectly where the content is comming from? Any recommendation? Thank you!
Intermediate & Advanced SEO | | teconsite0 -
Publishing pages with thin content, update later?
So I have about 285 pages I created with very, very thin content on each. Each is unique, and each serves its own purpose. My question is, do you guys think it is wise to publish all of these at once to just get them out there and update each as we go along? Each page is very laser targeted and I anticipate that a large handful will actually rank soon after publishing. Thanks! Tom
Intermediate & Advanced SEO | | TomBinga11250 -
Bespoke Website With Lack of Front Page Content
Hey guys, I wanted to ask you your opinion.. If you had a website - portfolio style for argument's sake and it was based on wordpress, obviously the front page won't be SEO friendly if you want to keep the minimalistic approach - there will be hardly any content to tell google what to rank your site for... So my question is, can you use a plugin that Google can 'see' content - such as a long unique article - that the user can't see in order to help you rank? I.e. for Gbot, the plugin would load the content plugin as plain html, but 'hide' it from most people visiting the site... What would you do in this scenario? Your response would be much appreciated! Thanks in advance for your help!
Intermediate & Advanced SEO | | geniusenergyltd0 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
Similar page titles but not quite duplicate
Howdy Mozzers, I have a problem with the way Google now tries not to show more than one search result per site on the first page. As in it is a lot harder to be ranked number 1 - 10 twice with different pages. Some of my pages have similar yet different page titles so they use the same first two keywords and then a variable such as '(keyword) (keyword) installations' '(keyword) (keyword) surveys'. Then when I search for '(keyword) (keyword)' they all appear at the start of page two with only ever one of them moving onto the end of page one. Now, it could just be that they are not quite optimised for page 1 but I think it would be more holding back of pages so they don't flood page 1. Any help on this? And also is there a problem with having similar page titles for pages? Cheers
Intermediate & Advanced SEO | | Hughescov0 -
Duplicate Content
http://www.pensacolarealestate.com/JAABA/jsp/HomeAdvice/answers.jsp?TopicId=Buy&SubtopicId=Affordability&Subtopicname=What%20You%20Can%20Afford http://www.pensacolarealestate.com/content/answers.html?Topic=Buy&Subtopic=Affordability I have no idea how the first address exists at all... I ran the SEOMOZ tool and I got 600'ish DUPLICATE CONTENT errors! I have errors on content/titles etc... How do I get rid of all the content being generated from this JAABA/JSP "jibberish"? Please ask questions that will help you help me. I have always been 1st on google local and I have a business that is starting to hurt very seriously from being number three 😞
Intermediate & Advanced SEO | | JML11790 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0