Duplicate Content Issue from using filters on a directory listing site
-
I have a directory listing site of harpists and have alot of issues coming up that say:
Content that is identical (or nearly identical) to content on other pages of your site forces your pages to unnecessarily compete with each other for rankings.
Because this is a directory listing site the content is quite generic.The main issue appears to be coming from the functionality of the page. It appears that the "spider" is picking up each different choice of filter as a new page? If you have a look at this link you will see what I mean.
People searching the site can filter the results of the songs played by this harpist by changing the dropdowns etc... but for some reason the filter arguments are being picked up...? Do you have any good approaches to solving this issue?
A similar issue comes from the video pages for each harpist. They are being flagged as identical content - as there are currently no videos on the page.
|
http://www.find-a-harpist.co.uk/user/39/videos
|
http://www.find-a-harpist.co.uk/user/37/videos
|
Do you have any suggestions?
Many thanks for taking the time to read this and respond.
| | | | | |
| | -
Thank you both for you responses. Yes the site is relatively new. I shall implement your suggestions and hopefully they will do teh trick.
-
Is your site relatively new? I currently show no pages in the Google index at all, which makes the duplicate content issue a bit moot (at least in the short-term). The search filters and pagination are a bit different issues. You could META NOINDEX any pages with the filter parameters active, or rel-canonical them to the unfiltered version (as @Steve25 said). Since no pages are indexed yet, you could also just "nofollow" the filter links ("Title", etc.), which should help prevent those filtered versions getting crawled. Pagination (pages 2+ of search) is a trickier issue, but it might be best to just NOINDEX, FOLLOW those. You could also let Google know in Google Webmaster Tools that that page= parameter is for pagination (I've had that be hit-or-miss, but it is easy, relative to other solutions). For the empty profiles, it really depends on the scope. If you have a lot, I'd ideally want to code them to have META NOINDEX if they're empty. You can lift the NOINDEX once they have content posted. You'd have to do that dynamically, but it shouldn't be too tricky. That way, Google would see new pages only once they have some content in place.
-
Could you set up canonical tags so that when users select certain criteria a parent page is shown in the canonical?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Complex Rankings Issue For A Law Firm Site
Be warned, this is a complex issue that I have and will require someone who has some advanced knowledge about 301s and link penalty’s. I have a law firm client whose site is having some issues. There are some very complex details here so I'm going to articulate them in bullet points in hopes of making the issues easy to understand. So here's my root problem: We have poor organic rankings (4th, 5th, 6th page for most terms) despite Domain Authority of 32 (avg. 1st page competitor is 28) and some very strong white hat link building the last 60 days or so. How's their backlink profile look, you ask? When you look at their backlink profile in OSE, their spam score is a 1/17 (not sure if that's credible in any way). Lot's of links that score 5's on the spam score make up about 10% of their OSE links. Here’s where it gets tricky; those links are not directed the client's New URL, they are links that go to some old URLs the client used to have, for which they had an SEO guy who built all those crappy links. Those URLs with the crappy links (we'll call them The Crappy URLs) were 301'd (can we all agree 301'd is a verb?) to the NEW URL for just a couple of months. Shortly after that, NEW URL dropped almost completely out of Google, so the client turned off the 301s. So despite those 301s being turned off, OSE still shows all the links going to The Crappy URLs but is giving The New URL credit for them. Keep in mind, the 301s were turned off about 6 months ago so it’s a little strange that OSE still shows those 301s. This has led me to the conclusion that the Domain Authority that OSE shows of 32, is not a “real” number since it is seemingly based off links inherited from 301s that no longer exist. So now I’m trying to create an action plan for this client that will hopefully help us start to make some real progress in our rankings. This client does not have the budget to wait another 6 months for some sign of hope so time is of the essence. Here’s my theoretical action plans I’m choosing from and would like the communities input on which, if any, they feel is best (Also, if I’m missing something or you have an idea, I’m all ears): **Potential Action Plans: ** Do nothing, keep building quality links, creating quality content, monitor crawl reports/gwt for issues. That strategy is going to win long term. #1 + Create one page sites on The Crappy URLs, setup GWT for them, submit sitemaps thus forcing Google, OSE and other web crawlers to index them, thus removing any potential residual penalties from the 301s. NOTE: Currently The Crappy URLS are just landing on GoDaddy’s default landing page which is of course not being indexed by Google or OSE. #2 + Disavow all the bad links going to The Crappy URLS. Then once the bad links no longer appear in the OSE profile for each of The Crappy Sites, 301 them again, thus inheriting the good links but not the bad. #1 + 301 the Crappy URLS back to the New URL, while also disavow any links going to The Crappy URLs. The logic here is that if the road back to recovery is going to be a few months away no matter what, when the 301 knocked them back 6 months ago no reputable link building was being done. I am cautiously optimistic the linkbuilding we are doing will eventually off set any penalty’s coming from the 301s. Plus now we’ll know the 32 Domain Authority OSE is giving us is real. This is the one I’m leaning towards quite frankly because I think it will reduce the recovery time and we’ll know somewhat quickly (30-60 days) if it’s actually working. 1-3 could each take 90 days before we know if it’s working. So please, if you have any expertise with any of this, your help or advice would be appreciated. I’d rather not share The New URL for obvious reasons but if you must know, simply message me and as long as you’re legit, I’ll share it with you.
Moz Pro | | BrianJGomez0 -
Hi guys What the best way to adress duplicate content on photo gallery?
inside my moz report for duplicate contentit says that the photo gallery has duplicate content. let me post and example. is saying this site->http://www.yoursite.com//photogallery/name-of-the-page site photogallery category page name its being duplicated to all these other urls : http://www.yoursite.com//photogallery/name-of-the-page-categoryone http://www.yoursite.com//photogallery/name-of-the-page-categorytwo http://www.yoursite.com//photogallery/name-of-the-page-categorythree http://www.yoursite.com//photogallery/name-of-the-page-categoryfour and so on! each one has it own canonical tag to its own individual page. the site structure is this: http://www.yoursite.com//photogallery/ in here there are all the links pointing to the right categorypage ie: http://www.yoursite.com//photogallery/ >>>> http://www.yoursite.com//photogallery/categoryone pic 1 pic 2 pic 3 http://www.yoursite.com//photogallery/categorytwo pic 1 pic 2 pic 3 http://www.yoursite.com//photogallery/categorythree pic 1 pic 2 pic 3 http://www.yoursite.com//photogallery/categoryfour pic 1 pic 2 pic 3 So i don't know how to interpret Moz diagnose. how could i interpret moz reports to find out what to fix and how to fix it? Sorry for the long post! ;
Moz Pro | | surgeonsadvisor0 -
404 : Errors in crawl report - all pages are listed with index.html on a WordPress site
Hi Mozers, I have recently submitted a website using moz, which has pulled up a second version of every page on the WordPress site as a 404 error with index.html at the end of the URL. e.g Live page URL - http://www.autostemtechnology.com/applications/civil-blasting/ Report page URL - http://www.autostemtechnology.com/applications/civil-blasting/index.html The permalink structure is set as /%postname%/ For some reason the report has listed every page with index.html at the end of the page URL. I have tried a number of redirects in the .htaccess file but doesn't seem to work. Any suggestions will be strongly appreciated. Thanks
Moz Pro | | AmanziDigital0 -
Duplicate page reported in Wordpress site, but I can't find it in All Pages list
In a crawl report a duplicate page content warning has been displayed. The two urls are 1. http://www.superheroes.com.au/shop
Moz Pro | | Andyfools
and
2. http://www.superheroes.com.au/shop/category/catalog/ It's a WordPress site and I cannot find the second page anywhere in the list of All Pages in the Admin (I want to add canonicalisation code) When I view the 2nd page and click Edit Page, it redirects to the 1st page. Any ideas where SEOMoz would be finding this 2nd page or how it might be being generated? (btw I didn't build this site) Thanks Simon0 -
Member Only Content
I run a wordpress based website that contains a large amount of free content, but also a large amount of content that is only accessed via a paid membership. After running a SEOmoz campaign for the site, it showed 3600 errors for duplicate page titles and 1900 errors for duplicate page content. After looking into the errors it became clear that the majority of them were due to the fact that if you clicked on a link to paid content, it would take you to the paid membership sign in page. So how to I go about fixing these errors? I don't want this to hurt my rankings. Or fix it if it already has.
Moz Pro | | CobraJones950 -
What tools can I use to crawl a site which uses #! hasbhang?
I have a site which was created in a way that it uses hasbang #!. I am using 3 different SEO tools and they can't seem to crawl the website. Or what suggestion can you give me in dealing with hasbang. Any ideas please. Thanks a lot for your help. Allan
Moz Pro | | AllanDuncan0 -
Niche directories white hat?
Hello, I only submit to general directories that are extremely strong - DMOZ, Best of the Web, Yahoo Directory. But I do submit to all fairly strong, fairly high quality niche directories that any competitor is listed under. Is this completely white hat and useful now and in the future? I don't want to be penalized. I do try to get a variety of links as well as the niche directories. Thanks!
Moz Pro | | BobGW0 -
Blogger Duplicate Content? and Canonical Tag
Hello: I previously asked this question, but I would love to get more perspectives on this issue. In Blogger, there is an archive page and label(s) page(s) created for each main post. Firstly, does Google, esp. considering Blogger is their product, possibly see the archive and tag pages created in addition to the main post as partial duplicate content? The other dilemma is that each of these instances - main post, archive, label(s) - claim to be the canonical. Does anyone have any insight or experience with this issue and Blogger and how Google is treating the partial duplicates and the canonical claims to the same content (even though the archives and label pages are partial?) I do not see anything in Blogger settings that allows altering these settings - in fact, the only choices in Blogger settings are 'Email Posting' and 'Permissions' (could it be that I cannot see the other setting options because I am a guest and not the blog owner?) Thanks so much everyone! PS - I was not able to add the blog as a campaign in SEOmoz Pro, which in and of itself is odd - and which I've never seen before - could this be part of the issue? Are Blogger free blogs not able to be crawled for some reason via SEOmoz Pro?
Moz Pro | | holdtheonion0