Handling Similar page content on directory site
-
Hi All,
SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US.
I do not want these pages being indexed and was wanting to know the best way to go about this.
I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this.
Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt.
The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site.
Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index?
Thanks!
-
Thanks Kane!
Meta-robots it is!
I will apply it and see how I go with it.
Cheers
-
The best solution is to use on those pages.
I believe that using robots.txt will still allow the URLs to be shown as URLs in search results, so that is less ideal. Not certain if that's still the case, but it used to be that way.
I personally would not nofollow links to that page, because if you use "noindex, follow" it will in turn pass value to other indexed pages, and nofollowing links to a noindex page isn't supposed to increase pagerank to other links on the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Similar content, targeting different states
I have read many answers regarding not having duplicated pages target different states (cities). Here is the problem. We have same content that will serve different pages in some provinces in Canada that we can't change allot intentionally. We don't want these pages compete within the same province. What would be the best approach not to get penalized and keep SERP? Initially we though about hreflang, but we can't really do it on the provice/state attributes. Thanks in advance!
Intermediate & Advanced SEO | | MSaffou20180 -
Site Merge Strategy: Choosing Target Pages for 301 Redirects
I am going to be merging two sites. One is a niche site, and it is being merged with the main site. I am going to be doing 301 redirects to the main site. My question is, what is the best way of redirecting section/category pages in order to maximize SEO benefits. I will be redirecting product to product pages. The questions only concerns sections/categories. Option 1: Direct each section/category to the most closely matched category on the main site. For example, vintage-t-shirts would go to vintage-t-shirt on main site. Option 2: Point as many section/category pages to larger category on main site with selected filters. We have filtered navigation on our site. So if you wanted to see vintage t-shirts, you could go to the vintage t-shirt category, OR you could go to t-shirts and select "vintage" under style filter. In the example above, the vintage-t-shirt section from the niche site would point to t-shirts page with vintage filter selected (something like t-shirts/#/?_=1&filter.style=vintage). With option 2, I would be pointing more links to a main category page on the main site. I would likely have that page rank higher, because more links are pointing to it. I may have a better overall user experience, because if the customer decides to browse another style of t-shirt, they can simply unselect the filter and make other selections. Questions: Which of these options is better as far as: (1) SEO, (2) User experience If I go with option 2, the drawback is that the page titles will all be the same (i.e vintage-t-shirts pointing to the page with filter selected would have "t-shirts" as page title instead of a more targeted page with page title "vintage t-shirts." I believe a workaround would be to pull filter values from the URL and append them to the page title. That way page title for URL t-shirts/#/?=1&filter.style=vintage_ would be something like "vintage, t-shirts." Is this the appropriate way to deal with it? Any thoughts, suggestions, shared experiences would be appreciated.
Intermediate & Advanced SEO | | inhouseseo0 -
Two sites with same content in different countries. How does it effect SEO?
Lets say for example that we have to sites, example.com and example.co.uk. The sites has the same content in the same language. Can the sites rank well in its own country? Of course all content could be rewritten, but that is very time consuming. Any suggestions? Has anyone did this before or now a site which has?
Intermediate & Advanced SEO | | fredrikahlen0 -
Should all pages on a site be included in either your sitemap or robots.txt?
I don't have any specific scenario here but just curious as I come across sites fairly often that have, for example, 20,000 pages but only 1,000 in their sitemap. If they only think 1,000 of their URL's are ones that they want included in their sitemap and indexed, should the others be excluded using robots.txt or a page level exclusion? Is there a point to having pages that are included in neither and leaving it up to Google to decide?
Intermediate & Advanced SEO | | RossFruin1 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0 -
Optimize the category page or a content page?
Hi, We wish to start ranking on a specific keyword ("log house prices" in italian). We have two options on what pages we should optimize for this keyword: A long content page (1000+ words with images) Log houses category page, optimized for the keyword (we have 50+ houses on this page, together with a short price summary). I would think that we have better chances with ranking with option nr.2 , but then we can't use that page for ranking with a more short-tail keyword (like "log houses"). What would you suggest? Is there maybe a third option for this?
Intermediate & Advanced SEO | | JohanMattisson0 -
How to avoid content canibalizm? How do I control which page is the landing page?
Hi All, To clarify my question I will give an example. Let's assume that I have a laptop e-commerce site and that one of my main categories is Samsung Laptops. The category page shows lots of laptops and a small section of text. On the other hand, in my article section I have a HUGE article about Samsung Laptops. If we consider the two word phrases each page is targeting then the answer is the same - Samsung Laptops. On the article i point to the category page using anchor such as "buy samsung laptops" or "samsung laptops" and on the category page (my wishful landing page) I point to the article with "learn about samsung laptops" or "samsung laptops pros and cons". Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
3 Sites Covering Similar Topics & Panda
My question will take a bit of explaining, so here goes: I have 3 blogs on the same server: 1. personal finance blog; 2. credit card blog; 3. prepaid credit card blog. The personal finance blog is my flagship site started in 2007, which feeds my family and pays the mortgage. By contrast, the other two sites (started in 2008 and 2010) I would gladly kill if the result would help my personal finance blog. In the fall of 2010 (before Panda) the prepaid card blog was penalized by Google. This has been confirmed by Google in response to a reconsideration request. Of course, they don't say why. I've tried a number of things and resubmitted the site, but with no luck. Both the personal finance blog and credit card blog were hit by Panda 2 (April 11, 2011) and have not recovered. While the personal finance site covers many topics (e.g., investing, credit, debt, money management), its income comes largely from credit cards. We review individual credit cards and have pages that list cards by category (e.g., balance transfer, cash back, travel). The credit card blog does the same thing, but of course covers credit cards in more depth. There is a similar overlap between the prepaid card blog on the one hand, and the credit card blog and personal finance blog on the other. However, all content is unique. I do not currently link between the sites, although until a few months ago I had blogroll links between the sites and a few (less than 10) content links. If you've made it this far (and I hope you have), here are my questions: 1. Could the existence of the credit card and prepaid credit card sites be hurting my personal finance blog's rankings in Google, whether via Panda or otherwise? 2. If there is a reasonable chance that the answer to question 1 is yes, what would you suggest I do? Of course, I could just take down the sites, but I wonder if there are other options. One thought I had was to deindex the two card sites (I assume I can do this by disallowing googlebot via robots.txt) and give it time. Would Google treat this as if the sites did not exist? Both sites get a fair amount of traffic from bing and yahoo, so this option appeals to me. Of course, for all I know the existence of the two card sites are hurting my personal finance blog's rankings in bing and yahoo, too. I thought about selling the sites, but if they are hurting my personal finance site, I grow concerned about how google distinguishes between a site being sold and a webmaster just trying to make the sites look like they are owned by different people. In this regard, I've never tried to hide the common ownership of the sites and have no intention of doing that now. If I kill the sites, should I redirect them to my personal finance site? For the penalized prepaid card site, this seems both risky and unhelpful. But perhaps redirecting the credit card site is an option. Given that the personal finance site is my livelihood, I greatly appreciate your thoughts on my dilemma.
Intermediate & Advanced SEO | | Bergerlaw0