Handling Similar page content on directory site
-
Hi All,
SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US.
I do not want these pages being indexed and was wanting to know the best way to go about this.
I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this.
Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt.
The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site.
Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index?
Thanks!
-
Thanks Kane!
Meta-robots it is!
I will apply it and see how I go with it.
Cheers
-
The best solution is to use on those pages.
I believe that using robots.txt will still allow the URLs to be shown as URLs in search results, so that is less ideal. Not certain if that's still the case, but it used to be that way.
I personally would not nofollow links to that page, because if you use "noindex, follow" it will in turn pass value to other indexed pages, and nofollowing links to a noindex page isn't supposed to increase pagerank to other links on the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adding hreflang tags - better on each page, or the site map?
Hello, I am wondering if there seems to be a preference for adding hreflang tags (from this article). My client just changed their site from gTLDs to ccTLDs, and a few sites have taken a pretty big traffic hit. One issue is definitely the amount of redirects to the page, but I am also going to work with the developer to add hreflang tags. My question is - is it better to add them to the header of each page, or the site map, or both, or something else? Any other thoughts are appreciated. Our Australia site, which was at least findable using Australia Google before this relaunch, is not showing up, even when you search the company name directly. Thanks!Lauryn
Intermediate & Advanced SEO | | john_marketade0 -
I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?
Intermediate & Advanced SEO | | jim_shook0 -
Would be the network site map page considered link spam
In the course of the last 18 months my sites have lost from 50 to 70 percent of traffic. Never have used any tricks, just simple white-hat SEO. Anyway, I am now trying to fix things that hadn't been a problem before all those Google updates, but apparently now are. Would appreciate any help.. I used to have a network site map page on everyone of my sites (about 30 sites). It basically would be a page called 'our network' and it'll show a list of links to all of my other sites. These pages were indexed, had decent PR and didn't seem to cause any problem. Here's an example of one of them:
Intermediate & Advanced SEO | | romanbond
http://www.psoriasisguide.ca/psoriasis_scg.html In the light of Panda and Penguin and all these 'bad links' I decided to get rid of most of them. My traffic didn't recover at all, it actually went further down. Not sure if there is any connection to what I'd done. So, the question is: In your opinion/experience, do you think such network sitemap pages could be causing penalties for link spam?0 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Redirecting Pages from site A to site B
Hi, I have a client who have a solid, high ranking content based site (site A). They have now created an ecommerce site in addition (site B). To give site B a boost in terms of search engine visibility upon launch, they now wish to redirect approx 90% of site As pages to site B. What would be the implications of this? Apart from customers being automatically redirected from the page they thought they where landing on, how would google now view site A? What are your thoughts to thier idea. I am trying to talk them out of it as I think its a poor one.
Intermediate & Advanced SEO | | Webrevolve0 -
K3 duplicate page content and title tags
I'm running a Joomla site, have just installed k2 as our blogging platform. Our Crawl Report with SEOMOZ shows a good bit of duplicate content and duplicate title tags with our K2 blog. We've installed sh404SEF. Will I need to go into sh404SEF each time we generate a blog entry to point the titles to one URL? If there is something simpler please advise. Thank you, Don
Intermediate & Advanced SEO | | donaldmoore0 -
Duplicate Page Title/Content Issues on Product Review Submission Pages
Hi Everyone, I'm very green to SEO. I have a Volusion-based storefront and recently decided to dedicate more time and effort into improving my online presence. Admittedly, I'm mostly a lurker in the Q&A forum but I couldn't find any pre-existing info regarding my situation. It could be out there. But again, I'm a noob... So, in my recent SEOmoz report I noticed that over 1,000 Duplicate Content Errors and Duplicate Page Title Errors have been found since my last crawl. I can see that every error is tied to a product in my inventory - specifically each product page has an option to write a review. It looks like the subsequent page where a visitor can fill out their review is the stem of the problem. All of my products are shown to have the same issue: Duplicate Page Title - Review:New Duplicate Page Content - the form is already partially filled out with the corresponding product My first question - It makes sense that a page containing a submission form would have the same title and content. But why is it being indexed, or crawled (or both for that matter) under every parameter in which it could be accessed (product A, B, C, etc)? My second question (an obvious one) - What can I do to begin to resolve this? As far as I know, I haven't touched this option included in Volusion other than to simply implement it. If I'm missing any key information, please point me in the right direction and I'll respond with any additional relevant information on my end. Many thanks in advance!
Intermediate & Advanced SEO | | DakotahW0 -
Optimize root domain or a page in a sub directory?
Hi My root domain is already optimized for keywords, i would say branded keywords, which i do not really need, as the traffic from these does not give me any revenue ( mostly consists of our employees/returning visitors). Now i have run on page optimization for set of keywords for root domain which i like and got good grades (hurray!). But yet my website does not show up on search engines for those keywords. I have got pretty good link building done to my root domain but this is not done for all keywords (but done for branded keywords). It just happened, please do not ask why. So i decided to optimize inside pages in sub directory with new set of keywords i like. Starting with link building, giving anchor text on various other website linking to this particular page. These pages are not ranked in top 50 in google. Is that a good practice? or I would not need those branded keywords, hence should I re-optimize my root domain to suite my new keywords by giving less preference to branded keywords? Is this a good practice?
Intermediate & Advanced SEO | | MiddleEastSeo0