Handling Similar page content on directory site
-
Hi All,
SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US.
I do not want these pages being indexed and was wanting to know the best way to go about this.
I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this.
Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt.
The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site.
Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index?
Thanks!
-
Thanks Kane!
Meta-robots it is!
I will apply it and see how I go with it.
Cheers
-
The best solution is to use on those pages.
I believe that using robots.txt will still allow the URLs to be shown as URLs in search results, so that is less ideal. Not certain if that's still the case, but it used to be that way.
I personally would not nofollow links to that page, because if you use "noindex, follow" it will in turn pass value to other indexed pages, and nofollowing links to a noindex page isn't supposed to increase pagerank to other links on the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issues on same page with urls to multiple tabs?
Hello everyone! I'm first time here, and glad to be part of Moz community! Jumping right into the question I have. For a type of pages we have on our website, there are multiple tabs on each page. To give an example, let's say a page is for the information about a place called "Ladakh". Now the various urls that the page is accessible from, can take the form of: mywanderlust.in/place/ladakh/ mywanderlust.in/place/ladakh/photos/ mywanderlust.in/place/ladakh/places-to-visit/ and so on. To keep the UX smooth when the user switches from one tab to another, we load everything in advance with AJAX but it remains hidden till the user switches to the required tab. Now since the content is actually there in the html, does Google count it as duplicate content? I'm afraid this might be the case as when I Google for a text that's visible only on one of the tabs, I still see all tabs in Google results. I also see internal links on GSC to say a page mywanderlust.in/questions which is only supposed to be linked from one tab, but GSC telling internal links to this page (mywanderlust.in/questions) from all those 3 tabs. Also, Moz Pro crawl reports informed me about duplicate content issues, although surprisingly it says the issue exists only on a small fraction of our indexable pages. Is it hurting our SEO? Any suggestions on how we could handle the url structure better to make it optimal for indexing. FWIW, we're using a fully responsive design with the displayed content being exactly same for both desktop and mobile web. Thanks a ton in advance!
Intermediate & Advanced SEO | | atulgoyal0 -
301 migration - Indexed Pages rising on old site
Hello, We did a 301 redirect from site a to site b back in March. I would check on a daily basis on the index count using query "site:sitename" The past couple of days, the old domain (that was 301 redirected) indexed pages has been rising which is really concerning. We did a 301 redirect back in march 2016, and the indexed count went from 400k pages down to 78k. However, the past 3 days it went from 78k to 89,500. And I'm worried that the number is going to continue to rise. My question - What would you do to investigate / how to investigate this issue? Would it be screaming frog and look at redirects? Or is this a unique scenario that I'd have to do other steps/procedures?
Intermediate & Advanced SEO | | ggpaul5620 -
Should I Re-direct Domains to Internal Pages on Money Site
I have an ecommerce site that is fully built out with thousands of products. I own many industry related domains for the products that i sell. Many of these domains are sitting unused. I started to think that it would beneficial if i 301 redirect (at the registrar level) these domains to their SPECIFIC subcategories on my main money site. For example, i sell sporting goods and my main website is buysportinggoods.com I also own the following domains: basketballoutlet.com & baseballequipmentstore.com & footballpads.com Would it be wise or foolish (and potentially cause a Google penalty) if i did the following: Point basketballoutlet.com to buysportinggoods.com/basketballs Point baseballequipmentstore.com to buysportinggoods.com/baseball Point footballpads.com to buysportinggoods.com/football Please let me know your thoughts or experiences with similar situations. Thanks!
Intermediate & Advanced SEO | | Prime850 -
Contextual FAQ and FAQ Page, is this duplicate content?
Hi Mozzers, On my website, I have a FAQ Page (with the questions-responses of all the themes (prices, products,...)of my website) and I would like to add some thematical faq on the pages of my website. For example : adding the faq about pricing on my pricing page,... Is this duplicate content? Thank you for your help, regards. Jonathan
Intermediate & Advanced SEO | | JonathanLeplang0 -
How do I know what pages of my site is not inedexed by google ?
Hi I my Google webmaster tools under Crawl->sitemaps it shows 1117 pages submitted but 619 has been indexed. Is there any way I can fined which pages are not indexed and why? it has been like this for a while. I also have a manual action (partial) message. "Unnatural links to your site--impacts links" and under affects says "Some incoming links" is that the reason Google does not index some of my pages? Thank you Sina
Intermediate & Advanced SEO | | SinaKashani0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Will thousands of redirected pages have a negative impact on the site?
A client site has thousands of pages with unoptimized urls. I want to change the url structure to make them a little more search friendly. Many of the pages I want to update have backlinks to them and good PR so I don't want to delete them entirely. If I change the urls on thousands of pages, that means a lot of 301 redirects. Will thousands of redirected pages have a negative impact on the site? Thanks, Dino
Intermediate & Advanced SEO | | Dino641 -
Duplicate Content from Article Directories
I have a small client with a website PR2, 268 links from 21 root domains with mozTrusts 5.5, MozRank 4.5 However whenever I check in google for the amount of link: Google always give the response none. My client has a blog and many articles on the blog. However they have submitted their blog article every time to article directories as well, plain and simle creating duplicate and content. Is this the reason why their link: is coming up as none? Is there something to correct the situation?
Intermediate & Advanced SEO | | danielkamen0