Duplicate Titles for Large Lists
-
Our blog (www.cowleyweb.com/blog) has recently been given topic categories so we can utilize our old blogs. Otherwise, users would only see what's new and never look back (our blogs are organized by the month they were published) and all that hard work would kind of be a waste after a while.
So we came up with a few topics (i.e. social media, internet marketing, etc.) and adding those as tags to blogs. Now, users can click the topics and get a results page on our blog of all the previously published blogs related to that topic. Sounds great.
BUT, it's hurting our SEO crawl report. If the list goes beyond one page of search results, the 2nd and subsequent pages get dinged as "duplicate title" b/c they share the same title (i.e. "Social Media"). How can I fix this?
I'm not the web designer but something tells me maybe some sort of tag that says "Page 2" or something would do the trick. We use Drupal which is good for customization.
I assume tons of bloggers and websites have dealt with this problem.
Please help. Want to give the web guy some solutions.
Thank you.
-
I have never used it but try implementing the Smart Paging module + Tokens - http://drupal.org/project/smart_paging
-
I would suggest going to Analytics, segmenting by organic search traffic, and seeing if anyone has landed on those pages from search results in the last 2-3 months. If Google is not returning them in search results, and they are not bringing traffic, Google usually favors cleaning pages out of the index that don't need to be there.
If you don't want to noindex them, you can add "Page 2" etc to the title tags to eliminate the duplicate title errors in the crawl report.
-Dan
-
Talked to our web designer. He said he's nervous about no-indexing as if Google will get suspicious and it will hurt more than be a solution. Don't know what to say.
-
Hey There
Sounds like you are all set - just want to add that the type of page you're referring to: page/2 etc is "subpages" and you'll also want to look into noindexing those as well, in addition to "tags" and "categories". That should also fix the errors you're seeing in the Moz report.
-Dan
-
Thanks guys! That's awesome. Forwarding to our developer.
-
It's usually suggested that Tag archives and Category archives be set to NoIndex which will help in alleviating this issue.
-
This sounds like a Drupal/CMS issue. if Drupal doesn't have a fix natively, i'm sure a 3rd developer may have a solution:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issue: Mobile vs. Desktop View
Setting aside my personal issue with Google's favoritism for Responsive websites, which I believe doesn't always provide the best user experience, I have a question regarding duplicate content... I created a section of a Wordpress web page (using Visual Composer) that shows differently on mobile than it does on desktop view. This section has the same content for both views, but is formatted differently to give a better user experience on mobile devices. I did this by creating two different text elements, formatted differently, but containing the same content. The problem is that both sections appear in the source code of the page. According to Google, does that mean I have duplicate content on this page?
Web Design | | Dino640 -
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Using content from other sites without duplicate content penalties?
Hi there, I am setting up a website, where i believe it would substantially benefit users experience if i setup a database of information on artists. I am torn because to feasibly do this correctly, i would have content that is built from multiple sources, but has no real unique content. It would have parts from Wikipedia, parts from other websites etc. All would be sourced of-course. My concern is that if i do this, am i risking in devaluing my website because of this. Is there a way i can handle this without taking a hit?
Web Design | | BorisD0 -
Duplicate Page Content mysite.com and mysite.com/index.html MOZ Dashboard
According to MOZ Dashboard my site shows Duplicate Page Content mysite.com and mysite.com/index.html .What i can do for that .redirect mysite.com/index.html to mysite.com .then how can i do that using .htaccess file .
Web Design | | innofidelity0 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0 -
How serious is duplicate page content?
We just launched our site on a new platform - Magento Enterprise. We have a wholesale catalog and and retail catalog. We have up to 3 domains pointing to each product. We are getting tons of duplicate content errors. What are the best practices for dealing with this? Here is an example: mysite.com/product.html mysite.com/category/product.html mysite.com/dynamic-url
Web Design | | devonkrusich0 -
Duplicate content on mobile sites
Hi Guys We are launching a mobile webshop later this year and have decided to use a subdomain for this. (m.domainname.xx). The content will be more or less identical with the one on the standard desktop site (domainname.xx), but im struggeling to find out if this will create dipplicate content between the mobile and desktop site. Does anyone have a solid answer for this one?
Web Design | | AndersDK0