Not sure if I need to be concerned with duplicate content plus too many links
-
Someone else supports this site in terms of making changes so I want to make sure that I know what I am talking about before I speak to them about changes.
We seem to have a lot of duplicate content and duplicate titles.
This is an example http://www.commonwealthcontractors.com/tag/big-data-scientists/ of a duplicate. Do I need to get things changed?
The other problem that crops up on reports is too many on page links. I am going to get shot of the block of tags but need to keep the news. Is there much else I can do?
Many thanks.
-
Thank you Robert for your advice.
Much appreciated.
Niamh
-
Your welcome Niamh.
David
-
David
I appreciate you taking the time for my query. It is really helpful and I will get it sorted.
Many thanks
Niamh
-
Niamh
With the url above, there is some duplicate content on the site and given the blog type it is I am not sure that it is helpful or a problem. I would suggest you nofollow the archives due to the way they are scrolling and showing as separate pages over and over.
As to the internal linking, from what I see there are not a ton of internal links here. Typically, you here keep it under 100 or keep it under 200. To me, if you have over 100 the page is a bit busy. You have less than 100 on the homepage and I do not think you have a real issue there.
Hope this helps,
Robert -
Simply put imagine your site is a book, a user comes along and finds the book has two chapter ones with the same title and the same page content - what value is that to the reader? None
Search engines are all about search quality and content, if they are seeing duplicate titles, descriptions and content how are they going to know what content on your pages is relevant to the user and delivers value to the users need.
I would make it a best practice for your business to ensure site content is rich, valuable and not duplicated across your sites domain. Also make sure descriptions and titles are short but descriptive. Maybe you can get your site's content manager to use great little tools like SEOMofo for this - http://www.seomofo.com/snippet-optimizer.html
Also get them looking at using schema.org and develop rich snippets for your page metadata.
Others can maybe come back with a better answer on that one - (read Robert's comments below regarding link quantities).
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content by the product pages
Hi,Do you thing those pages have duplicate content:https://www.nobelcom.com/Afghanistan-phone-cards/from-Romania-235-2.htmlhttps://www.nobelcom.com/Afghanistan-phone-cards-2.htmlhttps://www.nobelcom.com/Afghanistan-Cell-phone-cards-401.htmlhttps://www.nobelcom.com/Afghanistan-Cell-phone-cards/from-Romania-235-401.html.And also how much impact will it have on a panda update?I'm trying to figure out if all the product pages, (that are in the same way as the ones above) are the reson for a Panda Penalty
On-Page Optimization | | Silviu0 -
How best to deal with internal duplicate content
hi having an issue with a client site and internal duplicate content. The client has a custom cms and when they post new content it can appear, in full, at two different urls on the site. Short of getting the client to move cms, which they won't do, I am trying to find an easy fix that they could do themselves. ideally they would add a canonical on one of the versions but the cms does allow them to view posts in html view, also would be a lot if messing about wth posting the page and then going back to the cms and adding the tag. the cms is unable to auto generate this either. The content editors are copywriters not programmers. Would there be a solution using wmt for this? They have the skill level to be able to add a url in wmt so im thinking that a stop gap solution could be to noindex one of the versions using the option in webmaster tools. Ongoing we will consult developers about modifying the cms but budgets are limited so looking for a cheap and quick solution to help until the new year. anyone know of a way other than wmt to block Google from seeing duplicate content. We can block Google from folders because only a small percentage of the content in the folder would be internally duplicate. would be very grateful for any suggestions anyone could offer. thanks.
On-Page Optimization | | daedriccarl0 -
Duplicate content on domains we own
Hello! We are new to SEO and have a problem we have caused ourselves. We own two domains GoCentrix.com (old domain) and CallRingTalk.com (new domain that we want to SEO). The content was updated on both domains at about the same time. Both are identical with a few exceptions. Now that we are getting into SEO we now understand this to be a big issue. Is this a resolvable matter? At this point what is the best approach to handle this? So far we have considered a couple of options. 1. Change the copy, but on which site? Is one flagged as the original and the other duplicate? 2. Robots.txt noindex, nofollow on the old one. Any help is appreciated, thanks in advance!
On-Page Optimization | | CallRingTalk0 -
Too Many On-Page Links
Hello. So, my SEO team has worked very hard to finally resolve RogerBot/GoogleBot specific Crawl Errors either manually or programmatically can be fixed for our Budget Blinds USA Pro Campaign. We've done a good job even if a lot of it came from Robots.txt file entries as this was the most efficient way our client chose to do it. Good news is most of it is CMS configuration and not bad site architecture. That being said our next big volume of Crawl Errors is "Too Many On-Page Links". Our Moz DomainRank is 61. Our client, on this new version of the website, added a large nav-based footer which has duplicate links from the Header Main Navigation. I believe our solution is to put in No-Follow Metatags at the Footer Link Level, so we don't zap Page Authority by over-dividing as you recommend. Is this the best way to resolve this? Is there any risk in this? Or is a 61 DomainRank high enough for RogerBot and GoogleBot to crawl these anyway? Please advise,
On-Page Optimization | | Aviatech0 -
Duplicate content http:// something .com and http:// something .com/
Hi, I've just got a crawl report for a new wordpress blog with suffusion theme and yoast wordpress seo module and there is duplicate content for: http:// something .com and http:// something .com/ I just can't figure out how to handle this. Can I add a redirect for .com/ to .com in htaccess? Any help is appreciated! By the way, the tag value for rel canonical is **http:// something .com/ **for both.
On-Page Optimization | | DanielSndstrm0 -
Too Many On-Page Links on a Directory type website?
Hi there, I run a website which is a directory therefore there are a lot of On-Page links. If you take a look at the site, www.south-african-hotels.com, you will see there are a number of links on all pages which are completely relevant. I'm not sure what to remove as everything is relevant. The top navigation is available throughout and that alone has 120 links in it to give users easy access to information. Do I ignore the Too many On-Page links suggestion or do I change something? Any suggestions welcome! Thank you in advance!
On-Page Optimization | | RyanMackie0 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0 -
Too many on-page links
I manualy counted the links on my website http://www.commensus.com which came to around 50, but SEO moz says I have over 100 and google isn't seeing them all.
On-Page Optimization | | jawl44630