Not sure if I need to be concerned with duplicate content plus too many links
-
Someone else supports this site in terms of making changes so I want to make sure that I know what I am talking about before I speak to them about changes.
We seem to have a lot of duplicate content and duplicate titles.
This is an example http://www.commonwealthcontractors.com/tag/big-data-scientists/ of a duplicate. Do I need to get things changed?
The other problem that crops up on reports is too many on page links. I am going to get shot of the block of tags but need to keep the news. Is there much else I can do?
Many thanks.
-
Thank you Robert for your advice.
Much appreciated.
Niamh
-
Your welcome Niamh.
David
-
David
I appreciate you taking the time for my query. It is really helpful and I will get it sorted.
Many thanks
Niamh
-
Niamh
With the url above, there is some duplicate content on the site and given the blog type it is I am not sure that it is helpful or a problem. I would suggest you nofollow the archives due to the way they are scrolling and showing as separate pages over and over.
As to the internal linking, from what I see there are not a ton of internal links here. Typically, you here keep it under 100 or keep it under 200. To me, if you have over 100 the page is a bit busy. You have less than 100 on the homepage and I do not think you have a real issue there.
Hope this helps,
Robert -
Simply put imagine your site is a book, a user comes along and finds the book has two chapter ones with the same title and the same page content - what value is that to the reader? None
Search engines are all about search quality and content, if they are seeing duplicate titles, descriptions and content how are they going to know what content on your pages is relevant to the user and delivers value to the users need.
I would make it a best practice for your business to ensure site content is rich, valuable and not duplicated across your sites domain. Also make sure descriptions and titles are short but descriptive. Maybe you can get your site's content manager to use great little tools like SEOMofo for this - http://www.seomofo.com/snippet-optimizer.html
Also get them looking at using schema.org and develop rich snippets for your page metadata.
Others can maybe come back with a better answer on that one - (read Robert's comments below regarding link quantities).
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Number of internal links and passing 'link juice' down to key pages.
Howdy Moz friends. I've just been checking out this post on Moz from 2011 and wanted to know how relevant it is today? I'm particularly interested in a number of links we have on our HP potentially harming important landing page rankings because not enough 'link juice is getting to them i.e) are they are being diluted by all the many other links on the page? (deeper pages, faqs, etc etc) It seems strange to me that as Google as has got more sophisticated this would still be that relevant (thus the reason for posting). Anyway, I thought I was definitely worth asking. If we can leverage more out of our on-page efforts then great 🙂
On-Page Optimization | | isaac6630 -
Need suggestion: Should the user profile link be disallowed in robots.txt
I maintain a myBB based forum here. The user profile links look something like this http://www.learnqtp.com/forums/User-Ankur Now in my GWT, I can see many 404 errors for user profile links. This is primarily because we have tight control over spam and auto-profiles generated by bots. Either our moderators or our spam control software delete such spammy member profiles on a periodic basis but by then Google indexes those profiles. I am wondering, would it be a good idea to disallow User profiles links using robots.txt? Something like Disallow: /forums/User-*
On-Page Optimization | | AnkurJ0 -
Static content VS Dynamic changing content what is best
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
How do you avoid getting hit for too many links with an ecommerce site?
On my campaign for www.fourcolormagnets.com one of my warnings was "too many on-page links". Is there any thing to do for ecommerce sites? and also, my page www.fourcolormagnets.com/rectangle-sizes.php is listed as having 744 links but, I count nowhere near that number. And idea where this comes from?
On-Page Optimization | | JHSpecialty0 -
Do NoFollow links still split link equity?
So I realize that Google will split link equity between all links on any given page. Example, if a landing page has 10 links then the authority from the landing page is split into 10 and each link given its own smaller amount of equity from that landing page. My question is if I were to turn 9 of the 10 links on this page to NoFollow links would the equity still remain split 10 ways or would it simply pass all of it to the one DoFollow link left on the page?
On-Page Optimization | | PageOnePowerGang0 -
Duplicate Content
Hi I have Duplicate content that i do sent understand 1 - www.example.dk 2- www.example.dk/ I thought i was the same page, whit and without the / Hope someone can help 🙂
On-Page Optimization | | seopeter290 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0 -
Too many on-page links
I manualy counted the links on my website http://www.commensus.com which came to around 50, but SEO moz says I have over 100 and google isn't seeing them all.
On-Page Optimization | | jawl44630