How do I identify what is causing my Duplicate Page Content problem?
-
Hello,
I'm trying to put my finger on what exactly is causing my duplicate page content problem... For example, SEOMoz is picking up these four pages as having the same content:
http://www.penncare.net/ambulancedivision/braunambulances/express.aspx
http://www.penncare.net/ambulancedivision/recentdeliveries/millcreekparamedicservice.aspx
http://www.penncare.net/ambulancedivision/recentdeliveries/monongaliaems.aspx
http://www.penncare.net/softwaredivision/emschartssoftware/emschartsvideos.aspx
As you can tell, they really aren't serving the same content in the body of the page. Anybody have an idea what might be causing these pages to show up as Duplicate Page Content? At first I thought it was the photo gallery module that might be causing it, but that only exists on two of the pages...
Thanks in advance!
-
Ah right - OK then.
With regards to data coming back from SEOmoz's crawler, I might be tempted to ask them what it is seeing. I should really have a look at this myself because I haven't yet.
-
I'm currently getting that information from Moz's own web crawler wherein it tells me the pages of that have Duplicate Page Content and the other URLs that that duplicate content exists on.
With regard to the 301's - I have rewrite rules setup to 1.) set all requests to lowercase 2.) trim off home.aspx 3.) append www. to the beginning of the request, etc. When processed these should function as a single redirect / rewrite.
-
Before looking at the duplicate content (what did you use to find there is duplicate content?)... a quick question - you have a lot of 301's. Just want to check, are these just a single redirect or a redirect of a redirect etc?
-
I would add some content to these pages to help differentiate. None of them are text heavy so it may be hard for spiders to see a difference. Add a summary, maybe a text translation of what is in the vids, etc
-
Thanks for your reply... I guess more specifically I was wondering what it was about these particular page elements that makes search engines incapable of deciphering them from one another.
-
- Search engines don't know which version(s) to include/exclude from their indices
- Search engines don't know whether to direct the link metrics (trust, authority, anchor text, link juice, etc.) to one page, or keep it separated between multiple versions
- Search engines don't know which version(s) to rank for query results
When duplicate content is present, site owners suffer rankings and traffic losses and search engines provide less relevant results.
Hope this helps!
Resources, http://www.seomoz.org/learn-seo/duplicate-content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
19 Hours Excessive to Code Single Wordpress Page?
My developer says that is will take 19 hours to modify a listing page of the wpcasa London real estate theme because the existing template is difficult to customize. I am attaching an image of the existing page before customization and an image of a final mock up. Is 19 hours a reasonable amount of time to customize this page? Look forward to feedback. New Design is visible at: https://imgur.com/a/42XBqDD Alan IQ1i0kg
Web Design | | Kingalan10 -
Internally linked pages from different subdomain must be well optimised?
Hi all, We have guide/help pages from different subdomain (help.website.com). And we have linked these from 3rd hierarchy level pages of our website (website.com/folder1/topic2). But help.website sumdomain & pages are not well optimised. So, I am not sure linking these subdomain pages from our website pages hurts our rankings? Thanks,
Web Design | | vtmoz0 -
Is this spammy/panda problem?
We have a property site and have many area pages with listings of properties. For example, one page may have up to 30 properties listed, with 100 words of description on that listing page for that property and then you click 'more info' to get to the page of that property with maybe 200 words in total. I want to add bullet points for each property on the area page but Im worried that it may be seen by google as spammy even though its usefull to the client. For example, if I had 30 properties on a page, and 28 of them said next to each picture.. Property Type: Shared, Open Plan, Single Bed Would that be a problem for google?
Web Design | | nick-name1230 -
Avg Page Load Time Increase After Responsive Web Design
The Avg. Page Load Time has been steadily increasing after our website went responsive. What could have cause this?
Web Design | | JMSCC1 -
Is it important to keep your website home index page simple to rank better?
My website http://www.endeavourcottage.co.uk/ markets holiday cottages and it's grown from my own singular cottage into a small letting agency and I used to rank at best number 3 for the short tailed keywords like Whitby holiday cottages with its drop-down to position 10 on Google.co.uk. So this week I was looking for a UK business to help me improve my rankings and the first thing they said was my home page is detrimental with the listing too many conflicting info with it advertising all 12 properties on it. They suggested a door entry page into the site keeping it simple but when I run it through the analysing tool here on Moz for "Whitby holiday cottages" as an example it came out looking okay. I do the usual things of title tags and meta descriptions for my keywords etc any suggestions or advice would be very welcome thank you Alan
Web Design | | WhitbyHolidayCottages0 -
How to add SEO Content to this site
Hi Great community and hope you guys can help! I have just started on a SEO project for http://bit.ly/clientsite , the clients required initial KPI is Search Engine Rankings at a fairly low budget. The term I use for the site is a "blurb site", the content is thin and the initial strategy I want to employ to get the keyword rankings is to utilize content. The plan is to: add targeted, quality (user experience & useful) and SEO content on the page itself by adding a "read more" link/button to the "blurb" on the right of the page (see pink text in image) when someone clicks on the "read more", a box of content will slide out styled much the same as the blurb itself and appear next to and/or overlay over the blurb and most of the page (see pink rectangle in image) Question: Is this layer of targeted , quality (user experience & useful) and SEO content (which requires an extra click to get to it) going to get the same SEO power/value as if it were displayed traditionally on the initial display? If not, would it be better to create a second page (2<sup>nd</sup> layer) and have the read more link to that and then rel-canonical the blurb to that 2<sup>nd</sup> page, so that all the SEO passes to this expanded content and the second page/layer is what will show up in the rankings? Thanks in advance qvDgZNE
Web Design | | Torean0 -
Custom 404 Page Indexing
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
Web Design | | sftravel0 -
Could Website redesign be a cause of drop in rankings?
We had a complete redesign of our website and moved it over to wordpress several months ago. As url's changed, we had appropriate 301 redirects done. Rankings for our top keywords dropped, but others remained intact. Our SEO company told us rankings drop when a redesign is done, but I thought if we did all redirects properly (which they approved), it wouldn't be much of a problem. Additionally, we've been steadily adding good new content. Any advice?
Web Design | | rdreich490