How can i see the pages that cause duplicate content?
-
SEOmoz PRO is giving me back duplicate content errors. However, i don't see how i can get a list of pages that are duplicate to the one shown. If i don't know which pages/urls cause the issue i can't really fix it. The only way would be placing canonical tags but that's not always the best solution.
Is there a way to see the actual duplicate pages?
-
The only other thing I can think of is there's duplicate page content and duplicate title content. If it says true in either of those columns then there's no URLs in the columns to the right of it (headed duplicate_page_content or duplicate_title) then I'd contact Moz and work with them. Mine populate fine.
-
That surely makes sense! But when i look at the column that says duplicate_page_content then there is nothing shown.. even if they are marked as true. I must be missing something...
-
OK, within that Excel file, there's a column header with "duplicate page content" - so, the URL in question will be in the far left (URL) then there's a column that says "duplicate page" (with true/false as the options) and if it's true, then there's another column with "duplicate page content" as a header and URLs in it. Those should be the ones that Moz caught duplicating the URL in the "URL" column - if that makes any sense at all!
-
True! it's really helpful! I might have one more question regarding this. When i export to csv i get a ton of data. I open the file in Excel and seperate the data to columns. The pages that have duplicate content issues are marked as "true". But how can i see within this document which pages are duplicate for another specific page?
-
No shame! There's a ton of data here and it can be a bit of a needle in a haystack at first to figure out
That's why these forums are so helpful!
-
Exactly. The download gives much deeper data, however with a few clicks that Netlogiq suggested you can find it w/o downloading.
-
Ummm.. i just found it. Not having bright moments today. shame. You must click on the number which is in the column "Other urls's". I was clicking on the page title shown in the column: "Page title url"
Didn't really jump to mind to click on the number.
Everything in order! Thx for responding everyone!
-
Hmmm not quite clear yet..
When i click on the issue in the overview a list of pages which have a duplicate content issue, opens. Then when i click on one of those links the only thing i see is a bold URL and some information about the duplicate content. But i don't see the url that is duplicate to the one displayed bold.
-
Now, I'll preface this by saying I don't know what documents you may be looking at vs what I have access to. I see duplicate links from SEOMOz, so you can get to it.
For example, when I log into my SEOMoz campaign information and click on the red errors box, then the duplicate content box, there's a selection of duplicate URLs right below the chart. My current one is indicating it caught 29 duplicate pages of content for my Spanish signs product section, then I can see all the URLs listed out that it sees as duplicates.
Granted, SEOMoz only crawls 10,000URLs at a time, so for a major site like mine that's only part of what we have, but it's an indicator of stuff we need to fix. I download my campaign report into a CSV file and there's columns in that identifying what's duplicate, too.
-
You can also export the document:
Crawl Diagnosis - Duplicate page content - export to CVS. Or - click on the +x number of duplicate pages, and you will see all the duplicate pages for that URL.
-
Yes, you can click on the error/duplicate content link and the pages will list. It will list the other pages below the bolded listing. Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
Query Strings causing Duplicate Content
I am working with a client that has multiple locations across the nation, and they recently merged all of the location sites into one site. To allow the lead capture forms to pre-populate the locations, they are using the query string /?location=cityname on every page. EXAMPLE - www.example.com/product www.example.com/product/?location=nashville www.example.com/product/?location=chicago There are thirty locations across the nation, so, every page x 30 is being flagged as duplicate content... at least in the crawl through MOZ. Does using that query string actually cause a duplicate content problem?
Technical SEO | | Rooted1 -
Utilising Wordpress Attachment Pages Without Getting Duplicate Content Warnings.
I have a wordpres site that relies heavily on images and their usefulness. Each post links to larger sizes of the images with links back to the post and the "gallery" all images uploaded to the post. Unfortunately this goes against the "rules" and our attachment page show as duplicate content in Google (even though the image titles are different). There must be a way to utlise and make the most of attachment pages without getting duplicate content warnings?
Technical SEO | | DotP0 -
Why are my 301 redirects and duplicate pages (with canonicals) still showing up as duplicates in Webmaster Tools?
My guess is that in time Google will realize that my duplicate content is not actually duplicate content, but in the meantime I'd like to get your guys feedback. The reporting in Webmaster Tools looks something like this. Duplicates /url1.html /url2.html /url3.html /category/product/url.html /category2/product/url.html url3.html is the true canonical page in the list above._ url1.html,_ and url2.html are old URLs that 301 to url3.html. So, it seems my bases are covered there. _/category/product/url.html _and _/category2/product/url.html _ do not redirect. They are the same page as url3.html. Each of the category URLs has a canonical URL of url3.html in the header. So, it seems my bases are covered there as well. Can I expect Google to pick up on this? Why wouldn't it understand this already?
Technical SEO | | bearpaw0 -
Duplicate Content within Site
I'm very new here... been reading a lot about Panda and duplicate content. I have a main website and a mobile site (same domain - m.domain.com). I've copied the same text over to those other web pages. Is that okay? Or is that considered duplicate content?
Technical SEO | | CalicoKitty20000 -
Lots of duplicate content warnings
I have a site that says that I have 2,500 warnings. It is a real estate website and of course we use feeds. it says I have a lot of duplicate content. One thing is a page called "Request an appointment" and that is a url for each listing. Since there are 800 listings on my site. How could I solve this problem so that this doesn't show up as duplicate content since I use the same "Request an Appointment" verbeage on each of those? I guess my developer who used php to do it, created a dedicated url to each. Any help would be greatly appreciated.
Technical SEO | | SeaC0 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0