Duplicate content pages
-
Crawl Diagnostics Summary shows around 15,000 duplicate content errors for one of my projects, It shows the list of pages with how many duplicate pages are there for each page. But i dont have a way of seeing what are the duplicate page URLs for a specific page without clicking on each page link and checking them manually which is gonna take forever to sort.
When i export the list as CSV, duplicate_page_content column doest show any data.
Can anyone please advice on this please.
Thanks
<colgroup><col width="1096"></colgroup>
| duplicate_page_content | -
Hey there!
Thanks for writing in.
I downloaded the CSV from your Travel Pack campaign. It looks like all of the duplicate content pages are in the CSV that I exported. I found them by sorting the the rows in Excel. Here is a good guide on how to get started sorting in Excel: http://office.microsoft.com/en-us/excel-help/sort-data-in-a-range-or-table-HP010073947.aspx
Thanks!
Nick
-
Sorry if my English was not clear, it's not my first language. My issue is I can't get the list of duplicate URLs of my site...
-
If they are attached to specific strings ( String: After the URL it looks like this: /?alwer.ei.we ) you can block the string(s) in your robot.txt file.
Lets say there are 100 duplicates that start with"/?osifos.sdjvnksdj" block out the "?osifos" in your robot txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find those website who are using our content
I'm tring to figure it out that by using seo moz how can i find all website who are using our content.
Moz Pro | | Showhow20 -
How to remove 404 pages wordpress
I used the crawl tool and it return a 404 error for several pages that I no longer have published in Wordpress. They must still be on the server somewhere? Do you know how to remove them? I think they are not a file on the server like an html file since Wordpress uses databases? I figure that getting rid of the 404 errors will improve SEO is this correct? Thanks, David
Moz Pro | | DJDavid0 -
Duplicate page titles in SEOMoz
My on page reports are showing a good number of duplicate title tags, but they are all because of a url tracking parameter that tells us which link the visitor clicked on. For example, http://www.example.com/example-product.htm?ref=navside and http://www.example.com/example-product.htm are the same page, but are treated as to different urls in SEOMoz. This is creating "fake" number of duplicate page titles in my reports. This has not been a problem with Google, but SEOMoz is treating it like this and it's confusing my data. Is there a way to specify this as a url parameter in the Moz software? Or does anybody have another suggestion? Should I specify this in GWT and BWT?
Moz Pro | | InetAll0 -
Page Authority Bulk Check?
Hi all, Was wondering if maybe using the SEOmoz API there was a way to check in excel the page authority of 500 URLs or more? Or is there a different way to do this? Thanks, Carlos
Moz Pro | | Carlos-R0 -
Crawl Diagnostics returning duplicate content based on session id
I'm just starting to dig into crawl diagnostics and it is returning quite a few errors. Primarily, the crawl is indicating duplicate content (page titles, meta tags, etc), because of a session id in the URL. I have set-up a URL parameter in Google Webmaster Tools to help Google recognize the existence of this session id. Is there any way to tell the SEOMoz spider the same thing? I'd like to get rid of these errors since I've already handled them for the most part.
Moz Pro | | csingsaas0 -
What is the difference between the Rank Tracker and the On-page Optimization page?
Both of them track keywords. In the Rank Tracker, you add each keyword manually and you associate it with a URL. For On-page Optimization page, the URLs are generated automatically based on searches and traffic?
Moz Pro | | ehabd0 -
Truncate page URLs
We have some pages (for example a contact us form) for which the URL is modified by the CMS depending on the referring page (this helps to put the form submission in context for the sales reps who get the contact submission). The SEOmoz crawler considers each URL a new page -- and so numbers like in diagnostics are all inflated as the same page is listed multiple times (e.g. for too many links) Is there a setting to change what the crawler considers to be the same page? Here are two URLs for the same page that the reports treat as separate pages: http://www.spirent.com/About-Us/Contact_us.aspx?referurl=0F528F4D703D8BB3523738D6373AA8AD http://www.spirent.com/About-Us/Contact_us.aspx?referurl=10ACDA6055244E369395223437FDCF30 The page is actually: http://www.spirent.com/About-Us/Contact_us.aspx Thanks Ken
Moz Pro | | spirent.marcom0 -
Domain vs. page authority?
hey i've been told that page auth is more important than domain authrity on open site.. why is that?
Moz Pro | | daxvirgo0