What is the best duplicate content checker that will check by phrase?
-
I create a lot of landing pages for individual keywords on my site. An issue that I've run into is I unknowingly use some common phrases repeatedly on different pages and therefore sometimes get dinged by Google.
I'm basically looking for a tool that would check the content of a new page against all the other pages on my site and check it phrase by phrase. Most of the tools I've found make you put in two URLs to check against - I need it to check against hundreds.
-
Looks pretty good - thanks!
-
Dave, you might want to check out UN.CO.VER, from Textbroker: http://www.textbroker.com/uncover/
I use it a lot, and it's customizable to the level of checking you desire. Nice, because it'll give you a live link to the location where it's detected duplicate content. Free, by the way.
-
Thanks for the suggestion. I've found Copyscape to be okay if there is overt plagiarism but not particularly effective at the phrase-level so am hoping for something else.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am trying to better understand solving the duplicate content issues highlighted in your recent crawl report of our site - www.thehomesites.com.
Below are some of the urls highlighted as having duplicate content -
On-Page Optimization | | urahul
http://www.thehomesites.com/zip_details/76105
http://www.thehomesites.com/zip_details/44135
http://www.thehomesites.com/zip_details/75227
http://www.thehomesites.com/zip_details/94501 These are neighborhood reports generated for 4 different zip codes. We use a standard template to create these reports. What are some of the steps we can take to avoid these pages being categorized as duplicate content?0 -
Moz Crawl Shows Duplicate Content Which Doesn't Seem To Appear In Google?
Morning All, First post, be gentle! So I had Moz crawl our website with 2500 high priority issues of duplicate content, not good. However if I just do a simple site:www.myurl.com in Google, I cannot see these duplicate pages....very odd. Here is an example....
On-Page Optimization | | scottiedog
http://goo.gl/GXTE0I
http://goo.gl/dcAqdU So the same page has a different URL, Moz brings this up as an issue, I would agree with that. However if I google both URL's in Google, they will both bring up the same page but with the original URL of http://goo.gl/zDzI7j ...in other words, two different URL's bring up the same indexed page in Google....weird I thought about using a wildcard in the robots.txt to disallow these duplicate pages with poor URL's....something like.... Disallow: /*display.php?product_id However, I read various posts that it might not help our issues? Don't want to make things worse. On another note, my colleague paid for a "SEO service" and they just dumped 1000's of back-links to our website, of course that's come back to bite us in the behind. Anyone have any recommendations for a good service to remove these back-links? Thanks in advance!!0 -
Where to add new content
I run a vBulletin website and vBulletin isnt very SEO friendly. I do fairly well in Google for most of my keywords, but forums dont necessarily build strong page authority etc. My site deals with fishing reports across the state of VA and drives 15-18k sessions a month and close to 100,000 page views a month based on Google Analytics. I want to start targeting new keywords and I am concerned about vBulletin inability to be SEO friendly. Many of my new keywords arent dynamic like fishing reports that are added by members daily. These are more like campgrounds, marinas etc. My thought is to install a Wordpress blog and build out this content so I can efficiently deal with on page SEO. the vBulletin software is installed in the root so I would install wordpress in something like mydomain/lake123/ Is the right thing to do, and will google see multiple sitemaps (one for vbulletin and another for wordpress) and index appropriately? Am I missing something major here? Thanks ~ Brian
On-Page Optimization | | FCBCO0 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
High Volume Duplicate Title and Content Errors: Scale of 1-10 How bad is this?
I have 15k pages and 1.5K have duplicate title and content errors. the reason is I have a ring parent page with child pages for each of the size variations. On a scale of 1-10 how big an issue is this and does it need fixing?
On-Page Optimization | | Tippman0 -
Tags creating duplicated content issue?
Hello i believe a lot of us use tags in our blogs as a way to categorize content and make it easy searchable but this usually (at lease in my case) cause duplicate content creation. For example, if one article has 2 tags like "SEO" & "Marketing", then this article will be visible and listed in 2 urls inside the blog like this domain.com/blog/seo and domain.com/blog/marketing In case of a blog with 300+ posts and dozens of different tags this is creating a huge issue. My question is 1. Is this really bad? 2. If yes how to fix it without removing tags?
On-Page Optimization | | Lakiscy0 -
Duplicate Page Content and Duplicate Page Title
Hi All, I'm new in SEOMoz and have some questions after I have already spend 2-3 days trying to resolve the problems identified from Crawling one of my clients websites. I get quite a lot of Duplicate Page Conntent and Page Titles warnings and trying to find a workaround through the forums and posts. I continuously get this error on most of my pages: URL: http://domain.com/benefits with the same Page but with a WWW in front URL: http://www.domain.com/benefits Any advice will be highly appreciated. Thanks, Athos
On-Page Optimization | | athosk0 -
Can duplicate content issues be solved with a noindex robot metatag?
Hi all I have a number of duplicate content issues arising from a recent crawl diagnostics report. Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem? Thanks for any / all replies
On-Page Optimization | | joeprice0