Duplicate Content Report: Duplicate URLs being crawled with "++" at the end
-
Hi,
In our Moz report over the past few weeks I've noticed some duplicate URLs appearing like the following:
Original (valid) URL:
http://www.paperstone.co.uk/cat_553-616_Office-Pins-Clips-and-Bands.aspx?filter_colour=Green
Duplicate URL:
http://www.paperstone.co.uk/cat_553-616_Office-Pins-Clips-and-Bands.aspx?filter_colour=Green**++**
These aren't appearing in Webmaster Tools, or in a Screaming Frog crawl of our site so I'm wondering if this is a bug with the Moz crawler? I realise that it could be resolved using a canonical reference, or performing a 301 from the duplicate to the canonical URL but I'd like to find out what's causing it and whether anyone else was experiencing the same problem.
Thanks,
George
-
So glad to help, George!
-
Hi Chiaryn,
Thanks - you've been really helpful! I had assumed that as the referrer wasn't in the Web UI (per WMT), it wasn't available anywhere. I'd also assumed it was a copywriting issue and not a product data issue.
Need to readdress my assumptions
George
-
Hey George,
Thanks for writing in.
I looked into the pages with the ++ in the URL and it seems that they do actually exist on the site, so it isn't an issue with our crawler that is causing these in your crawl errors. For example, a link to the URL http://www.paperstone.co.uk/cat_553_Desktop-Essentials.aspx?filter_colour=Green++ can be found in the source code of the page http://www.paperstone.co.uk/cat_553_Desktop-Essentials.aspx here: http://screencast.com/t/HpHTlSs5gH8H
You can find the referral pages for the ++ pages on the site by downloading the Full Crawl Diagnostics CSV. In the first column, perform a search for the ++. When you find the correct row, look in the column labeled referrer, AM. This tells you the referral URL of the page where our crawlers first found the URLs that include ++. You can then visit this URL to find the links to those pages.
Since these URLs with the ++ do resolve with a 200 http status and they have the same code and content as the pages without the ++, our crawler will count them as duplicate content. I'm not certain why Screaming Frog and GWT are not find or reporting these pages; it may be that they parse the + signs in the URL differently than our crawler does.
As Keri and bishop23 mentioned, this is most likely not a major issue if GWT isn't reporting the errors, but we prefer to report the issues because we would rather be safe than sorry.
I hope this helps. Please let me know if you have any other questions.
Chiaryn
-
I'm not seeing an answer that jumps out at me for this one. For the immediate future, don't sweat it if you're not seeing it in GWT. This is assigned to our help desk, and we'll have someone from there investigate more and get back to you, though it might be a few days because of the Thanksgiving holiday (if you don't get an answer today, it may be Monday before we have a chance to respond).
-
If they're not appearing on WMT than you should ignore unless it's an exact duplicated content, then delete
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
My question is about duplicate content identified on my site by Moz The pages identified as duplicates come from different areas of the site, different categories etc.
Product Support | | cavendish
I realise the pages dont have much difference between them but would these pages be considered close enough to be duplicates http://www.youlesmotorcycles.com/clothing-male/reflective-clothing-non-tri http://www.youlesmotorcycles.com/accessories/luggage-hard0 -
I have removed a subdomain from my main domain. We have stopped the subdomain completely. However the crawl still shows the error for that sub-domain. How to remove the same from crawl reports.
Earlier I had a forum as sub-domain and was mentioned in my main domain. However i have now discontinued the forum and have removed all the links and mention of the forum from my main domain. But the crawler still shows error for the sub-domain. How to make the crawler issues clean or delete the irrelevant crawl issues. I dont have the forum now and no links at the main site, bu still shows crawl errors for the forum which doesnt exist.
Product Support | | potterharry0 -
Reports Issues
Hello there, I recently re-activated my account and I have some issues with the reports. I have been notified by email that the crawl has been successful and data were collected but they refer to January and February instead of November. What should I do? Thanks
Product Support | | PremioOscar0 -
On page SEO Report Sucks
Hey. I'm trying to get a PDF of the on page report so I can see a list of each issue. This works well if you're logged in and just navigate through the different tabs, but not so well if you export. All I get is an overview of the number of the priority issues, but not the URLs and the specific problems. I know you can export a CVS, but that's not a very good branded way to export this. What I'd like to do is export a report that is fairly gainular so I can provide "before and after" reports for work we completed. As is stands now, I have an overview PDF which sucks and can't provide any more details or a CVS which has all the details in the world, but is so in depth it's hard to navigate, especially for clients who don't know what they're looking at, not to mention it's not branded!!! There should be way more options for reporting here. Please let me know if you can do something about this or if I'm just missing something. Thanks. Micha
Product Support | | Multiverse-Media-Group0 -
MA monthly reports
Hi I need to submit my monthly reports to my clients this week but they have all come through devoid of most data ! I did submit a support request ticket to help@ yesterday but no reply yet, i appreciate you may well be very busy if this has happened for everyone Pls advise update asap so i know what to tell my clients ? Out of interest any other MOzzers out there having the same monthly reporting problem ? (i.e. no or little data) ? Many Thanks Dan
Product Support | | Dan-Lawrence0 -
How do I pull a report to show duplicate page content including links to the pages that are duplicates?
I need to send a report to my developers including the url's which have duplicate page content. Unfortunately the Moz download report only includes the pages that are affected by duplicate page content.
Product Support | | simmo2350 -
Report is scheduled for next month - need it now
Hi all, I just made a report from Moz pro. After selecting all the reports I wanted clicked finished. My report would send by e-mail to me and my account address. But I still haven't received any report.
Product Support | | StercBV
When I check the 'manage report' page it says that my next report will be created in july. But I need it now 🙂 It has been about 2 hours now since I created the report. Can someone tell my when I've to be more patience or it will take up to july for my first report? My campaign allready is added more then a month ago. First time to export the reports as complete set0