Duplicate Page Due To Website Display Function
-
Hi
Can anyone help with how I can rectify a duplicate issue? A high priority on my Moz report shows a duplicate issue however, this is due to the way the website is structured. For example. the below duplicate is created due to the website having a function to display all trips, so customers do not need to search page by page i.e:
http://www.bikecation.co.uk/categories/cycling-climbs
http://www.bikecation.co.uk/categories/cycling-climbs/page/2?showall=1
My question is, Will this format damage the SEO for this page? Is there a way to rectify? Would a canonical tag work in this case?
Many Thanks
Claire
-
Hi Jordan
Thank you this has completely cleared things up. Thanks for your help.
Claire
-
Hi there!
While our crawler doesn't support rel="prev" or "next" tags as a means of resolving duplicate content, we do recognize canonical tags. In this case, it looks like the reason we're reporting these as duplicates is because the canonical tag on http://www.bikecation.co.uk/categories/cycling-climbs/page/2?showall=1 is pointing to http://www.bikecation.co.uk/categories/cycling-climbs/page/2 rather than http://www.bikecation.co.uk/categories/cycling-climbs.I hope this helps!
-
HI Chris
This does help, thank you. I have downloaded a CSV and it states under the REL Canonical column http://www.bikecation.co.uk/categories/cycling-climbs
Is this how it should be if implemented right?
I need to understand this report and canonical tag/duplicates in more detail. I will see if Moz has any learning guides..
Thanks for your help - Much appreciated.
-
As far as i remember Moz isn't so hot on detecting canonical links and/or pagination both of which are on your site. These essentially will Stop the duplicate content that you think you are seeing so don't panic. If we're getting more technical canonical and pagination shouldn't be on the same page really but worst case one is ignored so no need to panic.
TL:DR - Moz doesn't dismiss errors like the above based on pagination / canonical so you can ignore it as you have these implemented (can you even see these if you export a report (.CSV)
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index your website pages on Google 2020 ?
Hey! Hopefully, everyone is fine here I tell you some step how you are index your all website pages on Google 2020. I'm already implementing these same steps for my site Boxes Maker. Now Below I'm giving you some steps for indexing your website pages. These are the most important ways to help Google find your pages: Add a sitemap. ... Make sure people know your site. ... Ensure full navigation on your site. ... Apply the indexing application to your homepage. ... Sites that use URL parameters other than URLs or page names may be more difficult to broadcast.
Intermediate & Advanced SEO | | fbowable0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Internal Duplicate Pages causing dip in rankings
Hi Guys, Need help in understanding whether having duplicate pages on your site push you down in rankings. Our all product pages getting indexed by Google with different parameters i.e. filters, affiliate id, utm_source etc. and then we have 10-15 duplicate for one product page. I am observing dip in rankings whenever Google starts indexing these duplicate but when I asked this question to John Muller and other Google team they said if you set up canonical then you don't have to worry about having different urls for same page but we are not ranking on Google and if we do then we dropped from page 1 to page 2 or sometimes page 3. Example - http://goo.gl/G5p3X5 Any suggestions.
Intermediate & Advanced SEO | | Webmaster_SEO0 -
Just found a wordpress blog duplicating main website blog - what to do?
Hello Mozzers, I am working on a website and found the social media agency, employed by the website owner, was running a parallel wordpress blog which duplicates the content on the main website's blog (200 odd pages of this duplicating wordpress blog are indexed - the duplication is exact other than for non-blog content pages - around 60 category, date pages, homepage, etc. I am planning to 301 redirect the wordpress blog pages to equivalent pages on website blog, and then 301 redirect the homepage, category and date pages, etc. to the website blog homepage, so all the blog pages redirect to somewhere on main website. _Does this make sense, or should I only worry about redirecting the blog content pages? _ Also, the main website is new and there are redirects coming in to pages from old website already. _Is there anything to be cautious about when redirecting to a main website from multiple old websites? _ Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Content Marketing: Should we build a separate website or built in site within the Website itself?
Hi Mozzers, Client: Big carpet cleaner player in the carpet cleaning industry Main Goal: Creating good content to Get more organic traffic to our main site Structure of the extra content: It will act like a blog but will be differentiated from the regular site by not selling anything but just creating good content. The look and design will be different from the client's site. SEO question: In terms of SEO, what would be the most beneficial for us to do, should we built in this new section/site outside or inside the client's site? I personally think that it should be separated from the main site because of the main reasons: A followed link to the main site Anchor texts implementation linking back to our service pages If we would to choose to build in this content, it would be highly beneficial for getting organic traffic within the main site but I am afraid this will not provide us any link juice since anchor texts won't be accounted the same since all of those would be located in the Nav bar of the main site. Can someone tell me what would be the best in terms of SEO? P.S: My boss doesn't agree with me and would rather go the second option (build in within the main site) that's why i am asking you guys what would be the most beneficial? Thank you Guys
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Coupon Website Has Tons of Duplicate Content, How do I fix it?
Ok, so I just got done running my campaign on SEOMOZ for a client of mine who owns a Coupon Magazine company. They upload thousands of ads into their website which gives similar looking duplicate content ... like http://coupon.com/mom-pop-shop/100 and
Intermediate & Advanced SEO | | Keith-Eneix
http://coupon.com/mom-pop-shop/101. There's about 3200 duplicates right now on the website like this. The client wants the coupon pages to be indexed and followed by search engines so how would I fix the duplicate content but still maintain search-ability of these coupon landing pages?0