Duplicate content from pagination and categories found in multiple locations
-
Hey Moz community,
Really need help resolving duplicate content issues for an eCommerce utilizing Magento.
We have duplicate content issues with category pagination and categories found in multiple locations.
here's an example:
"www.website.com/style/sequin-dresses" is also found at "www.website.com/features/sequin-dresses"
*to resolve this issue do we just need to place a canonical tag on "www.website.com/features/sequin-dresses" pointing to "www.website.com/style/sequin-dresses"?
In addition, the category "Sequin Dresses" also has pagination. to resolve duplicate content issues with pagination do we need to implement a rel=next/prev tag? (we do not have a view-all due to the amount of products featured)
If anyone has experience with this or any insights on how to resolve these issues please let me know.
Thanks!
-
You should be able to use canonicalisation here, but for a more in-depth guide to pagination including rel="next", "prev", etc., check out this blog post by my former agency. It's a great resource on the subject.
-
Actually the rel=canonical would solve both issues. I am guessing that because of the way the site is made, 80% of the content from page to page in a category is the same. I am guessing it might be using the same title tag, maybe appending a "page title page 2" on it, maybe not. Also the meta description is more than likely the same as well. It is a given that all of the code is the same, so I am thinking unless you are using long descriptions on the category pages that is the issue.
I would also implement the rel=next and rel=prev as well. You might have someone code in a way to change the title tag based on the page as well, and if you are feeling spendy have someone code in a meta description based on the category page as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I create multiple page URLs that are optimized for location and keywords that may be overlapping or the same?
Hi guys, I am attempting to create unique URLs for several different pages on a website. Let's say hypothetically that this is a website for a chain of Ice Cream Shops in Missouri. Let's say they have 15 locations in Springfield, Missouri. I would ideally like to optimize our Ice Cream Shop's in Springfield, Missouri with the main keyword (ice cream) but also the geo-specific location (Springfield), but we obviously can't have duplicate URLs for these 15 locations. We also have several secondary keywords, think things like: frozen yogurt or waffle cone that we can also use, although it would most likely be more powerful if we use the primary keyword. Any suggestions for how to go about doing this most effectively? Thanks!
On-Page Optimization | | GreenStone0 -
Duplicate content on events site
I have an event website and for every day the event occurs the event has a page. For example: The Oktoberfest in Germany the event takes 16 days. My site would have 16 (almost)identical pages about the Oktoberfest(same text, adres, photos, contact info). The only difference between the pages is the date mentioned on the page. I use rich snippets. How does google treat my pages and what is the best practice.
On-Page Optimization | | dragonflo0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Duplicate page content
what is duplicate page content, I have a dating site and it's got a groups area where the members can base there discussions in a category like for an example, night life, health and beauty, and such. why would this cause a problem of duplicate page content and how would I fix it. explained in the terms of a dummy.
On-Page Optimization | | clickit2getwithit0 -
Duplicate content harms individual pages or whole site?
Hi, One section of my site is a selection of Art and Design books. I have about 200 individual posts, each with a book image and a description retrieved from Amazon (using their API). Due to several reasons not worth mentioning I decided to use the Amazon description. I don't mind if those pages rank well or not, but I need them as additional content for my visitors as they browse my site. The value relies in the selection of books. My question is if the duplicate content taken from Amazon harms only each book page or the whole site. The rest of the site has unique content. Thanks! Enrique
On-Page Optimization | | enriquef0 -
Duplicate Product BUT Unique Content -- any issues?
We have the situation where a group of products fit into 2 different categories and also serve different purposes (to the customer). Essentially, we want to have the same product duplicated on the site, but with unique content and it would even have a slightly different product name. Some specifications would be redundant, but the core content would be different. Any issues?
On-Page Optimization | | SEOPA1 -
Tools for finding duplicate content offsite?
Hi is there a tool that will spider my site then find similar text on external sites?
On-Page Optimization | | adamzski0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0