ECommerce site - Duplicate pages problem.
-
We have an eCommerce site with multiple products being displayed on a number of pages.
We use rel="next" and rel="prev" and have a display ALL which I understand Google should automatically be able to find.
-
Should we also being using a Canonical tag as well to tell google to give authority to the first page or the All Pages. Or was the use of the next and prev rel tags that we currently do adequate.
-
We currently display 20 products per page, we were thinking of increasing this to make fewer pages but they would be better as this which would make some later product pages redundant . If we add 301 redirects on the redundant pages, does anyone know of the sort of impact this might cause to traffic and seo ?.
General thoughts if anyone has similar problems welcome
-
-
Many thanks , you have been most helpful.
Yes, I see your point. I think we will have a look at implementing this on a couple of categories where we can monitor traffic and rankings . Then if it looks good, then will roll it out to the rest of the site.
Thank you.
Sarah
-
Essentially yes - pages 2+ of search just look "thin" to Google. They tend to have similar title tags, META descriptions, etc., and Google honestly isn't all that fond of indexing search pages in the first place (they don't want their search to land on your search). Those 2+ pages also don't tend to attract links or make a lot of sense for someone landing on them. By using META NOINDEX,FOLLOW, Google can crawl those searches to deeper pages, but the actually search pages don't dilute your overall site and search index.
Google's preferred method (or so they say) in 2012 is rel=prev/next, but I find that implementation can be much trickier than META NOINDEX. It's a difficult topic, and I honestly find that the ideal approach varies wildly from site to site. It's important to plan well, implement careful, and measure the results.
-
Hi Peter,
Many thanks for your answer. Very comprehensive and much appreciated There's certainly some good suggestions here.
Just quickly you mention about putting a NOINDEX FOLLOW on every page from 2 or 3 onwards.I take it , that's because later pages don't rank to well ?.
Is that the suggestion so the idea behind it that the link juice is being diluted to much. By Keeping only the first 2 pages say indexed etc, I would stand a better chance of ranking higher.
I will pass your suggestions on to my developer and see what we can come up from it. Will monitor and report back , hopefully with a sorted solutioin.
Once again , many thanks for sound advice.
Sarah.
-
Unfortunately, pagination + sorts gets ugly fast. Technically, the rel=prev/next tag should contain the sort parameter AND then you should canonical to the main pagination page. So, for example if you had a page like:
www.example.com/search.php?page=2&sort=asc
You should have tags like:
- Rel=Prev: http://www.example.com/search.php?page=1&sort=asc
- Rel=Next: http://www.example.com/search.php?page=3&sort=asc
- Canonical: http://www.example.com/search.php?page=2
In practice, it's incredibly hard to implement. So, you could do a couple of things:
(1) Block the sort_by parameter with Google Webmaster Tools parameter handling
(2) Use META NOINDEX, FOLLOW on all pages 2+ of search and sort URLs
I don't find Robots.txt works that well, in practice, and 800K blocked URLs can make Google jump. I'm actually confused by how Google is crawling the sorts at all (since they're form-driven). It looks like you put the sorts in your pagination links. Would it be possible to store any sorts in a cookie or session variable and not add those to links?
Given your current situation, and that Google has indexed thousands of sort URLs (from what I can see), I think the Google Webmaster Tools approach might be the safest. This is a complex problem, though, and you may need to consult someone.
-
Hi ,
In Answer to your point on to Question 2 , Currently the maximum number of pages we have is 4 pages plus a View All for a few of our products but most products are split on 2 pages plus a view all.
For the largest product example we have 83 products broken down as Page 1 to 4 has 20 products , page5 has 3 and View all - 83 products. rel Prev and rel Next are on the pages and View all has Nothing on it (Is that okay). The title tags are duplicated on the numerous pages , so I was going to add in page 2, 3, 4 etc to sort that.
I was going to increase the number of products per page to 30 , which would in effect put me down to 3 pages plus View all but more importantly , I thought I would also get stronger link value and less dilution hence better SEO .
The pages don't rank partially well at all well but on google speed test, I think we score 85/100 anyway , so from a speed point of view, it should'nt be a problem. Was just worried, that big changes like this could have a dramatic effect .
The url incase your interested is http://www.bestathire.co.uk/rent/Scaffold_towers/266
Many thanks
Very much appreciated.
Sarah.
-
Hi ,
Many Thanks for your reply,
We do have pagination and sorts like listing products a-z , z-a , price low to high and high to low etc which all generate different urls but we have put in the robot.txt file for google not to spider them. See below .
Also from looking at WMT is says it has blocked886,996 url's in the past 90 days. Our site has approx 54,000 indexed pages.
Disallow: */sort_by:Product.price%20ASC
Disallow: */sort_by:Product.price%20DESC
Disallow: */sort_by:Product.title%20ASC
Disallow: */sort_by:Product.title%20DESC
Disallow: */sort_by:Product.distance%20ASC
Disallow: */sort_by:Product.distance%20DESC
Disallow: */stealth:onAre you suggesting we do the Canonical the sorts aas well for saftey incase we have missed anything ?
Sarah
-
(1) DON'T canonical to the first page of results - Google definitely has issues with that. If you've got rel=prev/next in place, then I wouldn't canonical to "View All", either. They're kind of competing signals. You can use rel=prev/next with rel=canonical, but it's a bit complicated. Basically, it's for situations where you have pagination AND some other parameter, like a sort.
(2) If you increase it, just make sure it doesn't negatively impact users or load-times (might be worth A/B testing, honestly). Are you saying that you might end up with a URL like "?page=7" which basically doesn't exist because now you'll have less pages? I think you might be safer just letting that 404 and have Google recrawl the new structure. The odds of having any links to Page 7 of search results (inbound links, that is) are very low, and just letting those pages die off may be safer.
-
I think the best solution to get something properly done on your website, if you're displaying a page with 20 products (by default) and it has a complicated extension to see the next one ( domain.com/?abc=123etc#321 ) you have a significant problem that you should be concerned about more - whether it's domain.com/category/page/1/ and page/2/.
In theory, page/1/ and page/2/ (blog style) contain the same content as the home page (/1/ or /). Some practices are noindex,follow for any page [2-∞). You should definitely consider rel=canonical across the site though. It's essential. As well as rel="next" rel="prev".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tough SEO problem, Google not caching page correctly
My web site is http://www.mercimamanboutique.com/ Cached version of French version is, cache:www.mercimamanboutique.com/fr-fr/ showing incorrectly The German version: cache:www.mercimamanboutique.com/de-de/ is showing correctly. I have resubmitted site links, and asked Google re-index the web site many times. The German version always gets cached properly, but the French version never does. This is frustrating me, any idea why? Thanks.
Technical SEO | | ss20160 -
Looking at creating some auto-generated pages - duplicate content?
Hi Everyone! We just launched a new version of our research site and the main CTA on the page sends users to a subdomain that's blocked by robots.txt. The subdomain link is one of our PPC landing pages and they would be duplicate content for every model (cars). We're also looking at a new content stream of deals pages, on the main domain. The thought process was that we could rank these pages for things like "Volkswagen golf deals" and also use them as canonical URLs from the PPC pages so that Panda doesn't get mad at us for sending hundreds of links to a subdomain that's blocked. It's going to take us a lot of time to write the copy for the deals pages, so if we auto-generate it by pulling a paragraph of copy from the car review, and numerical stats about that model, will it be classes as duplicate and/or is there any downside to doing it? Review Page: http://www.carwow.co.uk/car-reviews/Ford/Fiesta Deals Page: http://www.carwow.co.uk/deals/Ford/Fiesta PPC Landing Page: http://quotes.carwow.co.uk/buy/Ford/Fiesta I can't help but feel that this may all be a bit overkill and perhaps it makes more sense to build 1 central deals page per model with unique content that we can also send the PPC traffic to, then life any block from the quotes. subdomain. But that will take time and we'd also like a quick solution. I'd also question if it's even an issue to link to a blocked subdomain, Google adds the quote URL into the index but can't crawl it, which I've been told is bad - but is it bad enough to do something about? Thanks, JP
Technical SEO | | Matt.Carwow0 -
How to avoid duplicate content when blogging from a site
I have a wordpress plastic surgery website. I have a wordpress blog on the site. My concern is avoiding duplicate content penalties when I blog. I use my blog to add new information about procedures that have pages on the same topic on the main site. Invariably same keywords and phrases can appear in the blog-will this be considered Duplicate content? Also is it black hat to insert anchor text in a blog linking back to site content-ie internal link or is one now and then helpful
Technical SEO | | wianno1680 -
Duplicate Content on Product Pages
Hello I'm currently working on two sites and I had some general question's about duplicate content. For the first one each page is a different location, but the wording is identical on each; ie it says Instant Remote Support for Critical Issues, Same Day Onsite Support with a 3-4 hour response time, etc. Would I get penalized for this? Another question i have is, we offer Antivirus support for providers ie Norton, AVG,Bit Defender etc. I was wondering if we will get penalized for having the same first paragraph with only changing the name of the virus provider on each page? My last question is we provide services for multiple city's and towns in various states. Will I get penalized for having the same content on each page, such as towns and producuts and services we provide? Thanks.
Technical SEO | | ilyaelbert0 -
Schema Tags Configuration - Ecommerce Category Pages
I'm semi confident that some schema tags are implemented correctly on our ecommerce category pages.. but I would just like to double check. An example url http://www.freshcargo.co.uk/shoes I have just fixed some errors using the Google rich snippets tool... but the thing I'm not sure about is why the prices are being displayed as seperate items Eg: http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fwww.freshcargo.co.uk%2Fshoes&view= Thanks in advance
Technical SEO | | edwardlewis0 -
Problems with pages loading within seomoz account
any one else have the problem of pages loading once logged into their seomoz account??
Technical SEO | | james1000 -
Home Page Indexing Question/Problem
Hello Everyone, Background: I recently decided to change the preferred domain settings in WM Tools from the non www version of my site to the www version. I did this because there is a redirect from the non www to the www and I've built all of my internal links with the www. Everything I read on SEO Moz seemed to indicate that this was a good move. Traffic has been down/volatile but I think it's attributable mostly to a recent site change/redesign. Having said that the preferred domain change did seem to drop traffic an additional notch. I made the move two weeks ago. Here is the question: When I google my site, the home page shows up as the site title without the custom title tags I've written. The page that displays in the SERP is still the non www version of the site. a site:www.mysite.com search shows an internal page first but doesn't return the home page as a result. All other pages pop up indexed with the www version of the page. a site:mysite.com (notice lack of www) search DOES SHOW my home page and my custom title tags but with a non www version of the page. All other pages pop up indexed with the www version of the page. Any one have thoughts on this? Is this a classic example of waiting on Google to catch up with the changes to my tiny little site?
Technical SEO | | JSOC0