ECommerce site - Duplicate pages problem.
-
We have an eCommerce site with multiple products being displayed on a number of pages.
We use rel="next" and rel="prev" and have a display ALL which I understand Google should automatically be able to find.
-
Should we also being using a Canonical tag as well to tell google to give authority to the first page or the All Pages. Or was the use of the next and prev rel tags that we currently do adequate.
-
We currently display 20 products per page, we were thinking of increasing this to make fewer pages but they would be better as this which would make some later product pages redundant . If we add 301 redirects on the redundant pages, does anyone know of the sort of impact this might cause to traffic and seo ?.
General thoughts if anyone has similar problems welcome
-
-
Many thanks , you have been most helpful.
Yes, I see your point. I think we will have a look at implementing this on a couple of categories where we can monitor traffic and rankings . Then if it looks good, then will roll it out to the rest of the site.
Thank you.
Sarah
-
Essentially yes - pages 2+ of search just look "thin" to Google. They tend to have similar title tags, META descriptions, etc., and Google honestly isn't all that fond of indexing search pages in the first place (they don't want their search to land on your search). Those 2+ pages also don't tend to attract links or make a lot of sense for someone landing on them. By using META NOINDEX,FOLLOW, Google can crawl those searches to deeper pages, but the actually search pages don't dilute your overall site and search index.
Google's preferred method (or so they say) in 2012 is rel=prev/next, but I find that implementation can be much trickier than META NOINDEX. It's a difficult topic, and I honestly find that the ideal approach varies wildly from site to site. It's important to plan well, implement careful, and measure the results.
-
Hi Peter,
Many thanks for your answer. Very comprehensive and much appreciated There's certainly some good suggestions here.
Just quickly you mention about putting a NOINDEX FOLLOW on every page from 2 or 3 onwards.I take it , that's because later pages don't rank to well ?.
Is that the suggestion so the idea behind it that the link juice is being diluted to much. By Keeping only the first 2 pages say indexed etc, I would stand a better chance of ranking higher.
I will pass your suggestions on to my developer and see what we can come up from it. Will monitor and report back , hopefully with a sorted solutioin.
Once again , many thanks for sound advice.
Sarah.
-
Unfortunately, pagination + sorts gets ugly fast. Technically, the rel=prev/next tag should contain the sort parameter AND then you should canonical to the main pagination page. So, for example if you had a page like:
www.example.com/search.php?page=2&sort=asc
You should have tags like:
- Rel=Prev: http://www.example.com/search.php?page=1&sort=asc
- Rel=Next: http://www.example.com/search.php?page=3&sort=asc
- Canonical: http://www.example.com/search.php?page=2
In practice, it's incredibly hard to implement. So, you could do a couple of things:
(1) Block the sort_by parameter with Google Webmaster Tools parameter handling
(2) Use META NOINDEX, FOLLOW on all pages 2+ of search and sort URLs
I don't find Robots.txt works that well, in practice, and 800K blocked URLs can make Google jump. I'm actually confused by how Google is crawling the sorts at all (since they're form-driven). It looks like you put the sorts in your pagination links. Would it be possible to store any sorts in a cookie or session variable and not add those to links?
Given your current situation, and that Google has indexed thousands of sort URLs (from what I can see), I think the Google Webmaster Tools approach might be the safest. This is a complex problem, though, and you may need to consult someone.
-
Hi ,
In Answer to your point on to Question 2 , Currently the maximum number of pages we have is 4 pages plus a View All for a few of our products but most products are split on 2 pages plus a view all.
For the largest product example we have 83 products broken down as Page 1 to 4 has 20 products , page5 has 3 and View all - 83 products. rel Prev and rel Next are on the pages and View all has Nothing on it (Is that okay). The title tags are duplicated on the numerous pages , so I was going to add in page 2, 3, 4 etc to sort that.
I was going to increase the number of products per page to 30 , which would in effect put me down to 3 pages plus View all but more importantly , I thought I would also get stronger link value and less dilution hence better SEO .
The pages don't rank partially well at all well but on google speed test, I think we score 85/100 anyway , so from a speed point of view, it should'nt be a problem. Was just worried, that big changes like this could have a dramatic effect .
The url incase your interested is http://www.bestathire.co.uk/rent/Scaffold_towers/266
Many thanks
Very much appreciated.
Sarah.
-
Hi ,
Many Thanks for your reply,
We do have pagination and sorts like listing products a-z , z-a , price low to high and high to low etc which all generate different urls but we have put in the robot.txt file for google not to spider them. See below .
Also from looking at WMT is says it has blocked886,996 url's in the past 90 days. Our site has approx 54,000 indexed pages.
Disallow: */sort_by:Product.price%20ASC
Disallow: */sort_by:Product.price%20DESC
Disallow: */sort_by:Product.title%20ASC
Disallow: */sort_by:Product.title%20DESC
Disallow: */sort_by:Product.distance%20ASC
Disallow: */sort_by:Product.distance%20DESC
Disallow: */stealth:onAre you suggesting we do the Canonical the sorts aas well for saftey incase we have missed anything ?
Sarah
-
(1) DON'T canonical to the first page of results - Google definitely has issues with that. If you've got rel=prev/next in place, then I wouldn't canonical to "View All", either. They're kind of competing signals. You can use rel=prev/next with rel=canonical, but it's a bit complicated. Basically, it's for situations where you have pagination AND some other parameter, like a sort.
(2) If you increase it, just make sure it doesn't negatively impact users or load-times (might be worth A/B testing, honestly). Are you saying that you might end up with a URL like "?page=7" which basically doesn't exist because now you'll have less pages? I think you might be safer just letting that 404 and have Google recrawl the new structure. The odds of having any links to Page 7 of search results (inbound links, that is) are very low, and just letting those pages die off may be safer.
-
I think the best solution to get something properly done on your website, if you're displaying a page with 20 products (by default) and it has a complicated extension to see the next one ( domain.com/?abc=123etc#321 ) you have a significant problem that you should be concerned about more - whether it's domain.com/category/page/1/ and page/2/.
In theory, page/1/ and page/2/ (blog style) contain the same content as the home page (/1/ or /). Some practices are noindex,follow for any page [2-∞). You should definitely consider rel=canonical across the site though. It's essential. As well as rel="next" rel="prev".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Landing page video scripts - duplicate content concerns
we are planning to create a series of short (<30 sec) videos for landing pages for our clients PPC campaigns. Since our clients all offer the same services (except in different geographical regions of the county) - we were planning to use the SAME script ( approx 85 words) with only the clients business name changed. Our question is : Would these videos be identified as 'duplicate content' - if we are only planning to use the videos on landing pages and only for PPC? -in other words are we in any danger of any kind of consequences from the engines for repeating script text across a series of landing pages featured only at PPC campaigns?
Technical SEO | | Steve_J0 -
Duplicate Content Problem!
Hi folks, I have a quite awkward problem. Since a few weeks a get a huge amount of "duplicate content errors" in my MOZ crawl reports. After a while of looking for the error I thought of the domains I've bought additionally. So I went to Google and typed in site:myotherdomains.com The results was as I expected that my original website got indexed with my new domains aswell. That means: For example my original website was index with www.domain.com/aboutus - Then I bought some additional domains which are pointing on my / folder. What happened is that I also get listed with: www.mynewdomains.com/com How can I fix that? I tried a normal domain redirect but it seems as this doesn't help as when I am visiting www.mynewdomains.com the domain doesnt change in my browser to www.myoriginaldomain.com but stays with it ... I was busy the whole day to find a solution but I am kinda desperate now. If somebody could give me advice it would be much appreciated. Mike
Technical SEO | | KillAccountPlease0 -
My site was Not removed from google, but my most visited page was. what does that mean?
Help. My most important page http://hoodamath.com/games/ has disappeared from google, why the rest of my site still remains. i can't find anything about this type of ban. any help would be appreciated ( i would like to sleep tonight)
Technical SEO | | hoodamath0 -
Duplicate page issue in website
i found duplicate pages in my website. seomoz is showing duplicate web pages this is issue or not please tell me?
Technical SEO | | learningall0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | | nomad-2023230 -
ECommerce: Best Practice for expired product pages
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages. We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be: 1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories). 2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404) I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option. Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
Technical SEO | | zeepartner1 -
Removing robots.txt on WordPress site problem
Hi..am a little confused since I ticked the box in WordPress to allow search engines to now crawl my site (previously asked for them not to) but Google webmaster tools is telling me I still have robots.txt blocking them so am unable to submit the sitemap. Checked source code and the robots instruction has gone so a little lost. Any ideas please?
Technical SEO | | Wallander0 -
Does duplicate content on word press work against the site rank? (not page rank)
I noticed in the crawl that there seems to be some duplicate content with my word press blog. I installed a seo plugin, Yoast's wordpress seo plugin, and set it to keep from crawling the archives. This might solve the problem but my main question is can the blog drag my site down?
Technical SEO | | tommr10