Indexing an e-commerce site
-
Hi all,
My client babyblingstreet.com. She sells baby and toddler clothing. Now a lot of the links on her site contain the same products. For instance: if you go to "What's new" you can find those same products in let's say her "Sale Items" link category.
The real problem with this is let's say my client sells a green dress and someone accesses it through the "baby and toddler dresses" category. And let's say this URL has 10 links pointing to it. Now, let's say someone else accesses this same green dress through the "What's new" category. And let's say this particular URL has 10 links pointing to it. Instead of having 20 links pointing to one URL about the green dress, I now have 10 links pointing to one URL and 10 pointing to another URL even though both URLs feature the exact same green dress.
In this particular example I would want to make the URL of the green dress in the "baby and toddler clothing" section be the canonical URL. So that means I would have to use this canonical tag on the green dress URL that's in the "what's new" category and let's say also the "sale items" category. This could get very tedious if my client has 200+ products. So I am wondering if I have to place a canonical tag on every URL that displays the green dress?
More importantly, I would like to know other people's strategies for indexing e-commerce sites that have the same product featured in multiple categories throughout the site.
I hope this makes sense. Thanks for your time.
-
I think you're right. Again, thanks for the well-informed response. I will take a lot of what you have just said into consideration. I also side with you about the duplicate issues. I may be a bit cynical here, but I have always found it hard to believe that Google will ever give us the complete truth.
-
This is one area where I am 99.99999% confident saying that Google's past statements are incorrect and even irresponsible. Panda is, in many ways, an assault on thin content, and duplicates are worse than thin. I've seen many large-scale sites take massive hits from duplicates (as much as 80% traffic loss).
The "myth" is that duplicate content causes a Capital-P Penalty, but Google uses a very narrow and self-serving definition of that term. Duplicate content does not cause a manual penalty and they probably don't consider Panda to be a penalty internally at Google. However, the consequences are very severe.
Even before Panda, I saw cases studies where reducing duplicate content greatly improved rankings. I had a client whose "product" pages (it was an event site) were being filtered out due to massive duplication. Once we fixed the problem, their search traffic tripled over the course of 3 months. This was well before May Day and Panda (2007, if memory serves). Today, it's 10X worse.
When you get into e-commerce, the problem is almost inevitable and needs to be managed. Now, does that mean that you're currently facing ranking issues, Panda, etc.? No, not necessarily. You have less than 2K indexed pages, which is hardly excessive. If each product page has one duplicate, and you know that can't spin out of control, the consequences are limited. Still, you're diluting your ranking ability to some extent. I think it's well worth addressing the problem and being proactive.
-
Thank you for your well-informed response, Peter. You are right, though it is tedious, I still have to do it.
In regards to product duplicates severely harming my client's ability to rank, I am not quite sure if that's true. Google has wrote extensive material about duplicate content and how it's a myth that it affects ranking. I am not quite sure how truthful that is, but here's a link to one of those articles:
http://www.spottedpanda.com/2011/seo-news/confirmed-seo-facts-matt-cutts/
As for not seeing the duplicate product URLs in action, that's simply because the site is ground-floor. I inherited this project about a month and a half-ago from a design company who only built her a beautiful site. They did not optimize one thing for her. What's worse is that they used this heavily, technically involved cart software called ProductCart. The cart uses .Asp technology, which I am not sure if you aware of this, but many servers aren't built anymore to handle this legacy coding format.
The real problem I am facing, per @activitysuper response, is that the link in the cat is the same the as the link in products section. What I am saying is that there's a completely different cat that also has that same product but with a different URL.
You are both right, this is probably something I can remedy on the server-side. I was just merely throwing this out there to determine how other SEOs deal with having the same product in multiple categories.
Thanks for your time.
-
How do you claim to be part of a community when all you offer is criticisms and for that matter complete ignorance? Do me a favor and never waste my time with such an ignorant response again. You know nothing about me, my client or the background of the situation.
-
It may be tedious, but you need to do it, one way or another. Theoretically, these product duplicates could be severely harming your client's ranking ability.
Practically, I'm not seeing much evidence, though, of these duplicate paths or duplicate products in the Google index. I am seeing other duplicate pages, like search results and https: versions of your product pages. You have a few canonicalization issues going on.
Ideally, no matter what category path, you'll land on one URL. The very small usability consequences of the path change (in my experience, at least) are far outweighed by the risks of spinning off dozens of duplicates. As @activitysuper said, there should be a way to do this dynamically - you're changing a couple of templates, not individual product pages.
I would have to see the duplicate product URLs in action, though. I'm not finding that specific problem.
-
You must be able to dynamically code the canonical tags into those 'new products'.
The really question is why have you got 2 pages? Surely you have a link in the cat and a link in the new products section linking to the same page.
-
how do you manage to pick up clients without having an understanding of how to optimise their site? seems a bit odd
-
If you are using Magento Commerce, just select the option in Config.
If you are using something else, then you may need a plugin.
Any eCommerce software should have already run into this problem a couple years ago.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Transferring Domain and redirecting old site to new site and Having Issues - Please help
I have just completed a site redesign under a different domain and new wordpress woo commerce platform. The typical protocol is to just submit all the redirects via the .htaccess file on the current site and thereby tell google the new home of all your current pages on the new site so you maintain your link juice. This problem is my current site is hosted with network solutions and they do not allow access to the .htaccess file and there is no way to redirect the pages they say other than a script they can employ to push all pages of the old site to the new home page of the new site. This is of course bad for seo so not a solution. They did mention they could also write a script for the home page to redirect just it to the new home page then place a script of every individual page redirecting each of those. Does this sound like something plausible? Noone at network solutions has really been able to give me a straight answer. That being said i have discussed with a few developers and they mentioned a workaround process to avoid the above: “The only thing I can think of is.. point both domains (www.islesurfboards.com & www.islesurfandsup.com) to the new store, and 301 there? If you kept WooCommerce, Wordpress has plugins to 301 pages. So maybe use A record or CName for the old URL to the new URL/IP, then use htaccess to redirect the old domain to the new domain, then when that comes through to the new store, setup 301's there for pages? Example ... http://www.islesurfboards.com points to http://www.islesurfandsup.com ... then when the site sees http://www.islesurfboards.com, htaccess 301's to http://www.islesurfandsup.com.. then wordpress uses 301 plugin for the pages? Not 100% sure if this is the best way... but might work." Can anyone confirm this process will work or suggest anything else to redirect my current site on network solutions to my new site withe new domain and maintain the redirects and seo power. My domain www.islesurfboards.com has been around for 10 years so dont just want to flush the link juice down the toilet and want to redirect everything correctly.
Intermediate & Advanced SEO | | isle_surf0 -
Micro Site Penalty?
I have been carrying out On-Page optimisation only for a client www.shade7.co.nz. After three months or so I have been getting some great results, improving to the top three positions for at least 30 of 45 keywords targeted. Couple of more tweaks and I would be a very happy camper. Disaster overnight! Rankings CRASH! Unbeknown to me the client a month or so back decided to link just about every product/link on a micro site he owns (www.shademakers.com/ ) plus one other site he owns. Explorer I think discovered over 350 back-links (follow) from these sites! As this is a site he owns and it is targeting the same keywords I presume this falls into the EVIL bucket of SEO. Two part question do you believe I am correct that this is the reason for this rankings crash and what would be the best way to resolve this! server-side 301 redirect for the micro site? Delete the micro site (drastic measure) Remove all the links other than maybe one in the contact page saying visit our other site shade7 other options? The client or I have not received any bad link Emails from Google.
Intermediate & Advanced SEO | | Moving-Web-SEO-Auckland0 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
E-commerce Adding New Content - Blog vs New Page
I have an ecommerce site (www.brick-anew.com) focused on Fireplace products and we also have a separate blog (fireplacedecorating.com) focused on fireplace decorating. My ecommerce site needs new content, pages, internal links, etc... for more Google love, attention, and rankings. My question is this: Should I add a blog to the ecommerce site for creating new content or should I just add and create new pages? I have lots of ideas for relevant new content related to fireplaces. Are there any SEO benefits to a blog over new static pages? Thanks! SAM
Intermediate & Advanced SEO | | SammyT0 -
Migrating a site from a standalone site to a subdivision of large .gov.uk site
The scenario We’ve been asked by a client, a Non-Government Organisation who are being absorbed by a larger government ministry, for help with the SEO of their site. They will be going from a reasonably large standalone site to a small sub-directory on a high authority government site and they want some input on how best to maintain their rankings. They will be going from the Number 1 ranked site in their niche (current site domainRank 59) to being a sub directory on a domainRank 100 site). The current site will remain, but as a members only resource, behind a paywall. I’ve been checking to see the impact that it had on a related site, but that one has put a catch all 302 redirect on it’s pages so is losing the benefit of a it’s historical authority. My thoughts Robust 301 redirect set up to pass as much benefit as possible to the new pages. Focus on rewriting content to promote most effective keywords – would suggest testing of titles, meta descriptions etc but not sure how often they will be able to edit the new site. ‘We have moved’ messaging going out to webmasters of existing linking sites to try to encourage as much revision of linking as possible. Development of link-bait to try and get the new pages seen. Am I going about this the right way? Thanks in advance. Phil
Intermediate & Advanced SEO | | smrs-digital0 -
Push for site-wide https, but all pages in index are http. Should I fight the tide?
Hi there, First Q&A question 🙂 So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues. While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index. The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc." That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it. I work for a certificate authority. A company that issues SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/ Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https. The bottom line for me is; I have a site of ~800 pages that I will need to switch to https. I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion. So, here are a few general questions. What are the major considerations for such a switch? Are there any less obvious pitfalls lurking? Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions? Is that something that can be done with canonicalization? or would something at the server level be necessary? How is that going to affect my page authority in general? What obvious questions am I not asking? Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible. Any input will be very much appreciated. Thanks, Dennis
Intermediate & Advanced SEO | | dennis.globalsign0