Noindex large productpages on webshop to counter Panda
-
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update.
One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage.
There is however no capacity to do this on short notice. So now I'm wondering if the following is effective.
Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category.
My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure.
What do you think?
-
I see. There's a pretty thorough discussion on a very similar situation here: http://moz.com/community/q/can-i-use-nofollow-tag-on-product-page-duplicated-content. Everett endorsed Monica's answer with, "... you might consider putting a Robots Noindex,Follow meta tag on the product pages. You'll need to rely on category pages for rankings in that case, which makes sense for a site like this." Monica's long term solution was to also work on getting specific user-generated content on as many product pages as possible. Cheers!
-
@Ryan, thx for your answer. The pagerank flow is indeed one of the things I worry about when deindexing large parts of the site. Especcialy since the category pages will be full of internal links to productpages that are excluded from indexation by robots.txt or robots meta.
The problem I am trying to solve however has nothing to do with pagerank sculpting. I suspect an algorithmic drop due to thin, duplicate and syndicated content. The drop is sitewide. Assuming that the drop is due to panda I suspect the percentage of low quality pages should be optimized. Would outlinking and better DA really be sufficient to counter a suspected Panda problem? Or is it required to make the 10.000 product pages of better quality, I would think the latter. Since there is no budget to do so I wonder if it is possible to drop these low quality pages from the index (but keep them in the website). Would this strenghten the remaining pages to bounce back up, assuming these remaining pages are of good quality offcourse.
Since SEO is not the only factor to be taken into account I'd rather not delete these pages from the website.
-
Matt Cutts speaks to part of what you're thinking about doing here: https://www.mattcutts.com/blog/pagerank-sculpting/ and it's important to note that it's not nearly as effective. The thing I would focus more on is the DA and quality of referrals to your site. Secondly, linking out from pages is actually a positive strength indicator when done in the right way, per Cutts in the same article, "In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites." Perhaps your product pages could be strengthened further by this as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog archive pages are meta noindexed but still flagged as duplicate
Hi all. I know there several threads related to noindexing blog archives and category pages, so if this has already been answered, please direct me to that post. My blog archive pages have preview text from the posts. Each time I post a blog, the last post on any given archive page shifts to the first spot on the next archive page. Moz seems to report these as new duplicate content issues each week. I have my archive pages set to meta noindex, so can I feel good about continuing to ignore these duplicate content issues, or is there something else I should be doing to prevent penalties? TIA!
Technical SEO | | mkupfer1 -
Nofollow/Noindex Category Listing Pages with Filters
Our e-commerce site currently has thousands of duplicate pages indexed because category listing pages with all the different filters selected are indexed. So, for example, you would see indexed: example.com/boots example.com/boots/black example.com/boots/black-size-small etc. There is a logic in place that when more than one filter is selected all the links on the page are nofollowed, but Googlebot is still getting to them, and the variations are being indexed. At this point I'd like to add 'noindex' or canonical tags to the filtered versions of the category pages, but many of these filtered pages are driving traffic. Any suggestions? Thanks!
Technical SEO | | fayfr0 -
Panda Cleanup - Removing Old Blog Posts, Let Them 404 or 301 to Main Blog Page?
tl;dr... Removing old blog posts that may be affected by Panda, should we let them 404 or 301 to the Blog? We have been managing a corporate blog since 2011. The content is OK but we've recently hired a new blogger who is doing an outstanding job, creating content that is very useful to site visitors and is just on a higher level than what we've had previously. The old posts mostly have no comments and don't get much user engagement. I know Google recommends creating great new content rather than removing old content due to Panda concerns but I'm confident we're doing the former and I still want to purge the old stuff that's not doing anyone any good. So let's just pretend we're being dinged by Panda for having a large amount of content that doesn't get much user engagement (not sure if that's actually the case, rankings remain good though we have been passed on a couple key rankings recently). I've gone through Analytics and noted any blog posts that have generated at least 1 lead or had at least 20 unique visits all time. I think that's a pretty low barrier and everything else really can be safely removed. So for the remaining posts (I'm guessing there are hundreds of them but haven't compiled the specific list yet), should we just let them 404 or do we 301 redirect them to the main blog page? The underlying question is, if our primary purpose is cleaning things up for Panda specifically, does placing a 301 make sense or would Google see those "low quality" pages being redirected to a new place and pass on some of that "low quality" signal to the new page? Is it better for that content just to go away completely (404)?
Technical SEO | | eBoost-Consulting0 -
Launching large content project - date-stamp question
Hello mozzers! So my company is about to launch a large scale content project with over 100 pieces of newly published content. I'm being asked what the date-stamp for each article should be. Two questions:
Technical SEO | | Vacatia_SEO
1- Does it hurt article's SEO juice to have a lot of content with the same "published on" date?
2- I have the ability to manually update each articles date stamp. Is there a recommended best practice? p.s. Google has not crawled any of these pages yet.1 -
Noindex Success?
Has anyone had success implementing noindex/follow to pages from their site which has been hit by a Panda penalty? Our site has a lot of duplicate content for products descriptions that we had permission to use from our distributor (who is also online). We went ahead and noindex/follow those pages in the hopes that google will focus on the products that we carry that do have original descriptions (about 1/3 of our products). We didn't want to just remove those products since they are actually beneficial to our customers. Most of the duplication of content is in the form of ingredients lists.
Technical SEO | | dustyabe0 -
Implementing Cannonical & Alternate tags on a large website
Hi There, Our brochureware website consists of a Desktop site (www.oursite.com)and a Mobile website (m.oursite.com). I know I need to implement the alternate tags on the desktop pages and the cannonical tags on the mobile versions. However we have a huge site is there any dynamic way through javascript to have the code be generated or is it something that should be done manually page by page? Below is sample javascript a colleague completed to attempt to dynamically develop the snippet but I am unsure if bots will be able to interpret it: Alternate version: Thanks in advance Phil
Technical SEO | | Phily0 -
Will Google Continue to Index the Page with NoIndex Tag Upon Google +1 Button Impression or Click?
The FAQs for Google +1 button suggests as follows: "+1 is a public action, so you should add the button only to public, crawlable pages on your site. Once you add the button, Google may crawl or recrawl the page, and store the page title and other content, in response to a +1 button impression or click." If my page has NoIndex tag, while at the same time inserted with Google +1 button on the page, will Google recognise the NoIndex Tag on the page (and will not index the page) despite the +1 button's impression or clicks send signals to Google spiders?
Technical SEO | | globalsources.com0 -
Xenu Alternative for Large Sites
We're launching a new site and we're trying to crawl it to check for any problems. It's millions of pages and Xenu seems to start encountering errors as the numbers mount past 500,000. Does anyone know of an alternative, free or paid, that could handle the size better?
Technical SEO | | eLocalusa0