20-30% of our ecommerce categories contain no extra content, could this be a problem
-
Hello,
About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda?
Thanks!
-
It's not an exact science in regard to any one signal, however yes, the more you can reinforce the ability to strengthen topical focus, the less likely Panda would find category pages to be weak.
-
No worries Bob. Ignore my original suggestion then.
Alan has some good suggestions for you to follow
-Andy
-
Thanks Alan, this is perfect.
So if we had at least a couple of good paragraphs on every category page, and a few extra very appropriate internal links pointing to each of those category pages then we would be in good shape as far as Panda and category strength. Correct?
-
Hi Andy,
Sorry for the confusion. This is an ecommerce site. I edited the original question to be clear.
-
I'm assuming that this is a Wordpress site (more info would be useful) and a common issue is category pages causing problems due to them showing the same excerpts over and over. No indexing them gets around this.
if I have misread the type of issue this is, then of course, this doesn't apply. With this being posted in blogging and content, this was my assumption.
A URL to look at would I'm sure confirm more of the problem.
-Andy
-
Andy,
why would you noindex/follow category pages? Thats like saying "hey - we have X products for this category - so it's really a high value and important page we deserve ranking for. Except hey - we don't have the willingness to boost the trust signals on the category page itself, so don't bother."
That in turn negatively impacts the site's ability to gain maximum ranking signals for any products in those categories (at least in highly competitive fields).
So I'm curious why you'd take that path.
-
It could be Bob. I always advise that category pages are noindex / follow to avoid issues.
if you are using Wordpress and Yoast, this is just a setting.
-Andy
-
If a category page has almost no content (other than photos and product names), then that's a potential "thin content" issue, though the way your question is worded, I'm not confident my interpretation is actually what you meant by "no content beyond".
If product names don't reference the category name, and if there's a lack of any descriptive content on the category page, that's likely even more of a problem - thin content and lack of topical reinforcement of the category itself.
A general rule (barring other issues or considerations) is to have at least a couple paragraphs of unique, descriptive paragraph based text that reinforces the topical focus of each category page. There are numerous ways to split that content out across a category page, and in highly competitive categories, more content may be needed if not enough products exist in the category.
Other factors that can help mitigate this to a certain degree include (but aren't necessarily limited to):
- hierarchical URL structure (nested URLs so product detail pages are seen at the URL as being "beneath" their category
- Proper nested breadcrumbs to reinforce that hierarchical structure
- Strong internal linking a) within categories this would include pagination code (rel-next/rel-prev). b) outside a category this would include links and highly refined relevant content elsewhere on the page linking to the category page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Canonical tags being direct to "page=all" pages for an Ecommerce website
I find it alarming that my client has canonical tags pointing to "page=all" product gallery pages. Some of these product gallery pages have over 100 products and I think this could effect load time, especially for mobile. I would like to get some insight from the community on this, thanks!
White Hat / Black Hat SEO | | JMSCC0 -
Image Optimization & Duplicate Content Issues
Hello Everyone, I have a new site that we're building which will incorporate some product thumbnail images cut and pasted from other sites and I would like some advice on how to properly manage those images on our site. Here's one sample scenario from the new website: We're building furniture and the client has the option of selecting 50 plastic laminate finish options from the Formica company. We'll cut and paste those 50 thumbnails of the various plastic laminate finishes and incorporate them into our site. Rather than sending our website visitors over to the Formica site, we want them to stay put on our site, and select the finishes from our pages. The borrowed thumbnail images will not represent the majority of the site's content and we have plenty of our own images and original content. As it does not make sense for us to order 50 samples from Formica & photograph them ourselves, what is the best way to handle to issue? Thanks in advance, Scott
White Hat / Black Hat SEO | | ccbamatx0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Is this Duplicate content?
Hi all, This is now popping up in Moz after using this for over 6 months.
White Hat / Black Hat SEO | | TomLondon
It is saying this is now duplicate site content. What do we think? Is this a bad strategy, it works well on the SERPS but could be damaging the root domain page ranking? I guess this is a little shady. http://www.tomlondonmagic.com/area/close-up-magician-in-crowborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-desborough/ http://www.tomlondonmagic.com/area/close-up-magician-in-didcot/ Thanks.0 -
Is it a duplicate content ?
Hi Please check this link : http : // www . speedguide . net/news/yahoo-acquires-email-management-app-xobni-5252 it's a post where the admin just write the first 200-300 words and then insert the "read more here" which links to the original post This make the website active as the admin always add new content but is this not against google rules as it's a duplicate content ?? Can you tell me the name of this strategy ? Is this really work to make the website active ??
White Hat / Black Hat SEO | | loumi0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0