Dealing with thin comment
-
Hi again! I've got a site where around 30% of URLs have less than 250 words of copy. It's big though, so that is roughly 5,000 pages. It's an ecommerce site and not feasible to bulk up each one. I'm wondering if noindexing them is a good idea, and then measuring if this has an effect on organic search?
-
Thanks guys! We'e starting to add more content to each page, looks like the only way!
-
Does your competition have more content for these products?
If so, you need to ramp it up.
Either way, no-indexing them is not going to do any good.
-
Hi Blink
What would you be hoping to gain by de indexing these pages?
-
The size of your site is important. The value these pages have as far as bulking your site up is important. If you no index them, you will significantly reduce the size of your site, which can effect your ability to rank on other pages as well. No indexing them is not best practice, and will cause more harm than good.
These pages aren't hurting your site by not ranking. They might rank for terms you aren't tracking also. The pages probably have some authority and links, getting rid of that will definitely be detrimental.
-
Hi! I agree that they won't rank, but most aren't now anyway. I'm more concerned that they are pulling everything else down. By noindexing them, I can at least see if that is the problem/.
-
If you no index the pages they will never rank. The bots will not be able to crawl them so they will essentially be useless.
Are these product pages? The best way to get content on these pages is through user generated reviews. That is the best way to accomplish adding content without spending a ton of time writing copy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dealing with negative SEO
Interested to know people strategies for detecting and mitigating negative SEO. Previously I've used link monitoring tool and kept an eye on all new back links coming in to any page on the site. I have then manually assessed each one again using some tools and actually visiting the website. However, this always leaves me with one dilemma. Regardless of my assessment how do search engines see that link? I run three lists a white list, grey list and blacklist. White list - very relevant and have a lot of authority. I.e. leading industry blogs and forums. Grey list - out of topic/industry, directories Blacklist - sites de-indexed by Google, illegal content or absolute spam (i.e. one page filled with hundreds of links to different domains) Do you have any thoughts? How do you assess if link is bad?
Intermediate & Advanced SEO | | seoman100 -
[Advice] Dealing with an immense URl structure full of canonicals with Budget & Time constraint
Good day to you Mozers, I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced. This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified. Examples are: Every PoS could have a different price on their product Some of them have services available and some may have fewer, but the content on these service page doesn't change. I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag. i.e: https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential. Question is:
Intermediate & Advanced SEO | | Charles-O
For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one. I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on? I am worried about: Indexation, Accessibility and crawl budget being wasted. Thank you in advance,1 -
Dealing with 404s during site migration
Hi everyone - What is the best way to deal with 404s on an old site when you're migrating to a new website? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Dealing with past events
Hi We have a website which lists both upcoming and past events. Currently everything is indexed by google, with no real issues (usually it finds the most up-to-date events) and we have deprioritised the past events in the sitemap. Do I need to go one step further and noindex events which are past or just leave it as-is? They dont really hold much value, but sometimes will have a number of incoming links and social media shares pointing to them. We want to keep the page active for visitors, just wondering about google (there's no real link between past events and future either, so difficult to 'point' to newer version of an event) We have approx 1M 'past' events and growing so its a big change. Also would you keep them in sitemap with lower priority, or just remove them? EDIT: Just seen a Matt Cutts post from 2014 which indicates than an 'unavailable_after' meta tag might be best?
Intermediate & Advanced SEO | | benseb0 -
Anyone deal with WebSynthesis as a WordPress host?
Curious to get feedback from users who have used or are currently using WebSynthesis for their WordPress web hosting. I'm also open to hear about what you are doing if not using WebSynthesis, like WPEngine or GoDaddy's new Managed WP solution, etc. Thanks!!
Intermediate & Advanced SEO | | WhiteboardCreations0 -
Best way to deal with multiple languages
Hey guys, I've been trying to read up on this and have found that answers vary greatly, so I figured I'd seek your expertise. When dealing with the url structure of a site that is translated into multiple languages, is it better SEO wise to structure a site like this : domain.com/en domain.com/it etc or to simply add url modifiers like domain.com/?lang=en domain.com/?lang=it In the first example, I'm afraid google might see my content as duplicate even though its in a different language.
Intermediate & Advanced SEO | | CrakJason0 -
Using comment boxes for building links (the right way)
Some people see this kind of link building as spammy mainly because of automated systems I guess making it spammy. But what if you use your company name linking to your site to indicate who has posted it and then actually contribute some good discussion. A lot of these are no-follow (although I have got it into my head even though they are no follow not passing juice I still think Google counts the link and it does something). So I want to start doing some of this, for example squidoo. Lots of lens with great content that I could quite easily comment on with 50 words+
Intermediate & Advanced SEO | | activitysuper0 -
How to deal with 1 product in 1 country and 3 languages?
After reading multiple posts on dealing with multilanguage sites (also checked http://www.google.com/support/forum/p/Webmasters/thread?tid=12a5507889c20461&hl=en), I still haven't got an answer to a very specific question I have. Please allow me to give some background:
Intermediate & Advanced SEO | | TruvoDirectories
I'm working for the official Belgian Yellow Pages (part of Truvo), and as you might know in Belgium, we have to deal with 3 official languages (BE-nl, BE-fr, BE-de | the latter is out of scope for this question) and on top of that we also have a large international audience (BE-en). Furthermore, Belgium is very small, meaning that someone living in the French part of Belgium (ex. Liège) easily might look for information in the Dutch part of Belgium (ex. Antwerpen) without having to switch websites/language. Since 1968 (http://info.truvo.be/en/our-company/) we have established 3 different brands, each brand is adapted to a language, each has a clear language specific connotation:
for the BE-nl market: we have the brand "gouden gids"
for the BE-fr market: we have the brand "pages dor"
for the BE-en market we have the brand "golden pages" Logically, this results in 3 websites: www.goudengids.be, www.pagesdor.be, www.goldenpages.be each serving a specific language and containing specific language messages and functionalities, but, off course, serving a part of the content that is similar for all websites regardless of the language.
So we do have following links ex.
http://www.goudengids.be/united-consultants-nv-antwerpen-2000/
http://www.pagesdor.be/united-consultants-nv-antwerpen-2000/
http://www.goldenpages.be/united-consultants-nv-antwerpen-2000/ When I want to stick with the separate brands for the same content, how do I make sure that Google shows the desired url when searching in resp. google.be (dutch), google.be (french) google.be (english)? Kind Regards0