Dealing with thin comment
-
Hi again! I've got a site where around 30% of URLs have less than 250 words of copy. It's big though, so that is roughly 5,000 pages. It's an ecommerce site and not feasible to bulk up each one. I'm wondering if noindexing them is a good idea, and then measuring if this has an effect on organic search?
-
Thanks guys! We'e starting to add more content to each page, looks like the only way!
-
Does your competition have more content for these products?
If so, you need to ramp it up.
Either way, no-indexing them is not going to do any good.
-
Hi Blink
What would you be hoping to gain by de indexing these pages?
-
The size of your site is important. The value these pages have as far as bulking your site up is important. If you no index them, you will significantly reduce the size of your site, which can effect your ability to rank on other pages as well. No indexing them is not best practice, and will cause more harm than good.
These pages aren't hurting your site by not ranking. They might rank for terms you aren't tracking also. The pages probably have some authority and links, getting rid of that will definitely be detrimental.
-
Hi! I agree that they won't rank, but most aren't now anyway. I'm more concerned that they are pulling everything else down. By noindexing them, I can at least see if that is the problem/.
-
If you no index the pages they will never rank. The bots will not be able to crawl them so they will essentially be useless.
Are these product pages? The best way to get content on these pages is through user generated reviews. That is the best way to accomplish adding content without spending a ton of time writing copy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am really surprised to see this page is ranking like crazy even the content is very thin
https://www.hackerearth.com/blog/artificial-intelligence/artificial-intelligence-101-how-to-get-started/ We are ranking for 121KW for this page. And 22KW are ranking in the 1-3 position. I am not able to understand why will it rank like anything. Considering that it has just 4 inbound links. Will some help me to understand this mystery. When we try to write a good in-depth content then we are not ranking but for such content, we are doing fairly good.
Intermediate & Advanced SEO | | Rajnish_HE1 -
Is Paging Comments SEO Friendly? Implications?
So glad to be here. Just amazed to see so many discussions over here. I had a quick query. One of our my blog has more than 500 comments on almost 50+ posts (and in some posts, it's even 1000+ comments). This impacts the load time as well as the experience on mobile. So, wanted to understand if I enable pagination of comments, is it SEO friendly. Does it negatively impacts SEO? I do not want to take the route of migrating to Disqus, FB comments, etc.
Intermediate & Advanced SEO | | flmgo820 -
Site not showing up in search - was hacked - huge comment spam - cannot connect Webmaster tools
Hi Moz Community A new client approached me yesterday for help with their site that used to rank well for their designated keywords, but now is not doing well. Actually, they are not on Google at all. It's like they were removed by Google. There are not reference to them when searching with "site: url". I investigated further and discovered the likely problem . . . 26 000 spam comments! All these comments have been removed now. I clean up this Wordpress site pretty well. However, I want to connect it now to Google webmaster tools. I have admin access to the WP site, but not ftp. So I tried using Yoast to connect. Google failed to verify the site. So the I used a file uploading console to upload the Google html code instead. I check that the code is there. And Google still fails to verify the site. It is as if Google is so angry with this domain that they have wiped it completely from search and refuse to have any dealings with it at all. That said, I did run the "malware" check or "dangerous content" check with them that did not bring back any problems. I'm leaning towards the idea that this is a "cursed" domain in Google and that my client's best course of action is to build her business around and other domain instead. And then point that old domain to the new domain, hopefully without attracting any bad karma in that process (advice on that step would be appreciated). Anyone have an idea as to what is going on here?
Intermediate & Advanced SEO | | AlistairC0 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
Dealing with past events
Hi We have a website which lists both upcoming and past events. Currently everything is indexed by google, with no real issues (usually it finds the most up-to-date events) and we have deprioritised the past events in the sitemap. Do I need to go one step further and noindex events which are past or just leave it as-is? They dont really hold much value, but sometimes will have a number of incoming links and social media shares pointing to them. We want to keep the page active for visitors, just wondering about google (there's no real link between past events and future either, so difficult to 'point' to newer version of an event) We have approx 1M 'past' events and growing so its a big change. Also would you keep them in sitemap with lower priority, or just remove them? EDIT: Just seen a Matt Cutts post from 2014 which indicates than an 'unavailable_after' meta tag might be best?
Intermediate & Advanced SEO | | benseb0 -
User generated content (Comments) - What impact do they have?
Hello MOZ stars! I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have? For your information:
Intermediate & Advanced SEO | | idg-sweden
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments. My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why! 🙂 If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify. Best regards,
Danne0 -
Dealing with non-canonical http vs https?
We're working on a complete rebuild of a client's site. The existing version of the site is in WordPress and I've noticed that the site is accessible via http and https. The new version of the site will have mostly or entirely different URLs. It seems that both http and https versions of a page will resolve, but all of the rel-canonical tags I've seen point to the https version. Sometimes image tags and stylesheets are https, sometimes they aren't. There are both http and https pages in Google's index. Having looked at other community posts about http/https, I've gathered the following: http/https is like two different domains. http and https versions need to be verified in Google Webmaster Tools separately. Set up the preferred domain properly. Rel-canonicals and internal links should have matching protocols. My thought is that we will do a .htaccess that redirects old URLs regardless of the protocol to new pages at one protocol. I would probably let the .css and image files from the current site 404. When we develop and launch the new site, does it make sense for everything to be forced to https? Are there any particular SEO issues that I should be aware of for a scenario like this? Thanks!
Intermediate & Advanced SEO | | GOODSIR0 -
How to deal with 1 product in 1 country and 3 languages?
After reading multiple posts on dealing with multilanguage sites (also checked http://www.google.com/support/forum/p/Webmasters/thread?tid=12a5507889c20461&hl=en), I still haven't got an answer to a very specific question I have. Please allow me to give some background:
Intermediate & Advanced SEO | | TruvoDirectories
I'm working for the official Belgian Yellow Pages (part of Truvo), and as you might know in Belgium, we have to deal with 3 official languages (BE-nl, BE-fr, BE-de | the latter is out of scope for this question) and on top of that we also have a large international audience (BE-en). Furthermore, Belgium is very small, meaning that someone living in the French part of Belgium (ex. Liège) easily might look for information in the Dutch part of Belgium (ex. Antwerpen) without having to switch websites/language. Since 1968 (http://info.truvo.be/en/our-company/) we have established 3 different brands, each brand is adapted to a language, each has a clear language specific connotation:
for the BE-nl market: we have the brand "gouden gids"
for the BE-fr market: we have the brand "pages dor"
for the BE-en market we have the brand "golden pages" Logically, this results in 3 websites: www.goudengids.be, www.pagesdor.be, www.goldenpages.be each serving a specific language and containing specific language messages and functionalities, but, off course, serving a part of the content that is similar for all websites regardless of the language.
So we do have following links ex.
http://www.goudengids.be/united-consultants-nv-antwerpen-2000/
http://www.pagesdor.be/united-consultants-nv-antwerpen-2000/
http://www.goldenpages.be/united-consultants-nv-antwerpen-2000/ When I want to stick with the separate brands for the same content, how do I make sure that Google shows the desired url when searching in resp. google.be (dutch), google.be (french) google.be (english)? Kind Regards0