Index or No Index (Panda Issue)
-
Hi,
I believe our website has been penalized by the panda update. We have over 9000 pages and we are currently indexing around 4,000 of those pages. I believe that more than half of the pages indexes have either thin content. Should we stop indexing those pages until we have quality page content? That will leave us with very few pages being indexed by Google (Roughly 1,000 of our 9,000 pages have quality content). I am worried that we would hurt our organic traffic more by not indexing the pages than by indexing the pages for google to read. Any help would be greatly appreciated.
Thanks,
Jim Rodriguez
-
Firstly, please don't assume that you've been hit by Panda. Find out. Indexation count is generally not a good basis for assuming a penalty.
- Was there a traffic drop around the date of a known Panda update? Check this list. https://moz.com/google-algorithm-change . If the date of traffic drop lines up, you might have a problem. Otherwise it could easily be something else.
- How many links does your site have? Google indexes and crawls based on your authority. It's one area where it doesn't really matter where the links go: just having more links seems to increase the amount your site is crawled. Obviously the links should be non-spammy.
- Do you have a site map? Are you linking to all of these pages? It could be an architecture issue unrelated to penalty.
If it is a Panda issue: generally I think people take the wrong approach to Panda. It's NOT a matter of page count. I run sites with hundreds of thousands of URLs indexed, useful pages with relatively few links and no problems. It's a matter of usefulness. So you can decrease your Panda risk by cutting out useless pages - or you can increase the usefulness of those pages.
When consulting I had good luck helping people recover from penalties, and with Panda I'd go through a whole process of figuring out what the user wanted (surveys, interviews, user testing, click maps, etc.), looking at what the competition was doing through that lens, and then re-ordering pages, adjusting layout, adding content, and improving functionality toward that end.
Hope that helps.
-
Every case is different, what might work for someone else may not work for you. This depends on the content you are saying is thin - unless it has caused a penalty, I would leave it indexed and focus on writing more quality content.
-
I think it is a critical issue - you have thin content on your most of the pages; If google bot can access your thin content pages, you may not recover from panda until you add quality content on your all the pages and that pages indexed by google (it may take a very long time)
If you have added noindex (just you told Google that do not index pages), still Google can access your pages so, google can still read your thin content and you can not recover any how.
so as per my advice you need to either remove all thin content from your pages and add quality content as fast as you can and tell google to indexed your new content (using fetch option in Google search console) (recommended) or add nofollow and noindex both to the thin content pages (not recommended) because you may lose huge number of traffic and (may you can't recover from panda - i am not sure for this statement).
-
Hi Jim,
From my own experience with Panda-impacted sites, I've seen good results from applying meta robots "noindex" to URLs with thin content. The trick is finding the right pages to noindex. Be diligent in your analytics up front!
We had a large site (~800K URLs), with a large amount of content we suspected would look "thin" to Panda (~30%). We applied the noindex to pages that didn't meet our threshold value for content, and watched the traffic slowly drop as Google re-crawled the pages and honored the noindex.
It turned out that our analytics on the front end hadn't recognized just how much long-tail traffic the noindexed URLs were getting. We lost too much traffic. After about 3 weeks, we essentially reset the noindex threshold to get some of those pages back earning some traffic, which had a meaningful impact on our monetization.
So my recommendation is to do rigorous web analytics up front, decide how much traffic you can afford to lose (you will lose some) and begin the process of setting your thresholds for noindex. It takes a few tries.
Especially if you value the earning potential of your site over the long term, I would be much more inclined to noindex deeply up front. As long as your business can survive on the traffic generated by those 1000 pages, noindex the rest, and begin a long-term plan for improving content on the other 8000 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No index vs removal - Subdomains
One of my clients has a subdomain - docushare.***.edu (vs ***.edu) that they would like to not influence SEO.
On-Page Optimization | | Crescent_Sense
the question is: should they no -index these pages or remove as a subdomain? Thank You! jeremy0 -
Language redirction causing 302 issues
Hi, I am getting an error on Moz campaign for 302 errors. The reason we have this redirection is because we have the website available in different languages and the website redirects to the language according to the area you are in. e.g. www.keyhub.com/en or www.keyhub.com/es I am guessing it might be a common issue for all the websites with multiple languages. Any solutions for this or should we just ignore this error? Thanks, Priyam
On-Page Optimization | | kh-priyam0 -
How to fix thin content issue?
Hello! I've checked my website via Moz and received "thin content" issue: "Your page is considered to have "thin content" if it has less than 50 words" But I definitely know that we have 5 text blocks with unique content, each block consist of more than 50 words. Do you have any ideas what may cause this issue? Thanks in advance, Yana
On-Page Optimization | | yanamazault0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Latent semantic Indexing - Does this help rankings/relevance?
Hi, Does semantically related words to the target term on a page help with rankings/relevance? If your after the term 'PC Screen' and you use the term 'PC Monitor' will go make the connection and also reward you because of the relevance? Anyone do this and have you seen any positives? I've just started to try this out lately and have been combining it with Wordle.net to give me an indication of where the content piece is heading and how aggressive the content leans towards certain words (makes things a little more interesting then calculating densities).
On-Page Optimization | | Bondara0 -
How to fix duplicate issue among multiple root domains
Hello, I’m doing SEO for one E-commerce website which name is Lily Ann Cabinets & I’ve around 300 different root domains which having same linking structures, same design & even same product database for all 300 websites, but currently I’m focusing only on Lily Ann Cabinets website & trying to get ranking on some targeted keywords, but website is not performing well in Google.com For Example: http://www.lilyanncabinets.com/ (Main Websites)
On-Page Optimization | | CommercePundit
http://www.orlandocabinets.com/
http://www.chicagocabinets.org/
http://www.miamicabinets.org/
http://www.newyorkcabinets.org/
http://www.renocabinets.org/ So please can anyone tell that Will it create duplicate issue in search engines or may be due to this reason website doesn’t have good ranking in search engines, then how can I fix this issue? Do I have to make different structures for Lily Ann Cabinets?0 -
Google Will Now Start Indexing Facebook Comments
Interesting article: http://www.telegraph.co.uk/technology/google/8863354/Google-to-index-Facebook-comments.html
On-Page Optimization | | TheVolkinator0 -
Are a lot auf tag-sites in the index a bad signal for low quality? (Panda Update)
Hello everybody, first of all please excuse my bad english. I'm from Germany - I try my best. 😉 The case: I have a Wordpress SEO project which rankings very well. A this moment I have all "archive sites" like "archive", "category" und "tags" indexed. I use the more-Tag for every archive/category/tag site - so duplicate content ist not really a problem, but in view of the Panda Update, which surely arrives in Germany soon, I wonder if all this Tag/Archive/Category Sites in the index maybe seen as low quality und can hurt the ranking of my whole site. Low quality because: With using the more-tag the site are just a list of internal links with content snippets. I have 500 articles und 700 Tag Site (all in the index). So my fear is when google (with Panda Update) looks at my site und sees all this (maybe) low quality tag-sites in the index I get penalised because there is not a good proportion between my normal (good quality) Articles und the archive/tag sites. I hope you guys can understand my thoughts. Do I have a legitimate fear that the mass of tag-site in the index could be problem? Are there any data from the USA, how blogs mit Tag-Site in the Index rank after the Panda Update or if sites which contains of internal Links mit content snippets - like these tag site - are low quality in Google eyes? Or I'm worring to much? Thank you very much! Oliver
On-Page Optimization | | channelplus0