Index or No Index (Panda Issue)
-
Hi,
I believe our website has been penalized by the panda update. We have over 9000 pages and we are currently indexing around 4,000 of those pages. I believe that more than half of the pages indexes have either thin content. Should we stop indexing those pages until we have quality page content? That will leave us with very few pages being indexed by Google (Roughly 1,000 of our 9,000 pages have quality content). I am worried that we would hurt our organic traffic more by not indexing the pages than by indexing the pages for google to read. Any help would be greatly appreciated.
Thanks,
Jim Rodriguez
-
Firstly, please don't assume that you've been hit by Panda. Find out. Indexation count is generally not a good basis for assuming a penalty.
- Was there a traffic drop around the date of a known Panda update? Check this list. https://moz.com/google-algorithm-change . If the date of traffic drop lines up, you might have a problem. Otherwise it could easily be something else.
- How many links does your site have? Google indexes and crawls based on your authority. It's one area where it doesn't really matter where the links go: just having more links seems to increase the amount your site is crawled. Obviously the links should be non-spammy.
- Do you have a site map? Are you linking to all of these pages? It could be an architecture issue unrelated to penalty.
If it is a Panda issue: generally I think people take the wrong approach to Panda. It's NOT a matter of page count. I run sites with hundreds of thousands of URLs indexed, useful pages with relatively few links and no problems. It's a matter of usefulness. So you can decrease your Panda risk by cutting out useless pages - or you can increase the usefulness of those pages.
When consulting I had good luck helping people recover from penalties, and with Panda I'd go through a whole process of figuring out what the user wanted (surveys, interviews, user testing, click maps, etc.), looking at what the competition was doing through that lens, and then re-ordering pages, adjusting layout, adding content, and improving functionality toward that end.
Hope that helps.
-
Every case is different, what might work for someone else may not work for you. This depends on the content you are saying is thin - unless it has caused a penalty, I would leave it indexed and focus on writing more quality content.
-
I think it is a critical issue - you have thin content on your most of the pages; If google bot can access your thin content pages, you may not recover from panda until you add quality content on your all the pages and that pages indexed by google (it may take a very long time)
If you have added noindex (just you told Google that do not index pages), still Google can access your pages so, google can still read your thin content and you can not recover any how.
so as per my advice you need to either remove all thin content from your pages and add quality content as fast as you can and tell google to indexed your new content (using fetch option in Google search console) (recommended) or add nofollow and noindex both to the thin content pages (not recommended) because you may lose huge number of traffic and (may you can't recover from panda - i am not sure for this statement).
-
Hi Jim,
From my own experience with Panda-impacted sites, I've seen good results from applying meta robots "noindex" to URLs with thin content. The trick is finding the right pages to noindex. Be diligent in your analytics up front!
We had a large site (~800K URLs), with a large amount of content we suspected would look "thin" to Panda (~30%). We applied the noindex to pages that didn't meet our threshold value for content, and watched the traffic slowly drop as Google re-crawled the pages and honored the noindex.
It turned out that our analytics on the front end hadn't recognized just how much long-tail traffic the noindexed URLs were getting. We lost too much traffic. After about 3 weeks, we essentially reset the noindex threshold to get some of those pages back earning some traffic, which had a meaningful impact on our monetization.
So my recommendation is to do rigorous web analytics up front, decide how much traffic you can afford to lose (you will lose some) and begin the process of setting your thresholds for noindex. It takes a few tries.
Especially if you value the earning potential of your site over the long term, I would be much more inclined to noindex deeply up front. As long as your business can survive on the traffic generated by those 1000 pages, noindex the rest, and begin a long-term plan for improving content on the other 8000 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can i fix the canonical tag issue?
I was experiencing today morning some issue on our website page. While I inspected the URL to Google for the page https://www.socprollect-mea.com/dubai-free-zone-company-formation/ I noted the issue. The canonical tag is showing errors only the source code showsThis is a serious issue right?Does anyone have the idea to solve this concern? Please help me to sort it out the issue and guide me on how to fix this WordPress issue.
On-Page Optimization | | nazfazy0 -
No index for http version of website
Hi, I've had a message from Google search console to say the sitemap for the http version of my site is tagged as no index. As the https version is indexed, do I need to change the http version to be indexed as well? Do I need to keep the http version of the site in search console alongside the https version, or should I remove it? Advice appreciated!
On-Page Optimization | | Robingoodlad0 -
What is the perfect way to handle multiple sitemaps index in Search Console?
Hello friends, I have this doubt for a long and i want to share it with you. In our agency many clients have a PHP template for the home page of their sites, and also have a blog with wordpress as CMS. When i am optimizing sitemaps, I have two separate files, an index of Sitemaps created with Wordpress SEO by Yoast (which inside has separate Sitemaps tags, categories, posts, pages, authors, etc.) and on the other hand the home page sitemap with the subsections. As you know the sitemap generated by "Wordpress SEO by Yoast" is dynamic as it creates the sitemap according to current site content, and is updated every time a new entry is raised or modify any URL. This makes it very practical. I can not have a unique index sitemap sitemaps nesting inside another, as it is not allowed by Google or Sitemap protocol. I read in the Google Support you can upload multiple sitemaps to Search Console but does not say anywhere on upload multiple sitemaps index, or a combination thereof. In my case, I would have to upload two separately files, the dynamically generated with wordpress and the manual created for the PHP template. In my opinion there is no problem and Google will index everything properly performing it this way, but I wanted to share it with you to see how you solve this problem and what experiences had. Thanks and best regards.
On-Page Optimization | | NachoRetta1 -
Does the content in Joomla modules get indexed
Hello Moz, we are using Joomla to build websites. In Joomla you can put content in modules. What we would like to know if would search engines index content in modules, as it does with content attached to menu links? Thanks Ian
On-Page Optimization | | Substance-create0 -
PANDA Attack: Too many on page links
Hey guys! I have a bit of a dilemma...one of my sites got hit by Panda 😞 The content itself contains about 10 links, however since the site is a process directory, at the bottom of the page you will find that the visitor can also browse process directory by name or page and then beneath this there are 80 links :s My concern is that if i remove this I will lose internal link juice! HELP! What approach should I take? I was thinking of either reducing the number of links OR hiding it by using Java ORRRR removing the links entirely. Advice anyone? This is a page as an example: http://www.processlibrary.com/directory/files/csrsc/25349/ All pages are like this!
On-Page Optimization | | OrangeGuys0 -
Indexed iframed content behind login
Hi, I have a question regarding iframed content. I would like to get my non cms content which is served via an iframe solution (from the same domain) behind a anonymous or personal login indexed by search engines. How can we make this work? I've looked at the following solutions: http://googlewebmastercentral.blogspot.nl/2008/10/first-click-free-for-web-search.htmlhttp://productforums.google.com/forum/#!topic/webmasters/l9n8oGLQRkUBut I would like the content to be crawlable deeper than the just one page (if this is possible using the iframe solution).We could also setup different new pages in our CMS with the same content...Any suggestions?Thanks!Arnout
On-Page Optimization | | hellemans0 -
Cnnonical Issue! Plz Help
Hi, I'm having this problem for one of my website, say www.abc.com. Certain information in the site is long and thus required to be put into several pages. For example, let say there is a section for the "List of Business Schools in Canada", this is a huge list and thus divided into several pages. The main URL is like this www.abc.com/business-schools/list-of-business-schools-in-canada.html & after on its goes on like www.abc.com/business-schools/list-of-business-schools-in-canada1.html www.abc.com/business-schools/list-of-business-schools-in-canada2.html www.abc.com/business-schools/list-of-business-schools-in-canada3.html Etc. Now as Google is considering these pages as canonical what should I do suppose do what with it? I've examine that rel="canonical" tag is used on every pages (canada1.html, canada2.html etc.) and the canonical URL is set to the main list-of-business-schools-in-canada.html page. So, why is that Google is picking this up as canonical? Have I made a mistake in placing the rel= canonical tag ? Please suggest. Thanks in advance,
On-Page Optimization | | ITRIX0 -
Changing Title Tags once Indexed and positioned?
Hello, Would it be wise to change title tags on a page that is optimized and ranked already in google? These are sports pages, and the teams we are targeting are now in the playoffs so we wanted to insert that into our Title? Thanks!!
On-Page Optimization | | TP_Marketing0