Index or No Index (Panda Issue)
-
Hi,
I believe our website has been penalized by the panda update. We have over 9000 pages and we are currently indexing around 4,000 of those pages. I believe that more than half of the pages indexes have either thin content. Should we stop indexing those pages until we have quality page content? That will leave us with very few pages being indexed by Google (Roughly 1,000 of our 9,000 pages have quality content). I am worried that we would hurt our organic traffic more by not indexing the pages than by indexing the pages for google to read. Any help would be greatly appreciated.
Thanks,
Jim Rodriguez
-
Firstly, please don't assume that you've been hit by Panda. Find out. Indexation count is generally not a good basis for assuming a penalty.
- Was there a traffic drop around the date of a known Panda update? Check this list. https://moz.com/google-algorithm-change . If the date of traffic drop lines up, you might have a problem. Otherwise it could easily be something else.
- How many links does your site have? Google indexes and crawls based on your authority. It's one area where it doesn't really matter where the links go: just having more links seems to increase the amount your site is crawled. Obviously the links should be non-spammy.
- Do you have a site map? Are you linking to all of these pages? It could be an architecture issue unrelated to penalty.
If it is a Panda issue: generally I think people take the wrong approach to Panda. It's NOT a matter of page count. I run sites with hundreds of thousands of URLs indexed, useful pages with relatively few links and no problems. It's a matter of usefulness. So you can decrease your Panda risk by cutting out useless pages - or you can increase the usefulness of those pages.
When consulting I had good luck helping people recover from penalties, and with Panda I'd go through a whole process of figuring out what the user wanted (surveys, interviews, user testing, click maps, etc.), looking at what the competition was doing through that lens, and then re-ordering pages, adjusting layout, adding content, and improving functionality toward that end.
Hope that helps.
-
Every case is different, what might work for someone else may not work for you. This depends on the content you are saying is thin - unless it has caused a penalty, I would leave it indexed and focus on writing more quality content.
-
I think it is a critical issue - you have thin content on your most of the pages; If google bot can access your thin content pages, you may not recover from panda until you add quality content on your all the pages and that pages indexed by google (it may take a very long time)
If you have added noindex (just you told Google that do not index pages), still Google can access your pages so, google can still read your thin content and you can not recover any how.
so as per my advice you need to either remove all thin content from your pages and add quality content as fast as you can and tell google to indexed your new content (using fetch option in Google search console) (recommended) or add nofollow and noindex both to the thin content pages (not recommended) because you may lose huge number of traffic and (may you can't recover from panda - i am not sure for this statement).
-
Hi Jim,
From my own experience with Panda-impacted sites, I've seen good results from applying meta robots "noindex" to URLs with thin content. The trick is finding the right pages to noindex. Be diligent in your analytics up front!
We had a large site (~800K URLs), with a large amount of content we suspected would look "thin" to Panda (~30%). We applied the noindex to pages that didn't meet our threshold value for content, and watched the traffic slowly drop as Google re-crawled the pages and honored the noindex.
It turned out that our analytics on the front end hadn't recognized just how much long-tail traffic the noindexed URLs were getting. We lost too much traffic. After about 3 weeks, we essentially reset the noindex threshold to get some of those pages back earning some traffic, which had a meaningful impact on our monetization.
So my recommendation is to do rigorous web analytics up front, decide how much traffic you can afford to lose (you will lose some) and begin the process of setting your thresholds for noindex. It takes a few tries.
Especially if you value the earning potential of your site over the long term, I would be much more inclined to noindex deeply up front. As long as your business can survive on the traffic generated by those 1000 pages, noindex the rest, and begin a long-term plan for improving content on the other 8000 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No index vs removal - Subdomains
One of my clients has a subdomain - docushare.***.edu (vs ***.edu) that they would like to not influence SEO.
On-Page Optimization | | Crescent_Sense
the question is: should they no -index these pages or remove as a subdomain? Thank You! jeremy0 -
I am looking for help eliminating issues with website
I am looking for a company or person to assist me on the technical aspect of improving our site speed and resolving so errors that MOZ is reporting. For instance, we have an old URL domain that we retired when we consolidated 2 websites. We pointed (mapped) the old site pages to the current website, is this redirect helping (links) or hurting us? Can anyone recommend someone I can hire to help me with this project?
On-Page Optimization | | JulieALS0 -
Hit by Panda 4.1 and it couldn't be more wrong!
Hi, I'm scratching my head with this one, I have a website with around 40 pages of unique content produced by a professional copywriter who works magazines and PR agencies - each page has around 750/1000 words - according to Google the reading age is intermediate as you would expect from a good copywriter, I have anchor points jumping around the page to information the user shows an interest in - this happens I have video recording and heat maps. I also receive 100s and on some pages 1000s of social shares from Facebook, Twitter, Linkedin and G+. I wanted to build a site the way Google wants you to so I have done no link build at all, everything focused onsite so I just spent the last 2 months making the whole site responsive for mobile and tablet devices - I also spent time getting the load time down and 'was' in the process of hooking into a CDN for extra performance. Ive done everything I can to make the site just good and its reflected in the social sharing and natural links from sites such as huffington post. On the 23rd my sites rankings which were solid for over 2 years have crashed, but what's worse is ive been dropped and replaced with sites using the same tactics as the Payday Loan industry, and it seems great favouritism has been given to sites containing Adsense - I can see ranking one page sites with less than 300 words content and 3/4 ad units above the fold and sites which have taken chunks of content off Wikipedia and rank. Thumbs up Google, great job.
On-Page Optimization | | followuk0 -
What is the best way to resolve duplicate content issue
Hi I have a client whose site content has been scraped and used in numerous other sites. This is detrimental to ranking. One term we wish to rank for is nowhere. My question is this: what's the quickest way to resolve a duplicate content issue when other sites have stolen your content? I understand that maybe I should firstly contact these site owners and 'appeal to their better nature'. This will take time and they may not even comply. I've also considered rewriting our content. Again this takes time. Has anybody experienced this issue before? If so how did you come to a solution? Thanks in advance.
On-Page Optimization | | sicseo0 -
Need help regarding On page issue for dating site?
I'm having problem with on page of dating website datetolove.com. If you go to the location link in the website you will see the profile of same persons are showcasing in Country,state & city which will lead to duplication.if i use canonical tag in the country page then I think google will crawl only country page & will leave state & city pages.But I need that google should crawl all the 3 pages without any duplication's.Please help me out with this problem please check the link below.There are many other problems also in this website. http://www.datetolove.com/en/locations
On-Page Optimization | | varun18000 -
Modifying Well Established & Well Indexed Content
I have a page that is very well indexed and has a 1st position ranking in google. It is the best landing page in my site. That being said, it's several years old and I honestly think it could be better. The images could be enlarged, the the images could have fancy box enlargements instead of just linking out to flickr, there could be more content about follow up projects that people have done. I'm noteably nervous about changing such a clutch piece of content on my site. I do want to improve the content for users, not just make it more SEO friendly (it's already SEO'd), but I'm afraid that any change could cause a set back in ranking. Am I being afraid of nothing, should I just go for it and improve my content, or should I be extra cautious when editing well indexed content like this? Thanks for the advice
On-Page Optimization | | CPollock0 -
Duplicate Product BUT Unique Content -- any issues?
We have the situation where a group of products fit into 2 different categories and also serve different purposes (to the customer). Essentially, we want to have the same product duplicated on the site, but with unique content and it would even have a slightly different product name. Some specifications would be redundant, but the core content would be different. Any issues?
On-Page Optimization | | SEOPA1 -
Do we need to use the canonical tag on non-indexed pages?
Hi there I have been working in / learning SEO for just over a year, coming from a non dev background, so there are still plenty of the finer points on-page points I am working on. Slowly building up confidence and knowledge with the great SEOMoz as a reference! We are working on this site http://www.preciseuk.co.uk (we are still tweaking the tags and content by the way- not finished yet!) Because a lot of the information is within accordians, a page is generated for each tab of the accordian expanded, for example: http://www.preciseuk.co.uk/facilities-management.php is the main page but then you also have: http://www.preciseuk.co.uk/facilities-management.php?tab=0 http://www.preciseuk.co.uk/facilities-management.php?tab=1 http://www.preciseuk.co.uk/facilities-management.php?tab=2 http://www.preciseuk.co.uk/facilities-management.php?tab=3 http://www.preciseuk.co.uk/facilities-management.php?tab=4 http://www.preciseuk.co.uk/facilities-management.php?tab=5 All of which are in the same file. According to the crawl test, these pages are not indexed. Because it is all in one file, should we add the canonical tag to it, so that this is replicated in all the tab pages that are generated? eg. Thanks in advance for your help! Liz OneResult
On-Page Optimization | | oneresult
liz@oneresult.co.uk2