When do Panda ranking factors apply when Google deindexes a page
-
Here is 2 scenarios
Scenario 1
Lets say I have a site with a ton of pages (100,000+) that all have off site duplicate content. And lets say that those pages do not contain any rel="noindex" tags on them.
Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty.
Since all those pages are no longer indexed by Google does the Panda Penalty still apply even though all those pages have been deindexed?
Scenario 2
I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off?
What I am getting at is that I have realized that I have a ton of pages with off site duplicate content, even though those pages are already not indexed by Google does me by simply adding the rel="noindex" tag to them tell Google that I am trying to get rid of duplicate content and they lift the Panda penalty?
The pages are useful to my users so I need them to stay.
Since in both scenarios the pages are not indexed anyways, will Google acknowledge the difference in that I am removing them myself and lift the panda ban?
Hope this makes sense
-
I have over 800,000 pages total that contain duplicate content "if" that is an issue with my definitions. I would assume that Panda would slap me hard for that, again "if" that is the issue. Since I have never tried to deindex this many pages I am hoping this works and I will take a few coffee breaks waiting because its going to be a while lol
I have nothing to lose and I feel like I have tried a ton. Thanks so much
-
"Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty."
Panda will not deindex pages. It might move them to the supplemental index, but they're not deindexed. Technically, Panda is not a penalty. It's an algorithmic demotion. If you've got a bunch of duplicate content, Google may choose not to index some of that content, or, more likely, to just show users the most appropriate page of that content.
Now, if Panda has affected your site because Google feels that the site consists of mostly duplicate or thin content then you'll need to noindex or significantly change or remove that content in order for Google to see that the quality has improved. You can't say that the content is essentially gone because Google is not showing it. That wouldn't change the factors that caused you to be affected by Panda. (Now, this is assuming that this is what the problem is, because we don't know that.)
"I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off?"
If these pages were the primary reason for Panda to visit your site, then what would happen is that as Google recrawls your site they will start to recognize that the quality is improved. Then, at some point with a future Panda refresh (it may take several if there is a lot of content to crawl), you should see an increase in traffic. If the duplication was the only factor that Panda was concerned about then you'd likely see a dramatic improvement. If it was just one of the factors, you might see a smaller improvement. If you had a lot of factors, you may see very little or just some improvement.
If I understand the question right, I would say that the answer is to go ahead and add the noindex tag to these pages.
-
No offense man, I really want to figure out what the heck happened with my site, I really feel like I was hit from unfortunate circumstances.
My website is http://www.freescrabbledictionary.com/
The duplicate content I am referring too is that I generate my definitions for words from an API provided by https://www.wordnik.com/
I do site the resource on each definition at the bottom of every page (which was required by https://github.com/wordnik), an example is http://www.freescrabbledictionary.com/dictionary/word/testing/
I have never had a manual penalty from Google, I check Google Webmaster Tools all the time. I also use tools like Google analytics as well as moz.com, ahrefs.com and monitorbacklinks.com.
I used to rank for the keyword "scrabble dictionary" in the top 4 spots on average. For a long time I was #2 which was my biggest keyword traffic.
I remember when the first Panda update came out I was not hit. I notice the negative changes in my rankings after the second panda update and so on. Since Penguin was in the mix as well I cant even tell if I was hit with penguin.
I never paid or did black-hat backlinking
Again I never was hit with a manual penalty, this is 100% algorithmic
If you notice with the keyword "scrabble dictionary" my homepage does not rank for that keyword at all. Not anywhere in all the search results, where I used to rank top 4 spots.
Since I have been hit so hard I have nothing to lose so I have "noindex" 100% of each word definition, sentence example's and quotes which even though those are not copied (except for definitions) I did that just in case. This equates for about 90% of my site's pages indexed by Google.
I have changed my site design to equate for the "refresh" ranking factor, I have desperately comb through my site 1,000,000 times trying to figure out what happened, I have disavowed link 10 fold and nothing seems to affect my rankings. At this point I will try anything...I have nothing to lose.
-
Can you describe what happened to your site and why you believe you got a penalty.
make sure Type site:www.example.com in to Google it will show you what is indexed.
Be certain that you do not have a robots.txt file or something similar blocking your website go to https://www.feedthebot.com/tools/ type your domain in and it will tell you if you are blocking anything with your robots.txt do this on the URLs that you think not indexed.
Because if you are affected by a true panda penalty it would be a manual penalty you would receive word inside your Google Webmaster tools account. If you do not have one set one up.
https://www.google.com/webmasters/tools/home?hl=en
if you think you've been hit by an algorithm penalty not manual you can check by using tools listed in this URL
http://www.iacquire.com/blog/5-tools-to-help-you-identify-a-google-slap
now obviously because you're talking about duplicate content which it seems like you may have known existed somewhere else may be and please don't take offense you copied it?
In that case Google takes the domain with the most authority and gives it to that domain.
so time.com could probably take your entire site and you would be the one that looked like you stole their content.
remember Google does also consider the first time it was indexed however site authority trumps it.
Google will only acknowledge a difference if you actually have a manual penalty if you received a manual penalty it would come with instructions on what to do next.
My advice to you is if you have duplicate content that is taken from another website and not yours please remove the content second choice no index that content.
It could be that you have the misfortune of somebody finding out that you took their content and they do digital millennium takedown in many cases would damage or domain beyond repair. You would know if this occurred as well. I'm just letting you know it's not smart to have someone else's content on your site you should write it uniquely to meet your end-users needs and if the current content is very helpful to them I recommend you use that to create your own unique content not spinning it but unique.
please know that if you tell me you didn't take the content I will apologize right away. I do not mean to imply.
respectfully,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does no-follow for pages affect site ranking?
Hey, I have a question. On my site, it's divided into the main site and the blog is in a subfolder of same domain. Within the main site (same domain), there are MANY checkout pages and other internal pages we use though all with "NO FOLLOW" on each. Despite it having "NO FOLLOW", will it affect our blog rankings in any way or domain ranking?"
On-Page Optimization | | Mirian0 -
Page grader says we are keyword stuffing but we arn't. Page source shows different story.
Hi community! We have just run a page grader for the keyword 'LED Bulbs' on whichledlight.com and it comes up that we are keyword stuffing! However, a brief look at the source for the homepage and there's only 6 times that LED Bulbs pops up. We do have the non plural version of the word 'LED Bulb' on the page 27 times.. do we think that would contribute to the keyword stuffing? Thanks!!
On-Page Optimization | | TrueluxGroup0 -
Moving from Bigcommerce to Woocommerce on WP. Should we redirect size pages into one page?
We are moving from Bigcommerce to Woocommerce on WP. On Bigcommerce, due to some bizarre reasoning the previous developer had 3 separate URLS for the same product in different sizes - S, M and L. Now we plan to have one product page where the sizes can be selected and 301 redirect the 3 urls to the new one. Is this advisable? Or should we just have 3 separate pages. OR should we have one of the sizes pages as the new page and then redirect the other 2 to this one? I ask this because the site has a LOT of ranking power and we do not want to jeopardise that.
On-Page Optimization | | MashBonigala0 -
Page 2 is ranking
Hey All, I'm working on a wordpress site project and in analytics the sites ranking url is page 2. is this a problem?
On-Page Optimization | | CobraJones950 -
Google indexing page differently
Does google index an interal page differently depending on whether you are using a FULL url (including domain) or just a relative link? Also, is it possible that using a full URL (http://mysite.com/page.html) causes the browser to "ping" the server differently than just having the href linked to using relative links (/page.html) Could this cause server or firewall perfomance issues?
On-Page Optimization | | WebRiverGroup0 -
How to properly remove pages and a category from Google's index
I want to remove this category http://www.webdesign.org/web-design-news-all/ and all the pages in that category (e.g. http://www.webdesign.org/web-design-news-all/7386.html ) from Google's index. I used the following string in the "Reomval URS" section in Google Webmaster Tools: http://www.webdesign.org/web-design-news-all/* is that correct or I better use http://www.webdesign.org/web-design-news-all/ ? Thanks in advance.
On-Page Optimization | | VinceWicks0 -
Does Too Many On-Page Links on a Page Really Matters?
Does Too Many On-Page Links on a Page Really Matters? Especially if they are pointing to internal page?
On-Page Optimization | | AppleCapitalGroup1 -
Google SERPS showing wrong page.
I am new to SEO and trying to rank for keyword 'corporate entertainment' and my site is currently at 26. However google is showing the homepage http://www.musicliveuk.com in SERPS as opposed to my optomised page http://www.musicliveuk.com/home/corporate-entertainment. Any ideas why it is choosing so show the home page as the most relevant result?
On-Page Optimization | | SamCUK0