How long for Panda 4.1 fixes to take affect?
-
Hi,
If you have been hit by Panda 4.1 and now putting fixes in place, for this example lets say you remove a load of dup content (and that's what caused the problem) - how long would it take for that fix to take affect?
Do you have to wait for the next Panda update? or will it be noticed on the next crawl?
Thanks.
-
Cheers Dennis,
I see so it may see the changes in the short future rather than the long run.
-
Back then, it needed a major panda update to finally see some changes. Personally, I think this is still the safer bet.
As it is refreshing monthly now, I've noticed that it still is a bit tricky to figure out. I've had sites that recover after 3 refreshes. I've seen sites recover after a full year. For what's it worth, I've never experienced a recovery right on the next minor refresh, possibly because it still needs time to fully crawl and reindex the pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetch and render partial result could this affect SERP rankings [NSFW URL]
Moderator's Note: URL NSFW We have been desperately trying to understand over the last 10 days why our homepage disappears for a few days in the SERPS for our most important keywords, before reappearing again for a few more days and then gone again! We have tried everything. Checked Google webmaster - no manual actions, no crawl errors, no messages. The site is being indexed even when it disappears but when it's gone it will not even appear in the search results for our business name. Other internal pages come up instead. We have searched for bad back links. Duplicate content. We put a 301 redirect on the non www. version of the site. We added a H1 tag that was missing. Still after fetching as Google and requesting reindexing we were going through this cycle of disappearing in the rankings (an internal page would actually come in at 6th position as opposed to our home page which had previously spent years in the number 2 spot) and then coming back for a few days. Today I tried fetch and render as Google and was only getting a partial result. It was saying the video that we have embedded on our home page was temporarily unavailable. Could this have been causing the issue? We have removed the video for now and fetched and rendered and returned a complete status. I've now requested reindexing and am crossing everything that this fixes the problem. Do you think this could have been at the root of the problem? If anyone has any other suggestions the address is NSFW https://goo.gl/dwA8YB
Intermediate & Advanced SEO | | GemmaApril2 -
Panda, rankings and other non-sense issues
Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!
Intermediate & Advanced SEO | | fablau0 -
Tips for optimizing website for one long term keyword
Hello, I have quite specific long term keyword (4 part keyword) for which I would like to rank as high as possible and other keywords would come automatically, I know there's lot to it how to do it properly, but is there any good tips you could help me out with? I have 4-5 different pages with the keyword related product, would it be smart to optimize them all for the one keyword or optimize just one of those pages and leave others with other information, this I believe would be important subject to decide? I know I could add the exact long term keyword since it's related to content to titles, h1 headers, alt tags , file names and url, but would it be smart to use the optimization for that exact long term keyword on all those pages or just one? This is very important subject for my business and any advice will be most highly valued. Many thanks
Intermediate & Advanced SEO | | bidilover0 -
Two websites (Domains) with same content for more than 4 years which one to choose now?
Hi, I need help with this decision, thanks in advance. My client has 2 websites but they have the same content: one has 4 years http://radiocolombia.com.co/ and the the other one 7 years: http://radiocolombiainternacional.com/web/ This content has been duplicated for years, how do I know which website is more relevant for google? we have to pick one. Please any advice? Thanks, David
Intermediate & Advanced SEO | | seoweb330 -
How does having multiple pages on similar topics affect SEO?
Hey everyone, On our site we have multiple pages that have similar content. As an example, we have a section on Cars (in general) and then specific pages for Used Cars, European Cars, Remodeled Cars etc. Much of the content is similar on these page and the only difference is some content and the additional term in the URL (for example car.com/remodeled-cars and /european-cars). In the past few months, we've noticed a dip in our organic ranking and started doing research. Also, we noticed that Google, in SERPs, shows the general page (cars.com/cars) and not the specific page (/european-cars), even if the specific page has more content. Can having multiple pages with similar content hurt SEO? If so, what is the best way to remedy this? We can consolidate some of the pages and make the difference between them a little clearer, but does it make that much of a difference for rankings? Thanks in advance!
Intermediate & Advanced SEO | | JonathonOhayon0 -
Are Incorrectly Set Up URL Rewrites a Possible Cause of Panda
On a .NET site, there was a url rewrite done about 2 years ago. From a visitor's perspective, it seems to be fine as the urls look clean. But, Webmaster tools reports 500 errors from time to time showing /modules/categories... and /modules/products.... which are templates and how the original urls were structured. While the developer made it look clean, I am concerned that he could have set it up incorrectly. He acknowledged that IIS 7 on a Windows server allows url rewrites to be set up, but the site was done in another way that forces the urls to change to their product name. So, he has believed it to be okay. However, the site dropped significantly in its ranking in July 2013 which appears to be a Panda penalty. In trying to figure out if this could be a factor in why the site has suffered, I would like to know other webmasters opinions. We have already killed many pages, removed 2/3 of the index that Google had, and are trying to understand what else it could be. Also, in doing a header check, I see that it shows the /modules/products... page return a 301 status. I assume that this is okay, but wanted to see what others had to say about this. When I look at the source code of a product page, I see a reference to the /modules/products... I'm not sure if any of this pertains, but wanted to mention in case you have insight. I hope to get good feedback and direction from SEOs and technical folks
Intermediate & Advanced SEO | | ABK7170 -
What are the best ways to fix 404 errors?
I recently changed the url of my main blog and now have about 100 404 errors. I did a redirect from the old url to the new one however still have errors. 1. Should I do a 301 redirect from each old blog post url to the new blog post url? 2. Should I just delete the old blog post (url) and rewrite the blog post? I"m not concerned about links to the old posts as a lot of them do not have many links.
Intermediate & Advanced SEO | | webestate0 -
Consolidate 150 domains to 1
Hi! Just as the questions tell we are looking at a project where we might have to consolidate 150 different domains into 1 (of course with a corresponding page on the new domain). We aim at preserving as much of the linkjuice as possible from each domain. Any advice on doing this propely? I, of course, see a risk of opening the new domain and just redirecting (301) the old domains to the specific page on the new domain but is there any right or wrong way of doing this? I might add that each domain has a more or less unique linkprofiles in terms om linking domains, number of linking domains and such. Our dear friend Cutts has some information on this topic, http://www.youtube.com/watch?v=l7M22teF3Ho but he only talks about 4 domains - which of course seem like a bit more natural occurring phenomenon. But what about 150 of them? Anyone got any advice? Is this as much of a no-go that I feel it is? Thanks! Edit: There domains are all owned by the same entitiy, share the same GWT and such.
Intermediate & Advanced SEO | | bebetteronline0