Reality of Panda 3.9 Refresh
-
I have had a 10 page website(registered in 1999) rank for my top keywords(top 5) for over 4 years. No changes have been made to the website. (Static website).
July 11, 2012, most of the keywords, and all the major keywords were dropped from Google. They remain steady in Bing and Yahoo.
I saw that some people referred to a Panda 3.9 refresh on that day, but also saw that Google(Matt Cutts) denied the refresh.
Given the simplicity of the website and the strong backlinks, which remain, what are other reasons I could see a drastic drop in 1 day.
Any ideas on where to target my search for solving this very serious issue? Any thoughts would be appreciated.
-
Thanks for all the feedback. After some serious review, I am convinced that Google somehow began indexing our HTTPS pages and dropped all our HTTP pages. As this is a .net website with a web.config file, what would you all recommend I do to make the google bot read the http pages instead of the https pages.
Would you add robot.txt file to the web.config file or handle it another way.
Again, thanks for all the assistance.
-
Check WMT for any notices too. Check for any new spammy links pointing to/from the site.
Neg seo has been talked about a lot lately.
-
Nothing "naughty". I have done some guest articles on various blogs related to the industry over the past 4 months but they are all legit unique articles and on sites with a domain authority or 30 or higher so that should not have been the issue. Also, they were not paid articles, they were free articles.
Our chief competitors have been actively promoting theire sites and increasing the sizes of their websites. We have been limited. We also do not do any PPC, but all our top competitors are doing PPC. That should not be an issue, but maybe things are changing. I am not sure on that point.
Thanks for the feedback.
-
I have only been with this company since July 2011, but I believe they were hit by the Penguin update of March 2011. Since then it has been steady. During that update, they lost their local listings but retained there national rankings. Since that time, the rankings have remained in the top 5 for our major keywords.
-
HI,
I guess there are updates happening all of the time with Google algorithms where they're always trying to improve the quality of results.
So my questions to you are:
Are you sure that you've not been doing what google might consider "naughty tactics"?
Have you had anything flagged in your Webmaster Tools account?
How far down the listings dod you drop and how are the competitors ranking. Are there any similarities between your site and the main competitors. IE if they didn't suffer, what are you doing differently?
I know it's not really an answer for you but some food for thought that I hiope helps.
Best of luck
Steve
-
Did any of your rankings dropped during the Penguin update? Panda update?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to best add affiliate links in a way that minimizes panda risk?
We have a site of about 100.000 pages that is getting several million of visitors per year via organic search. We plan to add about 50.000 new pages gradually in the next couple of months and would like to add affiliate links to the new pages. All these 50.000 new pages will have unique quality data that a team has been researching for a while. I would like to add in the area under the fold or towards the end of the pages in an unobstrusive way affiliate links to about 5 different affiliate programs with affiliate links customized to page content and of real value to visitors. Since affiliate links are one of the factors that may trigger panda I am a bit nervous whether we should add the affiliate links and if there is any way of implementing the affiliate links in a way that they may be less likely to trigger panda. E.g. would you consider hiding affiliate links from google by linking to intermediate URL (which I would mark as noindex nofolllow) on our domain which then redirects to the final affiliate landing page (but google may notice via chrome or android data) ? Any other idea?
Intermediate & Advanced SEO | | lcourse0 -
Can You Help Confirming That 1stWebDesigner Was hit by Panda 4.0 and PayDay Loans 2.0,3.0?
Hello, really hoped for your feedback here. So my site is http://www.1stwebdesigner.com I just used http://barracuda.digital/panguin-tool/ - and came to realization, that indeed we might have been hit by famous Panda 4.0 update. Here is screenshot from Barracuda tool - https://www.evernote.com/l/AELGaZ6nyxBE1aK7oQJVHBPKJuwjUh5JWk8 and in attachment there is Google Analytics screenshot for that time - especially May, 2014. Can you please help me confirm that we indeed have been hit with the Penalty? It has been 1.5 years already and since then, the traffic never has grown more. Before May, 2014 our site received 1.5 million pageviews/mo, after it..until this moment traffic has never been more than 600,000 pageviews/mo. Now if I look back even more with Barracuda tool, I can see it was also affected by Top Heavy 1.0 update,Thu Jan 19 2012. Even back then before all these updates site grew to 2.5 million pageviews/mo. What is the painful part I always focused on time-consuming link building. Submitting links to design news sites, leaving quality comments under related articles and always have written unique quality content on site for years. Can you tell based on screenshots and confirm we indeed have penalty. And maybe suggest what are the actions to take now? I have ahrefs pro subscription and started using Disavow tool to prepare for submission to Google. Would love to hear feedback on this..it has been painful throughout years. Update: here is one more screenshot from Barracuda looking from 2012 year to traffic - https://www.evernote.com/l/AEIX6iP4USBGtbyRLZ1oTmcACgBtXHBP2rw SpRPIyY
Intermediate & Advanced SEO | | researchninja0 -
With the New Panda update supposedly only weeks away, is it wise to No Index my products I have not had time to rewrite the product descriptions for ?
Hi Mozzers, I read on SEJ yesterday than apparently the Panda update was due in the 2 - 4 weeks. I still have a large of my products which I have not got around to rewriting unique product descriptions for. I know these product descriptions are duplicated on other affiliate sites so do it think it in light of the panda update coming , would it wise to put a NO INDEX Meta tag on these product pages until I get around to rewriting the descriptions. That way, I may not hit my Panda and it will buy me a bit more time. Just an idea , but thought I'd run it by. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Moving Part of a Website to a Subdomain to Remove Panda Penalty?
I have lots of news on my website and unlike other types of content, news posts quickly become obsolete and get a high bounce rate. I have reasons to think that the news on my website might be partly responsible for a Panda penalty so I'm not sure. There are over 400 news posts on the blog from the last 4 years so that's still a lot of content. I was thinking of isolating the news articles on a subdomain (news.mywebsite.com) If the news play a part in the Panda penalty, would that remove it from the main domain?
Intermediate & Advanced SEO | | sbrault740 -
Panda Updates - robots.txt or noindex?
Hi, I have a site that I believe has been impacted by the recent Panda updates. Assuming that Google has crawled and indexed several thousand pages that are essentially the same and the site has now passed the threshold to be picked out by the Panda update, what is the best way to proceed? Is it enough to block the pages from being crawled in the future using robots.txt, or would I need to remove the pages from the index using the meta noindex tag? Of course if I block the URLs with robots.txt then Googlebot won't be able to access the page in order to see the noindex tag. Anyone have and previous experiences of doing something similar? Thanks very much.
Intermediate & Advanced SEO | | ianmcintosh0 -
Article Submissions - Still Worth it After Panda Update?
Are article submissions still relevant after the panda update? Many of these sites (ezinearticles) are still hit from the panda update.
Intermediate & Advanced SEO | | qlkasdjfw0