Reality of Panda 3.9 Refresh
-
I have had a 10 page website(registered in 1999) rank for my top keywords(top 5) for over 4 years. No changes have been made to the website. (Static website).
July 11, 2012, most of the keywords, and all the major keywords were dropped from Google. They remain steady in Bing and Yahoo.
I saw that some people referred to a Panda 3.9 refresh on that day, but also saw that Google(Matt Cutts) denied the refresh.
Given the simplicity of the website and the strong backlinks, which remain, what are other reasons I could see a drastic drop in 1 day.
Any ideas on where to target my search for solving this very serious issue? Any thoughts would be appreciated.
-
Thanks for all the feedback. After some serious review, I am convinced that Google somehow began indexing our HTTPS pages and dropped all our HTTP pages. As this is a .net website with a web.config file, what would you all recommend I do to make the google bot read the http pages instead of the https pages.
Would you add robot.txt file to the web.config file or handle it another way.
Again, thanks for all the assistance.
-
Check WMT for any notices too. Check for any new spammy links pointing to/from the site.
Neg seo has been talked about a lot lately.
-
Nothing "naughty". I have done some guest articles on various blogs related to the industry over the past 4 months but they are all legit unique articles and on sites with a domain authority or 30 or higher so that should not have been the issue. Also, they were not paid articles, they were free articles.
Our chief competitors have been actively promoting theire sites and increasing the sizes of their websites. We have been limited. We also do not do any PPC, but all our top competitors are doing PPC. That should not be an issue, but maybe things are changing. I am not sure on that point.
Thanks for the feedback.
-
I have only been with this company since July 2011, but I believe they were hit by the Penguin update of March 2011. Since then it has been steady. During that update, they lost their local listings but retained there national rankings. Since that time, the rankings have remained in the top 5 for our major keywords.
-
HI,
I guess there are updates happening all of the time with Google algorithms where they're always trying to improve the quality of results.
So my questions to you are:
Are you sure that you've not been doing what google might consider "naughty tactics"?
Have you had anything flagged in your Webmaster Tools account?
How far down the listings dod you drop and how are the competitors ranking. Are there any similarities between your site and the main competitors. IE if they didn't suffer, what are you doing differently?
I know it's not really an answer for you but some food for thought that I hiope helps.
Best of luck
Steve
-
Did any of your rankings dropped during the Penguin update? Panda update?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Current Sitelinks Refresh/Removal timeframe?
Hi All, Did good chunk of research here prior to posting on sitelinks. Tried the Webmaster Console to remove the links Tried removing via sitemap, WP SEO plugin removal Now removed unnecessary pages, and redirected to homepage. Re petitioned for removal via Webmaster Google Console area. Whats been the recent conservative realistic estimate of when sitelinks will be removed/refreshed? Also any tricks to expedite recrawl or re build of sitelinks. The site has very low traffic SMB that uses site purely for portfolio and no ranking/traffic purposes. so maybe the priority is very low for this site. Back in the day sitelinks were a lot easier to manage or so it seems. Thoughts?
Intermediate & Advanced SEO | | vmialik1 -
How safe is it to use a meta-refresh to hide the referrer?
Hi guys, So I have a review site and I'm affiliated with several partnership programs whose products I advertise on my site. I don't want these affiliate programs to see the source of my traffic (my site), so I'm looking for a safe solution to hide the referrer URL. I have recently added a rel="noreferrer" tag to all my affiliate links, but this method isn't perfect as not all browsers respect that rule. After doing some research and checking my competitors I noticed that some of them use meta-refresh, which seems more reliable in this regard. So, how safe is it to use meta-refresh as means of hiding referrer URL? I'm worrying that implementing a meta-refresh redirect might negatively affect my SEO. Does anybody have any suggestions on how to hide the referrer URL without damaging SEO? Thank you.
Intermediate & Advanced SEO | | Ibis150 -
Top 3 concepts in modern SEO - best resources
Hello, What are the top 3 concepts in modern SEO in your honest opinion, and what are your best sources for learning about them. For example, #1 10X Content This Whiteboard Friday
Intermediate & Advanced SEO | | BobGW0 -
Are Incorrectly Set Up URL Rewrites a Possible Cause of Panda
On a .NET site, there was a url rewrite done about 2 years ago. From a visitor's perspective, it seems to be fine as the urls look clean. But, Webmaster tools reports 500 errors from time to time showing /modules/categories... and /modules/products.... which are templates and how the original urls were structured. While the developer made it look clean, I am concerned that he could have set it up incorrectly. He acknowledged that IIS 7 on a Windows server allows url rewrites to be set up, but the site was done in another way that forces the urls to change to their product name. So, he has believed it to be okay. However, the site dropped significantly in its ranking in July 2013 which appears to be a Panda penalty. In trying to figure out if this could be a factor in why the site has suffered, I would like to know other webmasters opinions. We have already killed many pages, removed 2/3 of the index that Google had, and are trying to understand what else it could be. Also, in doing a header check, I see that it shows the /modules/products... page return a 301 status. I assume that this is okay, but wanted to see what others had to say about this. When I look at the source code of a product page, I see a reference to the /modules/products... I'm not sure if any of this pertains, but wanted to mention in case you have insight. I hope to get good feedback and direction from SEOs and technical folks
Intermediate & Advanced SEO | | ABK7170 -
Panda 4.0 Update Affected Site - What should be a the minimum Code to Text Ratio we should aim for ?
Hi All, My eCommerce site got hit badly with the Panda 4.0 update so we have been doing some site auditing and analysis identifying issues which need addressing. We have thin/duplicate issues which I am quite sure was part of the reason we were affected by this even though we use rel=next and rel=prev along with having a separate view all page although we don't concanical tag to this page as I dont' think users would benefit from seeing to many items on one page. This led me to look at our Code to Content Ratio. We have now managed to increase it from 9% to approx 18-22% on popular pages by getting rid of unnecessary code etc. My question is , is there an ideal percentage the code to content ratio should be ?.. and what should I be aiming for ? Also any other Panda 4.0 advice would also be appreciated thanks Sarah
Intermediate & Advanced SEO | | SarahCollins0 -
Do you know a case where product variations caused panda?
Would like to add 300 products to ecommerce site of which 150 products are just variations in different colors. In this particular case there are some reaons for: Not writing partially unique product descriptions for each product variation Not setting them up as variations on one product page So I would have several product pages with nearly identical product descriptions (proprietary description written by us), just name of color in title and description and EAN in description being different, as well as over time different user generated content showing up. Also different product images used. I would not mind if google would not index some product variations. Do you think I should be concerned about Panda? Do you know any website which had a Panda problem caused by product variations? Thanks
Intermediate & Advanced SEO | | lcourse0 -
Panda recovery. Is it possible ?
Dear all, To begin, english is not my native language, so I'm very sorry if I make some mistake. On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic. The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that. So, in may, I have made a new version. Here you can see the most important modifications : A smallest header (-100px height). 2 columns website (the oldest website had 3 columns) I have deleted the category menu with the list of all categories and the alphabetical menu. less ads on the website (since few days I have also deleted the 2 adense blocks) The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only. I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex. I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used). I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam. Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment. All the merchants pages without promo codes have a noindex on the robot tag. Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes. Affiliate links are created on JS which open a new window (a redirect page with noindex). That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS... At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery. I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites... I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do. I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations. Many thanks for all. Sincerely, Florent
Intermediate & Advanced SEO | | Floroger0 -
Panda Prevention Plan (PPP)
Hi SEOMOzers, I'm planning to prepare Panda deployment, by creating a check-list from thinks to do in SEO to prevent mass trafic pert. I would like to spread these ideas with SEOMoz community and SEOMoz staff in order to build help ressources for other marketers. Here are some ideas for content website : the main one is to block duplicate content (robots.txt, noindex tag, according to the different canonical case) same issue on very low quality content (questions / answers, forums), by inserting canonical redirect or noindex on threads with few answers
Intermediate & Advanced SEO | | Palbertus1