Reality of Panda 3.9 Refresh
-
I have had a 10 page website(registered in 1999) rank for my top keywords(top 5) for over 4 years. No changes have been made to the website. (Static website).
July 11, 2012, most of the keywords, and all the major keywords were dropped from Google. They remain steady in Bing and Yahoo.
I saw that some people referred to a Panda 3.9 refresh on that day, but also saw that Google(Matt Cutts) denied the refresh.
Given the simplicity of the website and the strong backlinks, which remain, what are other reasons I could see a drastic drop in 1 day.
Any ideas on where to target my search for solving this very serious issue? Any thoughts would be appreciated.
-
Thanks for all the feedback. After some serious review, I am convinced that Google somehow began indexing our HTTPS pages and dropped all our HTTP pages. As this is a .net website with a web.config file, what would you all recommend I do to make the google bot read the http pages instead of the https pages.
Would you add robot.txt file to the web.config file or handle it another way.
Again, thanks for all the assistance.
-
Check WMT for any notices too. Check for any new spammy links pointing to/from the site.
Neg seo has been talked about a lot lately.
-
Nothing "naughty". I have done some guest articles on various blogs related to the industry over the past 4 months but they are all legit unique articles and on sites with a domain authority or 30 or higher so that should not have been the issue. Also, they were not paid articles, they were free articles.
Our chief competitors have been actively promoting theire sites and increasing the sizes of their websites. We have been limited. We also do not do any PPC, but all our top competitors are doing PPC. That should not be an issue, but maybe things are changing. I am not sure on that point.
Thanks for the feedback.
-
I have only been with this company since July 2011, but I believe they were hit by the Penguin update of March 2011. Since then it has been steady. During that update, they lost their local listings but retained there national rankings. Since that time, the rankings have remained in the top 5 for our major keywords.
-
HI,
I guess there are updates happening all of the time with Google algorithms where they're always trying to improve the quality of results.
So my questions to you are:
Are you sure that you've not been doing what google might consider "naughty tactics"?
Have you had anything flagged in your Webmaster Tools account?
How far down the listings dod you drop and how are the competitors ranking. Are there any similarities between your site and the main competitors. IE if they didn't suffer, what are you doing differently?
I know it's not really an answer for you but some food for thought that I hiope helps.
Best of luck
Steve
-
Did any of your rankings dropped during the Penguin update? Panda update?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shall I hide short product review texts from customers (to avoid google panda/quality issues)?
About 30% of product reviews that the clients of our ecommerce store submitted in the last 10 years are 3 words or less (we did not require any minimum length). Would you recommend to hide those very short review texts? Where to draw the limit?
Intermediate & Advanced SEO | | lcourse
Numeric star rating would still go into our accumulated product rating. My only concern here is what impact it may have on google ranking.
To give some context, the site has for a long time some panda/phantom related issues where there are no obvious reasons that we could point to.0 -
How long for Panda 4.1 fixes to take affect?
Hi, If you have been hit by Panda 4.1 and now putting fixes in place, for this example lets say you remove a load of dup content (and that's what caused the problem) - how long would it take for that fix to take affect? Do you have to wait for the next Panda update? or will it be noticed on the next crawl? Thanks.
Intermediate & Advanced SEO | | followuk0 -
Ranking Google News - 3 digits in URL?
Do we need to have unique 3 digits in URL, like stated here in technical guidelines from Google -- https://support.google.com/news/publisher/answer/40787?hl=en&ref_topic=4359866. Or, is having and submitting Google News XML Sitemap a way around that -- https://support.google.com/news/publisher/answer/68323?hl=en
Intermediate & Advanced SEO | | bonnierSEO0 -
Backlinking 3 sites from same domain and backlinking main site too
Hello, we have 4 sites, in which 1 is a main site and rest 3 are niche sites All these 3 sites have dofollow links to main site from home page We got a high quality backlink - through which all 3 niche sites have got it from that domain Is it worth to add backlink from that domain to main site too, despite the fact the 3 sites already have recvd it and they all link to main site many thanks
Intermediate & Advanced SEO | | Modi0 -
Meta Refresh tag on cache pages- GRRR!
Hi guys, All of our product pages originate in a URL with a unique number but it redirects to an SEO url for the user. These product pages have blocks on the page and these blocks are automatically populated with our database of content. Here's an example of the redirect in place: www.example.com/45643/xxxx.html redirects to www.example.com/seo-friendly-url.html The development team did this for 2 reasons. 1) our internal search needs the unique numbered urls for search and 2) it allows quick redirects as pages are cached. The problem I face is this, the redirects from the cached are being tagged with 'meta refresh', yup, they are 302. The development team said they could stop caching and respond dynamically with a 301 but this would bring in a delay. Speed wise, the cached pages load within 22ms and dynamically 530ms, so yeah half a second more. Currently cached pages just do a meta refresh tagged redirect and I want to move away from this. What would you guys recommend in such a situation? I feel like unless I place a 301, I'll be losing out on rank juice.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites. Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.) Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.) The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed. So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc. After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.) How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both? If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ? Thanks for reading this long post.
Intermediate & Advanced SEO | | JustDucky0 -
Duplicate Content - Panda Question
Question: Will duplicate informational content at the bottom of indexed pages violate the panda update? **Total Page Ratio: ** 1/50 of total pages will have duplicate content at the bottom off the page. For example...on 20 pages in 50 different instances there would be common information on the bottom of a page. (On a total of 1000 pages). Basically I just wanted to add informational data to help clients get a broader perspective on making a decision regarding "specific and unique" information that will be at the top of the page. Content ratio per page? : What percentage of duplicate content is allowed per page before you are dinged or penalized. Thank you, Utah Tiger
Intermediate & Advanced SEO | | Boodreaux0 -
How to compete with duplicate content in post panda world?
I want to fix duplicate content issues over my eCommerce website. I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website. I want to give one example to know more about it. http://www.vistastores.com/outdoor-umbrellas Non WWW version: http://vistastores.com/outdoor-umbrellas redirect to home page. For HTTPS pages: https://www.vistastores.com/outdoor-umbrellas I have created Robots.txt file for all HTTPS pages as follow. https://www.vistastores.com/robots.txt And, set Rel=canonical to HTTP page as follow. http://www.vistastores.com/outdoor-umbrellas Narrow by search: My website have narrow by search and contain pages with same Meta info as follow. http://www.vistastores.com/outdoor-umbrellas?cat=7 http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum I have restricted all dynamic pages by Robots.txt which are generated by narrow by search. http://www.vistastores.com/robots.txt And, I have set Rel=Canonical to base URL on each dynamic pages. Order by pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name I have restrict all pages with robots.txt and set Rel=Canonical to base URL. For pagination pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2 I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages. I have also set Rel=Canonical to base URL. I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages. Google search result: https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520 Since last 7 days, my website have affected with 75% down of impression & CTR. I want to recover it and perform better as previous one. I have explained my question in long manner because, want to recover my traffic as soon as possible.
Intermediate & Advanced SEO | | CommercePundit0