Has My Site Been Hit by Panda 4.0?
-
I operate a New York City commercial real estate web site (www.nyc-officespace-leader.com). Ranking and traffic have dropped steeply since early June.
Around May 20th a new Panda update was launched by Google and I wonder if that could partially explain the drop. My site contains the following:
-300 listing pages. These are product pages and often contain less than 100 words. Many have not been changed in two years.
-150 Building pages. These contain less than 220 words. Many have not been changed in two years.
-40 blog pages. We have been adding 1 or 2 per month.
-50 or 60 neighborhood and type of space pages. These contain 200-600 words.
Could our drop in traffic be due to Panda? I might add that an upgraded version of the site with new forms, a modified right rail an header was launched on June 6th. Also, we submitted a disavow file with Google on April 20th for about 100 toxic domains, one third of the 300 domains that link to us.
In order to take remedial action we need to understand what has happened. Any ideas???
Thanks, Alan
-
Were the 300 listing pages ever receiving traffic? If the answer to this question is no or if a significant number of them weren't ever receiving any traffic it might change what I would think you should do. Poor title tags will hurt your visibility in lots of ways. I would not personally tie the title tag strategy to Panda. Panda is a content algo. It seems to look for duplicate, near duplicate, thin content, poor quality content and then make sure that sites with those criteria are not ranking. If you think more broadly about it you might ask why Google would want to take the whole site down in the rankings for thin content on a few or many pages with potentially low or no traffic. I think the reason they are penalizing the whole site is because they don't want webmasters producing this type of content. If they can get content creators to think twice before creating another 20 urls about x topic then over the long haul their job will become much easier. They can fight off spam more easily because it won't work. I was very angry when Panda 4 rolled out and some sites I own got hit. However, I feel empowered now to correct the issue. My suggestion for you is to compare the urls with their links and traffic. There should be some clear cut low quality stuff that you can noindex. On the pages that drive traffic I would make sure you are providing deep, helpful content. Hard to discuss all the things you may need to do over email but I think you are probably getting the idea. PM me if you want to chat more.
-
Hi Brad:
I have had a very similar experience. Reduced ranking across the board for any competitive term, but not eliminated.
My site contains 300 listing pages (product pages). Those listing are usually short (less than 125 words). Would beefing them up to 200-300 words help? Also the title tags for these terms are not that well written. Some look like this:
"242 W. 38th St. - Manhattan, New York", "12 W. 37th St. - Manhattan, New York,"
Some are more elaborate like: "14 Penn Plaza | Great Midtown Office Rental | 2,313 SF Negotiable", "Lease 14 Penn Plaza Executive Offices | 34th Street | 4836 SF"
Could poor title tags contribute to the penalty.
Also, many listings have "Manhattan, New York" added at the end of the title tag, really impoverishing it, like: "Park Avenue South 27th & 28th Street- Manhattan, New York"
"44th Street-Eighth & Ninth Avenues - Manhattan, New York"These title tags seem really poor. Could that trigger the Panda penalty?
Also, many title tags for other page categories have "Metro Manhattan Office Space" at the end of them. Could this trigger the penalty? I think the title tags are really bad.
<colgroup><col width="377"></colgroup>
|<colgroup><col width="377"></colgroup>
| ||
-
Hi Egol:
Will Google remove Panda penalties once content is improved? Or does an improvement take months (like waiting for a Penguin update)
I can potentially correct two out of the three issues you mention without enormous SEO costs. I am not sure how to address the structural problems.
Thanks,
Alan -
Google has been assessing sites with the Panda algo for over two years. They are continuously changing the algo to promote sites that fit its requirements and demote sites that have content that does not.
In my opinion, your site has technical problems, thin content and duplicate content that could trigger Panda problems. Read my replies to your previous questions to find them.
-
If you saw a drop around May 19-20 then I would say it is almost certain that this was Panda. I've been in a similar experience. I own a network of content sites that had never been clearly hit by Panda on the day of launch until Panda 4 where they got hit very hard. When you consider Panda I would think of it less like an algo change and more like a filter. Panda seems to crawl the site looking for certain criteria. If your criteria meets these then all your results get filtered. Remember that Google has clearly stated that their goal with Panda is for sites with thin or duplicated content to not rank high on the page. In my case I am seeing that I still rank for everything that I used to rank 1, 2, 3 for but now I rank 8, 9, 10. I have no proof yet that I can get out from under it but I am making a lot of changes now that I believe will do the trick to fixing my own traffic issues. I found this article to be very helpful as I considered what I should do next
http://cognitiveseo.com/blog/5536/google-panda-4-0-topical-authority-content-update-2014-case-study/
If you want to reach out to me privately I'd be happy to discuss in more detail.
Brad
-
Hi,
Did you get any penalty spam email? I would go through your links in webmaster tools and moz to see if your site is on and spammy sites, forums or blogs. If so ask for removal or disavow them. I would also add some good rich content to the pages that may have "thin content"
-
Hi EGOL:
Interesting. But why now? Panda updates have been rolling out frequently to no ill effect. There have been no material changes in the content in the last few months except for updating many key pages with better content.
How is this latest update of Panda different?
What would you suggest I do to escape from this penalty?
I have a bout 350 product pages with thin content (less than 120 words each). Would no indexing them or rewriting them help? What if I increased the content by 100-150 words? But I would think that Google would understand that product pages could have light content and still add value.
Thanks, Alan
-
** Has My Site Been Hit by Panda 4.0?**
I believe that some form of Panda has gotten your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Merging Niche Site
I posted a question about this a while ago, but still haven't pulled the trigger. I have a main site (bobsclothing.com). I also have a EM niche site (i.e shirtsmall.com). It would be more efficient for me to merge these site, because: I would have to manage content, promos, etc. on a single site. In other words, I can focus efforts on 1 site. If I am writing content, I don't have to split the work. I don't have to worry about duplicate content. Right now, if I enter a product URL into copyscape, the other sites is returned for many products. What makes me apprehensive are: The niche site actually ranks for more keywords than the main site, although it has lower revenue. Slightly lower PA, and DA. Niche site ranks top 20 for a profitable keyword that has about 1300 exact match searches. If you include the longer tail versions of the keyword it would be more. If I merge these sites, and do proper 301s (product to product, category to category) how likely is it that main site will still rank for that keyword? Am I likely to end up with a site that has stronger DA? Am I better off keeping the niche site and just focusing content efforts on the few keywords that it can rank well for? I appreciate any advice. If someone has done this, please share your experience. TIA
Intermediate & Advanced SEO | | inhouseseo0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
Following Penguin 2.0 hit in May, my site experienced another big drop on August 13th
Hi everyone, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update in May. This was the first significant drop that the site has experienced since 2007, and I was initially concerned that the new website design I released in March was partly to blame. On further investigation, many spammy sites were found to be linking to my website, and I immediately contacted the sites, asked for the removal of the sites, before submitting a disavow file to Google. At the same time, I've had some great content written for my website over the last few months, which has attracted over 100 backlinks from some great websites, as well as lots of social media interaction. So, while I realise my site still needs a lot of work, I do believe I'm trying my best to do things in the correct manner. However, on August 11th, I received a message in Google WMTs : Googlebot found an extremely high number of URLs on your site I studied the table of internal links in WMTs and found that Google has been crawling many URLs throughout my site that I didn't necessarily intend it to find i.e. lots of URLs with filtering and sorting parameters added. As a result, many of my pages are showing in WMTs as having over 300,000 internal links!! I immediately tried to rectify this issue, updating the parameters section in WMTs to tell Google to ignore many of the URLs it comes across that have these filtering parameters attached. In addition, since my access logs were showing that Googlebot was frequently crawling all the URLs with parameters, I also added some Disallow entries to robots.txt to tell Google and the other spiders to ignore many of these URLs. So, I now feel that if Google crawls my site, it will not get bogged down in hundreds of thousands of identical pages and just see those URLs that are important to my business. However, two days later, on August 13th, my site experienced a further huge drop, so its now dropped by about 60-70% of what I would expect at this time of the year! (there is no sign of any manual webspam actions) My question is - do you think the solutions I've put in place over the last week could be to blame for the sudden drop, or do you think I'm taking the correct approach, and that the recent drop is probably due to Google getting bogged down in the crawling process. I'm not aware of any subsequent Penguin updates in recent days, so I'm guessing that this issue is somehow due to the internal structure of my new design. I don't know whether to roll back my recent changes or just sit tight and hope that it sorts itself out over the next few weeks when Google has more time to do a full crawl and observe the changes I've made. Any suggestions would be greatly appreciated. My website is ConcertHotels.com. Many thanks Mike
Intermediate & Advanced SEO | | mjk260 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Our Site's Content on a Third Party Site--Best Practices?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content. I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site. Our thoughts so far: add a paragraph of original content to our content link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties) What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site? They are really pushing for not using a canonical--so this isn't an option. What would you do?
Intermediate & Advanced SEO | | nicole.healthline1 -
Panda/Penguin & more than one services site in niche
Hello, My friend has a personal development training site. I have been advised not to make separate personal coaching sites for the owners of the training sites. Do you have experience that Panda/Penguin could penalize for separate sites in a similar niche? Do you need any more info to give a good response? Thank you.
Intermediate & Advanced SEO | | BobGW0 -
Strange recovery from Panda
I have 2 business sites. www.affordable-uncontested-divorce.com is a homestead template site which is old and clunky but has given me steady traffic despite little maintenance. It was unafected by the various Panda updates. It does load very fast. www.uncontesteddivorce-nyc I put up about 18 months ago it is a Thesis Theme Wordpress site with the usual bells and whistles. I put a lot of work into it and around May its traffic finally surpassed my old site. In June traffic to the new site started tanking, ultimately about 30% off. A friendly SEO thought that there was some duplication between the 2 sites and Google might have seen the older site as the authority site and the newere as the scraper. I tried the usual fixes and the decline finally bottomed out but no recovery. I read someone who said that Wordpress sites are problamatical with Panda because of inherent duplicate content issues unless you don't use them as blogs, just as CMS. So I got rid of all the blog posts save one. Around about 3 months ago my traffic started to go up again and now it once again has surpassed the older site. The strange thing about it is that since the recovery my Analytic numbers like bounce rate number of page views and time on site have gone down and are much worse on the new site than they are on the old site. Does anyone have any idea of what' s up? Thx Paul
Intermediate & Advanced SEO | | diogenes0 -
My Job Site is having Indexing Issues
I have 2 job sites that I am managing and working on. One of the sites has a great deal of job vacancies and expired job pages that have been indexed. This one below: http:// job search.cctc .com/cctc Jobsearch/expandedjobsearch.do This job site does not have any job pages index: http://www.cross countryallied. com/ctAlliedWebSite/ travel-nurse-jobs/job-search.jsp Why and what can I do to get the dynamic pages index and ranking? Any help tips would be much appreciated. Thanks
Intermediate & Advanced SEO | | Melia0