Panda 2.2 Full Recovery In Action
-
I have had several new clients come to me after Panda and Panda 2. Lots of audits. The client who had the worst problems, and has since corrected the worst issues based on my audit just bounced back in an epic way, and while it could be a short-term thing, I don't believe that's the case - it's just too big of a jump back - full recovery.
I'm curious to find out if anyone sees a similar recovery on your sites.
FYI the biggest problems (most of which have been resolved now) include:
- Content organization - it was a mess of a site
- Extreme over-use of ads on the page and in the content
- Topical focus - there was so much going on across every page of the site that confused Google
- Major site speed issues
-
They were running several ad networks feeding ads everywhere - the system was choking on it. Then there's the fact that it's a Drupal site, set up with default native taxonomy, causing serious data extraction bottle-necks.
Both issues had to be addressed. It's still slow, but much better.
-
HI Alan,
Great to hear. Could you expand on your experience with the site speed and how you improved it? Was it server side or due to site content? Location of server?
-
it's actually hit a higher level of traffic than pre-May-Day. Hard to tell on that chart but amazing if it holds.
-
Great news, Alan. Keep us posted if this holds. Looks like traffic is about up to normal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Entire website is duplicated on 2 domains - what to do?
My client's website has 1000+ pages and a Domain Authority of 23. I have just discovered that the entire site is duplicated on a second domain (main URL = companyname.com - duplicate site URL = company-name.com). The home page of the duplicate domain has a 301 redirect going to the main domain. However, none of the 1000+ other pages have any redirect set up, so Google is indexing the entire duplicate site. I'm assuming this is a bad thing for SEO. Duplicate site has a domain Authority of 4, so I'd like to transfer whatever link juice it has, towards the main site. What's the best thing to do? Ultimately I think it would be best to delete the duplicate site. So would it be a case of adding a redirect to the htaccess file along the lines of: redirect company-name.com/?slug? to https://companyname.com/?slug? (I realise this isn't the correct syntax - but is the concept correct?) Has anyone ever dealt with this successfully?
Technical SEO | | BottleGreenWebsites0 -
Merge 2 websites into one, using a non-existing, new domain.
I need to merge https://www.WebsiteA.com and https://www.WebsiteB.com to a fresh new domain (with no content) https://www.WebsiteC.com. I want to do it the best way to keep existing SEO juice. Website A is the companies home page and built with Wordpress Website B is the company product page and built with Wordpress Website C will be the new site containing both website A and B, utilizing Wordpress also. What is the best way to do this? I have research a lot and keep hitting walls on how to do it. It's a little trickier because it's two different domains going to a brand new domain. Thanks
Technical SEO | | jarydcat10 -
Blog Page Titles - Page 1, Page 2 etc.
Hi All, I have a couple of crawl errors coming up in MOZ that I am trying to fix. They are duplicate page title issues with my blog area. For example we have a URL of www.ourwebsite.com/blog/page/1 and as we have quite a few blog posts they get put onto another page, example www.ourwebsite.com/blog/page/2 both of these urls have the same heading, title, meta description etc. I was just wondering if this was an actual SEO problem or not and if there is a way to fix it. I am using Wordpress for reference but I can't see anywhere to access the settings of these pages. Thanks
Technical SEO | | O2C0 -
Blogspot domains - giving me a manual action
So some agency did horrendous article submissions on mass in 08/09. Since I have been tidying this up by manually getting the domains removed in our back-link profile. Some however i just cannot get rid of. Recent penguin update obviously penalised me for this, so i disavowed the rest i could not remove and did a reconsideration request. The reply from Google was still that it violates guidelines and it used 3 blogspot domains (which no crawler i used had previously found) as examples. Now there is NOONE at Google to contact about this and the sites are abandoned, so they just sit there doing damage. I will ofcourse add these to the disavow but can i disavow the whole of blogspot.com ? What if all are in the disavow but they still use it against us in the reconsideration request and i cannot remove them as noone to contact at Google? Really appreciate the help, thanks, 2 years of hell tidying up bad agency work!
Technical SEO | | pauledwards0 -
Google sees 2 home pages while I only have 1
How to solve the problem of google seeing both domain.com and domain.com/index.htm when I only have one file? Will the cannonical work? If so which? Or any other solutions for a novice? I learned from previous blogs that it needs to be done by hosting service, but Yahoo has no solution.
Technical SEO | | Kurtyj0 -
No manual spam actions found - still my site does not rank
I noticed it on the 1st of October 2012 - that all my rankings disappeared - i filed a reconsideration request w google and i got this - No manual spam actions found. I have no idea why my site would have been subject to an algo change which made my rankings completely go away - i have not used spam, not used any kind of linkbuilding. Can you guys look at my site and see if you have any ideas: http://tinyurl.com/9a5k38u Thank you, Cary
Technical SEO | | CMTM0 -
Sitemap.xml - autogenerated by CMS is full of crud
Hi all, hope you can help. the Magento ecommerce system I'm working with autogenerates sitemap.xml - it's well formed with priority and frequency parameters. However, it has generated lots of URLs that are pointing to broken pages returning fatal erros, duplicate URLs (not canonicals), 404s etc I'm thinking of hand creating sitemap.xml - the site has around 50 main pages including products and categories, and I can get the main page URLs listed by screaming frog or xenu. Then I'll have to get into the hand editing the crud pages with noindex, and useful duplicates with canonicals. Is this the way to go or is there another solution thanks in advance for any advice
Technical SEO | | k3nn3dy30 -
If non-paying customers only get a 2 min snippet of a video, can my video length in sitemap.xml be the full length?
I am working on a website that all of its primary contents are videos. They have an assortment of free videos, but the majority or viewable only with a subscription to the site. If you don't have a subscription, you can see a 2 min video clip of the contents of the video. But all the videos can be anywhere from 10min to 1.5 hours. When I am auto-generating the sitemap.xml, can I put the full length of the videos for paying members in the XML in the video:duration property? Or because publicly only 2 minutes is available (unless you pay for a membership) is that frowned upon?
Technical SEO | | nbyloff0