Penguin Update Issues.. What would you recommend?
-
Hi,
We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%.
We suspect it's for a couple of reasons
1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx
We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week
2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle?
3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com.
Any help will be much appreciated as this is Killing our business.
Jay
-
Hey Ben,
Thank you so much for your response.
I 'm pretty sure it was the Penguin update that brought our rankings down.
We don't participate in any paid linking, no blog networks etc.
The only thing we did was submit to article directories- which i understand are frowned upon now so we'll move away from that.
We'll try to get all the non existent pages to show 404 codes and any clear any duplicate page title and page content errors and hope that we'll get back in google good graces.
-
Hi Jay,
Sorry to hear it's hurting your business so much.
Have you double checked the dates of your decrease in traffic against the Penguin update? There were a lot of big changes going on around that time so it's worth being sure it was Penguin.
In answer to question 3 - If they're external sites then I don't think those 1700 404s are having a negative effect on your SEO. If those directories are hurting you at all through the Penguin update then it would be through over-optimised anchor text (although I haven't seen any definitive data on this).
In answer to question 2 - Would I be right in thinking that you're using a 301 or a 302 to send users to a generic error page? However you're generating soft 404s the best fix is to make them real 404 errors so the server returns a 404 code. The details of setting up a custom 404 page are pretty well documented around the web so you shouldn't have much problem with it.
In answer to question 1 - Have you tried checking to see if Google has re-cached your pages since the change? It's also probably worth looking at the rel=prev rel=next markup as well. Maile Ohye from Google has released a pretty comprehensive video on the topic of pagination and SEO so I'd recommend checking that out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Job Posting Page and Structured Data Issue
We have a website where we do job postings. We manually add the data to our website. The Job Postings are covered by various other websites including the original recruiting organisations. The details of the job posting remain the same, for instance, the eligibility criteria, the exam pattern, syllabus etc. We create pages where we list the jobs and keep the detailed pages which have the duplicate data disallowed in robots.txt. Lately, we have been thinking of indexing these pages as well, as the quantum of these non-indexed pages is very high. Some of our competitors have these pages indexed. But we are not sure whether doing this is gonna be the right move or if there is a safe way to deal with this. Additionally, there is this problem that some job posts have very less data like fees, age limit, salary etc which is thin content so that might contribute to poor quality issue. Secondly, we wanted to use enriched result snippets for our job postings. Google doesn't want snippets to be used on the listing page: "Put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages). Instead, apply structured data to the most specific page describing a single job with its relevant details." Now, how do we handle this situation? Is it safe to allow the detailed pages which have duplicate job data and sometime not so high quality data in robots.txt?
Intermediate & Advanced SEO | | dailynaukri0 -
Is there a way to no index no follow sections on a page to avoid duplicative text issues?
I'm working on an event-related site where every blog post starts with an introductory header about the event and then a Call To Action at the end which gives info about the Registration Deadline. I'm wondering if there is something we can and should do to avoid duplicative content penalties. Should these go in a widget or is there some way to No Index, No Follow a section of text? Thanks!
Intermediate & Advanced SEO | | Spiral_Marketing0 -
How often does the WMT incoming links gets updated?
Hi, We made some drastic changes removing links (mainly resulted from one domain) and wondered when we should expect a change in the incoming links report of Google's WMT...? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
WMT Showing Duplicate Meta Description Issues Altough Posts Were Redirected
Dear Moz Community, Some time ago we've change the structure of our website and we've redirected the old URL's to the new ones. About 2,000 posts were redirected at that time. While checking Webmaster Tools a few days ago I've discovered that about 500 duplicate meta-description issues appear in the "HTML Improvements" area. To my surprise, altough the old posts were redirected to the new path, WMT sees the description of the old posts similar with the one of the new post. Moreover, after changing the structure all meta-descriptions were modified and they weren't the same used before the restructure. For example I've redirected /blog/taxi-transfer-from-merton-sw19-to-london-city-airport/ to /destinations/greater-london/merton-sw19/taxi-transfer-to-london-city-airport-from-merton/ Now they are shown as having duplicate content. I've checked the redirects and they are working. I get the same error from the redirected pages for about 150 titles. Did anyone else get this errors or can you please offer me some suggestions about how I can fix this? Thank you in advance! Tiberiu
Intermediate & Advanced SEO | | Tiberiu0 -
Was anyone hit by BOTH the 'Phantom' update as well as Penguin 2.0?
I'm interested to know if Phantom was just a "pre-Penguin" 2.0 or if it was a completely different update. Thoughts?
Intermediate & Advanced SEO | | nicole.healthline0 -
What About Google Panda Update 22?
Maybe I haven't found the threads or whatever but I haven't seen lots of posts about the latest Google Panda update from November 21-22 on SEOmoz. Panda 22 is not even listed here: http://www.seomoz.org/google-algorithm-change Until November 21st, Google killed 3 of 5 websites I own through their Panda updates (never got hit by Penguin updates as I got only original content), accounting for about 25% of my income. Fortunately, the 2 remaining websites gained more traffic throughout the summer of 2012 so my income almost got back to 100% even though I got the "Unnatural Links" warning in Google Webmaster Tools in July. Since then, I did a huge link cleanup and according to the Link Detox Tool (from another SEO service), the number of "toxic links" went from about 350 to 50. Back link reports is as follow: 8% (52) Toxic Links; 57% (382) Suspicious Links; 35% (235) Healthy Links; Out of the 382 suspicious, most of them are coming from the same domain and they are all directories to which my website has been submitted automatically (not using any specific keyword anchor). On the opposite, healthy links are coming from different domains so I like to think they have a stronger impact than suspicious links. That said, my two remaining websites were still doing well until November 21 where it got hit by the Panda. Now traffic has dropped by 55% and income has dropped by 75% (yes I'll have to look for a job within a year if I don't fix this). (I want to add that none of my websites are "thin websites". One has over 1500 pages of content and the other has about 500 pages. All websites have content added 3 to 5 times a week.) What I don't get is that all my "money keywords" are still ranked in the top 10 results on Google according to multiple tools / services I use, yet the impressions dropped from 50% to 75% for those keywords?!? I have a feeling that this time it's not only a drop in ranking. There's a drop in impressions caused by something else. Is it caused by emphasis on local search? Are they showing more ads and less organic results? But here's the "funny part": For the last 5 years, I was never able to advertise my website on Google Adwords. Each time, I got a quality score of about 4/10 only to see it drop to 1/10 within a few hours of launching the campaign. On November 22nd, I build new PPC campaigns based on the exact same PPC campaigns I had the past (same keywords, same ads, same landing pages). Guess what? Now the quality score is between 7/10 and 10/10 (most of them have 10/10) for the exact same PPC campaign! What a "coincidence" huh?
Intermediate & Advanced SEO | | sbrault740 -
Indexing issue?
Hey guys when I do a search of site:thetechblock.com query in Google I don't seem to see any recent posts (nothing for August). In Google webmaster I see that the site is being crawled (I think), but I'm not sure. I also see the the sitemaps are being indexed but again it just seems really odd that I'm not seeing these in Google results. SEO seems all good too with SEO Moz. Is there something I'm not getting?
Intermediate & Advanced SEO | | ttb0 -
Penguin Rescue! A lead has been hit and I need to save them!
I had a meeting today with a prospective client who has been hit by Penguin. Their previous SEO company has obviously used some questionable techniques which is great for me, bad for the client. Their leads have dropped from 10 per day to 1 or 2. Their analytics shows a drop after the 25th, a back link check shows a lot of low quality links. Domain metrics are pretty good and they are still ranking ok for some keywords. I have 1 month to turn it around for them. How do you wise people think it can be done? First of all I will check the on-site optimisation. I will ensure that the site isn't over optimised. Secondly, do I try and remove the bad links? Or just hit the site with good content and good links to outweigh the bad ones. Also, do you think G is actually dropping rankings for the over optimisation / bad links or are the links are just being discredited rsulting in the drop in rankings. 2 very different things. Any advice is appreciated. Thanks
Intermediate & Advanced SEO | | SimpsonGareth0