Dropped ranking - Penguin penalty or duplicate content issue?
-
Just this weekend a page that had been ranking well for a competitive term fell completely out of the rankings. There are two possible causes and I'm trying to figure out which it is, so I can take action.
I found out that I had accidentally put a canonical on another page that was for the same page as the one that dropped out of the rankings. If there are two pages with the same canonical tag with different content, will google drop both of them from the index?
The other possibility is that this is a result of the recent Penguin update. The page that dropped has a high amount of exact anchor text. As far as I can tell, there were no other pages with any penalties from the Penguin update.
One last question: The page completely dropped from the search index. If this were a Penguin issue, would it have dropped out completely,or just been penalized with a drop in position?
If this is a result of the conflicting canonical tags, should I just wait for it to reindex, or should I request a reconsideration of the page?
-
Yes I think it was a Penguin drop. There is one other thing about the page that dropped. It is using a 301 re-direct. I had updated the page url a while ago, but nearly all of the links to the page are to the old page. So this penalty might be a combination of signals that collectively have tagged that page.
I'm working on cleaning up the link profile right now. I think that Penguin is a very imperfect animal. But I cant change the beast, so I will just have to make some changes here.
-
It's unlikely the canonical is to blame here, if I'm understanding it correctly. If you tried to canonicalize Page B to Page A, and they were clearly different, one of two things should happen:
(1) Google will just ignore it.
(2) Google will follow it anyway, and drop Page B from the index.
Now, it's theoretically possible that, if Google thought you were using the canonical tag inappropriately to benefit Page A, they could punish Page A, but I've honestly never seen that happen (I've seen it with 301-redirects). Typically, Page B would also have to have a lot of links that you were trying to "clean" (think money laundering). Since Page B is new, this seems very unlikely.
If you're hitting exact-match (or close to it) anchor text hard on Page A, it's certainly possible Penguin came into play, especially if Page A is pushing keywords a bit too hard. It's been tough to confirm Penguin cases, but most of the verified ones I've seen are sudden drops. It's not a subtle, gradual impact.
You could wait for the next Penguin data update, but I suspect you may have to do some link clean up. If there's anything that's not only exact-match anchor text but is sitewide (especially footer links), I'd start there. They seem to be major targets of Penguin. Truthfully, though, we're still collecting data on it.
-
Thanks for the reply!
What happened was that I added a new page and accidentally used the canonical for the page that was ranking well for search terms on that new page.
So to state it a different way - I added a new Page B to the site, but instead of using the canonical for that page, I accidentally used the canonical for Page A. Page A is the page that previously had ranked well for search terms. On Saturday night or Sunday, Page A dropped out of all of the search terms that it ranks for. However, I did a little more research and Page A is still in the index, it just doesnt rank for any of the search terms it used to. Page B is also in the index, but since it is a new page, it does not really rank for any terms. Obviously, I have fixed the canonical on Page B and Google already has the new page in its cache.
As far as over-optimization penalties, Page A has nearly all the inbound links with anchor text that is only a slight variation of the search term. It is the page on the site that I would have expected to have got hit by Penguin. There are some other pages that have lost a little bit of ranking, but nothing drastic.
I am just surprised that if it is a Penguin penalty, it would completely lose ranking on the terms in a single day, rather than moving down the rankings to maybe the third or fourth page. Do you find that Penguin penalties usually result in a lower ranking, or completely losing rankings?
Either way, I'm going to go in and clean up the link profile, but it would be nice to know how aggressive I should be to try to recover that page.
-
I've seen some reports of sites being hit by the Penguin data update ("Penguin 1.1") on Friday night, but I'm not clear on the severity. If it's just one page, though, and it was completely de-indexed, that's pretty unlikely.
It is definitely possible for a bad canonical tag to drop a page from the index. I'm a little confused on what you're saying about the two pages. Are they both canonical'ed to a third page, or to each other? Could you give an example (maybe show us two tags that are similar to what you have, but with the exact details changed)?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
How to fix Category Duplicate Titles Issue?
How to fix Category Duplicate titles and descriptions issues? Most common problem in Wordpress. Example - http://www.abc.com.au/news
Intermediate & Advanced SEO | | varunrupal
http://www.abc.com.au/news/page/3
http://www.abc.com.au/news/page/4
http://www.abc.com.au/news/page/5
http://www.abc.com.au/news/page/10
http://www.abc.com.au/news/page/6
http://www.abc.com.au/news/page/7
http://www.abc.com.au/news/page/9
http://www.abc.com.au/news/page/80 -
Penalty for adding too much content too quickly?
Hi there, We released around 4000 pieces of new content, which all ranked in the first page and did well. We had a database of ~400,000 pieces and so we released the entire library in a couple of days (all remaining 396,000 pages). The pages have indexed. The pages are not ranking, although the initial batch are still ranking as are a handful (literally a handful) of the new 396,000. When I say not ranking - I mean not ranking anywhere (gone up as far as page 20), yet the initial batch we'd be ranking for competitive terms on page 1. Do Google penalise you for releasing such a volume of content in such a short space of time? If so, should we deindex all that content and re-release in slow batches? And finally, if that is the course of action we should take is there any good articles around deindexing content at scale. Thanks so much for any help you are able to provide. Steve
Intermediate & Advanced SEO | | SteveW19870 -
Duplicate content on yearly product models.
TL;DR - Is creating a page that has 80% of duplicated content from the past year's product model where 20% is about the new model changes going to be detrimental to duplicate content issues. Is there a better way to update minor yearly model changes and not have duplicated content? Full Question - We create landing pages for yearly products. Some years the models change drastically and other years there are only a few minor changes. The years where the product features change significantly is not an issue, it's when there isn't much of a change to the product description & I want to still rank on the new year searches. Since I don't want duplicate content by just adding the last year's model content to a new page and just changing the year (2013 to 2014) because there isn't much change with the model, I thought perhaps we could write a small paragraph describing the changes & then including the last year's description of the product. Since 80% of the content on the page will be duplicated from the last year's model, how detrimental do you think this would be for a duplicate content issue? The reason I'm leaving the old model up is to maintain the authority that page has and to still rank on the old model which is still sold. Does anyone else have any other better idea other than re-writing the same information over again in a different way with the few minor changes to the product added in.
Intermediate & Advanced SEO | | DCochrane0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
URL Error or Penguin Penalty?
I am currently having a major panic as our website www.uksoccershop.com has been largely dropped from Google. We have not made any changes recently and I am not sure why this is happening, but having heard all sorts of horror stories of penguin update, I am fearing the worst. If you google "uksoccershop" you will see that the homepage does not rank. We previously ranked in the top 3 for "football shirts" but now we don't, although on page 2, 3 and 4 you will see one of our category pages ranking (this didn't used to happen). Some rankings are intact, but many have disappeared completely and in some cases been replaced by other pages on our site. I should point out our existing rankings have been consistently there for 5-6 years until today. I logged into webmaster tools and thankfully there is no warning message from Google about spam, etc, but what we do have is 35,000 URL errors for pages which are accessible. An example of this is: | URL: | http://www.uksoccershop.com/categories/5_295_327.html | | Error details In Sitemaps Linked from Last crawled: 6/20/12First detected: 6/15/12Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. Is it possible this is the cause of the issue (we are not currently sure why the URL's are being blocked) and if so, how severe is it and how recoverable?If that is unlikely to cause the issue, what would you recommend our next move is?All help is REALLY REALLY appreciated 🙂
Intermediate & Advanced SEO | | ukss19840 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0 -
Accepting RSS feeds. Does it = duplicate content?
Hi everyone, for a few years now I've allowed school clients to pipe their news RSS feed to their public accounts on my site. The result is a daily display of the most recent news happening on their campuses that my site visitors can browse. We don't republish the entire news item; just the headline, and the first 150 characters of their article along with a Read more link for folks to click if they want the full story over on the school's site. Each item has it's own permanent URL on my site. I'm wondering if this is a wise practice. Does this fall into the territory of duplicate content even though we're essentially providing a teaser for the school? What do you think?
Intermediate & Advanced SEO | | peterdbaron0