How to recover from duplicate subdomain penalty?
-
Two and half a weeks ago, my site was slapped with a penalty -- 60% of organic traffic disappeared over 2-3 days.
After investigating we discovered that our site was serving the same content for all subdomains, and Google somehow had two additional subdomains it was crawling and indexing. We solved the issue with 301 redirects to our main site (www) a couple of days after the drop -- about two weeks ago.
Our rankings have not recovered, and the subdomains are still indexed per Webmaster Tools. Yesterday we submitted a Reconsideration Request. Will that help? Is there any other way to speed up the process of lifting the penalty?
This is the site: http://goo.gl/3DCbl
Thank you!
-
No recovery yet. Quick update... I put in a reconsideration request and was denied, saying No Manual Spam Actions found.
From WMT: The Total Crawled count on the bad subdomains is steady, and there are still no Removed pages, but the Not Selected count is steadily increasing--in fact the total of Indexed and Not Selected is greater than the Total Crawled count -- how does this make sense?
Thanks.
-
Oh - if the subdomains showed no pages indexed, and then all of a sudden at the exact time you dropped, the subdomains showed thousands of indexed pages, then you can definitely assume they are related.
I didnt realize there was such a clear correlation. The suggestions above still stand - you might want to go one further and simply add a noindex right in the robots.txt on those subdomains (make sure its on the subdomains and not the money site!).
Dont forget in WMT you can also do a change of address under Configuration. Youve already completed the first two steps, so you can simply tell Google exactly where the subs have moved.
There's no reason at all why these steps will not prompt google to de-index the subs. The links by the way are simply a 'nudge' to get Google to look at the subdomains again and 'discover' the changes.
-
We'll give the links a shot.
We did consider that the high number of similar static pages may be viewed negatively by Google, but we were ranking very well on many long tail searches before the drop. On WMT, the subdomains show no pages indexed until the exact date range that our rankings dropped, when they spike to the tens of thousands.
What do you think is the likelihood that the subdomains are the culprit in this case?
Thanks for all of your help.
-
Its definitely hard to say with that many URLs - I would definitely point a few at the sub's home page however. It could be that those sub-domains were cached at such long intervals, that Google simply hasn't checked the site again.
Sometimes, adding the sub to WMT, then submitting an xml sitemap, waiting until Google acknowledges it (and tells you how many are indexed) then removing the sitemap can help.
If and when the subdomains are de-indexed (and theres no reason to believe they wouldn't be), then watch your positioning for a week or two after - if it doesnt change, you have to consider that the drop in positioning may be from another cause. For example, the way that each sorting variable for the products lands on its own static page can be viewed as good for SEO but slightly risky since so many pages are so close to duplicated.
-
Thanks Jared. The subdomains are www.ww and www.lnirfrx. We configured all subdomans to 301 to www. We did not receive any messages in WMT -- just the sudden drop ranking.
I'm thinking about putting some links on a forum that I know doesn't have nofollows and is crawled several times a day. But we have tens of thousands of these subdomain pages indexed, will posting a couple of the links help? I wouldn't want to post more than that because it would look spammy.
-
Hi tact - what were your subdomains?
You mentioned that you sent in a Recon. Request - did you receive an unnatural links penalty in WMT?
If you have properly 301'd your subs so that NO subdomain page can be accessed, then by simply pointing a few links at the redirect like Ben said should help it de-index faster. Make sure though that the 301's are properly set up (do a header check) and also make sure that no content from the sub is available unless you are certain the the redirect is applied properly (clear the subdomain of files).
-
Do some guest blogging and point links at your 301s from your guest posts. Google will see that you mean business. You'll have new links and the old pages will be deindexed quicker.
-
I would submit a sitemap and keep moving forward with creating valuable content and sharing it to the right people. It can take Google a long time to get to your message.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap duplicate title
At the moment we have a html sitemap which is pulling the same h1's/ titles. How big a problem is the duplicate content issue which is medium priority in the moz pro softaware? Would you recommend changes as sitemap page 1 - page 2 etc. Thanks
Technical SEO | | VUK-SEO0 -
Duplication, pagination and the canonical
Hi all, and thank you in advance for your assistance. We have an issue of paginated pages being seen as duplicates by pro.moz crawlers. The paginated pages do have duplicated by content, but are not duplicates of each other. Rather they pull through a summary of the product descriptions from other landing pages on the site. I was planing to use rel=canonical to deal with them, however I am concerned as the paginated pages are not identical to each other, but do feature their own set of duplicate content! We have a similar issue with pages that are not paginated but feature tabs that alter the URL parameters like so: ?st=BlueWidgets ?st=RedSocks ?st=Offers These are being seen as duplicates of the main URL, and again all feature duplicate content pulled from elsewhere in the site, but are not duplicates of each other. Would a canonical tag be suitable here? Many Thanks
Technical SEO | | .egg0 -
Why is the report telling I have duplicate content for 'www' and No subdomain?
i am getting duplicate content for most of my pages. when i look into in your reports the 'www' and 'no subdomian' are the culprit. How can I resolve this as the www.domain.com/page and domain.com/page are the same page
Technical SEO | | cpisano0 -
How to remove a thin site penalty
Wondering if anyone could help out. A while back I made an affiliate store using wordpress and merchants products feeds. I didn't get found to adding any unique content to the site and, as was to be expected, I gained a penalty and my search traffic died. A few months back I redesigned the store, still using merchant csv but now with 98% unique content on each page. However, try as I may I still cannot get anywhere in the engines. The domain doesn't even rank for it's own name!! I have submitted reconsideration request but they have replied saying no penalty on the site. The domain is www.digitalcatwalk.co.uk. While the domain isn't massively strong I would prefer not to have to start again as I feel it is a very good domain name. Any advise would be most gratefully received. Thanks Carl
Technical SEO | | GrumpyCarl0 -
An odd duplicate content issue...
Hi all, my developers have just assured me that nothing has changed form last week but in the today's crawl I see all the website duplicated: and the difference on the url is the '/' so basically the duplicated urls are: htts://blabla.bla/crop htts://blabla.bla/crop/ Any help in understanding why is much appreciated. thanks
Technical SEO | | LeadGenerator0 -
Single Keyword Penalty?
Hi guys, I recently taken over SEO for strikebowling.com.au and I'm stumped to what has happened with the keyword 'Bowling' for the home page. Historically they have been ranking 5-6 for the year and they do come up in the local results. Start of September, bang they drop out of the top 100 for Bowling. No other words seem to be effected. However the keyword 'Bowling Alley' did improve around the same time for an internal page. What could have happened? A single keyword penalty? No messages in Webmaster tools No dodgy link building Look forward to some theories. Regards, Corey
Technical SEO | | LoudClear0 -
How can something be duplicate content of itself?
Just got the new crawl report, and I have a recurring issue that comes back around every month or so, which is that a bunch of pages are reported as duplicate content for themselves. Literally the same URL: http://awesomewidgetworld.com/promotions.shtml is reporting that http://awesomewidgetworld.com/promotions.shtml is both a duplicate title, and duplicate content. Well, I would hope so! It's the same URL! Is this a crawl error? Is it a site error? Has anyone seen this before? Do I need to give more information? P.S. awesomewidgetworld is not the actual site name.
Technical SEO | | BetAmerica0 -
Strange duplicate url
From your csv report I have this strange issue. This url: elettrodomestici.yeppon.it/climatizzatori/condizionatori-fissi/prodotti/condizionatori-fissi-comfee/ it's a duplicate of this elettrodomestici.yeppon.it/climatizzatori/condizionatori-fissi/prodotti/condizionatori-fissi-comfee/ but the only url that I can see in the website is this one. Why the "-" is transalted some times in "%2D" referrer obviously is elettrodomestici.yeppon.it/climatizzatori/condizionatori-fissi/prodotti/condizionatori-fissi-comfee/solo-disponibili/ I have many duplicate url...Can you help me? Thanks
Technical SEO | | yeppon0