My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
-
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains.
On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues.
When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys.
We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight.
I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/"
It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong.
I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty.
Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down?
We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content.
The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects.
Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem.
I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem!
It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content.
As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
-
The domains in question were all previously owned by me in my webmaster tools account long before this happened. I've since gone and put in an address change request for the site that has the 301s on it to point to the new site.
I'm feeling like I got stuck with a false positive here, but it is taking forever to get re-reviewed. Of course, it is grilling season now, so I'm losing tens of thousands of dollars in revenue per day that we are out of the index.
I realize the answer is probably no, but does anyone have any tips on how to speed up the review process? I could lose a quarter million dollars over the course of a week or two.
-
A doorway page is an old school black hat SEO technique. What webmasters would do is buy domains with high PR or buy expired domains that used to be competitors and then 301 redirect them back to their website. This was in essence buying their links, as the links to the old domains now ended up at their domain.
Are your domains all on the same hosting account or same serer c-block? Are they all registered and verified with Google Webmaster Tools? If not, then Google may seem them as being owned by different people. In that case, it would look to them like you just bought a bunch of domains and redirected them all to your domain.
To you, you were simply finding all the duplicate content out there and consolidating it into one domain the way you think you should. It just didn't look that way to Google. I would recommend claiming and verifying every one of the domains you want to 301 in GWT. Once you have them verified, then redirect them all to your new domain. At that point, file a reconsideration request with Google, explain your situation, show how you have all the domains verified and that they belong to you, and you should end up okay.
My best guess based on what you're saying is that Google thought all of your domains were under separate ownership, and to see them all 301 all at once looks like you just bought a bunch of other domains and redirected them to yours.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content does effect
Hey there, I am Doing SEO For one of my client For mortgage Company. As i have Checked in Other mortgage sites, they used to have Same Content/Details, In all Websites, & my client site have also Some of them, So my Question is as per Google If there Duplicate Content, it will be Got penalize, But as i see Ranking & traffic For competitor site, They have Duplication, then also Rank For 1st page,. what is Reason behind? so i also Implement/Execute our site With that same content?? or i'll Got penalize?? Thnx in advance
White Hat / Black Hat SEO | | iepl20010 -
Doorway v Landing Pages - Whats the difference?
I suppose I have not much further information to add apart from ask apart from what is the difference? Both are highly optimized pages but that's where my knowledge ends!
White Hat / Black Hat SEO | | loudawg0 -
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
Page keeps dropping
Hello All, I have a page (http://www.caravanguard.co.uk/touring-caravan-insurance/) we have seen it drop a bit over recent months and now we have been appearing low on page two (Google UK) for the term "Caravan Insurance". Now I do not expect to rank above some of the big players in our market and the aggregators, but we do feel we have good content and have tried to act as white as we can in terms of SEO, link building and alike, but we are still slipping. My main issue is this - one minute we have the link appearing in the serps and then the next it drops from the serps altogether. At this point i often resubmit for indexing in webmaster tools and a few minutes later its back.... is this just likley something I am seeing local to me due search becoming more personal (I have toggle hide private results too) or is there a cause to its sudden disappearance and reappearance? Any insight would be greatly appreciated.
White Hat / Black Hat SEO | | TimHolmes0 -
Linking my pages
Hello everybody, i have a small dilemma and i am not shore what to do. I am (my company) the owner of 10 e-commerce web sites. On every site i have a link too the other 9 sites and i am using an exact keyvoerd (not the shop name).Since the web stores are big and have over a 1000 pages, this means thet all my sites have a lot off inbound links (compared with my competiton). I am woried that linking them all together could be bad from Googles point of wiev. Can this couse a problem for me, should i shange it? Regardes, Marko
White Hat / Black Hat SEO | | Spletnafuzija0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0