A doorway-page vendor has made my SEO life a nightmare! Advice anyone!?
-
Hey Everyone,
So I am the SEO at a mid-sized nationwide retailer and have been working there for almost a year and half. This retailer is an SEO nightmare. Imagine the worst possible SEO nightmare, and that is my unfortunate yet challenging everyday reality.
In light of the new algorithm update that seems to be on the horizon from Google to further crack down on the usage of doorway pages, I am coming to the Moz community for some desperately needed help.
Before I was employed here, the eCommerce director and SEM Manager connected with a vendor that told them basically that they can do a PPC version of SEO for long-tail keywords. This vendor sold them on the idea that they will never compete with our own organic content and can bring in incremental traffic and revenue due to all of this wonderful technology they have that is essentially just a scraper.
So for the past three years, this vendor has been creating thousands of doorway pages that are hosted on their own server but our masked as our own pages. They do have a massive index / directory in HTML attached to our website and even upload their own XML site maps to our Google Web Master Tools. So even though they “own” the pages, they masquerade as our own organic pages.
So what we have today is thousands upon thousands of product and category pages that are essentially built dynamically and regurgitated through their scraper / platform, whatever.
ALL of these pages are incredibly thin in content and it’s beyond me how Panda has not exterminated them.
ALL of these pages are built entirely for search engines, to the point that you would feel like the year was 1998.
All of these pages are incredibly over- optimized with spam that really is equivalent to just stuffing in a ton of meta keywords. (like I said – 1998)
Almost ALL of these scraped doorway pages cause an incredible amount of duplicate content issues even though the “account rep” swears up and down to the SEM Manager (who oversees all paid programs) that they do not.
Many of the pages use other shady tactics such as meta refresh style bait and switching.
For example:
The page title in the SERP shows as: Personalized Watch Boxes
When you click the SERP and land on the doorway page the title changes to:
Personalized Wrist Watches. Not one actual watch box is listed.
They are ALL simply the most god awful pages in terms of UX that you will ever come across BUT because of the sheer volume of this pages spammed deep within the site, they create revenue just playing the odds game.
Executives LOVE revenue.
Also, one of this vendor’s tactics when our budget spend is reduced for this program is to randomly pull a certain amount of their pages and return numerous 404 server errors until spend bumps back up. This causes a massive nightmare for me.
I can go on and on but I think you get where I am going.
I have spent a year and half campaigning to get rid of this black-hat vendor and I am finally right on the brink of making it happen. The only problem is, it will be almost impossible to not drop in revenue for quite some time when these pages are pulled. Even though I have helped create several organic pages and product categories that will pick-up the slack when these are pulled, it will still be awhile before the dust settles and stabilizes.
I am going to stop here because I can write a novel and the millions of issues I have with this vendor and what they have done. I know this was a very long and open-ended essay of this problem I have presented to you guys in the Moz community and I apologize and would love to clarify anything I can.
My actual questions would be:
Has anyone gone through a similar situation as this or have experience dealing with a vendor that employs this type of black-hat tactic?
Is there any advice at all that you can offer me or experiences that you can share that can help be as armed as I can when I eventually convince the higher-ups they need to pull the plug?
How can I limit the bleeding and can I even remotely rely on Google LSI to serve my organic pages for the related terms of the pages that are now gone?
Thank you guys so much in advance,
-Ben
-
glad to help
-
You are a genius.
-
Glad i could be of some help,
If I were you I'd definitely grab copies of the pages if they're still live, you could do this from home even using some free tools like
http://phpcrawl.cuab.de/about.html
add a bit of Curl or WGET and you've got the pages plus the links and meta. Then if they do disappear suddenly and the business is stuck, you can hand this to your web people at oracle and they'll probably try and hire you, having said that, I'd imagine they've probably got a decent contingency plan because they're oracle, but you never know. Could save the day.
-
Thanks Jamie!
Yea we actually partner with Oracle for our web design, engineering , implementation and so on. So when it comes to server-side issues, we would have to go through them and there is always red tape involved.
Really I cannot understand how this vendor that does this is even in business and it is beyond me how they even get away with it. The wordpress 404 plug-in is a great idea though and that will definitely help me in the future with freelancing while I am here full-time.
-
We do self-canocialize and that is a very good question. What they will do is just keep spitting out dynamically generated URLS. They have absolutely no restrictions on page quality, content, they literally have no rules. This gives them immense flexibility.
And for the contract portion: One the contract ends, all of these pages will in-fact disappear and that is why they house them on their own servers. So that is what we want in the end.
It is dealing with the massive amount of 404s that will be an issue for awhile.
-
Thanks again!
Yes, that is the conundrum I am in here when it comes to "who actually owns the pages" and honestly, this vendor covered their bases. They actually house all of the pages on their own servers and basically scrape out site, then shoot them out through our CDN via a proxy or something like that. So they made sure we are at their mercy, they can pull them anytime they want.
So technically, If were were to redirect all of their pages and acquired links, it would actually not be too hard because each page is so unbelievably identical to our own organic pages. The problem is, we would have to access their server I believe and that will not happen.
It will also be one hell of a mess with 301s if we were to do that and I know someone I am planning with on our site team fears the length of the 301 chain this would cause in our htaccess file.
But we are thinking in the same ballpark as you mentioned - trying to find ways to somehow limit the 404 tsunami this would cause and see if we can "take back" some of the value they took from us in link juice.
-
Yes, redirections are 100% necessary. I agree whole heartedly.
-
Surely you can block them once the contract has been ended? I don't know how the law works where you are, but in the UK if you sever a contract you are no longer bound by it. But then again, I'm not a lawyer!!! LOL I'd be earning twice as much if I was!!! I'd look into this or get your legal team (assuming you have one) to look into it for after the contract has ended.
If they're scraping, could you put a canonical tag on your pages to self canonicalise? Only just thought of this!!! Might help, if you've not already done it.
-
Hey thanks so much for the response!
And there are no stupid questions!
Before I was hired here, the company was incredibly aggressive with PPC and CSE's and spent absorbent amounts on paid traffic.
The company literally drove 2x more traffic through paid than through organic. That has changed now even though we still spend pretty aggressively. We have an excellent SEM Digital Marketing Manager that handles all paid campaigns and affiliate programs and she is run ragged on a daily basis.
I really do think it would be worth taking a look at how we can compensate with PPC on the black-hat vendor's best performing URLs and thank you so much because it is an excellent idea.
To your robot blocking question:
I would love nothing more than to insert robot text that disallows Google Bot from crawling the tree sub folders that contain all of their doorway pages. Unfortunately, they entered into a legally binding contract and this would be like an act of war against them. I actually dream about doing this to them every night so that is an awesome point you bring up!
-
Thanks so much for the response.
Your advice on having a battle plan is perfect and is something that I have had to try, try try try and once I am done trying, I try again to find more creative ways to present SEO needs, site fixes and strategies.
I even went so far to show them what their page title look like in search when they are 90 characters long and compared them to that shady gas station on an isolated highway when we could be optimizing the titles, increase CTR and add some schema to product page SERPS to make them look like Sheetz!
Full PowerPoint pictures of gas stations!
The enigma of pushing SEO when nothing is "guaranteed" but the numbers they are seeing from this black-hat vendor are.
Yesterday, digging deeper and deeper using Screaming Frog, I dug into one of this vendor's sub folders that is a giant index (They have three of these sub folders they upload to our site)
I actually found that they are literally completely copying our product pages and making exact copies. They then insert basically meta spam links on the product pages that ensures that their copies will usually always out rank our original content that we have three writers working on.
Unbelievable I know. So with your awesome advice and internal reminder on how much more I need to think outside the box with presenting, I am going to make an entire roster of this plagiarized pages and show them that if all of these copied product pages were removed, our own organic product pages would show as they are meant to.
I cannot believe vendors still can get away with this. No one monitored them or had any idea what they were doing until I was hired it is just beyond belief.
Thank you so much for the advice and inspiration.
-
Nicely put, Amelia!
PPC would definitely be a great alternative to make up any losses from organic search. And, PPC Hero is indeed a great resource as is the AdWords help center.
From a technical standpoint, one would still want to have all of those crappy vendor pages re-directed somewhere, which would be a pain to manually do but a necessary pain. If not, they would be sending a huge amount of 404 errors and that's not going to be a good sign for Google. The pages are already indexed since they are getting traffic and you'd want to send that traffic, and any links associated with that page, somewhere - ideally, in your situation, a much better (relevant) page from a user and search engine perspective.
-
What a pig awful situation to be in. I feel for you.
The previous poster has some great suggestions which I would follow.
May I also suggest that you start a PPC campaign of your own to 'pick up the slack' as you put it? Assuming the budget previously allocated to this vendor would cover it? If the vendor was using PPC as the revenue driver to these horrible UX pages, imagine how much better the conversion would be from one of your 'good' pages?
If you've never used Adowrds before, then I would look at the adwords education center for a bit (sorry I can't remember what it's called). A good site I used to use when first starting out learning Adwords is PPC Hero - they had some good tips a few years ago, and I have no reason to believe they've gone downhill! I think (and hope I don't inadvertently offend anyone here, but it's my experience) that if you can do SEO then running PPC (though time consuming) should be easy enough for you to get your head around.
I don't know if this is a stupid suggestion or not as I'm not very technical (I rely on brilliant developers in my team) but could the vendor's dodgy pages be disallowed by your robots file? Could you also remove them from the index via webmaster tools (especially if the pages are just PPC landing pages and not built for organic search, which I understand is the case from your post)? Like I say, this may be a stupid suggestion... Please go easy on me if it is!!!
Good luck - and remember, 'what doesn't kill us, makes us stronger'. I bet you're a much better SEO now than you were a year and a half ago!
-
I feel like your best plan of attack will be two sided:
1.) Education - Which is a definite struggle, but helping your higher-ups really understand WHY these practices are an issue and how it could and eventually will impact their bottom line might resonate more than just saying there are issues present (which I am sure you have been doing anyway). Perhaps reiterating the amount of revenue that is a result of natural search and how much would be lost if the site were penalized would paint a more clear picture. Having data to support your arguments is always helpful. Maybe you can even do some research and present a few summarized case studies on other sites that have been penalized and how it impacted their natural search metrics.
2.) Plan - Have a plan of attack ready. Ok, so you get rid of these pages... Now what? Preparing a very clear, step-by-step plan on what changes need to be made, what these changes will accomplish and what issues they will address, how you will make them and how long it will take, and what the expected outcome will be will help them better understand the process and how it will help save and possibly even improve revenue.
Hope this is helpful - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Best Location to find High Page Authority/ Domain Authority Expired Domains?
Hi, I've been looking online for the best locations to purchase expired domains with existing Page Authority/ Domain Authority attached to them. So far I've found: http://www.expireddomains.net
White Hat / Black Hat SEO | | VelasquezEF
http://www.domainauthoritylinks.com
http://moonsy.com/expired_domains/ These site's are great but I'm wondering if I'm potentially missing other locations? Any other recommendations? Thanks.1 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Redirecting an image url to a more SEO friendly image url
We are currently trying to find the best way of making the images on one of our sites more SEO friendly, the easiest way for us would be to redirect the image URL to a more SEO friendly image URL. For example: http://www.website.com/default/cache/file/F8325DA-0A9A-437F-B5D0A4255A066261_medium.jpg redirects to http://www.website.com/default/cache/file/spiral-staircase.jpg Would Google frown upon this as it's saying the image is one thing and then points the user somewhere else?
White Hat / Black Hat SEO | | RedAntSolutions0 -
Duplicate content showing on local pages
I have several pages which are showing duplicate content on my site for web design. As its a very competitive market I had create some local pages so I rank high if someone is searching locally i.e web design birmingham, web design tamworth etc.. http://www.cocoonfxmedia.co.uk/web-design.html http://www.cocoonfxmedia.co.uk/web-design-tamworth.html http://www.cocoonfxmedia.co.uk/web-design-lichfield.html I am trying to work out what is the best way reduce the duplicate content. What would be the best way to remove the duplicate content? 1. 301 redirect (will I lose the existing page) to my main web design page with the geographic areas mentioned. 2. Re write the wording on each page and make it unique? Any assistance is much appreciated.
White Hat / Black Hat SEO | | Cocoonfxmedia0 -
HOW TO: City Targeted Landing Pages For Lead Generation
Hi guys, So one of my clients runs a web development agency in San Diego and for lead generation purposes we are thinking of creating him city targeted landing pages which will all be on different domains ie. lawebdesginstudio / sfwebdesigngurus I plan to register these 20-30 domains for my client and load them all up on a my single linux server I have from godaddy. I noticed however today using google's keyword tool that roughly only 5-10 cities have real traffic worth trying to capture to turn into leads. Therefore I am not sure if its even worth building those extra 20 landing pages since they will receive very little traffic. My only thought is, if I do decide to build all 30 landing pages, then I assume I will have a very strong private network of authority websites that I can use to point to the clients website. I mean I figure I can rank almost all of them page 1 top 5 within 2-3 months. My question is: 1. Do city targeted micro sites for the purpose of lead generation still work? If so are there any threads that have more info on this topic? 2. Do you suggest I interlink all 30 sites together and perhaps point them all to the money site? If so i'm wondering if I should diversify the ip's that I used to register the domains as well as the whois info. Thanks guys, all help is appreciated!
White Hat / Black Hat SEO | | AM2130 -
Is this a white hat SEO tactic?
Hi, I just noticed this website http://www.knobsandhardware.com hosts pages like http://www.knobsandhardware.com/local/hardware/California-Cabinet-Hardware.html that are filled with permutations of products + cities. These pages rank for these long tail phrases. Is this considered white hat?
White Hat / Black Hat SEO | | anthematic0 -
Google Penalising Pages?
We run an e-commerce website that has been online since 2004. For some of our older brands we are getting good rankings for the brand category pages and also for their model numbers. For newer brands, the category pages aren't getting rankings and neither are the products - even when we search for specific unique content on that page, Google does not return results containing our pages. The real kicker is that the pages are clearly indexed, as searching for the page itself by URL or restricting the same search using the site: modifier the page appears straight away! Sometimes the home page will appear on page 3 or 4 of the rankings for a keyword even though their is a much more relevant page in Google's index from our site - AND THEY KNOW IT, as once again restricting with the keywords with a site: modifier shows the obviously relevant page first and loads of other pages before say the home page or the page that shows. This leads me to the conclusion that something on certain pages is flagging up Google's algorithms or worse, that there has been manual intervention by somebody. There are literally thousands of products that are affected. We worry about duplicate content, but we have rich product reviews and videos all over these pages that aren't showing anywhere, they look very much singled out. Has anybody experienced a situation like this before and managed to turn it around? Link - removed Try a page in for instance the D&G section and you will find it easily on Google most of the time. Try a page in the Diesel section and you probably won't, applying -removed and you will. Thanks, Scott
White Hat / Black Hat SEO | | scottlucas0