Best way to remove spammy landing pages?
-
Hey Mozzers,
We recently took over a website for a new client of ours and discovered that their previous webmaster had been using a WordPress plugin to generate 5,000+ mostly duplicated local landing pages. The pages are set up more or less as "Best (service) provided in (city)"
I checked Google Webmaster Tools and it looks like Google is ignoring most of these spammy pages already (about 30 pages out of nearly 6,000 are indexed), but it's not reporting any manual webspam actions.
Should we just delete the landing pages all at once or phase them out a few (hundred) at a time?
Even though the landing pages are mostly garbage, I worry that lopping off over 95% of a site's pages in one fell swoop could have other significant consequences.
Thanks!
-
Hi Brian,
Good for you for discovering these. The process I would recommend would look like this:
-
Create a strategy for launching a set of new, excellent pages that cover the basics without needing to cover every possible combo as these duplicate/thin pages are likely trying to do.
-
Launch your new pages.
-
Delete the old ones and say, 'good riddance!'
-
-
I would fold the deletion of these pages into any other design and content changes you have planned for the site. Also, 301 redirection in place of all the old URLs will minimize any 404s that might have been created otherwise. Spidering software like ScreamingFrog or Xenu can help you spot inbound internal links on the pages.
Ultimately these pages are going to go away, either in terms of design or outright deletion so don't let the rate at which you delete them impede other decisions as you move forward in your work on the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to rank a national home page for a local keyword phrase
Hello - We are a nationally available brand based in Denver, CO. Our home page currently ranks #8 (used to be 5) for "real estate photography in Denver" -- I want to improve this ranking, but our home page is generalized and not geared toward Denver, CO but to all of our markets. I'm trying to troubleshoot this and have a few ideas.... I would love advice on the best route, or a different route altogether: Create a Denver-specific page -- _will that page compete with my home page that is already ranked in the top ten? _ Add the keyword phrase in the image alt attribute Add keyword phrase into the content - need to make sure that viewers realize we are national I already updated the meta description to say "real estate photography in Denver and beyond"
Local Website Optimization | | virtuance_photography1 -
Call Tracking Best Practises for General SEO
Hey folks, So I'm aware of the importance of consistent citations, and the mayhem call tracking numbers have been known to cause in regards to that in that past. So just wanted some up to date clarification on these two things: Local SEO isn't strictly speaking a big deal for us as we supply a software and as such are technically global. I'm presuming consistent citations are still worth aiming for though, and will help increase general authority as well? Let me know if I'm totally wrong about that! What's the best practise set up for call tracking, given that your main NAP number you'd obviously want hardcoded somewhere, alongside showing your dynamic numbers to relevant visitors. Apologies for any ignorance, as always any help and advice is muchos appreciato.
Local Website Optimization | | Zoope1 -
Local SEO Best Practise?
We are planning to localize our website by launching CCTLD. But there is a little confusion about some aspects, which are: Should we track location and take our visitors to their native domain? Or do we need to take our visitors to .com domain and show a Popup, if they want to visit the native region website? What is the best case study for localization?
Local Website Optimization | | UmairGadit0 -
Which is the best, ".xx" or ".com.xx" in general and for SEO?
Hi, I'm working for a digital marketing agency and have traffic from different countries. We are planning to make different websites for each country. What is the best SEO practice to choose the domain between ".xx" or ".com.xx" from Spain, Mexico, Chile, Colombia and Peru?
Local Website Optimization | | NachoRetta
I think that the ccTLD is better always, for example ".es" better than ".com.es"0 -
What is the effect of CloudFlare CDN on page load speeds, hosting IP location and the ultimate SEO effect?
Will using a CDN like CloudFlare.com confuse search engines in terms of the location (IP address) of where the site is actually physically hosted especially since CloudFlare distributes the site's content all around the globe? I understand it is important that if customers are mostly in a particular city it makes sense to host on an IP address in the same city for better rankings, all things else being equal? I have a number of city-based sites but does it make having multiple hosting plans in multiple cities/ countries (to be close to customers) become suddenly a ridiculous thing with a CDN? In other words should I just reduce it down to having one hosting plan anywhere and just use the CDN to distribute it? I am really struggling with this concept trying to understand if I should consolidate all my hosting plans under one, or if I should get rid of CloudFlare entirely (can it cause latency in come cases) and create even more locally-based hosting plans (like under site5.com who allow many city hosting plans). I really hope you can help me somehow or point me to an expert who can clarify this confusing conundrum. Of course my overall goal is to have:
Local Website Optimization | | uworlds
1. lowest page load times
2. best UX
3. best rankings I do realise that other concepts are more important for rankings (great content, and links etc.) but assuming that is already in place and every other factor is equal, how can I fine tune the hosting to achieve the desirable goals above? Many thanks!
Mark0 -
Top Pages analysis showing wordpress site pages when it was in a subdirectlry
My word press site used to be at morganlindsayphotography.com/Wedding I moved my site from /Wedding to the root domain three years ago where it is currently at www.morganlindsayphotography.com. Top pages in the open site analysis are still finding ONLY my old pages, titles and posts that were in the /Wedding - which are not even on my site anymore. http://www.morganlindsayphotography.com/Wedding/ http://www.morganlindsayphotography.com/Wedding/2012/11/04/a-new-product-addition-to-weddings/ http://www.morganlindsayphotography.com/Wedding/2012/06/22/brittany-reis-jason-mcclaflin-tiffin-ohio-wedding/ http://www.morganlindsayphotography.com/Wedding/2013/10/06/andy-mallory-cleveland-ohio-engagement/ http://www.morganlindsayphotography.com/Wedding/category/weddings/ Does anyone know why this could be happening?
Local Website Optimization | | morganlindsaycole0 -
How to improve optimization of this page
http://www.atomicx.com/cincinnati-web-design.php I am trying to get this page to rank for "cincinnati web design" and related phrases. I redid the landing page with all new content, images, url, etc about 3 months ago. Since then the site is still not showing up in the top 50 results. I see no reason that the page would not be. Our seo and compute repair phrases have all gone up but our web design phrases are being very stubborn. Thoughts?
Local Website Optimization | | Atomicx0 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0