Remove URLs from App
-
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview
Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable.
Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing.
Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs?
Thanks for your help and advice!
-
So, you basically can't 'force' Google to do anything but there may be better ways to encourage them to remove these URLs
The only way to force Google to remove a URL is to use the URL removal tool in Google Search Console but this only removes a page temporarily and it's a pain to do en-masse submissions. As such, not my recommendation
One thing to keep in mind. You have loads of pages with no-index directives on, but Google is also blocked frown crawling those pages via robots.txt. So if Google can't crawl the URLs, how can it find the no-index directives you have given? Robots.txt should be used for this - but your chronological deployment is off it's too early. You should put this on at the very, very end when Google has 'gotten the message' and de-indexed most of the URLs (makes sense, yes?)
My steps would be:
- No-index all these URLs either with the HTML or X-Robots (HTTP header) deployment (there are multiple Meta robots deployments, if editing the page-code is gonna be difficult! Read more here)
- Also deploy noarchive in the same way to stop Google caching the URLs. Also deploy nosnippet to remove the snippets from Google's results for these pages, which will make them less valuable to Google in terms of ranking them
- For the URLs that you don't want indexed, make the page or screen obviously render content that says the page is not available right now. This one might be tricky for you as you can't do it just for Googlebot, that would be considered cloaking under some circumstances
- On the pages which you have no-indexed, serve status code 404 to Google only (if it's just a status code, it's not considered cloaking). So for useragent GoogleBot make the HTTP response a 404 on those URLs (temporarily available but coming back). Remember to leave the actual, physical contents of the page the same for both Googlebot and users, though
- If that doesn't work swap out the 404 (sent only to GoogleBot) with a 410 (status code: gone, not coming back) to be more aggressive. Note that it will then be harder to get Google to re-index these URLs later. Not impossible, but harder (so don't open with this)
- Once most URLs have been de-indexed and de-cached by Google, put the robots.txt rule(s) back on to stop Google crawling these URLs again
- Reverse all changes once you want the pages to rank (correct the page's contents, remove nosnippet, noarchive and noindex directives, correct the status code, lift the robots.txt rules etc)
Most of this hinges on Google agreeing with and following 'directives'. These aren't hard orders, but the status code alterations in particular should be considered much harder signals
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best international URL strategy for my non-profit?
Hi, I have a non-profit organization that advocates for mental health education and treatment. We are considering creating regional chapters of the non-profit in specific countries - France, UK, Russia, etc. What's the best long-term foundation for global organic growth? Should we simply internationalize our content (.org/uk/)? Or create a custom site for each ccTLD (.org.uk, etc.? Since it's an educational site, the content for each country would not be particularly unique, apart from: Language (regional English nuance for UK and AUS, or other languages altogether) Expert videos and potentially supporting articles (i.e., hosting videos and a supporting article for a UK Doctor versus a US Doctor) Offering some regional context when it comes to treatment options, or navigating school, work, etc. Any thoughts would be much appreciated! Thanks! Aaron
Local Website Optimization | | RSR1 -
Which URL and rel=canonical structure to use for location based product inventory pages?
I am working on an automotive retailer site that displays local car inventory in nearby dealerships based on location. Within the site, a zip code is required to search, and the car inventory is displayed in a typical product list that can be filtered and sorted by the searcher to fit the searchers needs. We would like to structure these product inventory list pages that are based on location to give the best chance at ranking, if not now, further down the road when we have built up more authority to compete with the big dogs in SERP like AutoTrader.com, TrueCar.com, etc. These higher authority sites are able to rank their location based car inventory pages on the first page consistently across all makes and models. For example, searching the term "new nissan rogue" in the Los Angeles, CA area returns a few location based inventory pages on page 1. The sites in the industry that are able to rank their inventory pages will display a relatively clean looking URL with no redirect that still displays the local inventory like this in the SERP:
Local Website Optimization | | tdastru
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue
but almost always use a rel=canonical tag within the page to a page with a location parameter attached to the end of the URL like this one:
https://www.autotrader.com/cars-for-sale/New+Cars/Nissan/Rogue/Los+Angeles+CA-90001"/>
I'm having a hard time figuring out why sites like this example have their URLs and pages structured this way. What would be the best practice for structuring the URL and rel=canonical tags to be able to rank for and display location based inventory pages for cars near the searcher?0 -
How to direct the right countries to different URLS.
Hello, I have one site that has 3 different "sites" sitting within. .com/us/shop - Serves the US and Mexico How do i point US and Mexico to .com/us/shop .com/eu/shop - Serves all countries in Europe excluding the UK How do I point Spain, France, Italy, Germany to the .com/eu/shop .com/shop - Serves the UK and all other ROW countries. How do i point UK, China, Russia, Hong Kong to .com/shop Thanks
Local Website Optimization | | MarniFP0 -
301 or 302 Redirects with locale URLs?
Hi Mozers, I have a bit of a tricky question I need some help answering. My agency are building a brand new website for a client of ours which means changing the domain name (yay...). So! I have my 301's all ready to go for the UK locale, however, the issue I have is that the site will also eventually have French, German and Spanish locales - but these won't be ready to go until later this year. We will be launching in just English for September. The current site already has the French and German locales on it as well. Just to make sure I'm being clear, the site will be www.example.com for launch, but by lets say November, we will also have a www.example.com/fr/ and www.example.com/de/ site launched too. So what do I do with the locale URLs? As I said above, the exisitng site already has the French and German locales on it, so I don't particularly want to redirect the /fr/ and /de/ URLs to the English homepage, as I will want to redirect them to the new URLs in November, and redirecting more than once is bad for SEO right? Any ideas? Would 302s maybe be the best suggestion? Thanks! Virginia
Local Website Optimization | | Virginia-Girtz1 -
URL and title strategy for multiple location pages in the same city
Hi, I have a customer which opens additional branches in cities where he had until now only one branch. My question is: Once we open new store pages, what is the best strategy for the local store pages in terms of URL and title?
Local Website Optimization | | OrendaLtd
So far I've seen some different strategies for URL structure:
Some use [URL]/locations/cityname-1/2/3 etc.
while others use [URL]/locations/cityname-zip code/
I've even seen [URL]/locations/street address-cityname (that's what Starbucks do) There are also different strategies for the title of the branch page.
Some use [city name] [state] [zip code] | [Company name]
Other use [Full address] | [Company name]
Or [City name] [US state] [1/2/3] | [Company name]
Or [City name] [District / Neighborhood] [Zip Code] | [Company name] What is the preferred strategy for getting the best results? On the one hand, I wish differentiate the store pages from one another and gain as much local coverage as possible; on the other hand, I wish to create consistency and establish a long term strategy, taking into consideration that many more branches will be opened in the near future.1 -
Removed huge spammy location footer, looking to rebuild traffic the right way
Hello, On this site, I removed a huge spammy location footer with hundreds of cities, states, and dog training types. The traffic and rankings have gone down a lot, and I'd like a discussion on how to rebuild things the right way. There's some local adjustments to be made to the home page content, but other than that: My plans: 1. Analyze top 10 Google analytics keyword queries and work them into the content as best as possible, though I am debating whether the client should make new pages and how many. 2. I'm going to suggest he add a lot of content to the home page, perhaps a story about a dog training that he did in Wisconsin. I'll think about what else. Any advice is appreciated. Thanks.
Local Website Optimization | | BobGW0 -
Best way to remove spammy landing pages?
Hey Mozzers, We recently took over a website for a new client of ours and discovered that their previous webmaster had been using a WordPress plugin to generate 5,000+ mostly duplicated local landing pages. The pages are set up more or less as "Best (service) provided in (city)" I checked Google Webmaster Tools and it looks like Google is ignoring most of these spammy pages already (about 30 pages out of nearly 6,000 are indexed), but it's not reporting any manual webspam actions. Should we just delete the landing pages all at once or phase them out a few (hundred) at a time? Even though the landing pages are mostly garbage, I worry that lopping off over 95% of a site's pages in one fell swoop could have other significant consequences. Thanks!
Local Website Optimization | | BrianAlpert780 -
What is the Best Keyword Placement within a URL for Inner Location Pages?
I'm working on a website with 100s of locations. There is a location search page (Find Widget Dealer), a page for each state (Tennessee Widget Dealers) and finally a page for each individual location which has localized unique content and contact info (Nashville Widget Dealer). My question is is related to how I should structure my URL and the keywords within the URL. Keywords in my examples being the location and the product (i.e. widget). Here is a quick overview of each of the 3 tiered pages, with the Nashville page being the most optimized: Find Widget Dealer - Dealer Page only includes a location search bar and bullet list links to states Tennessee Widget Dealers - Page includes brief unique content for the the state and basic listing info for each location along with links to the local page) Nashville Widget Dealer - Page includes a good amount of unique content for this specific location (Most optimized page) That said, here are the 3 URL structure options I am considering: http://website.com/widget-dealers/tennesee/nashville http://website.com/dealers/tennesee-widget-dealers/nashville http://website.com/dealers/tennesee/nashville-widget-dealer Any help is appreciated! Thank you
Local Website Optimization | | the-coopersmith0