Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Old URLs Appearing in SERPs
-
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to.
Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s.
We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain.
We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input.
- Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise.
- Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days.
- Update robots.txt to block access to the redirecting directories.
Thank you.
Rosemary
One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
-
You're right - I'm worrying about something that isn't yet a problem.
Thank you
-
In my experience, the best way to absolutely get rid of them is to use the 410 permanently gone status code, then resubmit them for indexation (possibly via an XML sitemap submission, and you can also use Google's crawl testing tool in Search Console to double-check). That said, even with 410, Google can take their time.
The other option is to recreate 200 pages there and use the meta robots noindex tag on the page to specifically exclude them. The temporary block in Google Search Console can work, too, but, it's temporary and I can't say whether it will actually extend the time that the redirected pages appear in the index via the site: command.
All that said, if the pages only show via a site: command, there's almost no chance anyone will see them
-
Ok, Rand - one last questions.
I do think one year is a long time to have old results and if I was going to do a test to get Google to stop showing them in their SERPs what would you do? --- Let's say a client asked you to have these URLs disappear
The 79 pages that appear in the /eichler/ directory are from a personal site so I don't care what happens with those pages in the SERPs.
My ideas are:
-
Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise.
-
Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days.
-
Update robots.txt to block access to the redirecting directories.
-
Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise.
-
Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days.
-
Update robots.txt to block access to the redirecting directories.
-
-
14 months! Wow. That is a long time indeed. Although, now that I look, Moz redirected OpenSiteExplorer just about a year ago, and we still have URLs showing for the site: command in Google too (https://www.google.com/search?q=site%3Aopensiteexplorer.org) so I suppose it's not that uncommon.
Glad to hear traffic and rankings are solid. Let us know if we can help out in the future!
-
Thank you Rand. It has been 14 months since these pages were moved and I'd never seen Google retain pages anywhere near this long.
You're right of course, there has been no impact to traffic for our site as these pages weren't about our search business.
Thanks for taking a look at our issue.
Rosemary
-
Oh gosh - it's my pleasure! Thanks for being part of the Moz community
I'm honored to help out.
As for the URLs - looks like everything's fine. Google often maintains old URLs in a searchable index form long after they've been 301'd, but for every query I tried, they're clearly pulling up the correct/new version of the page, so those redirects seem to be working just great. You're simply seeing the vestigal remnants of them still in Google (which isn't unusual - we had URLs from seomoz.org findable via site: queries for many months after moving to Moz, but the right, new pages were all ranking for normal queries and traffic wasn't being hurt).
Some examples:
- https://www.google.com/search?q=Enter+the+World+of+Eichler+Design
- https://www.google.com/search?q=Eichler+History+flashbacks
- https://www.google.com/search?q=eichler+resources+on+the+web+books
Unless you're also seeing a loss in search traffic/rankings, I wouldn't sweat it much. They'll disappear eventually from the site: query, too. It just takes a while.
-
Wow - do I ever feel privileged to have you respond! Thank you Rand.
You can see a batch of redirected URLs here < site:totheweb.com eichler >
I appreciate any suggestions.
Rosemary
-
Hi Rosemary - can you share some examples of the URLs and the queries that bring them up in search results? If so, we can likely do a diagnosis of what might be going on with Google and why the pages aren't correctly showing the redirected-to URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
How do I deindex url parameters
Google indexed a bunch of our URL parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!
Technical SEO | | BT20090 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Landing Page URL Structure
We are finally setting up landing pages to support our PPC campaigns. There has been some debate internally about the URL structure. Originally we were planning on URL's like: domain.com /california /florida /ny I would prefer to have the URL's for each state inside a "state" folder like: domain.com /state /california /florida /ny I like having the folders and pages for each state under a parent folder to keep the root folder as clean as possible. Having a folder or file for each state in the root will be very messy. Before you scream URL rewriting :-). Our current site is still running under Classic ASP which doesn't support URL rewriting. We have tried to use HeliconTech's ISAPI rewrite module for IIS but had to remove it because of too many configuration issues. Next year when our coding to MVC is complete we will use URL rewriting. So the question for now: Is there any advantage or disadvantage to one URL structure over the other?
Technical SEO | | briankb0 -
Does it really matter to maintain 301 redirect after de-indexing of old URLs?
Today, I was reading latest blog post on SEOmoz blog about. Uncrawled 301s - A Quick Fix for When Relaunches Go Too Well This is very interesting study about 301 & How it useful to maintain traffic. I'm working on eCommerce website and I have done similar stuff on my website. I have big confusion to manage 301 redirect. My website generates new URLs due to following actions. Re-write dynamic URLs. Re-launch entire website on different eCommerce platform. [osCommerce to Magento Commerce] Re-name category. Trasfer one product from one category to another category. I'm managing my 301 redirect with old practice. Excel sheet data from Google webmaster tools and set specific new URLs for redirect. Hoooo... Now, I have 8.5K redirect in htaccess... And, I'm thinking it's too much. Can we remove old 301 redirect from htaccess or not? This is big question for me. Because, all pages are not hyperlink on external website. Google have just de-indexed old URLs and indexed new URLs. So, Is it require to maintain 301 redirect after Google process?
Technical SEO | | CommercePundit0 -
Cyrillic letter in URL - Encoding
Hi all We are launching our site in Russia. As far as I can see by searching Google all sites have URLs in latin letters. Is there a special reason for this? - It seems that cyrillic letters also work. My technical staff says that it might give some encoding problems. Can anyone give me some insight into this? Thanks in advance.. / Kenneth
Technical SEO | | Kennethskonto0 -
Drupal URL Aliases vs 301 Redirects + Do URL Aliases create duplicates?
Hi all! I have just begun work on a Drupal site which heavily uses the URL Aliases feature. I fear that it is creating duplicate links. For example:: we have http://www.URL.com/index.php and http://www.URL.com/ In addition we are about to switch a lot of links and want to keep the search engine benefit. Am I right in thinking URL aliases change the URL, while leaving the old URL live and without creating search engine friendly redirects such as 301s? Thanks for any help! Christian
Technical SEO | | ChristianMKTG0 -
Products with discrete URLs for each color
here is the issue. i have an ecommerce site that on a category page, shows each individual color for each product sold. and there is a distinct URL for each color. each product page shares the same content, with the only potentially differentiating factor being customer reviews (not nearly enough of these to differentiate anything). so we have URLs like: www.domain.com/product-green www.domain.com/product-yellow www.domain.com/product-red and so on. i am looking for a way to consolidate these URL while still showing all colors on the category page. the first solution i am considering is using the hash tag. so we would create www.domain.com/product#green, www.domain.com/product#yellow, www.domain.com/product#red. if possible, i would set the canonical tag as www.domain.com/product. the second solution would be to use the canonical tag and keep the URLs as is. the issue i see here is that we would need to create www.domain.com/product and show that page somewhere. www.domain.com/product would the URL that the above color URLs would canonicalize to. what would be the preferred solution? or is there something else?
Technical SEO | | rakesh_patel0