How to Get Google to Recognize Your Pages Are Gone
-
Here's a quick background of the site and issue. A site lost half of its traffic over 18 months ago and its believed to be a Panda penalty. Many, many items were already taken care of and crossed off the list, but here's something that was recently brought up.
There are 30,000 pages indexed in Google,but there are about 12,000 active products. Many of these pages in their index are out of stock items. A site visitor cannot find them by browsing the site unless he/she had bookmarked and item before, was given the link by a friend, read about it, etc. If they get to an old product because they had a link to it, they will see an out of stock graphic and not allow to make the purchase.
So, efforts have been made about 1 month ago to 301 old products to something similar, if possible, or 410 them. Google has not been removing them from the index. My question is how to make sure Google sees that these pages are no longer there and remove from the index? Some of the items have links to them and this will help Google see them, but what about the items which have 0 external / internal links?
Thanks in advance for your assistance.
In working on a site which has about 10,000 items available for sale. Looking in G
-
Sure, I can see the issues there. Having a look at the sitemap and submitting that would be my best guess.
-
No problem with stating the obvious...A fetch within GWT was done, but they would start from the homepage and work their way down from what I understand. How would they crawl these 'dead' pages which have been 301'd and 410'd?
-
Hi again
Okay - thanks for the clarification.
Now, I have never used this tool, but you could try the Remove Outdated Content tool from Google in Webmaster Tools. The reason I put that disclaimer here is because I don't know the timeline in how long it takes the content to get removed and I want you to make sure that this is a step you want to take, especially for thousands of pages.
Otherwise, you best bet is to just hang tight, rerun your sitemap, reupload to WMT, and let the crawls take their course. This may be the best bet. Good luck!
-
Hi Patrick,
The sitemap only shows the active products; therefore, the older, out of stock items, are not in there (definitely a good thing to check).
If you try to go to one of these pages, the header does show a 301 or 410, respectively. But, does Google recrawl all of the pages in their index? How will they see that these are gone if there are no links to many of these pages?
All product descriptions are unique, but unfortunately, a large site scraped them for a few years and recently stopped. That's another big piece to the puzzle as Google gave them credit when in fact, it was coped from the penalized site.
-
I am just stating the obvious, but have you used Google Webmaster Tools and prompted for a reindexation? Do you want to give a visitor a custom 404 or a an image of the sold out product, with newer and on stock alternatives?
That could be an elegant solution, though it could be a technical challenge.
-
Hi there
Did you make sure to remove those pages from your sitemap.xml? It takes Google a minute sometimes to see that pages are gone. I still have issues with that as well, but it's just the name of the game.
I would also check your internal links to make sure all links that point to those pages are pointing to their new locations. I would also check your backlink profile to see if any good links are out there that point to those old pages. Reach out and have those corrected.
Also check the product descriptions on those new pages to make sure they are robust and unique to the product.
Hope this helps - let me know if you have any more questions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
Interlinking from unique content page to limited content page
I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users. Questions: Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.
Intermediate & Advanced SEO | | khi50 -
Wrong page getting ranked
Hi all, we have product category pages on our ecommerce web site and we also produce blog content (such as buyers guides, setup guides etc) to help with ranking and give our site some good quality, unique content. However we are sometimes finding that the buyers guide / blog content gets ranked by Google over our product category page. I'm hoping, if I give an example or two, some one smart out there may be able to point me in the right direction as to how we can avoid this and get the product category page ranked instead? You will see from my examples we are linking internally using the keywords from the buyers guides to the product category pages in order to show the most important page to Google for these keywords and are trying to structure the product category pages as well as possible to make it the most optimized page for the term. Example: Keyword "twin dvd player"... product category page: http://www.3wisemonkeys.co.uk/dvd/portable-dvd-player-car/twin-dvd-player/ ... blog page actually getting ranked for this keyword: http://www.3wisemonkeys.co.uk/advice-center/dual-screen-and-twin-dvd-player-explained/ Keyword "site radio".... product category page: http://www.3wisemonkeys.co.uk/audio/radio/site-radio/ .... blog buyer guide page actually getting ranked for keyword: http://www.3wisemonkeys.co.uk/advice-center/Site-radio-buying-guide/ Any help / pointers appreciated. Thanks.
Intermediate & Advanced SEO | | jasef0 -
My landing page changed in google's serp. I used to have a product page now I have a pdf?
I have been optimizing this page for a few weeks now and and have seen our page for up from 23rd to 11th on the serp's. I come to work today and not only have I dropped to 15 but I've also had my relevant product page replaced by this page . Not to mention the second page is a pdf! I am not sure what happened here but any advice on how I could fix this would be great. My site is www.mynaturalmarket.com and the keyword I'm working on is Zyflamend.
Intermediate & Advanced SEO | | KenyonManu3-SEOSEM0 -
Robots.txt error message in Google Webmaster from a later date than the page was cached, how is that?
I have error messages in Google Webmaster that state that Googlebot encountered errors while attempting to access the robots.txt. The last date that this was reported was on December 25, 2012 (Merry Christmas), but the last cache date was November 16, 2012 (http://webcache.googleusercontent.com/search?q=cache%3Awww.etundra.com/robots.txt&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a). How could I get this error if the page hasn't been cached since November 16, 2012?
Intermediate & Advanced SEO | | eTundra0 -
How to get your company on the Google +, Right Hand side box of search results?
http://www.searchenginejournal.com/google-plus-content-replaces-ads/41452/ We have a Google plus page, but the results aren't coming up there Do you need a certain amount of people in your circles, what is the criteria to get your brand here? Any links?
Intermediate & Advanced SEO | | xoffie0 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610