Hundreds of thousands of 404's on expired listings - issue.
-
Hey guys,
We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000.
Many of these listings receive links.
Classified listings that are less than 45 days show other possible products to buy based on an algorithm.
It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results.
-> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised.
-> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience.
-> Or, shall we just leave them as 404's? : google sort of says it's ok
Very curious on your opinions, and how you would handle this.
Cheers,
Croozie.
P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
-
Wow! Thanks Ryan.
I'm sure it won't surprise you to know that I'm always reading eagerly when I see you respond to a question as well.
-
Thanks Ian, good to know
Again, good confirmation.
-
Hi Sha,
Spot on. Yes that was my original thinking, then I switched to the school of 200's with meta index's. But having you guys confirming this, makes me realise that doing 301's to the parent category is most certainly the way to go.
Permanently redirecting will have the added benefit of effectively 'de-indexing' the original classified's and of course throwing a ton of link juice over to the category levels.
What a wonderful, helpful community!
Many thanks,
Croozie.
-
Sha, your responses continuously offer outstanding actionable items which offer so much value. I love them so much as they offer such great ideas and demonstrate a lot of experience.
-
Hi Croozie,
Awesome work once again from Ryan!
Since your question feels like a request for suggestions on "how" to create a solution, just wanted to add the following.
When you say "classified listings" I hear "once off, here for a while, gone in 45 days content".
If that is the case, then no individual expired listing will ever be matched identically with another (unless it happens to be a complete duplicate of the original listing).
This would mean that it would certainly be relevant to send any expired listing to a higher order category page. If your site structure is such that you have a clear heirarchy, then this is very easy to do.
For example:
If your listing URL were something like http://www.mysite.com/listings/home/furniture/couches/couch-i-hate.php, then you can use URL rewrites to strip out the file name and 301 the listing to http://www.mysite.com/listings/home/furniture/couches/, which in most cases will offer a perfectly suitable alternative for the user.
There is another alternative you could consider if you have a search program built in - you could send the traffic to a relevant search. In the above example, mysite.com/search.php?s=couch.
Hope that helps,
Sha
-
We are now doing something similar with our site. We have several thousand products that have been discontinued and didn't think about how much link juice we were throwing away until we got Panda pounded. It's amazing how many things you find to fix when times get tough.
We started with our most popular discontinued products and are 301 redirecting them to either a new equivalent or the main category if no exact match can be found.
We are also going to be reusing the same product pages for annual products instead of creating new pages each year. Why waste all that link juice from past years?
-
If you perform a redirect, I recommend you offer a 301 header response, not a 200. The 301 response will let Google and others know the URL should be updated in their database. Google would then offer the new URL in search results. Additionally any link value can be properly forwarded to the new page.
-
Thanks Ryan,
Massive response! Awesome!
It's interesting that you talk a lot about the 301's.
Are you suggesting this would be far more preferable than simply producing a 200 status code page, listing product choices based on an algorithm - which we currently offer our customers for listings expired less than 45 days?
I suppose, to clarify, I'm worried that if we were to do that (produce 200 status code pages), then crawl equity would be reduced for Google, that we would be wasting a lot of their bandwidth on 200 status pages, when they could be better off crawling and indexing more recent pages.
Whereas with 301's to relevant products as you suggest, we solve that issue.
BTW, our 404 pages offer the usual navigation and search options.
Cheers,
Croozie.
-
Hi Croozie.
The challenge with your site is the volume of pages. Most large sites with 100k+ pages have huge SEO opportunities. Ideally you need a team which can manually review every page of your site to ensure it is optimized correctly. Such a team would be a large expense which many site owners choose to avoid. The problem is your site quality and SEO are negatively impacted.
Whenever a page is removed from your site or otherwise becomes unavailable, a plan should be in place PRIOR to removing the page. The plan should address the simple question: how will we handle traffic to the page whether it is from a search engine or a person who bookmarked the page or a link. The suggested answer is the same whether your site has 10 pages or a million pages:
- if the product is being replaced with a very similar product, or you have a very similar product, then you can choose to 301 the page to the new product. If the product is truly similar, then the 301 redirect is a win for everyone.
Example A: You offer a Casio watch model X1000. You stop carrying this watch and replace it with Casio watch model X1001. It is the same watch design but the new model has a slight variation such as a larger dial. Most users who were interested in the old page would be interested in the new page.
Example B: You offered the 2011 version of the Miami Dolphins T-shirt. It is now 2012 and you have the 2012 version of the shirt which is a different design. You can use a 301 to direct users to the latest design. Some users may be unhappy and want the old design, but it is still probably the right call for most users.
Example
You discontinue the Casio X1000 and do not have a very close replacement. You could 301 the page to the Casio category page, or you could let it 404.
The best thing to do in each case is to put on your user hat and ask yourself what would be the most helpful thing you can do to assist a person seeking the old content. There is absolutely nothing wrong with allowing a page to 404. It is a natural part of the internet.
One last point. Be sure your 404 page is optimized, especially considering how many 404s you present. The page should have the normal site navigation along with a search function. Help users find the content they seek.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO on Jobs sites: how to deal with expired listings with "Google for Jobs" around
Dear community, When dealing with expired job offers on jobs sites from a SEO perspective, most practitioners recommend to implement 301 redirects to category pages in order to keep the positive ranking signals of incoming links. Is it necessary to rethink this recommendation with "Google for Jobs" is around? Google's recommendations on how to handle expired job postings does not include 301 redirects. "To remove a job posting that is no longer available: Remove the job posting from your sitemap. Do one of the following: Note: Do NOT just add a message to the page indicating that the job has expired without also doing one of the following actions to remove the job posting from your sitemap. Remove the JobPosting markup from the page. Remove the page entirely (so that requesting it returns a 404 status code). Add a noindex meta tag to the page." Will implementing 301 redirects the chances to appear in "Google for Jobs"? What do you think?
Intermediate & Advanced SEO | | grnjbs07175 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Investigating Google's treatment of different pages on our site - canonicals, addresses, and more.
Hey all - I hesitate to ask this question, but have spent weeks trying to figure it out to no avail. We are a real estate company and many of our building pages do not show up for a given address. I first thought maybe google did not like us, but we show up well for certain keywords 3rd for Houston office space and dallas office space, etc. We have decent DA and inbound links, but for some reason we do not show up for addresses. An example, 44 Wall St or 44 Wall St office space, we are no where to be found. Our title and description should allow us to easily picked up, but after scrolling through 15 pages (with a ton of non relevant results), we do not show up. This happens quite a bit. I have checked we are being crawled by looking at 44 Wall St TheSquareFoot and checking the cause. We have individual listing pages (with the same titles and descriptions) inside the buildings, but use canonical tags to let google know that these are related and want the building pages to be dominant. I have worked though quite a few tests and can not come up with a reason. If we were just page 7 and never moved it would be one thing, but since we do not show up at all, it almost seems like google is punishing us. My hope is there is one thing that we are doing wrong that is easily fixed. I realize in an ideal world we would have shorter URLs and other nits and nats, but this feels like something that would help us go from page 3 to page 1, not prevent us from ranking at all. Any thoughts or helpful comments would be greatly appreciated. http://www.thesquarefoot.com/buildings/ny/new-york/10005/lower-manhattan/44-wall-st/44-wall-street We do show up one page 1 for this building - http://www.thesquarefoot.com/buildings/ny/new-york/10036/midtown/1501-broadway, but is the exception. I have tried investigating any differences, but am quite baffled.
Intermediate & Advanced SEO | | AtticusBerg10 -
Does hiding responsive design elements on smaller media types impact Google's mobile crawler?
I have a responsive site and we hide elements on smaller media types. For example, we have an extensive sitemap in the footer on desktop, but when you shrink the viewport to mobile we don't show the footer. Does this practice make Google's mobile bot crawler much less efficient and therefore impact our mobile search rankings?
Intermediate & Advanced SEO | | jcgoodrich1 -
Is there any SEO advantage to sharing links on twitter using google's url shortener goo.gl/
Hi is there any advantage to using <cite class="vurls">goo.gl/</cite> to shorten a URL for Twitter instead of other ones? I had a thought that <cite class="vurls">goo.gl/</cite> might allow google to track click throughs and hence judge popularity.
Intermediate & Advanced SEO | | S_Curtis0 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0 -
Can I, in Google's good graces, check for Googlebot to turn on/off tracking parameters in URLs?
Basically, we use a number of parameters in our URLs for event tracking. Google could be crawling an infinite number of these URLs. I'm already using the canonical tag to point at the non-tracking versions of those URLs....that doesn't stop the crawling tho. I want to know if I can do conditional 301s or just detect the user agent as a way to know when to NOT append those parameters. Just trying to follow their guidelines about allowing bots to crawl w/out things like sessionID...but they don't tell you HOW to do this. Thanks!
Intermediate & Advanced SEO | | KenShafer0 -
Soft 404's from pages blocked by robots.txt -- cause for concern?
We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages). Should we be concerned? Is there anything we can do about this?
Intermediate & Advanced SEO | | nicole.healthline4