Best Way To Handle Expired Content
-
Hi,
I have a client's site that posts job openings. There is a main list of available jobs and each job has an individual page linked to from that main list. However, at some point the job is no longer available. Currently, the job page goes away and returns a status 404 after the job is no longer available.
The good thing is that the job pages get links coming into the site. The bad thing is that as soon as the job is no longer available, those links point to a 404 page. Ouch. Currently Google Webmaster Tools shows 100+ 404 job URLs that have links (maybe 1-3 external links per).
The question is what to do with the job page instead of returning a 404. For business purposes, the client cannot display the content after the job is no longer available. To avoid duplicate content issues, the old job page should have some kind of unique content saying the job is longer available.
Any thoughts on what to do with those old job pages? Or would you argue that it is appropriate to return 404 header plus error page since this job is truly no longer a valid page on the site?
Thanks for any insights you can offer.
Matthew -
Hey Sebastian -
We already do something similar to know if it is expired (instead of the if condition in MySQL, we query for records where job_closing_date >= CURDATE()). Thankfully they programmed that in to pull the old job off the list and out of the job search results. (Though up until yesterday the old jobs were on the XML sitemap...woops. Guess what I fixed yesterday!)
I like your idea though of keeping the content active and keeping the page alive, but with some kind of message above there. That would definitely keep the page unique. I'm not positive that will fly on the business side but I'll definitely propose that.
Thanks for the reply!
-
I like that idea of 301 redirecting the page back to the job search page. The search page would certainly be a good introduction and probably satisfy looking for the job. These pages aren't high ranking pages in the SERPs, the traffic is referral traffic from other websites. Give that, so Utah Tiger's question about keywords and search engine wouldn't apply in this website's case. Thanks for the idea!
-
Hi Matthew,
What I would do is to still have it accessible through a direct link, but not through a list of jobs displayed on the main site. I would also include the note at the top of the page saying something like 'This job offer has already expired'.
This way you still have a page, which is unique, does not show on the main jobs list and indicates that it is expired.
I'm not sure how much of the programming knowledge you have and what technology is the site built in, but a simple IF condition in your SQL statement to add specific flag to each record indicating whether it is expired or not would be something like this (this specific one is based on the MySQL syntax):
IF (
CURDATE() BETWEENdate_from
ANDdate_to
,
0,
1
) ASexpired
Then, when you call specific job you simply check whether the 'expired' field is equal 1 - and if so - display the message above the job.
I hope this helps.
-
EGOL..Your technical response is way above me....could you restate in tyro terms.
Is the expired data hidden? Does the 301 redirect go to homepage or job search page or either? What value does it add? Keywords? I guess the pages would still be indexed in order for value to be created or does a 301 redirect just add all the value on the page it is redirected too? I will also go look up 301 redirects right now.
Utah Tiger
-
I have expiring content on one of my sites.
I place all of the postings into folders according to date such as...
mysite.com/postings/2012/02/job-at-mcds/
Then on certain dates I add an htaccess file to the /2012/02/ folder that will 301 redirect all items in that folder to the homepage.
You could 301 the old posts to a job search page or some other type of page that will introduce the visitor to your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to handle Product URLs which prepopulate options?
We are currently building a new site which has the ability to pre-populate product options based on parameters in the URL. We have done this so that we can send individual product URLs to google shopping. I don't want to create lots of duplicate pages so I was wondering what you thought was the best way to handle this? My current thoughts are: 1. Sessions and Parameters
Technical SEO | | moturner
On-site product page filters populate using sessions so no parameters are required on-site but options can still be pre-populated via parameters (product?colour=blue&size=100cm) if the user reaches the site via google shopping. We could also add "noindex, follow" to the pages with parameters and a canonical tag to the page without parameters. 2. Text base Parameters
Make the parameters in to text based URLs (product/blue/100cm/) and still use "noindex, follow" meta tag and add a canonical tag to the page without parameters. I believe this is possibly the best solution as it still allows users to link to and share pre-populated pages but they won't get indexed and the link juice would still pass to the main product page. 3. Standard Parmaters
After thinking more today I am considering the best way may be the simplest. Simply using standard parameters (product?colour=blue&size=100cm) so that I can then tell google what they do in webmaster tools and also add "noindex, follow" to the pages with parameters along with the canonical tag to the page without parameters. What do you think the best way to handle this would be?0 -
The best way to do Interstitial (ads)
Hello, I want to ask you guys what's the best way do to Interstitial without penalty?
Technical SEO | | JohnPalmer
and feel free to give me samples from another major websites. Thanks!0 -
Duplicate Content Problem!
Hi folks, I have a quite awkward problem. Since a few weeks a get a huge amount of "duplicate content errors" in my MOZ crawl reports. After a while of looking for the error I thought of the domains I've bought additionally. So I went to Google and typed in site:myotherdomains.com The results was as I expected that my original website got indexed with my new domains aswell. That means: For example my original website was index with www.domain.com/aboutus - Then I bought some additional domains which are pointing on my / folder. What happened is that I also get listed with: www.mynewdomains.com/com How can I fix that? I tried a normal domain redirect but it seems as this doesn't help as when I am visiting www.mynewdomains.com the domain doesnt change in my browser to www.myoriginaldomain.com but stays with it ... I was busy the whole day to find a solution but I am kinda desperate now. If somebody could give me advice it would be much appreciated. Mike
Technical SEO | | KillAccountPlease0 -
Duplicate Content
The crawl shows a lot of duplicate content on my site. Most of the urls its showing are categories and tags (wordpress). so what does this mean exactly? categories is too much like other categories? And how do i go about fixing this the best way. thanks
Technical SEO | | vansy0 -
Cloaking? Best Practices Crawling Content Behind Login Box
Hi- I'm helping out a client, who publishes sale information (fashion sales etc.) In order for the client to view the sale details (date, percentage off etc.) they need to register for the site. If I allow google bot to crawl the content, (identify the user agent) but serve up a registration light box to anyone who isn't google would this be considered cloaking? Does anyone know what the best practice for this is? Any help would be greatly appreciated. Thank you, Nopadon
Technical SEO | | nopadon0 -
What is the best way to optimize a page for a magazine
Hi i have a serious problem with a website that i am building http://www.cheapflightsgatwick.com/ with reference to letting the search engines know what the magazine is about. I am building a holiday magazine which will focus on holiday news, cheap deals and holiday reviews. I am wanting the home page to feature for the following keywords holiday news, holiday magazine, holiday ideas, best holiday deals, but the problem i have is, i have tried putting an introduction on the home page but it looks out of place, so what is the best way for me to let google know about what the site is about and to get it ranking well in the search engines any help and advice would be great
Technical SEO | | ClaireH-1848860 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0 -
Panda Update Question - Syndicated Content Vs Copied Content
Hi all, I have a question on copied content and syndicated content - Obviously copying content directly form another website is a big no no, but wanted to know how Google views syndicated content and if it views this differently? If you have syndicated content on your website, can you penalised from the lastest Panda update and is there a viable solutiion to address this? Mnay thanks Simon
Technical SEO | | simonsw0