Deal that expire what should i do?
-
Hey there Awesome team of Webmaster Forums,
Lets assume that I have a page that have deals in it. Those deals after a certain period of time expire. What should I do with the expired pages?My opinion is this.The page keeps the same URL but inside there is a content saying "Sorry but this deal has expired .... "and have some relevant deals beneath ORRedirect to a universal expired page. Kind Regards
-
Thanks mate. That was a very informative answer.
It's not that I am making some hand made items as the example of Matt was but I have a very small amount of deals that come and go every now and then.
I will not redirect or give a 404. I think that I will keep the page but explain that the deal is over and that there are more deals relevant to this one.
Forcible redirecting in my opinion is the worst in every situation except if the intent of the next page is Exactly the same as the previous which 99% of the times is not.
404 could be ok but the deals i offer are hard won and i dont want the traffic to just go into a 404 wall.
Adding the relevant deals seems like the best way to go.
Thanks again
-
Hi there
Here's a great resource from Matt Cutts on the subject. I like the idea of redirecting to a relevant page/category or creating a 404 page that assists the user in finding something related or gives them an opportunity to be notified when the offer/product is back.
You should also update internal links and your sitemap links so that this page isn't being constantly crawled if the offer is over and done, and not coming back.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
How to deal with 80 websites and duplicated content
Consider the following: A client of ours has a Job boards website. They then have 80 domains all in different job sectors. They pull in the jobs based on the sectors they were tagged in on the back end. Everything is identical across these websites apart from the brand name and some content. whats the best way to deal with this?
Technical SEO | | jasondexter0 -
How to deal with high authority but irrelevant external links
Hi, My client has an online ecommerce site where he sells wedding items. His previous SEO company added his URL to many websites that are related to porn. Those are high authority websites but are irrelevant to my client's business. Should I disavow those links?
Technical SEO | | alexkatalkin0 -
How best to deal with www.home.com and www.home.com/index.html
Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
Technical SEO | | Creatomatic
James @ Creatomatic0 -
Dealing with manual penalty...
I'm in the back-and-forth with Google's Quality Search team at the moment. We discovered a manual penalty on our website and have been trying to get it removed as of late. Problem is, tons of spammy incoming links. We did not ask for or purchase any of these links, it just so happens that spammy websites are linking to our site. Regardless, I've done my best to remove quite a few links in the past week or so, responding to the Quality Search team with a spreadsheet of the links in question and the action taken on each link. No luck so far. I've heard that if I send an email to a website asking for a link removal, I should share that with Google as well. I may try that. Some of the links are posted on websites with no contact info. A WhoIs search brings up a hidden registrant. Removing these links is far from easy. My question is, what are some techniques that are proven to be effective when working your way through the removal of a manual penalty? I know Google isn't going to tell me all of the offending links (they've offered a few examples, we've had those removed - still penalized) so what's the best way for me to find them myself? And, when I have a link removed, it may stay in Webmaster Tools as an active link for a while even though it no longer exists. Does the Quality Search team use Webmaster Tools to check or do they use something else? It's an open-ended question, really. Any help dealing with a manual penalty and what you have done to get that penalty removed is of great help to me. Thanks!
Technical SEO | | ccorlando0 -
Setting up a 301 redirect from expired webpages
Hi Guys, We have recently created a new website for one of our clients and replaced their old website on the same domain. One problem that we are having is that all of the old pages are indexed within Google (1000s) and are just getting sent to our custom 404 page. We are finding that there is an large bounce rate from this and also, I am worried from an SEO point of view that the site could lose rank positioning through the number of crawl errors that Google is getting. Want I want is to set up a 301 redirect from these pages to go to the 'our brands' page. The reason for this is that the majority of the old URLs linked to individual product pages, and one thing to note is that they are all .asp pages. Is there a way of setting up a rule in the htaccess file (or another way) to say that all webpages that end with the suffix of .asp will be 301 redirected to the our brands' page? (there is no .asp pages on the new site as it is all done in php). If so, I would love it if someone could post the code snippet. Thanks in advance guys and if you have any other ideas then be my guest to suggest 🙂 Matt.
Technical SEO | | MatthewBarby0 -
Is it Panda?, how to deal with AP etc newswire articles
A site I have lost 30% of its traffic in June then another 10% in July, is it Panda? The site has 10's of thousands of AP or other syndicated articles on it, they are not there for SE benefits, they are categorized and relevant to the people who read them, the site gets half of its traffic from type ins/bookmarks. Should I nofollow the articles or rel="canonical" them? what can help...... Cheers
Technical SEO | | adamzski0