With regards to SEO is it good or bad to remove all the old events from our website?
-
Our website sells tickets for various events across the UK, we do have a LOT of old event pages on our website which simply say SOLD OUT. What is the best practice?
Should these event pages be removed and a 301 redirect added to redirect to the home page?
Or should these pages remain in tact with simply SOLD OUT on the page?
-
This post may also be of help: http://moz.com/blog/how-should-you-handle-expired-content
-
could you perhaps use these pages as a way of capturing visitors email addresses? I.e 'The above event has now sold out/ taken place but to be first to hear about the next event (by the same group/artist) submit your email address here?
I think it is good to have these pages still on your site as it shows visitors you've sold tickets for lots of past events and I suppose if the pages are well optimised it will see you rank for tickets for the particular artist on the page.
Either way I'd keep the pages.
-
Depending on how your site is set up I think some history of sold out events gives you credibility from a user perspective.
Having said that however, as a user I don't want to see sold out events past the last 6 months (if you've got quite a few) - 12 months at the most.
Anything past that I would delete and redirect to current events.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tens of duplicate homepages indexed and blocked later: How to remove from Google cache?
Hi community, Due to some WP plugin issue, many homepages indexed in Google with anonymous URLs. We blocked them later. Still they are in SERP. I wonder whether these are causing some trouble to our website, especially as our exact homepages indexed. How to remove these pages from Google cache? Is that the right approach? Thanks
Algorithm Updates | | vtmoz0 -
What happens when most of the website visitors end up at an "noindex" log-in page?
Hi all, As most of the users are visiting our website for log-in, we are planning to deindex login page. As they cannpt find it on SERP, they gonna visit our website and login; I just wonder what happens when most of the visitors just end up at homepage by browsing into an "noindex" page. Obviously it increases bounce rate and exit rate as they just gonna disappear. Is this going to push down us in rankings? What are the other concerns to check about? Thanks
Algorithm Updates | | vtmoz0 -
Do more internal links from sub-domains to domain (website) hurt rankings?
Hi, We have nearly 10 sub-domains. Couple of our website top pages including homepage have been linked from every page of these sub-domains; from footer or top menu. Is this kind of linking is bad as per Google? What is the right way of linking between website and sub-domains?
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Has anyone else noticed a major increase in Yelp, BBB, etc. results in local SERPs, pushing business websites further down?
Across multiple cities and markets, this seems to be a trend. "Chicago coffee shop" or "Minneapolis hair salon" or "Sacramento car repair" - outside the local 7-pack, virtually every result is Yelp, BBB, Yellowpages, etc. Is this related to algo changes, or simply a result of those national sites pumping major resources into SEO? It just seems to be suddenly far more prevalent than it was even 6 months ago.
Algorithm Updates | | kpclaypool1 -
How can we start to improve Domain MozRank & MozTrust for our website?
A simple question maybe, but how and where do we start if we want to improve our 'Domain MozRank & Moztrust', 'assuming of course that by improving both these we will improve our rankings with Google plus sales?
Algorithm Updates | | ewanTHH0 -
Physical locationof the server vs customer base vs SEO penality?
HI All, We are an Australian business with our hosting currently based in Australia. We have recently been considering moving hosts for a few reasons. In particular when we have done analysis of hosting in the US and also with Rackspace say in Hong Kong we have found that the prices can be significantly cheaper or with more bells in whistles provided in the hosting of a dedicated server offshore vs Australia for the same price. Therefore from this point of view we would be much better off moving our hosting to the US or HK with Rackspace. There are the issues such as latency to take on board but lets put that to the side for the moment as we are mostly interested in understanding if offshore hosting will impact us from an SEO perspective and if so how and can these impacts be mitigated. So our first question is a) if we move our hosting offshore, will this impact our SEO? b) if it does impact our seo, how will it impact (ie lose rankings for organic pages due to IP address being offshore)? c) is A is also an impact are there ways of eliminating these impacts outlined in B? d) net - if the impacts on seo can be mitigated will the net result still be negative or could we still be seen on the same footing as a domain hosted in Australia? Thanks Sean
Algorithm Updates | | sbcinv0 -
Site-wide Footer Link on Client/Friend Website - Dangerous?
Hi Guys, I've got a friend / client / business associate who's website I helped develop. It's a three letter dot-com, so good trust, and an eCommerce site, so lot's of pages. When I launched my new site about 6 weeks ago I put "Official IT Partner of MySite.com" in the footer. No keywords in the anchor text, just the domain URL... There are no other external links like that on the site whatsoever, and I haven't been hit by Penguin. I'm ranking well for local targeted keywords a few weeks after launch, and traffic continues to increase... I am worried that Google will see this is unnatural, but I've received no warning or experienced any decline in rankings. There's about 2800 pages linking from the site to my site, all in the footer of course. Would it be better to remove the link from the footer and add it just to the home page and a couple of other high authority pages, or should I leave it be. It's not "unnatural", I am affiliated with the site and work in partnership with the site, but it does fit that profile. I'm thinking about removing the footer link and adding a small graphic on the home page of the linking site which links to my root domain, with a couple of broad keyword anchored links in a description underneath that also link to relevant pages on my site... What do you think? 2800 links w/ my URL as anchor text from high Domain Authority / Low Page Authority pages (the homepage and a few other pages have decent authority) to my root domain OR Three different links from one High DA/ High PA homepage (one image alt, two anchored w/ broad keywords) to three different pages on my site. Again, there are no other site-wide external links on the domain, and I'm pretty sure I escaped the Penguin. Looking forward to hearing the different points of view. Thanks, Anthony
Algorithm Updates | | Anthony_NorthSEO2