Thoughts on archiving content on an event site?
-
I have a few sites that are used exclusively to promote live events (ex. tradeshows, conference, etc). In most cases these sites content fewer than 100 pages and include information for the upcoming event with links to register. Some time after the event has ended, we would redesign the site and start promoting next years event...essentially starting over with a new site (same domain).
We understand the value that many of these past event pages have for users who are looking for info from the past event and we're looking for advice on how best to archive this content to preserve for SEO.
We tend to use concise urls for pages on these sites. Ex. www.event.com/agenda or www.event.com/speakers. What are your thoughts on archiving the content from these pages so we can reuse the url with content for the new event? My first thought is to put these pages into an archive, like www.event.com/2015/speakers. Is there a better way to do this to preserve the SEO value of this content?
-
I think Egol covers it very well.
I would not be changing pages if at all possible. What I would be considering is how to get the benefit/value of traction of 2015 pages for 2016 events. There are several options.
But it could be a revamped page, a CTA which you can click on to go to 2016 page etc. Also if no page up yet after every 2015 event, you could put on the page "the 2016 events details are not known yet, however leave your email and we will contact you with details of the new event etc" - that gives an opportunity to collate details. You know your business better than anyone, so likely better strategies but the underlying principles enunciated by Egol is the way to go.
-
I have a few pages that link to content on other websites that are annual events. For many of these events, our website refers more visitors to them than they get from any other source. Sometimes more than from all of their other sources combined.
Some of the people who put on these events maintain the same URL year after year. We really appreciate that because we don't have to edit our page. Others change the year in the URL which is slightly annoying but easy. Others make up a new URL that places the event somewhere on the rump of their website where we have difficulty finding it. Others do insane redirects that often don't work properly.
My message here is really to say that if you value traffic from other websites then it is not a good idea to move your event page to a new URL every year. Doing that will orphan links, annoy the people who link to you, and might be inconsiderate to the people who help you promote the event on their own websites. Moving the URL around is also a really bad idea from an SEO perspective because you divide your link equity.
What to do with the speakers list and other information that represents a single year? If you think this information will be consumed by visitors, linked to by other websites, or pull in traffic from search then archive it as you proposed. My choice of the URL would be www.event.com/speakers/2015/ instead of www.event.com/2015/speakers/ Why? that keeps all of the speakers information in the same folder. When you have the 2016 info you simply change the URL of the index page and republish the new information overtop of the old.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IP Canonicalization for HTTPS site?
I received an unsolicited SEO report for one of my sites. My site was faulted for not having IP canonicalization set up. I reviewed this carefully. My site runs on Apache, is https and is on a dedicated IP. The mod rewrite rules for Apache all deal with the http version of the site. When I type my site's IP into a browser, I get the the https version, but with a unsecure cert warning as the certificate does not include the IP. Should I implement the http IP canonicalization rule. Another rewrite rule would then redirect the request to the https version?
On-Page Optimization | | FatRodent20130 -
How to solve duplicate content issue???
I have 5 websites with different domain names, every website have same content, same pages, same website design. Kindly let me know how to solve this issue.
On-Page Optimization | | ross254sidney0 -
Help! A site has copied my blog!
My site tanked on July 21 and I have been working so hard to bring it back up but nothing is working. Today I looked at "Links to Your Site" on Google Webmasters and I see a copy of my site on another URL. mysite.eemovies.org/mycategory/mypost The domain name is eemovies.org and then all my stuff is wrapped around it and all my content is there! How do I stop this?!
On-Page Optimization | | 2bloggers0 -
Mass Duplicate Content
Hi guys Now that the full crawl is complete I've found the following: http://www.trespass.co.uk/mens-onslow-02022 http://www.trespass.co.uk/mens-moora-01816 http://www.trespass.co.uk/site/writeReview?ProductID=1816 http://www.trespass.co.uk/site/writeReview?ProductID=2022 The first 2 duplicate content is easily fixed by writing better product descriptions for each product (a lot of hours needed) but still an easy fix. The last 2 are review pages for each product which are all the same except for the main h1 text. My thinking is to add no index and no follow to all of these review pages? The site will be changing to magento very soon and theres still a lot of work to do. If anyone has any other suggestions or can spot any other issues, its appreciated. Kind regards Robert
On-Page Optimization | | yournetbiz1 -
Duplicate content list by SEOMOZ
Hi Friends, I am seeing lot of duplicate (about 10%) from the crawl report of SEOMOZ. The report says, "Duplicate Page Content" But the urls it listed have different title, different url and also different content. I am not sure how to fix this issue.. My site has both Indian cinema news and photo gallery. The problme mainly coming in photo gallery posts. for example: this is the main url of a post. apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos . But in this post, each image is a link to its enlarged images (default wordpress). The problem is coming with each individual image with in this post. examples of SEOMOZ report 3 individual urls as duplicate content...from the same above post.: http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-4 http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-3 http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-2 Some body please advise me.. Appreciate your help.
On-Page Optimization | | ksnath0 -
Events in Wordpress Creating Duplicate Content Canonical Issues
Hi, I have a site which uses Event Manager Pro within Wordpress to create Events (as custom post types on my blog. I use it to advertise cookery classes. In a given month I might run one type of class 4 times. The event page I have made for each class is the same and I duplicate it 4 times and just change the dates to promote it. The problem is with over 10 different classes, which are then duplicated up to 4 times each per month. I get loads of duplicate content errors. How can I fix this without redirecting people away from the correct page for the date they are interested in? Is it best just to use a no follow for ALL events and rely on the other parts of my site for SEO? Thanks, T23
On-Page Optimization | | tekton230 -
Is content aggregation good SEO?
I didn't see this topic specifically addressed here: what's the current thinking on using content aggregation for SEO purposes? I'll use flavors.me as an example. Flavors.me lets you set up a domain that pulls in content from a variety of services (Twitter, YouTube, Flickr, RSS, etc.). There's also a limited ability to publish unique content as well. So let's say that we've got MyDomain.com set up, and most of the content is being drawn in from other services. So there's blog posts from WordPress.com, videos from YouTube, a photo gallery from Flickr, etc. How would Google look at this scenario? Is MyDomain.com simply scraped content from the other (more authoritative) sources? Is the aggregated content perceived to "belong" to MyDomain.com or not? And most importantly, if you're aggregating a lot of content related to Topic X, will this content aggregation help MyDomain.com rank for Topic X? Looking forward to the community's thoughts. Thanks!
On-Page Optimization | | GOODSIR0 -
Is there a way to measure a competitors content growth?
Bonjour from 23 Degrees C hot & sultry Wetherby UK, I find again and again clients find jumping into an active volcano easier than addid fresh content. I want to try another tactic to prompt / scare them into adding content and here is my idea. I want to track a competitor site for a target term and monitor if they are adding content over time specific to that term. So my question is please: "Is there a reliable method to do this?" Ideas i had was doing a good old Google Title operator at the beginning of the month and then at monthly intervals to see if anything changes. Any insights welcome 🙂
On-Page Optimization | | Nightwing0