How to Handle Annual Content - 2018-2019
-
Hello! I was wondering how other SEOs handles their annual content. We do well with ranking for our industry keywords with the year in the content. We have annual changes to publish and talk about each year. What do you do with the previous years content? Leave it, 301 redirect it or just revamp the same content so it updates to the current year?
-
Try to schedule your content plan. I'm doing the content marketing for the blog of golf wedges reviews and I have made the content planning for it.
-
Great suggestion. You recently answered a query I had with regards to expired/annual content and this has proven to work quite well
-
Make a folder with the title... /golf-tournament/
Today the index file in that folder features information about your 2018 Golf Tournament. You keep it up-to-date throughout the year. Before the event it has who, what, where information needed for people who want to enter, photos of the course, short info about past winners, etc.
After the event you fill that page with results about the winners, cameos about the winners, big list of results and lots of photos about the event. You leave that up for a few months.
Then, you move that entire page to a subfolder.... /golf-tournament/2018/ which will serve as a scrap book page for that year. And, you put information about the 2019 tournament on the index file. You have obvious links to the 2018 scrap book page so any interested person can see it.
Over time you grow a big collection of pages for each of your annual tournaments
/golf-tournament/2018/ /golf-tournament/2019/ /golf-tournament/2020/ /golf-tournament/2021/
Anybody who links to your index page will always send people to fresh information. Anybody who wants the historic information just clicks a link for that year.
-
- If the annual content can be defined uniquely enough, create original URL's and write new copy.
- If the copy is close to the same, edit the previous pages to the relevant year and unique information
- if you prefer to make new pages/posts, 301 the old to the new
Hope this helps.
KJr
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
A lot of news / Duplicate Content - what to do?
Hi All, I have a blog with a lot of content (news and pr messages), I want to move my blog to new domain. What is your recommendation? 1. Keep it as is. old articles -> 301 -> same article different URL
Intermediate & Advanced SEO | | JohnPalmer
2. Remove all the duplicate content and create 301 from the old URL to my homepage.
3. Keep it as is, but add in the meta-tags NoIndex in duplicate articles. Thanks !0 -
Opinions on Boilerplate Content
Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
Robots.txt & Duplicate Content
In reviewing my crawl results I have 5666 pages of duplicate content. I believe this is because many of the indexed pages are just different ways to get to the same content. There is one primary culprit. It's a series of URL's related to CatalogSearch - for example; http://www.careerbags.com/catalogsearch/result/index/?q=Mobile I have 10074 of those links indexed according to my MOZ crawl. Of those 5349 are tagged as duplicate content. Another 4725 are not. Here are some additional sample links: http://www.careerbags.com/catalogsearch/result/index/?dir=desc&order=relevance&p=2&q=Amy
Intermediate & Advanced SEO | | Careerbags
http://www.careerbags.com/catalogsearch/result/index/?color=28&q=bellemonde
http://www.careerbags.com/catalogsearch/result/index/?cat=9&color=241&dir=asc&order=relevance&q=baggallini All of these links are just different ways of searching through our product catalog. My question is should we disallow - catalogsearch via the robots file? Are these links doing more harm than good?0 -
Duplicate content on yearly product models.
TL;DR - Is creating a page that has 80% of duplicated content from the past year's product model where 20% is about the new model changes going to be detrimental to duplicate content issues. Is there a better way to update minor yearly model changes and not have duplicated content? Full Question - We create landing pages for yearly products. Some years the models change drastically and other years there are only a few minor changes. The years where the product features change significantly is not an issue, it's when there isn't much of a change to the product description & I want to still rank on the new year searches. Since I don't want duplicate content by just adding the last year's model content to a new page and just changing the year (2013 to 2014) because there isn't much change with the model, I thought perhaps we could write a small paragraph describing the changes & then including the last year's description of the product. Since 80% of the content on the page will be duplicated from the last year's model, how detrimental do you think this would be for a duplicate content issue? The reason I'm leaving the old model up is to maintain the authority that page has and to still rank on the old model which is still sold. Does anyone else have any other better idea other than re-writing the same information over again in a different way with the few minor changes to the product added in.
Intermediate & Advanced SEO | | DCochrane0 -
Spellcheck necessary for user generated content?
We have a lot of user generated reviews on our key landing pages. Matt Cutts recommended using correctly spelled content. Would you perform a spellcheck of all already published user reviews or would you leave already published reviews rather intact and only perform spellcheck for new reviews before they are published? Since reviews have been marked up using schema.org, I am not sure whether posterior editing of lots of reviews may raise a flag with google regarding manipulating reviews. Thanks.
Intermediate & Advanced SEO | | lcourse0 -
Content linking ?
If you have links on the left hand side of the website on the Navigation and content at the bottom of the page and link to the same page with different anchor text or the same would it help the page (as it is surrounded by similar text) or is the first one counted and this is it?
Intermediate & Advanced SEO | | BobAnderson0 -
Expired Content
Hi We have a listing website that has a huge amount of listings.These listings are changing all time, they become passive or deleted. We would like to choose the response code for the passive for deleted pages. Which response type must we use ? Redirect to last category with 301 Give 410 Gone response code Give 404 Response code which option would we choose ? and any ideas ?
Intermediate & Advanced SEO | | SEMTurkey0