Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
-
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites).
I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors.
It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble.
Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do?
Thanks Moz community!
-
Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!
-
We have actually been doing this on one of our sites where we have several thousand articles going all the way back to the late 90s. Here is what we do / our process (I am not including how to select articles here, just what to do once they are selected).
- Really take the time to update the article. Ask the questions, "How can we improve it? Can we give better information? Better graphics? Better references? Can we improve conversion?" 2) Republish with a new date on the page. Sometimes add an editor's note on how this is an updated version of the older article. 3) Keep the same URL to preserve link equity etc or 301 to new url if needed 4) mix these in with new articles as a part of our publication schedule.
We have done this for years and have not run into issues. I do not think Google sees this as spammy as long as you are really taking the time to improve your articles. John M. and Gary I. have stated unequivocally that Google likes it when you improve your content. We have done the above, it has not been dangerous at all. Our content is better overall. In some cases where we really focused on conversion, we not only got more traffic, but converted better. Doing this will only benefit your visitors, which usually translates into Google liking the result.
I would ask, why take a few months where you only recycle content, to just mixing it up all year long? If you were going to designate 3 months of the year to just update content, then why not take the 3rd week of the month each month or every Wednesday and do the same thing instead. You accomplish the same thing, but spread it out. Make it a feature! Flashback Friday etc.
Bonus idea - make sure you get the schema right
We have something new with our process. Previously, we only marked up the publication date in schema. So when we republished, we would change the publication date in the schema as well to the new pub date. Now that Google requires a pub date and last modified date in schema we have changed our process. When we republish content, we will leave the original publication date as the publication date marked up in schema and then put the new date that the article is being published marked up as last modified in schema. This is a much more clearer and accurate representation to Google as what you are doing with the article.
We are also displaying the last modified date to the user as the primary date, with the publication date made secondary. The intent here is that we want to show that this is an article that has been recently updated to the user so they know the information is current.
To get this to work properly, we had to rework how our CMS interacts with content on both published date and last modified date, but in the end, I think we are giving better signals to Google and users on the statuses of our articles.
-
You'll probably experience a dip from not publishing new content but I don't believe there will be any other issues.
Updating old content (drip fed or in bulk) won't trigger any spam/manipulation flags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content on product category pages - does Google care?
Hi All, I've always been unsure about the importance of content on product category pages. Nobody reads it. If you search for "living room chairs", you're just going to want to see a big list of living room chairs - not read content about living room chairs, how to choose one, etc. On virtually any ecommerce site, category pages have a paragraph or two of total bla-bla. Does this have any impact on search rankings? More specifically, will Googlebot see content on how to choose a living room chair and say "Yes! This is really helpful content"? Or, will it realize that the searcher intent on this keyword is really just to see a list of chairs, and ignore this content - or at least downplay its importance? WDTY?
On-Page Optimization | | BarryBuckman0 -
Delete or not delete outdated content
Hi there!
On-Page Optimization | | Enrico_Cassinelli
We run a website about a region in Italy, the Langhe area, where we write about wine and food, local culture, and we give touristic informations. The website also sports a nice events calendar: in 4 years we (and our users) loaded more than 5700 events. Now, we're starting to have some troubles managing this database. The database related to events is huge both in file size and number of rows. There are a lot of images that eat up disk space, and also it's becoming difficult to manage all the data in our backend. Also, a lot of users are entering the website by landing on outdated events. I was wondering if it could be a good idea to delete events older than 6 months: the idea was to keep only the most important and yearly recurring events (which we can update each year with fresh information), and trash everything else. This of course means that 404 errors will increase, and also that our content will gettin thinner, but at the same time we'll have a more manageable database, and the content will be more relevant and "clean". What do you think? thank you 🙂 Best0 -
Duplicate content on events site
I have an event website and for every day the event occurs the event has a page. For example: The Oktoberfest in Germany the event takes 16 days. My site would have 16 (almost)identical pages about the Oktoberfest(same text, adres, photos, contact info). The only difference between the pages is the date mentioned on the page. I use rich snippets. How does google treat my pages and what is the best practice.
On-Page Optimization | | dragonflo0 -
Old landing page modifications - should I change the content?
One of our most popular landing page is starting to be a little bit out dated, should I keep the old content and update with newer text or is it safe to completely replace the old content with the new content without losing our organic traffic on this page?
On-Page Optimization | | rusted880 -
Should I Remove This Subdirectory From Google?
On my site, I have a subdirectory. It posts articles from a bunch of websites that my readers are interested in & links back to all of those sites. There is no original content in it. There are over 1700 indexed pages in this subdirectory. The rest of my site has about 500 (all original content). The search engine traffic for this subdirectory only accounts for 3.9% of my sites overall visits. Should I consider removing this subdirectory? Could all the duplicate content be hurting the rankings of my legit pages? What do you all think?
On-Page Optimization | | PedroAndJobu0 -
Over Optimization Penalty
Hi Guys, I was listening to the over optimization penalty Matt Cutts was talking about at SXSW over at Search Engine Land. http://searchengineland.com/too-much-seo-google%E2%80%99s-working-on-an-%E2%80%9Cover-optimization%E2%80%9D-penalty-for-that-115627# Looking at my on-page reports through SEOMOZ I have A report cards for each page, will we be OK or do we need to take a look? Thanks, Scott
On-Page Optimization | | ScottBaxterWW1 -
Page without content
Hey Everyone, I've started an SEO On Page analysis for a web site and I've found a lot of duplicate content and useless pages. What do I have to do? Delete this useless page, redirect or do canonical tag? If I have to delete what is the best way to do? Should I use GWT to delete? or just delete from the server? This URL for example: http://www.sexshopone.com.br/?1.2.44.0,0,1,13,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} There is no content and it is duplicate in reference of this: http://www.sexshopone.com.br/?1.2.44.0,0,1,12,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} and the correct page of the product is: http://www.sexshopone.com.br/?1.2.44.0,423,anel-peniano-evolved-boss-cock's-pleasure-rings-collar-white-reutilizavel-e-a-prova-d'agua-colecao-evolved.html [admin note: NSFW page} What is happening is that we have 8.000 pages like this. Useless and without any content. How do I proceed? Thanks!
On-Page Optimization | | luf07090 -
Google is indexing spam pages from my site. What is the most effective way to get ride of the search results? Pages are deleted now but should I do something more?
A long time ago I created a forum (Invision Power Board) and it got full of spam. Massive amounts! /forum/ I've now deleted the forum but the spam pages are still indexed on Google. Can I do something else to hurry up the process to get ride of them?
On-Page Optimization | | ocarlsson0