Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
-
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites).
I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors.
It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble.
Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do?
Thanks Moz community!
-
Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!
-
We have actually been doing this on one of our sites where we have several thousand articles going all the way back to the late 90s. Here is what we do / our process (I am not including how to select articles here, just what to do once they are selected).
- Really take the time to update the article. Ask the questions, "How can we improve it? Can we give better information? Better graphics? Better references? Can we improve conversion?" 2) Republish with a new date on the page. Sometimes add an editor's note on how this is an updated version of the older article. 3) Keep the same URL to preserve link equity etc or 301 to new url if needed 4) mix these in with new articles as a part of our publication schedule.
We have done this for years and have not run into issues. I do not think Google sees this as spammy as long as you are really taking the time to improve your articles. John M. and Gary I. have stated unequivocally that Google likes it when you improve your content. We have done the above, it has not been dangerous at all. Our content is better overall. In some cases where we really focused on conversion, we not only got more traffic, but converted better. Doing this will only benefit your visitors, which usually translates into Google liking the result.
I would ask, why take a few months where you only recycle content, to just mixing it up all year long? If you were going to designate 3 months of the year to just update content, then why not take the 3rd week of the month each month or every Wednesday and do the same thing instead. You accomplish the same thing, but spread it out. Make it a feature! Flashback Friday etc.
Bonus idea - make sure you get the schema right
We have something new with our process. Previously, we only marked up the publication date in schema. So when we republished, we would change the publication date in the schema as well to the new pub date. Now that Google requires a pub date and last modified date in schema we have changed our process. When we republish content, we will leave the original publication date as the publication date marked up in schema and then put the new date that the article is being published marked up as last modified in schema. This is a much more clearer and accurate representation to Google as what you are doing with the article.
We are also displaying the last modified date to the user as the primary date, with the publication date made secondary. The intent here is that we want to show that this is an article that has been recently updated to the user so they know the information is current.
To get this to work properly, we had to rework how our CMS interacts with content on both published date and last modified date, but in the end, I think we are giving better signals to Google and users on the statuses of our articles.
-
You'll probably experience a dip from not publishing new content but I don't believe there will be any other issues.
Updating old content (drip fed or in bulk) won't trigger any spam/manipulation flags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have speed problem in google webmaster
if i show my website to robots with less code (robots version ) is it harmful for my website ? my website is wordpress and i can't optimze it more plz help me
On-Page Optimization | | rhesti3280 -
We recently updated a large guide that takes the place of the original. The original has some nice organic traffic to it and I don't want to risk losing it. Should I 301 redirect to the new version, or update all the info directly on the original page?
We don't have a lot of content that garners much non-branded organic, so this is something I don't want to risk losing. We do not have a whole lot of external links into the page either.
On-Page Optimization | | AFP_Digital1 -
WMT Fetch as Google
Is there any benefits in using 'Fetch as Google' in WMT and then submitting for indexing? I have a page which I'm trying to get to rank so far with no luck is it likely to help or could it hinder? Please speak from experience not hearsay 🙂 Many Thanks
On-Page Optimization | | seoman100 -
Duplicate Content
I'm currently working on a site that sells appliances. Currently, there are thousands of "issues" with this site, many of them dealing with duplicate content. Now, the product pages can be viewed in "List" or "Grid" format. As Lists, they have very little in the way of content. My understanding is that the duplicate content arises from different URLs going to the same site. For instance, the site might have a different URL when told to display 9 items than when told to display 15. This could then be solved by inserting rel = canonical. Is there a way to take a site and get a list of all possible duplicates? This would be much easier than slogging through every iteration of the options and copying down the URLs. Also, is there anything I might be missing in terms of why there is duplicate content? Thank you.
On-Page Optimization | | David_Moceri0 -
Google Xml Sitemaps
Which plugin is good to use to create and submit my sitemap: sitemap from yoast or google xml sitemap plugin?
On-Page Optimization | | Sebastyan22
Which one is better? I already saw this video but I get an error when I submited it to webmaster tools and I don't know why:http://www.quicksprout.com/university/how-to-set-up-and-optimize-a-sitemap/_''Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead.''_Thank you !0 -
Does class and id names considered as text content by google
Does google and other search engines considers the class and id names as part of text content. Will it be included in the keyword density and treated as a content. For Example: <a <span="">href</a><a <span="">="http://xyz.com" title="xyz" class="topmargin_rightside_middlenavigation_home"></a> Will google considers the words "topmargin rightside middlenavigation home" as part of the text. Also If i am supposed to use this class as many times on a page, will the keyword density affects.
On-Page Optimization | | Sulekha0 -
Content by Country
Currently we have a news website aimed at several countries. We want to filter the content of some url (home, category pages, ...) using the country of origin of the visitor. For example in the home we've heard of global character, and a column with news of the country of origin of the visitor. This may affect the position or cause a Google penalty? thank you very much
On-Page Optimization | | promonet0 -
Duplicat contents on wordpress
I ran a crawl error and found that I have many pages with "tag" i.e. http://www.soobumimphotography.com/tag/70-200-2-8-is/ What's the best way to deal with this problems? Is it worth to visit all of them and fix? Delete? Could you give me some suggestions?
On-Page Optimization | | BistosAmerica0