Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
-
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites).
I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors.
It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble.
Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do?
Thanks Moz community!
-
Awesome, thank you so much for the detailed response and ideas - this all makes a good deal of sense and we really appreciate it!
-
We have actually been doing this on one of our sites where we have several thousand articles going all the way back to the late 90s. Here is what we do / our process (I am not including how to select articles here, just what to do once they are selected).
- Really take the time to update the article. Ask the questions, "How can we improve it? Can we give better information? Better graphics? Better references? Can we improve conversion?" 2) Republish with a new date on the page. Sometimes add an editor's note on how this is an updated version of the older article. 3) Keep the same URL to preserve link equity etc or 301 to new url if needed 4) mix these in with new articles as a part of our publication schedule.
We have done this for years and have not run into issues. I do not think Google sees this as spammy as long as you are really taking the time to improve your articles. John M. and Gary I. have stated unequivocally that Google likes it when you improve your content. We have done the above, it has not been dangerous at all. Our content is better overall. In some cases where we really focused on conversion, we not only got more traffic, but converted better. Doing this will only benefit your visitors, which usually translates into Google liking the result.
I would ask, why take a few months where you only recycle content, to just mixing it up all year long? If you were going to designate 3 months of the year to just update content, then why not take the 3rd week of the month each month or every Wednesday and do the same thing instead. You accomplish the same thing, but spread it out. Make it a feature! Flashback Friday etc.
Bonus idea - make sure you get the schema right
We have something new with our process. Previously, we only marked up the publication date in schema. So when we republished, we would change the publication date in the schema as well to the new pub date. Now that Google requires a pub date and last modified date in schema we have changed our process. When we republish content, we will leave the original publication date as the publication date marked up in schema and then put the new date that the article is being published marked up as last modified in schema. This is a much more clearer and accurate representation to Google as what you are doing with the article.
We are also displaying the last modified date to the user as the primary date, with the publication date made secondary. The intent here is that we want to show that this is an article that has been recently updated to the user so they know the information is current.
To get this to work properly, we had to rework how our CMS interacts with content on both published date and last modified date, but in the end, I think we are giving better signals to Google and users on the statuses of our articles.
-
You'll probably experience a dip from not publishing new content but I don't believe there will be any other issues.
Updating old content (drip fed or in bulk) won't trigger any spam/manipulation flags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have speed problem in google webmaster
if i show my website to robots with less code (robots version ) is it harmful for my website ? my website is wordpress and i can't optimze it more plz help me
On-Page Optimization | | rhesti3280 -
410 /demandware.store/ pages
I have a client that just migrated to salesforce commerce cloud. They have a large amount of 410 pages showing up in site crawls. They are all of the /on/demandware.store/ variety. Is there any reason to redirect thousands of these pages? Any value? Or should they be allowed to drop off?
On-Page Optimization | | bdcseo1 -
SERP Hijacking/Content Theft/ 302 Redirect?
Sorry for the second post, thought this should have it's own. Here is the problem I am facing amongst many others. Let's take the search term "Air Jordan Release Dates 2017" and place it into Google Search. Here is a link:
On-Page Optimization | | SneakerFiles
https://www.google.com/#q=air+jordan+release+dates+2017 Towards the bottom of the page, you will see a website that has SneakerFiles (my website) in the title. The exact title is: Air Jordan Release Dates 2016, 2017 | SneakerFiles - Osce Now, this is my content, but not my website. For some reason, Google thinks this is my site. If you click on the link in search, it automatically redirects you to another page (maybe 302 redirect), but in the cache you can see it's mine:
http://webcache.googleusercontent.com/search?q=cache:qrVEUDE1t48J:www.osce.gob.pe/take_p_firm.asp%3F+&cd=8&hl=en&ct=clnk&gl=us I have blocked the websites IP, disallowed my style.css to be used so it just shows a links without the style, still nothing. I have submitted multiple google spam reports as well as feedback from search. At times, my page will return to the search but it gets replaced by this website. I even filed a DMCA with Google, they declined it. I reached out to their Host and Domain register multiple times, never got a response. The sad part about this, it's happening for other keywords, for example if you search "KD 9 Colorways", the first result is for my website but on another domain name (my website does rank 3rd for a different Tag page). The page I worked hard on keeping up to date. I did notice this bit of javascript from the cloaked/hacked/serp hijacking website: I disabled iFrames...(think this helps) so not sure how they are doing this. Any help would be greatly appreciated. Note: I am using Wordpress if that means anything.0 -
Page/Website Structure
Hello again Mozzers, We have a category, lets call it widgets. Within widgets are about a hundred or so products. For usability my predecessor made the following layout Widgets Main Cateogry - Links off homepage - (no content just links to the 3 sub-categories)
On-Page Optimization | | ATP
- Widgets by Resolution
---- About 20 subcategories
eg. 0.1 Resolution widgets
0.2 resolution widgets
- Widgets by Capacity
---- About 20 subcategories
eg. 1 capacity widgets
2 capacity widgets
- Widgets by Type
---- About 12 subcategories This was a major improvement from a userbility perspective as it made a very complex product range navigatable by the major features or basic type. However, as you can imaging we now have 60+ very similiar pages all displaying very similiar products a nightmare for SEO. It also isnt ideal for user navigation as it take too many clicks to get to the products. I propose the following fix, and i wanted your opinion. Widget Main Category - Link from homepage (Consolidated with Widgets by Type)
-300 Words of content
-Links to the 12 Sub-type Catoregies (These are pages i can fill with content + products. This would give me a more ordinary structure of which I can focus each page to a keyword) The tricky part comes with incorporating the capacity and resolution options. 1 Browse Capacity Page
(20 sub categories all the same except capacity quantity & products)
1 Browse by Resolution Page
(20 sub categories all the same except resolution value & products) The owner want them, I was going to link from the main widgets page to each of these to give the customer the option. What I can't decide is how to deal with them from an SEO point of view. Should they be no-followed? canonicaled? Can there be any advantage to having so many pages covering slightly different variations or as i suspect it is dangerous to the overall health of the site. To complicate things further, Canonical tags may not be an option due to an old magento version running that doesnt support them. Is there an alternative way around? As always many thanks.0 -
Am I spamming my Keyword?
Hi All I am trying to rank my site for many key phrases but the pretty much always contain the word "Sussex" The biggy with a lot of competition is "Caterers Sussex" and similar variations when I view source on their page I find that Gastro catering's code uses "sussex" 92 times in it code. My site www.SussexChef.com uses the word "Sussex" 590 times, the competitors site mentions the word less in its code and is dominant for all my desirable key words. Am I spamming my keyword by using Sussex too often when naming my image file? Is there anything in this or am I barking up the wrong tree? Thanks for your help Ben
On-Page Optimization | | SussexChef830 -
Duplicate Content for Event Pages
Hi Folks, I have event pages for specific training courses running on certain dates, the problem I have is that MOZ indicates that I have 1040 duplicate content issues because I'm serving pages like this https://purplegriffon.com/event/2521/mop-practitioner I'm not sure how best to go about resolving this as, of course, although each event is unique in terms of it's start date, the courses and locations could be identical. Will Google penalise us for these types of pages, or will they even index them? Should I add a canonical link to the head of the document pointing to the related course page such as https://purplegriffon.com/courses/project-management/mop-management-of-portfolios/mop-practitioner. Will this solve the issue? I'm a little stuck on what to do for the best. Any advice would be much appreciated. Thanks. Kind Regards Gareth Daine
On-Page Optimization | | PurpleGriffon0 -
Lost Page Rank after directing http:// to WWW?
Hi I am trying to redirect all the non www urls to WWW. After I redirected them, most of my category page PR are dropped to 0. Can someone please tell me if this is the normal after effect after the redirect? Example url: this is PR2 before the redirect http://www.ilovebodykits.com/category/95/Body_Kits_Front_Bumpers.html
On-Page Optimization | | ilovebodykits0 -
SEO pluses/minuses of Content used in pop-up windows
I am curious to know what advantages/disadvantages there are to placing content in pop-up windows ( lightbox, etc.). My understanding is that the content can be heavily optimized, but if a user lands on the page where the content is, how will they find it if it is hidden within a pop-up? I guess it is more a of a user experience issue rather than an SEO issue? or not? Hopefully that makes sense. Gary
On-Page Optimization | | Apptixweb0