Updating existing content - good or bad?
-
Hi All,
There are many situations where I encounter the need (or the wish) to update existing content.
Here are few reasons:
- Some update turned up on the subject that does not justify a new posy / article but rather just adding two lines.
- The article was simply poorly written yet the page has PR as it is a good subject and is online for quite some time (alternatively I can create a new and improved article and 301 the old one to the new).
- Improving titles and sub titles of old existing articles.
I would love to hear your thoughts on each of the reasons...
Thanks
-
Wikipedia updates content all the time and they seem to rank rather well.
From google's perspective they would rather rank up-to-date content, so yes its got to be a good idea to update. An old page might have links to it, and history with google, so if it had up to date content its got to be better than a brand new page.
-
In all the 3 cases mentioned in the post, this seems like it is a good idea not to create new posts/pages and update the existing one. Obviously if the article is poorly written so in that case one should update the page after fixing the content of it instead of creating new pages... same is the case for the other 2 scenarios.
I think this video by SEOmoz contains your answer >> http://www.seomoz.org/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more
Hope this helps!
-
- hi Fernando,
long time no see.
The site as a tool that is technically accurate however I just want to point out that if you don't have the tag obviously your link will not qualify but you don't need new hosting as it states here
Here's the example of a tagged link that was done appropriately
http://www.feedthebot.com/tools/if-modified/
here's an example of what happens when I put my homepage and with obviously no tag
Does your webpage support the If Modified Since HTTP header?
enter URL: example - www.feedthebot.comNo.
This website does not support the if modified since http header. Scroll down for details.Technical stuff:
This tool checked your HTTP headers and received this response ...
Server Response HTTP/1.1 200 OK
HTTP/1.1 200 OK
Server: WP Engine/1.2.0
Date: Thu, 02 May 2013 03:57:11 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
Keep-Alive: timeout=20
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Pragma: no-cache
X-Pingback: http://www.blueprintmarketing.com/xmlrpc.php
X-UA-Compatible: IE=Edge,chrome=1
X-Cacheable: SHORT
Vary: Accept-Encoding,Cookie
Cache-Control: max-age=600, must-revalidate
X-Cache: HIT: 13
X-Cache-Group: normal
X-Type: default
There does not appear to be a "last modified header response"Therefore, this tool has determined that this URL does not support if modified since.
Web hosts who do support If Modified Since...
We use and recommend using BlueHost for your hosting needs -
here is some more information on if modified since
http://www.seomoz.org/q/is-the-if-modified-since-http-header-still-relevant
it seems you want to pay a lot of attention when implementing it to the clock on the server as well as on the actual workstation.
http://redmine.lighttpd.net/boards/2/topics/1999
http://trac.nginx.org/nginx/ticket/93
I hope this is of help,
Tom
-
If you are just updating the title, or rewriting the content, then I would go with the same page instead of creating a new one.
IF-MODIFIED-SINCE is the way of telling spiders that the content has/hasn't changed. You can read more here: http://www.feedthebot.com/ifmodified.html
-
Actually does sound familiar somehow even though I know most people are creating new post stating about the change and point to the old one (if there is enough to cover).
What about poorly written articles? Improving titles?
Please explain what you mean by "IF-MODIFIED-SINCE"?
Thanks
-
Matt Cutts from Google pointed out in a WH video that you should update instead of creating new pages with only the updates.
You can point in the old page that the content was updated using "IF-MODIFIED-SINCE".
I can't find the video right now, but I am sure he did say that
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No content in the view source, why?
Hi I have a website that you don't see the article body in the view source but if you use the inspect element tool you can see the content, do you know why? Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Google Adsense Good for SEO?
Is there any merit to the statement that Google will give some SEO value to sites that display Adsense? Or is there absolutely no SEO value for or against a site that displays Adsense Ads? Clearly, it would benefit Google's finance to give at least a small boost to sites that display Adsense, but do they do it? My guess is no, but I'm wondering ...
Intermediate & Advanced SEO | | applesofgold0 -
SEO from Godaddy How Good is it?
http://www.godaddy.com/search-engine/seo-services.aspx?ci=44163 it said "Includes Standard Search Engine Visibility to Improve Search Rankings" it begs for question... Search Engine Visibility??? Improve SERP?!?!!? is it really that good? O.o; or have i successfully been eaten my promotional messages? Can anyone with experience with them share some information with me ? 🙂 (The price tag is mighty interesting)
Intermediate & Advanced SEO | | IKT0