Update content
-
y'all,
what is the recommended amount of time in which content on a website should be refreshed?
TY
-
When you have something of value to say. If you are not that creative, as I am not, then have guest bloggers add valued content. I think what is more important than frequency is quality. This also depends on expectations from visitors, which depends on what type of site you are running.
Having too many may be too much for visitors to feel as though they are keeping up with the site.
Small sites 2-3 per month
Large sites (non news) 2-3 per day. (unless your site is broken into categories, then 1-2 per day per category)
-
I believe that having something new on the homepage when returning visitors arrive will make them come back more often.
To accomplish that we have a script that rotates content through a few of the blocks on our homepage. The script runs every hour so if a person returns later in the day they will see different items. We have enough content in this rotation that the typical visitor could make many visits before seeing the same item a second time.
Also, on most pages of our site we have links to our ten most recent blog posts that match the category of the page. These update every hour as we make up to ten blog posts per day. We also have lists of "related articles" that rotate hourly.
All of this keeps almost every page of our site constantly changing. I don't know what google thinks of it but adding these rotating content features has increased the visitor engagement level of our site.
-
great suggestion, thank you!
-
I don't think there is any black and white answer here. The correct answer is : You should refresh content as often as you can while still maintaining a high level of quality. You shouldn't be updating content for the sake of updating content.
If you are in a specific niche you might not be able to produce content as frequently as you like without some creative thinking. You should strive to produce and update content as often as possible while maintaining a high level of quality. On my main blog, I try to get something new up once every other day, and I update the most popular articles with relevant information as it becomes prudent to do so, for example I update our "Top 10 Workplace Accidents" article once every 30 days because that is when OSHA releases new statistics.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Https & Google Updated Guidelines
Hi We have https on aspects of the site which users directly interact with, such as login, basket page. But we don't have https across the whole site. In light of Google adding it to their guidelines - is this something we need to put into action? Also same question on the Accessibility point Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader. Are we going to be penalised if these are not added to our site? Thank you
Algorithm Updates | | BeckyKey0 -
How much content is it safe to change?
I have read that it is unsafe to change more than 20% of your site’s content in any update. The rationale is that "Changing too much at once can flag your site within the Google algorithm as having something suspicious going on." Is this true, has anyone had any direct experiences of this or similar?
Algorithm Updates | | GrouchyKids0 -
Site has disappeared since Panda 4 despite quality content, help!
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0. All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high. The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty). Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively? Would really appreciate any help.
Algorithm Updates | | search_shop0 -
Need some Real Insight into our SEO Issue and Content Generation
We have our site www.practo.com We have our blog as blog.practo.com We plan to have our main site in a months time from now as www.ray.practo.com The Issues - I will then need to direct all my existing traffic from www.practo.com to www.ray.practo.com Keeping in mind SEO and also since I will be generating new content via our Wordpress instance what are the best ways to do this so that google does not have difficulty in find out content 1. Would it be good if I put the Wordpress instance as ray.practo.com/ blog(wordpress instance comes in here in the directory) / article-url 2.Would it be better with www.practo.com / ray / blog/article-url I am using wordpress to roll out all our new SEO based content on various keywords and topics for which we want traffice - primary reasons are since we needed a content generation cms platform so that we dont have to deal with html pages and every time publish those content pages via a developer. Is the above - what soever I am planning to do in the correct manner keeping SEO in mind. Any suggestions are welcome. I seriously need to know writing seo based content on wordpress instance and have them in the urls is that a good idea? Or is only html a good idea. But we need some cms to be there so that content writers can write content independently. Please guide accordingly. Thanks
Algorithm Updates | | shanky10 -
Need help with some duplicate content.
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was. It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this: http://noahsdad.com/resources/ http://noahsdad.com/resources/page/2/ http://noahsdad.com/therapy/page/2/ I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category. What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed." There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories. Any ideas what I should do? Thanks.
Algorithm Updates | | NoahsDad0 -
SEO Test: Domain Hyphenation [Update]
In May I announced test results for domain hyphenation but after a 3 month followup the results have changed and the hyphenated domain now wins on what seems to be the first link instance advantage. I was unable to discover any other factors which may have influenced this test but if anyone has any ideas I would like to hear about it. Here are the details of the SEO test and revealed URLs.
Algorithm Updates | | Dan-Petrovic1 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0