What to do with old, outdated and light content on a blog?
-
So there's a blog I recently took over - that over the past 2 years has great content. However, with their 800+ published posts.
I'd say that 250-300 posts are light in content, that's nothing more than a small paragraph with no real specificity on what its about - more like general updates.
Now what would best practice be; optimizing all of the posts or deleting the posts and 301'ing the URL to another post/the root?
-
Nope - minimal to no traffic
-
#1 Users. Anyone reading them? If not, no point for them to exist.
-
That's what I was leaning towards. A lot of this content links out to support pages on other sites - so I was a bit weary about losing link juice to other pages. However, I think the content and pages aren't authoritative enough to really show any loss from the dropped links.
-
Both of those ideas sound exactly what i would do, if they have credible posts that are valuable to their users then keep them and add to them, maybe share them on social media once you add more relevant content. Or if they really are not anything useful i would delete and 301 to something similar.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
SEO threats of moving from [.com.au] domain to [.com] domain for a 15yr old SAAS company.
Hey Guys. I work for a 15 yr old SAAS company which originally started with a country-specific [.com.au] domain and later got a [.com] domain as the business grew. The AU website has a DA:56 while the [.com] has as DA: 25. Now we are looking to have everything migrated to the [.com] domain. But, my concern is that we might lose the SEO value of the AU domain. I was wondering if anyone has any experience in this or recommend a case study on this topic. Thanks! Allan
Algorithm Updates | | allanhenryjohn0 -
How often should I update the content on my pages?
I have started dropping on my rankings - due to lack of time after having a baby. I'm still managing to blog but I'm wondering if I update the content on my pages will that help? All my Meta tags and page descriptions were updated over a year ago - do I need to update these too? We were ranking in the top spots for a good few years, but we're slowly falling 😞 Please give me any advice to keep us from falling even further. I have claimed all my listings, and try to add new links once a month. I share my blog to all social sites and work hard to get Google reviews, we have 53 which is higher than any of our competitors. Any other ideas? Have I missed something that Google is looking for nowadays? Many thanks 🙂
Algorithm Updates | | Lauren16890 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Site has disappeared since Panda 4 despite quality content, help!
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0. All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high. The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty). Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively? Would really appreciate any help.
Algorithm Updates | | search_shop0 -
Moving content in to tabs
Hi, I'm kind of an SEO noobie, so please bare with me 🙂 On one of the sites I'm working on I got a request to move large blocks of content, just placed on the page currently, in to tabs. This makes sense. We tried it and it makes navigating through the information much easier for visitors. My question is: Will Google consider this as hiding information? It's not loaded dynamically. It's all their when the page is loaded, in the source, but not displayed until the visitor clicks the tab. Will this cause SEO issues? Thank you!
Algorithm Updates | | eladlachmi0 -
What is considered duplicate content in an ecommerce website that offers the same product for retail and wholesale purchasing?
I have an ecommerce website that offers retail and wholesale products which are identical, of course with the exception of pricing. My concern is duplicate content. If the same product is offered under both the retail and wholesale category, and described identically, with the exception of price, metadata and a few words, is that considered duplicate content and would both pages be disregarded by the robots? Is it best to avoid the same description for that one product under the two separate categories? Thanks for all your help!
Algorithm Updates | | flaca0 -
TOP 3-5 SEO Blogs
I am wondering if you can help me start with the top three to five SEO blogs. I have been really enjoying and getting into learning more about SEO and it is becoming really fun as it becomes less overwhelming. A few days ago there was a question about great SEO blogs. And everyone provided a great list. I bookmarked all of them, but in reality I won't be able to go through them all and really get what is being presented. My question is what would be the best 3-5 to start with? Eventually I will go through them all but experience can help me get on the right track. Thanks for the suggestions
Algorithm Updates | | fertilityhealth0