Recovering an Almost Dead Blog?
-
Hello,
I wanted to ask this from long time but finally i gathered my energy to ask this long question at moz.
Well, like almost all newbies with little knowledge of SEO & google I started my first blog in 2009, as things were very different that time & with posting more and more, I was getting good results & started to build decent traffic but with poor content ( I really din't care about it ) as I was getting organic traffic.
But things changed with Google Panda Completely after 11th April 2011, Since the time Traffic keep on falling, I never made backlinks so Penguin Updates never hit us but because of Poor & thin Content Site went down lower & lower.
I took some steps like increasing word count of posts, removing some posts but nothing worked so far but nothing worked.
Blog has almost 1200 articles & most important it was my first blog so I was bit attached with it.
Now my Question is, Should I just dispose the blog & move on or There is something which I can try to recover it.
The blog is 6 years old as of of now & received 2 million organic traffic as of now. ( attached organic Traffic screenshot )
My question is, Can something be done Seriously for this blog or I should just let it go.
I will appreciate some genuine advice on that.
Thanks
-
Yep, it's a big, tedious task, but there are no shortcuts here to do it right.
-
I will try to do so, thanks for your tip of keeping the posts privately.
However 1200 posts, its a big task to do.
Can anyone recall something similar with positive results?
-
Honestly, if you're using a CMS like Wordpress, all you should need to do is unpublish the post and let the search engines sort out the rest. If a post is returning a 404, it will get dropped from the index naturally. I can't think of any reason why you'd need to do any more work than that.
Also, a tip, I prefer setting the posts I'm removing to "Privately Published" rather than deleting them entirely. I like to keep removed content as a sort of historical archive, and it returns the same 404 message on the front.
-
Yes, almost 90% posts are not getting traffic, some posts are event posts, so they get some traffic during event & nothing before or after that.
What's best way to Remove Posts, Delete & Request Webmaster's tool to deindex & Remove cache version of site? or something else?
-
Andy, You are asking questions & I am looking for answers..
I am ok if I remove all useless posts, which means almost clearing entire blog. I am also willing to contribute more on this site but thing is does it worth? Will google really start picking my blog.
What if I Remove almost 90% of posts & Just leave 10% Posts with meaningful content?
Also should I do some link building etc?
-
My guess is that most of your blog posts aren't getting any traffic or engagement, but there are probably a few that do. I would start with a content audit, looking at the organic traffic, social engagement and backlinks to each page. You may not have built any links, but that doesn't mean your work hasn't earned them. Keep anything that draws consistent traffic, has been shared more than a few times, and has good quality back links. Let the rest 404. You'll need to make the determination on a case by case basis.
-
Well, you could decimate most of the site and fix many issues, but would this be enough to pull it back for you?
Of those that would remain, would you consider them to be more authoritative posts? Would they stand up in the face of Panda without issue?
-Andy
-
Andy Problem is Most of the articles are Short News, Don't know what can be done for that. Or I may Deindex all those posts, It will be approx 1000 posts ( almost 85% ) of total posts.
Traffic Driving posts are only few, however I have been posting very less like not even 10 posts in last year.
-
As said before, Have not made backlinks for this site at all, all links are natural. I was never hit by penguin, it was Panda all the time
-
Hi Ankit,
All is not lost, but it all depends on the time you have to put in to correcting it.
Have you ever tried to fix the issues with Panda? There is a wealth of information available out there - here are a couple of Google ones to read:
Remember that Panda focuses on thin and duplicate content, which translates to low quality, so if you think that you have ways to correct this, there is no reason you can't pull the traffic back.
-Andy
-
That's disaster. I would suggest you to check each & every backlink. Trying removing the spammy ones or disavow them. Add new posts & link it to old posts. Make it active!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Delete old blog posts after 301 redirects to new pages?
Hi Moz Community, I've recently created several new pages on my site using much of the same copy from blog posts on the same topics (we did this for design flexibility and a few other reasons). The blogs and pages aren't exactly identical, as the new pages have much more content, but I don't think there's a point to having both and I don't want to have duplicate content, so we've used 301 redirects from the old blog posts to the new pages of the same topic. My question is: can I go ahead and delete the old blog posts? (Or would there be any reasons I shouldn't delete them?) I'm guessing with the 301 redirects, all will be well in the world and I can just delete the old posts, but I wanted to triple check to make sure. Thanks so much for your feedback, I really appreciate it!
Technical SEO | | TaraLP1 -
Sitemap For Static Content And Blog
We'll be uploading a sitemap to google search console for a new site. We have ~70-80 static pages that don't really chance much (some may change as we modify a couple pages over the course of the year). But we have a separate blog on the site which we will be adding content to frequently. How can I set up the sitemap to make sure that "future" blog posts will get picked up and indexed. I used a sitemap generator and it picked up the first blog post that's on the site, but am wondering what happens with future ones? I don't want to resubmit a new sitemap each time that has a link to a new blog post we posted.
Technical SEO | | vikasnwu0 -
Will adding a mini directory to our blog with lots of outgoing no follow links harm our authority and context
We are an adventure travel tour company, who run hiking, kayaking, biking adventures in several countries. We have a travel tour operator website with a blog in a sub folder of the site. We want to add a section/category in the blog itself with a hiking club mini directory, that lists all hiking clubs in 1 or 2 specific countries. The reason we want to do this is because the people searching online for these clubs are our target market and potential clients. We hope to rank for some of these searches, and encourage interest in our blog/website in the process. We also want the potential to build relationships with these clubs. The question I want to ask is: if we add say 100 to 200 listings, and make all outgoing links no follow, will this harm our page authority, reputation with SE's or pose any other risk for our site. The other question is, do you think that this will dilute the context of our content - as its slightly different in context to the rest of our site content. Are we better to set up a separate site for this purpose.
Technical SEO | | activenz1 -
Should I enable trackback on my blog?
Hi, I received a notification that somebody posted a trackback comment and I am not sure about the recommended course of action. What is the SEO impact of accepting trackbacks on my blog? Should I simply ignore them or should I accept them? What I understand is that it means somebody linked to my blog (which is good) but do I get something out of posting the trackback in the comment section or am I just giving somebody a free link? Is that the same as if I was to link to another blog or does it carry some sort of a social recognition helping my site authority? Cheers Guillaume
Technical SEO | | tbps0 -
Blog separate from Website
One of my clients has a well established website, and a well established blog - each with its own domain. Is there any way to move the blog to his website domain without losing the SEO and links that he has built up over time?
Technical SEO | | EchelonSEO0 -
Too Many On Page Links Error On Wordpress Blog
I have a wordpress blog. I am getting an error message from SEOmoz "too many on page links" However SEOmoz is counting a full month of blogs as one page. For example-3 onpage internal links in each blog times 30 different blog article in a month is recorded as 90 on page links. Is there any mechanism to fix this on wordpress
Technical SEO | | wianno1680 -
Duplicate content - Quickest way to recover?
We've recently been approached by a new client who's had a 60%+ drop in organic traffic. One of the major issues we found was around 60k+ pages of content duplicated across 3 seperate domains. After much discussion and negotiation with them; we 301'd all the pages across to the best domain but traffic is increasing very slowly. Given that the old sites are 60k+ pages each and don't get crawled very often, is it best to notify the domain change through Google Webmaster Tools to try and give Google a 'nudge' to deindex the old pages and hopefully recover from the traffic loss as quickly and as much as possible?
Technical SEO | | Nathan.Smith0 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0