Removing blog posts with little/thin content
-
We've got quite a lot (I'd say 75%) of our blog posts which I'd consider as low quality. Short content (500 words or less) with few shares, no backlinks & comments; most of which gets 0-2 unique views a day (however combined this adds up).
Will removing these pages provide an SEO benefit greater than the reduction in traffic from the removal of these pages?
I've heard the likes of Neil Patel/Brian Dean suggest so, however I'm scared it will provide the opposite as less content is indexed and I'll actually see traffic fall.
Sam
-
Sam,
If you can safely assume that the pages are not hurting you, let them stay. It's certainly not ideal to have a website loaded with thin content. But, as is the case with most small sites, the posts are likely to do you more good than harm, provided you're willing to show them some attention.
Here's a good strategy to deploy:
-
Find the top 10 posts, as judged by analyzing GA and against the topics you hope to rank for, then beef them up with additional text and graphics.
-
Republish the posts, listing them as "updated."
-
Share the posts via social, using a meaningful quote from each piece to draw interest and invite re-shares.
-
Continue sharing the posts in the following weeks, each time with new text.
-
Gauge the performance of each social share, then use this information to create additional headlines for new posts, in addition to using it to inform you of what content might draw the most interest.
-
Repeat the process with the next 10 posts.
When you have thin, poorly performing content on your site, you aren't able to learn enough about what you're doing right to make a sound call. So to create more content, even "better" content, is likely a mistake. The wise approach is to use the content you have to investigate additional content ideas that would better serve your audience. Through social media and additional traffic to your site, you should be able to better discern what pieces of content will provide the greatest benefit in the future.
Additionally, the old content is likely to perform much better as well.
RS
-
-
It's difficult to talk in terms of truevalue. Someone of them may provide some value, but they pale in comparison to the new blog posts we have lined up and in my opinion bring the blog down; personally I wouldn't be sad to see them go.
I think it's time to exterminate.
Sam
-
Do the contents of these blog posts provide any value at all to the reader? Are they written well, and would you actually be sad to see them go? If yes, then refer to my previous response on re-purposing them to create even better content with more SEO value.
If not, and you're just worried about SEO, I'd say be rid of them. Based on those stats.
-
Thanks all, from my analysis:
In the last twelve months:
376 pages (although I'd estimate 70 of these aren't pages)
104 pages have bounce rate of 100%
307 pages have less than 20 unique views (for the previous 12 months) but the total count for this would be 1,374
which is a sizable sum.So the question is, is it worth pulling all the pages below 20 unique views and all the 100% bounce rate pages from the site? Will it actually benefit our SEO or am I just making work for myself?
I'd love to hear from people who've actually seen positive SEO movements after removing thin pages.
-
It's a waste of good content to remove it because it's considered "thin". In your position, I would consider grouping these under-performing/thin blog posts into topical themes, compile and update them to create "epic content" in the form of detailed guides or whatever is most suitable to the content. Add to the long post so that there's some logical structure to the combining of the little posts (and so it doesn't just read as if you stuck multiple posts together), then redirect the old post URLs to the newly created relevant posts. Not only do you have fresh content that could each provide a ton of value to your readers, but the SEO value of these so-called "epic" posts should in theory be more impactful.
Good luck, whatever you decide to do!
-
My rule of thumb would be:
Take all pages offline, which have under 30 organic sessions per month.
Like Dmitrii already mentioned, check your past data for these posts and have a look at average sessions durations / bounce rates / pages per sessions, with which you can valdiate the "quality of the traffic". If there are posts which have decent stats - don't take them offline. Rather update them or write a new blog post about the topic and make a redirect. In this case have a look in GWT for the actual serach queries (maybe you find some useful new insights).
-
Hi there.
Are those blogs ranking anywhat for any related keyphrases? At the same time, how about bounce rate and time on page for those 2 visits a day? Are you sure those visits are not bots/crawlers?
We have done similar reduction about 6 months ago and we haven't seen any drop in rankings. The share of traffic to thin pages was pretty small and bounce rate was high, as well as time on page was very short. So, why to have anything which doesn't do any good?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
641 Crawl Errors In My Moz Report - 190 are high priority Duplicate Content
Hi everyone, There are high and medium level errors. I was surprised to see any especially since Google Analytics shows no errors whatsoever.190 errors - duplicate content.A lot of images are showing in the Moz Crawl Report as errors, and when I click on one of these links in the report, it directs to the image which displays on a blog post on the site unusually since I haven't started blogging yet.. So it looks like all those errors are because the images are appearing on their own post.So for example a picture of a mountain would be referred to with www.domain.com/mountains ; the image would be included in the content on a page but why give an image a page/post all of it's own when that was not my intention. Is there a way I can change this?# ----------------------------------------
Reporting & Analytics | | SEOguy1
These are things I first see at the top of the Moz Report: There are 2 similar home urls at the top of the report: http status code is 200 for both (1) and (2) Link Count for (1) is 71. Link count for (2) is 60. No client or server errors Rel Canonical Rel-Canonical Target
Yes http:// domain. co.uk/home
Yes http:// domain. co.uk/home/ Does this mean that the home page is being seen as a duplicate by Google and the search engines?http status codes on every page is 200.Your help would be appreciated.Best Regards,0 -
Track subdomains in the same analytics property/view
Hello to everyone, this is my first question (kind of a newbie here :-D). I have a website that spans over 4 subdomains + the main domain and I'd like to have a place where I can see the total page views, events and goal completions. Main domain is langhe.net, and subdomains are eventi.langhe.net, servizi.langhe.net, shop.langhe.net, about.langhe.net. The website is based on wordpress and I'm using google analytics for wordpress by yoast as analytics plugin. The configuration of the plugin is as follows: all the website have the same UA-XXXXXXXX code The Subdomain Traking field (setDomainName) is filled with .langhe.net Cross-Domain tracking is enabled Cross-Domain Primary Domain (setDomainName) is langhe.net Cross-Domain Tracking, Other Domains field is filled with: about.langhe.net, eventi.langhe.net, servizi.langhe.net, shop.langhe.net. In the GA property I've created an Advanced Custom Filter configured this way: Field A -> Extract A = hostname (.*) Field B -> Extract B = request URI (.*) Output To -> Constructor = request URI $A1$B1 Field A Required: yes Field B Required: No Override Output Field: yes Case Sensitive: no I was wondering it this is the best way of doing it, or if there are other "best practice" ways to obtain what I'm looking for (for example with this configuration it's become quite difficult to separate the traffic sources for each website). Thank you in advance 🙂
Reporting & Analytics | | Enrico_Cassinelli
Cheers!0 -
What type of links/redirect is Yahoo! using?
So I'm trying to figure out exactly what type redirect or hyperlinking Yahoo! is using on their article pages. For example:
Reporting & Analytics | | William.Lau
https://shopping.yahoo.com/blogs/fashionate/spring-clean-your-beauty-routine--10-tips-on-looking-fresh-this-season-000058218.html Hover over an external link, it shows you the ending URL. Right or left click it, it gives you a 302 redirect. When you actually left click it, it adds and "id" attribute, I assume for tracking. However, when you left click the the hyperlink, it no longer shows as a 302. I have limited working knowledge of web development techniques, so anyone with advance knowledge or have actually done this, it'd be helpful to understand this more.0 -
Blog separate from website?
I have 2 domain names - 1 for a blog - the domain name contains key words related to the main subject of the blog. The 2nd is the main website with a domain name = company name - no key words. The blog and main website are about the same subject. SEOMoz link analysis indicates that the main website is getting domain authority credit from the large number of links from the blog to the main website. The blog website gets more traffic than the main website and Google Analytics data indicates that we are getting a low number of referrals from the blog to the main website. If I can increase the referral traffic from the blog site to the main website do you think the link credit from the blog to the main website warrants keeping the sites separate or do people think the blog and the main website should be under the same domain name? Please share your thoughts - I need advice!
Reporting & Analytics | | Arukor0 -
Totally Remove "localhost" entries from Google Analytics
Hello All, In Google Analytics I see a bunch of traffic coming from "localhost:4444 / referral". I had tried once before to create a filter to exclude this traffic source, but obviously I did it wrong since it's still showing up. Here is the filter I have currently: Filter Name: Exclude localhost
Reporting & Analytics | | Robert-B
Filter Type: Custom filter > Exclude
Filter Field: Referral
Filter Pattern: .localhost:4444.
Case Sensitive: No Can anyone see what I'm doing wrong and give me a push in the right direction? Thanks in advance!0 -
How serious are the Duplicate page content and Tags error?
I have a travel booking website which reserves flights, cars, hotels, vacation packages and Cruises. I encounter a huge number of Duplicate Page Title and Content error. This is expected because of the nature of my website. Say if you look for flights between Washington DC and London Heathrow you will at least get 60 different options with same content and title tags. How can I go about reducing the harm if any of duplicate content and meta tags on my website? Knowing that invariably I will have multiple pages with same content and tags? Would appreciate your advice? S.H
Reporting & Analytics | | sherohass0 -
Metrics for guest posting
I have couple blogs willing to publish an article with a backlink to my site. I want to compare them. What SEO metric is the best to use for comparison: Domain Authority or Domain Domain Mozrank or DomainTrust? I want to choose a blog that will give me the most linkjuice
Reporting & Analytics | | Alexey_mindvalley0 -
Analytics/Google Keyword comparison
Hi I'm trying to establish a methodology to best show the gap between potential and realised organic keyword traffic. To obtain potential keyword traffic I'm using the Google Adwords keyword tool to derive local monthly search volumes for exact keyword matches. However, I'm confused as to which is the best way of getting a comparable metric from Google Analytics (GA). I was using custom reports and the 'organic searches' metric. However, this provides different values to a standard report selecting non-paid search in the default advanced segments. What is the best report/metric in GA to use for both organic and paid search volumes that would be comparable to the Google Adwords keyword tool. Also, I'm having problems getting my kids to eat their greens, any advice! 😉 Thanks Neil
Reporting & Analytics | | mccormackmorrison0