Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I delete 100s of weak posts from my website?
-
I run this website: http://knowledgeweighsnothing.com/
It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings.
When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve.
I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website?
Any and all advice on how to proceed would be greatly recieved.
-
This is a very valid question, in my opinion, and one that I have thought about a lot. I even did it on a site before on a UGC section where there were about 30k empty questions, many of which were a reputation nightmare for the site. We used the parameters of:
- Over a year old
- Has not received an organic visit in the past year
We 410d all of them as they did not have any inbound links and we just wanted them out of the index. I believe they were later 301d, and that section of the site has now been killed off.
Directly after the pages were removed, we saw a lift of ~20% in organic traffic to that section of the site. That maintained, and over time that section of the site started getting more visits from organic as well.
I saw it as a win and went through with it because:
- They were low quality
- They already didn't receive traffic
- By removing them, we'd get more pages that we wanted crawled, crawled.
I think Gary's answer of "create more high quality content" is too simplistic. Yes, keep moving forward in the direction you are, but if you have the time or can hire someone else to do it, and those pages are not getting traffic, then I'd say remove them. If they are getting traffic, maybe do a test of going back and making them high quality to see if they drive more traffic.
Good luck!
-
Too many people are going to gloss over the "In general" part of what Gary is saying.
Things not addressed in that thread:
- If a URL isn't performing for you but has a few good backlinks, you're probably still better off to 301 the page to better content to it lend additional strength.
- The value of consistency across the site; wildly uneven content can undermine your brand.
- Consolidating information to provide a single authoritative page rather than multiple thin and weak pages.
- The pointlessness of keeping non-performing pages when you don't have the resources to maintain them.
-
Haha I read this question earlier, saw the post come across feedly and knew what I needed to do with it. Just a matter of minutes.
You're right though - I would've probably said remove earlier as well. It's a toss up but usually when they clarify, I try to follow. (Sometimes they talk nonsense of course, but you just have to filter that out.)
-
Just pipped me to it
-
Hi Xpers.
I was reading a very timely, if not the same issue article today from Barry Schwartz over at SEO Round Table. He has been following a conversation from Gary Illyes at Google, whom apparently does not recommend removing content from a site to help you recover from a Panda issue, but rather recommends increasing the number of higher quality pages etc.
If you are continuing to get more traffic by adding your new larger higher quality articles, I would simply continue in the same vein. There is no reason why you cannot still continue to share your content on social platforms too.
In the past I may have suggested removing some thin/outsdated content and repointing to a newer more relevant piece, but in light of this article I now may start to think a tad differently. Hopefully some of the other Mozzers might have more thoughts on Barry's post too.
Here is the article fresh off the press today - https://www.seroundtable.com/google-panda-fix-content-21006.html
-
Google's Gary Illyes basically just answered this on Twitter: https://www.seroundtable.com/google-panda-fix-content-21006.html
"We don't recommend removing content in general for Panda, rather add more highQ stuff"
So rather than spend a lot of time on old work, move forward and improve. If there's terrible stuff, I'd of course remove it. But if it's just not super-high quality, I would do as Gary says in this instance and work on new things.
Truthfully, getting Google to recrawl year or two or five stuff can be tough. If they don't recrawl it you don't even get the benefit until they do, if there were a benefit. Moving forward seems to make more sense to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is Amazon crawling my website? Is this hurting us?
Hi mozzers, I discovered that Amazon is crawling our site and exploring thousands of profile pages. In a single day it crawled 75k profile pages. Is this related to AWS? Is this something we should worry about or not? If so what could be a solution to counter this? Could this affect our Google Analytics organic traffic?
Intermediate & Advanced SEO | | Ty19860 -
Any Tips for Reviving Old Websites?
Hi, I have a series of websites that have been offline for seven years. Do you guys have any tips that might help restore them to their former SERPs glory? Nothing about the sites themselves has changes since they went offline. Same domains, same content, and only a different server. What has changed is the SERPs landscape. I've noticed competitive terms that these sites used to rank on the first page for with far more results now. I have also noticed some terms result in what seems like a thesaurus similar language results from traditionally more authoritative websites instead of the exact phrase searched for. This concerns me because I could see a less relevant page outranking me just because it is on a .gov domain with similar vocabulary even though the result is not what people searching for the term are most likely searching for. The sites have also lost numerous backlinks but still have some really good ones.
Intermediate & Advanced SEO | | CopBlaster.com1 -
Spammy page with canonical reference to my website
A potentially spammy website http://www.rofof.com/ has included a rel canonical tag pointing to my website. They've included the tag on thousands of pages on their website. Furthermore http://www.rofof.com/ appears to have backlinks from thousands of other low-value domains For example www.kazamiza.com/vb/kazamiza242122/, along with thousands of other pages on thousands of other domains all link to pages on rofof.com, and the pages they link to on rofof.com are all canonicalized to a page on my site. If Google does respect the canonical tag on rofof.com and treats it as part of my website then the thousands of spammy links that point to rofof.com could be considered as pointing to my website. I'm trying to contact the owner of www.rofof.com hoping to have the canonical tag removed from their website. In the meantime, I've disavowed the www.rofof.com, the site that has canonical tag. Will that have any effect though? Will disavow eliminate the effect of a rel canonical tag on the disavowed domain or does it only affect links on the disavowed website? If it only affects links then should I attempt to disavow all the pages that link to rofof.com? Thanks for reading. I really appreciate any insight you folks can offer.
Intermediate & Advanced SEO | | brucepomeroy2 -
Website copying in Tweets from Twitter
Just noticed a web developer I work with has been copying tweets into the website - and these are displayed (and saved) one page at a time across hundreds of pages (this is so they can populate a twitter feed, I am told). How would you tackle this, now that the deed's been done? This is in Drupal. Your thoughts would be welcome as this is a new one to me. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
Mobile website on a different URL address?
My client has an old eCommerce website that is ranking high in Google. The website is not responsive for mobile devices. The client wants to create a responsive design mobile version of the website and put it on a different URL address. There would be a link on the current page pointing to the external mobile website. Is this approach ok or not? The reason why the client does not want to change the design of the current website is because he does not have the budget to do so and there are a lot of pages that would need to be moved to the new design. Any advice would be appreciated.
Intermediate & Advanced SEO | | andypatalak0 -
Redirecting Canonical 301s and Magento Website
I have an issue with a client's website where it has 3700+ pages, but roughly half of them are duplicates. Thankfully, the only difference between the original and the duplictes is the "?print" at the end of each URL (I suppose this is Magento's way of making a printable page version of the same page. I don't know, I didn't build it.) My questions is, how can I get all the pages like this http://www.mycompany.com/blah.html?print to redirect to pages like this... http://www.mycompany.com/blah.html Also, do they NEED to be Canonical, or will a 301 redirect be sufficient. Also, after having done this, if anybody knows, is there a way I can turn that feature off in Magento, because we're expanding our product line, and I don't want to have to keep chasing after these "?print" pages after the fact.
Intermediate & Advanced SEO | | ClifThompson0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0