Deleting low quality content
-
Hi there. I have a question about deleting low quality content pages hopefully anyone could share your feedback on.
We have a b2c ecom store and Product Pages are our target LDPs from search. We've built many information pages that are related to different products in the long past that are linked to related product pages.
Problem is many of them lack so-called quality content in terms of volume and quality and they aren't helping. Especially since early this year, organic traffic started declining after having peaked in Feb.
So I'm considering deleting those we and Moz consider low quality that are not receiving search traffic.
Firstly, is that a good idea? Secondly, how should I go about it? Just delete them and put a redirect so that deleted pages will point to related pages or even homepage?
Looking forward to any expert input.
-Yuji -
you do need to obtain seo advice, but often, we don't advise to delete the page but to improve it substantially.
If you have duplicated content, remove it and replace it with well-written, white-hat, high-quality content marketing. This is how we've improved many businesses' local seo by improving on-page SEO, rather than deleting it completely.
-
It would be best to talk to an[SEO Agency to get advice before you delete any blog posts or main pages.
-
Thanks for your advice. Yes, we will definitely be careful deleting pages. Thanks a lot!
-
That's a really good idea! Cut down what you have to manage to the essentials and then spend more time on those pages. Make sure you do some kind of ranking or traffic audit against all the pages though. You don't want to delete the versions of each page which have some (even if it is small) SEO power. You want to target the ones which Google isn't using
-
Thanks a lot for your feedback. It was helpful. I think we may need to remove pages leaving only unique ones and update their content to be more valuable. Thanks!
-
This is usually speaking **not the right mind set **to succeed.
When Google says (through decreasing ranking positions) that you haven't put in enough effort, usually deleting a poor attempt garners no favour in the ranking results. Think about it. Google are saying "you don't have enough quality content" and your answer is to delete content, thus having less than before. Does that seem like a genuine attempt to comply with the increasing stringency of Google's guidelines?
Deleting stuff is the easy way out. Think about it as if you wrote an essay in College and Google were the examiner. They Give you a D- for your essay and mark certain areas of your work as needing improvement. If you deleted those paragraphs, did nothing else and re-submitted the essay would you honestly expect a better grade?
Google want to see effort, unique content, value-add for end users. _Real _hard graft.
If you have high volumes of pages which are identical other than one tiny tab of information or a variable price, then maybe streamlining your architecture by removing pages is the answer. If most of the pages are unique in function (e.g: factually different products, not just parameter-based URL variants etc) then it's more a comment on the lack of invested effort and you must tackle your mindset if you want to rank.
N.B: By effort I don't mean your personal effort. I could also be alluding to the fact that budget was too low when producing content. I'm describing the site - not you personally!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I delete all tags and just use my categories to organize content?
My website NorthernCaliforniaHikingTrails.com/blog has 400 or so tags, and it also has an extensive set of categories. I'm thinking about deleting all the tags, but keeping the categories and consolidating them a bit. Is there a significant SEO advantage to having tags in my case? I've seen a few very high-ranking websites actually rank for a tag, but I doubt my site will reach that level. Any help appreciated!
Intermediate & Advanced SEO | | John88990 -
Content Internal Linking ?
Should we internally link new content to old content using anchor tags (keywords) related to pages from all new blogposts or should be keep rotating the blogposts like link from some blog posts & not from others. What ratio should we maintain. Right now i keep 2 links maximum from a 300 words posts or 3 in 500 words posts maximum. But linking from each new blog posts will be good?
Intermediate & Advanced SEO | | welcomecure0 -
Pagination causing duplicate content problems
Hi The pagination on our website www.offonhols.com is causing duplicate content problems. Is the best solution adding add rel=”prev” / “next# to the hrefs As now the pagination links at the bottom of the page are just http://offonhols.com/default.aspx?dp=1
Intermediate & Advanced SEO | | offonhols
http://offonhols.com/default.aspx?dp=2
http://offonhols.com/default.aspx?dp=3
etc0 -
Please help with some content ideas
I was reading this post http://www.clambr.com/link-building-tools/ about how he had basically outreached to experts in the field and each one had shared this post with their followers. I am wondering how this could translate to our small business marketing and design blog I am really struggling for content ideas that will work in regards to popularity and link building.
Intermediate & Advanced SEO | | BobAnderson0 -
Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites. I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money. Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
Intermediate & Advanced SEO | | McTaggart0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Sites with dynamic content - GWT redirects and deletions
We have a site that has extremely dynamic content. Every day they publish around 15 news flashes, each of which is setup as a distinct page with around 500 words. File structure is bluewidget.com/news/long-news-article-name. No timestamp in URL. After a year, that's a lot of news flashes. The database was getting inefficient (it's managed by a ColdFusion CMS) so we started automatically physically deleting news flashes from the database, which sped things up. The problem is that Google Webmaster Tools is detecting the freshly deleted pages and reporting large numbers of 404 pages. There are so many 404s that it's hard to see the non-news 404s, and I understand it would be a negative quality indicator to Google having that many missing pages. We were toying with setting up redirects, but the volume of redirects would be so large that it would slow the site down again to load a large htaccess file for each page. Because there isn't a datestamp in the URL we couldn't create a mask in the htaccess file automatically redirecting all bluewidget.com/news/yymm* to bluewidget.com/news These long tail pages do send traffic, but for speed we only want to keep the last month of news flashes at the most. What would you do to avoid Google thinking its a poorly maintained site?
Intermediate & Advanced SEO | | ozgeekmum0 -
Question about copying content
Hi there, I have had a question from a retailer asking if they can take all our content i.e. blog articles, product pages etc, what is best practice here in getting SEO value out of this? Here a few ideas I was thinking of: I was thinking they put canonical tags on all pages where they have copied our content? They copy the content but leave all anchor text in place? Please let me know your thoughts. Kind Regards
Intermediate & Advanced SEO | | Paul780