When to delete low quality content
-
If 75% of a site is poor quality, but still accounts for 35% of the traffic to the site, should the content be 404ed? Or, would it be better to move it to a subdomain and set up 301 re-directs? This site was greatly affected by Panda.
-
Thank you for the reply. I should add that this 75% of the content is an application (i.e. it is all the same). So the traffic is about the same on all of the pages. We need to choose between 404-ing the content or putting it on a subdomain. Do you know if having thin content on a subdomain will affect the original domain? I've been searching for a while, but can't seem to find anything concrete about this. Thanks!
-
I am not saying this very loudly... but I sometimes toss out a little ditty of content to see if it pulls any traffic and to see where it positions in the SERPs. I use these as "scouts" to identify easy targets. If it works then I know that a larger investment in content will pay off.
-
Good point. I would go further into exploring which pages / phrases have been cut most and work out any common elements of all that have been affected and define a plan of attack. If you have more than 100 articles it would take some time before it's all fixed. It's all about prioritising.
A good place to take a peek at is Google Webmaster Tools which shows search trends and CTR data as well.
-
How about upgrading the content?
Look at your analytics. Identify the pages there that are getting the most traffic. Start with them and continue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalty for adding too much content too quickly?
Hi there, We released around 4000 pieces of new content, which all ranked in the first page and did well. We had a database of ~400,000 pieces and so we released the entire library in a couple of days (all remaining 396,000 pages). The pages have indexed. The pages are not ranking, although the initial batch are still ranking as are a handful (literally a handful) of the new 396,000. When I say not ranking - I mean not ranking anywhere (gone up as far as page 20), yet the initial batch we'd be ranking for competitive terms on page 1. Do Google penalise you for releasing such a volume of content in such a short space of time? If so, should we deindex all that content and re-release in slow batches? And finally, if that is the course of action we should take is there any good articles around deindexing content at scale. Thanks so much for any help you are able to provide. Steve
Intermediate & Advanced SEO | | SteveW19870 -
Different content on different mobile browsers
Is it ok to run different html & different content on different mobile browsers even though the url is same. or the site can get penalize ?
Intermediate & Advanced SEO | | vivekrathore0 -
An Unfair Content related penalty :(
Hi Guys, Google.com.au
Intermediate & Advanced SEO | | jarrodb
website: http://partysuppliesnow.com.au/ We had a massive drop in search queries in WMT around the 11th of september this year, I investigated and it seemed as though there were no updates around this time. Our site is only receiving branded search now - and after investigating i am led to believe that Google has mistakingly affected our website in the panda algorithm. There are no manual penalties applies on this site as confirmed by WMT. Our product descriptions are pretty much all unique but i have noticed that when typing a portion of text from these pages into google search using quotation marks, shopping affiliate sites which we use are being displayed first and our page no where to be seen or last in the results. This leads me to believe that Google thinks we have scraped the content from these sites when in actual fact they have from us. We also have G+ authorship setup. Typing a products full name into Google (tried a handful) our site is not in the top 100 or 200 at times, i think this further clarifies that we are penalised. We would really appreciate some opinions on this. Any course of actions would be great. We don't particularly want to invest in writing content again. From our point of view it looks like Google is stopping our site from ranking because it's getting mixed up with who the originator for our content is. Thanks and really appreciate it.0 -
Does having a page that ends with ? cause duplicate content?
I am working on a site that has lots of dynamic parameters. So lets say we have www.example.com/page?parameter=1 When the page has no parameters you can still end up at www.example.com/page? Should I redirect this to www.example.com/page/ ? Im not sure if Google ignores this, or if these pages need to be dealt with. Thanks
Intermediate & Advanced SEO | | MarloSchneider0 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Duplicate content issue for franchising business
Hi All We are in the process of adding a franchise model to our exisitng stand alone business and as part of the package given to the franchisee will be a website with conent identical to our existing website apart from some minor details such as contact and address details. This creates a huge duplicate content issue and even if we implement a cannonical approach to this will still be unfair to the franchisee in terms of their markeitng and own SEO efforts. The url for each franchise will be unique but the content will be the same to a large extend. The nature of the service we offer (professional qualificaitons) is such that the "products" can only be described in a certain way and it will be near on in impossible to have a unique set of "product" pages for each franchisee. I hope that some of you have come across a similar problem or that some of you have suggestions or ideas for us to get round this. Kind regards Peter
Intermediate & Advanced SEO | | masterpete0