Not Adding Fresh Content Daily Did Got Me Penalized?
-
One of my website used to post like a 1000 words articles every 4-5 (say like 12 x 300 words articles each in a week) days in a week. The process went till 3 months. Then suddenly we stopped adding content to it for a flat 15 days due to unavailability of content writer. Suddenly a major drop took place. Now we have been adding the same amount of quality content but the ranking doesn't seem to be improving. Is it a penalty?
-
If you go the "quality content" route and can only produce a small number of new articles per month, the best thing to do is to produce only "evergreen" content that can be recycled out to the front page. That will give the appearance of activity and diversity - at least to visitors with a good memory who have not been visiting your site for a long time.
Also, if you have a page of "news" where you link to articles on other websites about industry trends or interesting topics. That can develop a following of thousands of people who visit your site frequently just to check that page - or subscribe to your feed.
-
It will certainly get picked up by site crawlers for audit purposes, but would Google object to it? It depends what it is, where it is and how necessary it is. If it's a line of spam (for example) just to add in keywords, then this might cause you issues, but it depends on the quality of the rest of the page and content.
-Andy
-
No problem sir!
-
Mine isn't a sort of blog post. Its a demand of the website where i have to introduce 3 new pages. Content was originally written no spin, rewrite or so. One more thing i would like to addup, all my pages have a line in common (15-20 words) out of the 300 words. Can it affect the content duplicate issue?
-
That's the one - thanks Patrick
-Andy
-
Hi there
Just a quick side note - the post Andy is referencing above is located here.
Hope this helps! Good luck!
-
Looks like Phanthon 2 has effect my site.
-
Well, check analytics to look for when the drop happened and then look here on MOZ to see if it has coincided with anything.
-Andy
-
I truly understand but its a demand of website to have short articles. Because are writing product descriptions. So to make sure we at least have 300 words of goof content written on each page.
-
Hi Jawahar,
The best way to spot a penalty is to look at your analytics and see if any drops coincide with any algorithm updates.
I would echo a point that EGOL made earlier on another post, that daily content in this manner is probably not as beneficial as posting one very high quality article once a week. Of course, it depends what you are writing about, but shorter articles like this wouldn't generally do as much for you.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
How do the Quoras of this world index their content?
I am helping a client index lots and lots of pages, more than one million pages. They can be seen as questions on Quora. In the Quora case, users are often looking for the answer on a specific question, nothing else. On Quora there is a structure setup on the homepage to let the spiders in. But I think mostly it is done with a lot of sitemaps and internal linking in relevancy terms and nothing else... Correct? Or am I missing something? I am going to index about a million question and answers, just like Quora. Now I have a hard time dealing with structuring these questions without just doing it for the search engines. Because nobody cares about structuring these questions. The user is interested in related questions and/or popular questions, so I want to structure them in that way too. This way every question page will be in the sitemap, but not all questions will have links from other question pages linking to them. These questions are super longtail and the idea is that when somebody searches this exact question we can supply them with the answer (onpage will be perfectly optimised for people searching this question). Competition is super low because it is all unique user generated content. I think best is just to put them in sitemaps and use an internal linking algorithm to make the popular and related questions rank better. I could even make sure every question has at least one other page linking to it, thoughts? Moz, do you think when publishing one million pages with quality Q/A pages, this strategy is enough to index them and to rank for the question searches? Or do I need to design a structure around it so it will all be crawled and each question will also receive at least one link from a "category" page.
Intermediate & Advanced SEO | | freek270 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
Please help with some content ideas
I was reading this post http://www.clambr.com/link-building-tools/ about how he had basically outreached to experts in the field and each one had shared this post with their followers. I am wondering how this could translate to our small business marketing and design blog I am really struggling for content ideas that will work in regards to popularity and link building.
Intermediate & Advanced SEO | | BobAnderson0 -
What are your thoughts on Content Automation?
Hi, I want to ask forum members’ opinion on content automation. And before I raise the eyebrows of many of you with this question, I’d like to state I am creating content and doing SEO for my own website so I’m not looking to cut corners with spammy tactics that could hurt my website from an organic search perspective. The goal is to automate pages in the areas of headings, Meta Titles, Meta Descriptions, and perhaps a paragraph of content. More importantly, I’d like these pages to add value to the users experience so the question is…. How do I go about automating the pages, and more specifically, how is meta title, meta descriptions etc. automated? I’d also like to hear from people that recommend steering clear of any form of content automation. I hope my question isn’t too bit vague and I look forward to hearing from other Mozzers. Regards, Russell in South Africa
Intermediate & Advanced SEO | | Shamima0 -
Are we being Penalized? Can someone Assess Please!
We have two eCommerce sites. Both the sites can broadly be divided into 3 pages 1. Home Page. 2. Detail Page. 3 Category Pages (Altogether our site has approx 3 Million pages each) These are the site URLs http://bit.ly/9tRZIi - This is targeted for USA Audience http://bit.ly/P8MxPR - This is targeted for UK audience The .com domain which was launched earlier in 2011 is doing okay with decent organic traffic Precautions Taken: To avoid content being duplicate on both the sites we are using: a. Geo-targeting through Google webmaster tools b. rel=alternate tag on printsasia.co.uk Problem 1. The .co.uk domain which was launched in May 2012 started gaining organic traffic slowly but then suddenly dropped to almost 0 after September 18. 2. When we use operator site:printsasia.co.uk and apply a filter on past week/month we don't see any result. While when same operator used for "any time" we see some results. 3. According to webmaster tool, Google has indexed 95% of our URLs in the sitemap Our concern: Is our UK site penalized for some reasons? If yes, what could be the possible reason(s) for this penalty and possible steps to get out of it? Would request if experts here can review our site and help us.
Intermediate & Advanced SEO | | CyrilWilson0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Hidden Content with "clip"
Hi We're relaunching a site with a Drupal 7 CMS. Our web agency has hidden content on it and they say it's for Accessibility (I don't see the use myself, though). Since they ask for more cash in order to remove it, the management is unsure. So I wanted to check if anyone knows whether this could hurt us in search engines. There is a field in the HTML where you can skip to the main content: Skip to main content The corresponding CSS comes here: .element-invisible{position:absolute !important;clip:rect(1px 1px 1px 1px);clip:rect(1px,1px,1px,1px);} #skip-link a,#skip-link a:visited{position:absolute;display:block;left:0;top:-500px;width:1px;height:1px;overflow:hidden;text-align:center;background-color:#666;color:#fff;} The crucial point is that they're hiding the text "skip to main content", using clip:rect(1px 1px 1px 1px), which shrinks the text to one pixel. So IMO this is hiding content. How bad is it? PS: Hope the source code is sufficient. Ask me if you need more. Thx!
Intermediate & Advanced SEO | | zeepartner0