Does adding lots of new content on a site at one time actually hurt you?
-
When speaking with a client today, he made the comment that he didn't want all of the new content we'd been working to be added to the site all at once for fear that he would get penalized for flooding the site with new content. I don't have any strong data to confirm or refute the claim, is there any truth to it?
-
I agree with all colleagues above, I cant see how your web site will be penalised due to lots of pages uploaded at the same time.
However Adding Too Many Pages Too Quickly May Flag A Site To Be Reviewed Manually. This means thought that you will add hundreds of thousand of link a night. Here is the related via by Matt Cutts:
Hope you find this useful!
-
It is a real estate site and the content is a directory of the various condos available in their community. The pages are all unique and have real valuable content, so I don't think there will be any issues with content quality.
There is new content and blogging that occurs regularly on the site. I think that the client's concern comes from some old concepts that if we're only adding content infrequently, but in mass, that it may be seen as spammy.
-
I agree with Jesse. Earlier this year we added a new data-driven section to our website that included (believe it or not) 83,000 pages, all unique in content since the information is highly technical in nature. No associated penalties have resulted from this.
-
I agree with Jesse for the most part. I think the key is: what kind of content we are talking about? Adding tons of low-value, thin content pages to a site all at once (or even gradually) is probably going to diminish the authority of existing content. I do think that adding thousands of pages that have no page authority to a site that contains pages with a decent amount of authority could, theoretically, dilute the authority of the existing pages depending on site architecture, internal linking and the ratio of existing pages versus new pages. However, I would expect this to be only temporary, and if the new content is great quality, should be nothing to worry about long term.
-
Thanks Jesse, that was my thought exactly. If anything, I see incrementally adding the content as a negative thing, since it will lead to a less than complete user experience.
-
No truth to that whatsoever. That's weird paranoia.
If there was some sort of problem WITH the content, maybe. But there would be no penalty for all new content added.
I've done total site overhauls plenty of times and they get indexed quick with no penalties.. (although I will say the speed of this seems to be in flux, but I digress.)
Don't let the client worry about this. Think about any website that initially launches: why would Google penalize that?
Hope this helps. Paranoia is often the toughest challenge when it comes to dealing with clients/site owners.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google & Site Architecture
Hi I've been reading the following article about Google's quality signals here: https://searchenginewatch.com/2016/10/10/guide-to-google-ranking-signals-part-6-trust-authority-and-expertise/?utm_source=Search+Engine+Watch&utm_campaign=464594db7c-11_10_2016_NL&utm_medium=email&utm_term=0_e118661359-464594db7c-17828341 They mention - 3) All your categories should be accessible from the main menu. All your web pages should be labelled with the relevant categories. Is this every category? We have some say 3 levels deep, and they aren't all in the menu. I'd like them to be, so would be good to make a case for it. Thank you
Algorithm Updates | | BeckyKey1 -
Canonical when using others sites
Hi all, I was wondering if this is a good way to safely have content on our website. We have a job search website, and we pull content from other sites. We literally copy the full content text from it's original source, and paste it on our own site on an individual job page. On every individual job page we put a canonical link to the original source (which is not my own website). On each job page, when someone wants to apply, they are redirected to the original job source. As far as I know this should be safe. But since it's not our website we are canonical linking to, will this be a problem? To compare it was indeed.com does, they take 1 or 2 senteces from the original source and put it as an excerpt on their job category page (ie "accountant in new york" category page). When you click the excerpt/title you are redirected to the original source. As you might know, indeed.com has very good rankings, with almost no original content whatsoever. The only thing that is unique is the URL of the indeed.com category where it's on (indeed.com/accountant-new-york), and sometimes the job title. Excerpt is always duplicate from other sites. Why does this work so well? Will this be a better strategy for us to rank well?
Algorithm Updates | | mrdjdevil0 -
Time taken for Google Algorithm updates to show affect in Middle East?
Hello everyone, Just a quick question. Can anyone give me a safe estimate of how much time it could take for a Google Algorithm Update to show its effect in the Middle East after roll out? Maybe you guys can direct me to a post to read through and learn more about it myself. Your input will be highly appreciated. Regards, Talha
Algorithm Updates | | MTalhaImtiaz0 -
Site refuses to improve rankings. Can someone else put a set of eyes on this for me and see what I am missing?
Hello! We've been successful with over 40 clients and getting them to great results in our industry, insurance. We recently acquired a new client who had an existing website with prior SEO results a very spammy blog and many spammy links. We've removed many of the blog articles and links using the Google Disavow Tool We've been monitoring this site in a campaign on Moz, but we're seeing zero improvement week to week. Can someone put another set of eyes on this and see if we're simply just missing something? Results for all 30 of our tracked keywords, zero are in the top 50! I would guess this was an algorithm penalty, but it has been 3 months now since we've made the changes and nothing is changing... not even a little bit! Any help/suggestions would be GREATLY appreciated. Thank you and enjoy Labor Day weekend!
Algorithm Updates | | Tosten0 -
How can I use Intuit without getting duplicate content issues
All of my Intuit site show duplicate content on the index pages. How can I avoid this
Algorithm Updates | | onestrohm0 -
Seo results are down. Is my "all in one seo pack" to blame?
My website www.noobtraveler.com has shown a dip of 40% since Penguin's last update in November. I also transferred hosting at time, but I was wondering if I'm over optimizing with the all in one seo pack. I would appreciate it if someone could do a quick sweep and share their thoughts. Thanks!
Algorithm Updates | | Noobtraveler0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Site-wide Footer Link on Client/Friend Website - Dangerous?
Hi Guys, I've got a friend / client / business associate who's website I helped develop. It's a three letter dot-com, so good trust, and an eCommerce site, so lot's of pages. When I launched my new site about 6 weeks ago I put "Official IT Partner of MySite.com" in the footer. No keywords in the anchor text, just the domain URL... There are no other external links like that on the site whatsoever, and I haven't been hit by Penguin. I'm ranking well for local targeted keywords a few weeks after launch, and traffic continues to increase... I am worried that Google will see this is unnatural, but I've received no warning or experienced any decline in rankings. There's about 2800 pages linking from the site to my site, all in the footer of course. Would it be better to remove the link from the footer and add it just to the home page and a couple of other high authority pages, or should I leave it be. It's not "unnatural", I am affiliated with the site and work in partnership with the site, but it does fit that profile. I'm thinking about removing the footer link and adding a small graphic on the home page of the linking site which links to my root domain, with a couple of broad keyword anchored links in a description underneath that also link to relevant pages on my site... What do you think? 2800 links w/ my URL as anchor text from high Domain Authority / Low Page Authority pages (the homepage and a few other pages have decent authority) to my root domain OR Three different links from one High DA/ High PA homepage (one image alt, two anchored w/ broad keywords) to three different pages on my site. Again, there are no other site-wide external links on the domain, and I'm pretty sure I escaped the Penguin. Looking forward to hearing the different points of view. Thanks, Anthony
Algorithm Updates | | Anthony_NorthSEO2