Does adding lots of new content on a site at one time actually hurt you?
-
When speaking with a client today, he made the comment that he didn't want all of the new content we'd been working to be added to the site all at once for fear that he would get penalized for flooding the site with new content. I don't have any strong data to confirm or refute the claim, is there any truth to it?
-
I agree with all colleagues above, I cant see how your web site will be penalised due to lots of pages uploaded at the same time.
However Adding Too Many Pages Too Quickly May Flag A Site To Be Reviewed Manually. This means thought that you will add hundreds of thousand of link a night. Here is the related via by Matt Cutts:
Hope you find this useful!
-
It is a real estate site and the content is a directory of the various condos available in their community. The pages are all unique and have real valuable content, so I don't think there will be any issues with content quality.
There is new content and blogging that occurs regularly on the site. I think that the client's concern comes from some old concepts that if we're only adding content infrequently, but in mass, that it may be seen as spammy.
-
I agree with Jesse. Earlier this year we added a new data-driven section to our website that included (believe it or not) 83,000 pages, all unique in content since the information is highly technical in nature. No associated penalties have resulted from this.
-
I agree with Jesse for the most part. I think the key is: what kind of content we are talking about? Adding tons of low-value, thin content pages to a site all at once (or even gradually) is probably going to diminish the authority of existing content. I do think that adding thousands of pages that have no page authority to a site that contains pages with a decent amount of authority could, theoretically, dilute the authority of the existing pages depending on site architecture, internal linking and the ratio of existing pages versus new pages. However, I would expect this to be only temporary, and if the new content is great quality, should be nothing to worry about long term.
-
Thanks Jesse, that was my thought exactly. If anything, I see incrementally adding the content as a negative thing, since it will lead to a less than complete user experience.
-
No truth to that whatsoever. That's weird paranoia.
If there was some sort of problem WITH the content, maybe. But there would be no penalty for all new content added.
I've done total site overhauls plenty of times and they get indexed quick with no penalties.. (although I will say the speed of this seems to be in flux, but I digress.)
Don't let the client worry about this. Think about any website that initially launches: why would Google penalize that?
Hope this helps. Paranoia is often the toughest challenge when it comes to dealing with clients/site owners.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One keyword gone in Google SERPs - Fred?
I have an ecommerce site. One keyword, which I use to rank #1 for on Google years ago, I'm now completely gone from the SERP's as of a couple weeks ago. I'm scratching my head here, my other keywords don't seem to have changed much recently. Around mid-March of this year, which seems to line up with the Fred update, I noticed I went from page 3 to middle of page 1 for a few days with this keyword. It was a very happy few days. Then it slipped down and down and hovered around page 6. But as of a couple weeks ago, it's now gone. Before the Fred update, I changed a bunch of product pages within the keyword category that had duplicate content because they were kits of items arranged different ways. So instead of repeating the individual item descriptions over and over in the different kits, I changed the descriptions on the kits to links to the individual items within the kits. After the Fred update, at the end of March, I set all these kit item pages that I reduced to very thin content with just links to noindex. My theory is that the Fred update reset algorithmic penalties for a couple days as it was being introduced. So the penalty of duplicate content that I may have had was lifted since I took out the duplicate content, and I made it back to page one. Then as Fred saw I now had a new penalty of thin content, I got hit and slid back down the rankings. Now that I updated the pages that had very thin content to be noindex, do you think I'll see a return of the keyword to a higher position? Or any other theories or suggestions? I remember seeing keywords disappear and come back stronger years ago, but haven't seen anything like this in a long time.
Algorithm Updates | | head_dunce0 -
Google spitting out old data as new alerts
Am I just unlucky or are others seeing this too? I have several google alerts. For the past 6 months, google keeps sending crap along with good stuff. its a bit like their search results. There are three types of Alerts they send that I'm not impressed with. 1. Alerts that are from unintelligible splogs that take real news stories and rewrite them with unintelligible garbage that makes no sense at all. Sometimes, they serve up new alerts from the same splogs I saw several months ago, that I felt sure they would have zapped by now. 2. Old stories, that have been around for months. I just received one that was from January, from TechDirt, a big site that must get a huge amount of attention from google. 3. Irrelevant stories because they love to show how smart they are by splitting my alert keyword text into multiple words, but it gives useless results. This is the kind of stuff that crappy search engines like AltaVista used to do. Is google reverting to the childhood of search with all these changes?
Algorithm Updates | | loopyal0 -
New linkbuilding: If networks are useless, and I need high volume through a 1-man team, what's the best option?
I work for an online retailer, and we have thousands of product pages and our vertical for content is brutal -- half of them are owned by our competitors. Are there any new linkbuilding strategies that can be done through a 1-man team? I'm not talking about bots or traditional link networks. Our current strat revolves around the following: 1. Link prospecting through buzzstream tools and singular contacts 2. Finding bloggers/vloggers, sending product and having them send backlinks to our homepage level with their reviews (slow turnaround, low juice). 3. Syndicating our videos through multiple avenues. 4. Being active on social. We need to gain more authority outside of simple content building. Are there any alternatives to link networks to optimize build outs via a 1-man team? Many thanks!
Algorithm Updates | | eugeneku0 -
Host name per content
Hello everyone. I'm in charge of the website HispaZone.com in which apart from many other things we provide free program downloads in spanish in a similar way to softpedia, tucows, cnet, softonic and others. I'm not a great SEO but I try to do my best. Several months ago based on my most important competence (softonic.com and uptodown.com) I decided that I would give a host name under the domain hispazone.com for the landing page of each program download. For downloading Nero for example the landing page would be http://nero.hispazone.com and like this for the whole of our 800 program database. The thing is that after 5-6 months since that change and after many other improvements, the traffic coming from google to these downloads dropped dramatically. We thought it could have been related to Google Panda but we recently hired an SEO consultant and he says that it's because of not having the downloads under the same host name. That we lose the page authority and the link flow from the hostname http://www.hispazone.com. The SEO consultant seems to be great, very up to date with all new changes in google. We made many improvements thanks to him and I can say that I trust him with everything. But now comes the time for deciding if we move our program download landing pages back to the www.hispazone.com hostname. I would like some second opinion about this because the fact that the biggest ones in Spain like Softonic and Uptodown have a hostname for each program download when these companies invest really a lot in their SEO makes me be unsure of going back into having all under the same hostname. Thanks a lot.
Algorithm Updates | | HispaZone0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Are the latest Ranking Reports counting the new large format site links as positions?
Received my weekly ranking report this morning and noticed a specific keyword that I've been ranking in the 3rd or 4th spot has dropped a significant amount of positions. I tested the results myself and it appears the site links of the manufacturer are being counted as positions? My keyword has me in the 3rd position (although it is much lower on the physical page now because of the new format). I'm really wondering how this will affect organic listings going forward - this new format could be a game changer.
Algorithm Updates | | longbeachjamie2 -
Website "penalized" 3 times by Google
I have a website that I'm working with that has had the misfortune of gaining rankings/traffic on Google, then having the rankings/traffic removed...3 times! (Very little was changed on the site to gain or lose "favor" with Google, either.) Notes: Site is a mixture of high quality original content and duplicate content (vacation rental listings) When traffic crashes, we lose nearly all rankings and traffic (90+%) When traffic crashes, we lose all rankings sitewide, including those gained by our high quality, unique pages None of the "crash" dates appear to coincide with any Panda update dates We are working on adding unique content to our pages with duplicate content, but it's a long process and so far doesn't seem to have made any difference I'm confounded why Google keeps "changing its mind" about our site We have an XML sitemap, and Google keeps our site indexed pretty well, even when we lose our rankings Due to the drastic and sitewide loss of rankings, I'm assuming we are dealing with some sort of algorithmic penalty Timeline: Traffic steadily grows starting in Jan 2011 Traffic crashes on Feb 19, 2011. We assumed it was due to a pre-panda anti-scraper update, but don't know. Google sends traffic to our site on March 1, then none the next day On June 16th, I block part of the site using robots.txt (most of the section wasn't indexed anyway) On June 17th, Google starts ranking our site again. I thought it might be due to the robots.txt change, but I had just made the change a few hours ago, and Google wasn't even indexing the part of the site I blocked Traffic/rankings crash again on July 6th. No theory why. Site URL: http://www.floridaisbest.com Traffic Stats: Attached I know that we need more backlinks and less duplicate content, but I can't explain why our Google rankings are "on again, off again". I have never seen a site gain and lose all of its rankings/traffic so drastically multiple times, for no apparent reason. Any thoughts or ideas would be welcome. Thanks! t8IqB
Algorithm Updates | | AdamThompson0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0