Does adding lots of new content on a site at one time actually hurt you?
-
When speaking with a client today, he made the comment that he didn't want all of the new content we'd been working to be added to the site all at once for fear that he would get penalized for flooding the site with new content. I don't have any strong data to confirm or refute the claim, is there any truth to it?
-
I agree with all colleagues above, I cant see how your web site will be penalised due to lots of pages uploaded at the same time.
However Adding Too Many Pages Too Quickly May Flag A Site To Be Reviewed Manually. This means thought that you will add hundreds of thousand of link a night. Here is the related via by Matt Cutts:
Hope you find this useful!
-
It is a real estate site and the content is a directory of the various condos available in their community. The pages are all unique and have real valuable content, so I don't think there will be any issues with content quality.
There is new content and blogging that occurs regularly on the site. I think that the client's concern comes from some old concepts that if we're only adding content infrequently, but in mass, that it may be seen as spammy.
-
I agree with Jesse. Earlier this year we added a new data-driven section to our website that included (believe it or not) 83,000 pages, all unique in content since the information is highly technical in nature. No associated penalties have resulted from this.
-
I agree with Jesse for the most part. I think the key is: what kind of content we are talking about? Adding tons of low-value, thin content pages to a site all at once (or even gradually) is probably going to diminish the authority of existing content. I do think that adding thousands of pages that have no page authority to a site that contains pages with a decent amount of authority could, theoretically, dilute the authority of the existing pages depending on site architecture, internal linking and the ratio of existing pages versus new pages. However, I would expect this to be only temporary, and if the new content is great quality, should be nothing to worry about long term.
-
Thanks Jesse, that was my thought exactly. If anything, I see incrementally adding the content as a negative thing, since it will lead to a less than complete user experience.
-
No truth to that whatsoever. That's weird paranoia.
If there was some sort of problem WITH the content, maybe. But there would be no penalty for all new content added.
I've done total site overhauls plenty of times and they get indexed quick with no penalties.. (although I will say the speed of this seems to be in flux, but I digress.)
Don't let the client worry about this. Think about any website that initially launches: why would Google penalize that?
Hope this helps. Paranoia is often the toughest challenge when it comes to dealing with clients/site owners.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is good for SEO update blog post dates after update post content
Hello I am updating some posts of my Blog, adding new and fresh content and rewriting some of the existing. After doing that I am thinking to update de post publishing so that I appears on front page of the blog and user can read ir again. But I don't know if it is good for google to change the publishing date of the post that he had indexed 5 years ago. Also I don't know if google will read it again if it is old and see the new changes in order to improve it in search results
Algorithm Updates | | maestrosonrisas0 -
Agonizing over Meta length or content seems to make no sense as Google seems to be ignoring them!
Real frustrating for me to see Google ignoring my 'Meta Descriptions' and 'mining' my site for any description it chooses. For years my meta has always been displayed and was set up with best practices according to MOZ. My site snopro.co.nz and snopro.co.nz/wanaka-ski-hire have plenty of competition in the market but we are the only ones with a huge point of difference, we are web based only, and deliver the ski rental gear. My quality meta was a way I could control the text and use for a good CTR due to offering something unique in the 'Meta' (Rental Delivery). Seems the only way I can 'control' any text is with 'Adwords' ...funny that! Any others out there finding the same? Justin. BTW my meta is - 'Snopro Ski Rental Delivery Wanaka. We deliver & custom fit ski hire in the comfort of your accommodation. Hassle Free. Multi-day save 10%. Book here'
Algorithm Updates | | judsta0 -
Number or percentage of new visitors impact Google rankings?
Hi all, Does the number/percentage of new visitors (from different IPs and countries) impact the Google rankings? If there are more number of new visitors, will Google favours the website in rankings considering the fact that new visitors are better than returning/same visitors? Thanks
Algorithm Updates | | vtmoz0 -
Can we ignore "broken links" without redirecting to "new pages"?
Let's say we have reaplced www.website.com/page1 with www.website.com/page2. Do we need to redirect page1 to page2 even page1 doesn't have any back-links? If it's not a replacement, can we ignore a "lost page"? Many websites loose hundreds of pages periodically. What's Google's stand on this. If a website has replaced or lost hundreds of links without reclaiming old links by redirection, will that hurts?
Algorithm Updates | | vtmoz0 -
Creating Content for Semantic search?
Need some good examples of semantic search friendly content. I have been doing a lot of reading on the subject, but have seen no real good examples of 'this is one way to structure it'. Lots of reading on the topic from an overall satellite perspective, but no clear cut examples I could find of "this is the way the pieces should be put together in a piece of content and this is the most affective ways to accomplish it". **What I know: ** -It needs to answer a question that precludes the 'keyword being used' -It needs to or should be connected to authorship for someone in that topic industry -It should incorporate various social media sources as reference to the topic -It should link out to authoritative resources on the topic -It should use some structured data markup Here is a great resource on the important semantic search pieces: http://www.seoskeptic.com/semantic-seo-making-shift-strings-things/ ,but I want to move past the research into creating the content that will make the connections needed to get the content to rank. I know Storify is an excellent medium to accomplish this off page, but only gives no follow attribution to the topic creator and links their in. I am not a coder, but a marketer and creating the backend markup will really take me out of my wheel house. I don't want to spend all of my time flailing with code when I should be creating compelling semantic content. Any helpful examples or resources welcome. Thanks in advance.
Algorithm Updates | | photoseo10 -
Am I doing enough to rid duplicate content?
I'm in the middle of a massive cleanup effort of old duplicate content on my site, but trying to make sure I'm doing enough. My main concern now is a large group of landing pages. For example: http://www.boxerproperty.com/lease-office-space/office-space/dallas http://www.boxerproperty.com/lease-office-space/executive-suites/dallas http://www.boxerproperty.com/lease-office-space/medical-space/dallas And these are just the tip of the iceberg. For now, I've put canonical tags on each sub-page to direct to the main market page (the second two both point to the first, http://www.boxerproperty.com/lease-office-space/office-space/dallas for example). However this situation is in many other cities as well, and each has a main page like the first one above. For instance: http://www.boxerproperty.com/lease-office-space/office-space/atlanta http://www.boxerproperty.com/lease-office-space/office-space/chicago http://www.boxerproperty.com/lease-office-space/office-space/houston Obviously the previous SEO was pretty heavy-handed with all of these, but my question for now is should I even bother with canonical tags for all of the sub-pages to the main pages (medical-space or executive-suites to office-space), or is the presence of all these pages problematic in itself? In other words, should http://www.boxerproperty.com/lease-office-space/office-space/chicago and http://www.boxerproperty.com/lease-office-space/office-space/houston and all the others have canonical tags pointing to just one page, or should a lot of these simply be deleted? I'm continually finding more and more sub-pages that have used the same template, so I'm just not sure the best way to handle all of them. Looking back historically in Analytics, it appears many of these did drive significant organic traffic in the past, so I'm going to have a tough time justifying deleting a lot of them. Any advice?
Algorithm Updates | | BoxerPropertyHouston0 -
301 Or Canonical, Which one is more effective for eCommerce Website ?
I have my own eCommerce website. I want to avoid duplicate category pages so which method is more useful 301 redirection or Canonical url?
Algorithm Updates | | yuvastyle0 -
Does my overly dynamic website hurt my SEO?
I have heard from a couple of people that my overly dynamic URL's hurt my SEO tremendously. Can anyone verify that? Of course my provider says it doesn't matter but I take what they say with a grain of salt. Another thing, my web crawls show a TON of errors for duplicate page title and overly dynamic url and duplicate page content. How big of a deal is this? http://www.nvclothing.com
Algorithm Updates | | sviohl0