Content update on 24hr schedule
-
Hello!
I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page.
-
Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine)
-
Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask)
-
On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.)
-
Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page?
Thank you all mozzers!
-
-
When you say 1300 landing pages are coming online every night that doesn't mean 1300 new pages are being created does it? Based on the rest of your comment I'm taking it to mean that 1300 pages, which were already live and accessible to Google, are being updated and the content is changing if appropriate.
In terms of the specific situation I describe above, that should be fine - there shouldn't be a problem with having a system for keeping your site up-to-date. However, each of the below things, if true, would be a problem;
-
You are adding 1300 new pages to your site every night
-
This would be a huge increase for most sites, particularly if it was happening every night, but as I say above I don't think this is the case
-
You are actually scraping key information to include on your site
-
You mention an API so it may be that users are submitting this content to your site for you to use but if you are scraping the descriptions from some sites, and reviews from others that is what would be viewed as spammy and it seems like the biggest point of risk I've seen in this thread.
-
-
Something else occurred to me. So, our api rewrites EVERYTHING every night. So technically 1300 landing pages are coming online EVERY night, and the content isn't really changing. If that a problem?
To sorta explain, this is a review site for other websites/apps. Our API scrapes the description from the app/site, as well as ratings from app stores etc and then publishes that onto our page. So, generally the content isnt really changing, its just updating. Thoughts on that?
-
Thank you!!! Thats great info.
-
Hi,
As said below by Robin... I'm suggesting you think about the frequency that would be better for users/readers/clients. In the end, Google is another reader.
Hope it helps.
Best luck.
GR -
Hi, I think you've already got a couple of good answers here but just to throw in my thoughts; to me this would all come down to how much value you're getting for the volume of content you're creating.
It sounds to me like you have 1.3k product landing pages, and you're producing 80 articles a month, plus maybe you're indexing the review pages too?
I think frequency here becomes secondary to how much each of these things are adding. If you are indexing the reviews pages for specific products, those pages could just be diluting your site equity. Unless they are performing a valuable function I'd consider canonicalising them to the product pages. As the others have said, having product pages that regularly update with new reviews shouldn't be a problem but with all the content you're adding to the site you could be relying on Google indexing these changes far more quickly than it actually is.
If you're adding a large number of articles every month - are those articles cannibalising other pages, or each other? The way I'd try to gauge if it's too much is whether the pages are getting traffic, whether you're having a lot of flip-flopping in the keywords you're targeting, and whether you're starting to get issues with Google indexing all of your pages. Similar to the review pages, if the articles are providing value to your readers, getting you links or getting you a decent amount of traffic then grand, if they aren't generating much I'd consider producing less or removing/redirecting poorly performing articles after a little while to preserve site equity and help focus Google's crawl.
On the note of posting frequency, I would agree with Gaston that it's about what's right for your readers. If a lot of article-worthy content comes out at the same time, I'd post about it then and there, if this is just content you're coming up with and adding and timing doesn't matter, spreading it throughout the month makes sense in terms of staying fresh, getting the articles indexed, and honestly not having to rush deadlines/delay release.
-
Yeah so basically we are bumping up all the static content on our review pages. The reviews are updating daily. And to clarify when you say "wouldn't work in your favor" you mean we aren't getting any benefit from the content, it isn't negatively impacting us correct?
-
Thank you very much! Can you clarify number 3?
-
1. No, not really. It mostly depends on the percentage of content that isn't yours and can be viewed somewhere else. If reviews are 90% of the page and they're original content from another site that won't work in your favor though. But in this case, I'm assuming you're working around that.
2. No.
3. I would say No.
4. It depends, as long as you're not creating duplicate content at scale you should be fine.
-
Hi there!
- No, at all. There is no issue there, as long as changes do make sense.
- Noup, there is no such thing as "too much content".
- Think to Google other of your readers/clients. Wich frequency would be better for them?
- No, there aren't any negatives as long as you keep the content coherent and don't create duplicate content
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog with all copied content, should it be rewritten?
Hi, I am auditing a blog where their goal is to get approved to on ad networks but the whole blog has copied content from different sources, so no ad network is approving them. Surprisingly (at least to me), is that the blog ranks really well for a few keywords (#1's and rich snippets ), has a few hundred of natural backlinks, DA is high, has never been penalized (they have always used canonical tags to the original content), traffic is a few thousand sessions a month with mostly 85% organic search, etc. overall Google likes it enough to show them high on search. So now the owner wants to monetize it. I suggested that the best approach was to rewrite their most visited articles and deleted the rest with 301 redirects to the posts that stay. But I actually haven't worked on a similar project before and can't find precise information online so I'm looking to know if anyone has a similar experience to this. A few of my questions are: If they rewrite most of the pages and delete the rest so there is no repeated/copied content, would ad networks (eg. adsense) approve them? Assuming the new articles are at least as good quality as the current ones but with original content, is there a risk on losing DA? since pretty much it will look like a new site once they are done They have thousands of articles but only about 200 hundred get most visits, which would be the ones getting rewritten, so it should be fine to redirect the deleted ones to the remaining? Thanks for any suggestions and/or tips on this 🙂
Intermediate & Advanced SEO | | ArturoES0 -
Minimum amount of content for Ecommerce pages?
Hi Guys, Currently optimizing my e-commerce store which currently has around 100 words of content on average for each category page. Based on this study by Backlinko the more content the better: http://backlinko.com/wp-content/uploads/2016/01/02_Content-Total-Word-Count_line.png Would you say this is true for e-commerce pages, for example, a page like this: http://www.theiconic.com.au/yoga-pants/ What benefits would you receive with adding more content? Is it basically more content, leads to more potential long-tail opportunity and more organic traffic? Assuming the content is solid and not built just for SEO reasons. Cheers.
Intermediate & Advanced SEO | | seowork2140 -
Duplicate content issue with pages that have navigation
We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?
Intermediate & Advanced SEO | | roundbrix0 -
Recent Penguin Update
Hi SEOMoz, Today www.carrentalbuddy.com.au was hit pretty big by the Penguin 2.0 update (I believe). We had some pretty strong rankings for multiple search terms and we believe we have done everything by the book for Google. We can't seem to figure out why our rankings have dropped so dramatically recently and was hoping that some SEOMoz's could take a quick look to help us fix this problem. Kindest Regards, Chris
Intermediate & Advanced SEO | | kymodo0 -
How to promote good content?
Our team just finished a massive piece of content.. very similar to the SEOmoz Begginer's Guide to SEO, but for the salon/aesthetics industry. We have a beautifully designed 10 Chapter, 50-page PDF which will require an email form submission to download. Each chapter is optimized for specific phrases, and will be separate HTML pages that are publicly available... very much like how this is setup: http://www.seomoz.org/beginners-guide-to-seo My question is, what's the best way to promote this thing? Any specific examples would be ideal. I think blogger outreach would likely be the best approach, but is there any specific way that I should be doing this?.. Again a specific start-to-finish example is what I'm looking for here. (I've read almost every outreach post on moz, so no need to reference them) Anyone care to rattle off a list of ideas with accompanying examples? (even if they seem like no-brainers.. I'm all ears)
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Worldwide Stores - Duplicate Content Question
Hello, We recently added new store views for our primary domain for different countries. Our primary url: www.store.com Different Countries URLS: www.store.com/au www.store.com/uk www.store.com/nz www.store.com/es And so forth and so on. This resulted in an almost immediate rankings drop for several keywords which we feel is a result of duplicate content creation. We've thousands of pages on our primary site. We've assigned a "no follow" tags to all store views for now, and trying to roll back the changes we did. However, we've seen some stores launching in different countries with same content, but with a country specific extensions like .co.uk, .co.nz., .com.au. At this point, it appears we have three choices: 1. Remove/Change duplicate content in country specific urls/store views. 2. Launch using .co.uk, .com.au with duplicate content for now. 3. Launch using .co.uk, .com.au etc with fresh content for all stores. Please keep in mind, option 1, and 3 can get very expensive keeping hundreds of products in untested territories. Ideally, we would like test first and then scale. However, we'd like to avoid any duplicate penalties on our main domain. Thanks for your help and answers on the same.
Intermediate & Advanced SEO | | globaleyeglasses0 -
What constitutes duplicate content?
I have a website that lists various events. There is one particular event at a local swimming pool that occurs every few months -- for example, once in December 2011 and again in March 2012. It will probably happen again sometime in the future too. Each event has its own 'event' page, which includes a description of the event and other details. In the example above the only thing that changes is the date of the event, which is in an H2 tag. I'm getting this as an error in SEO Moz Pro as duplicate content. I could combine these pages, since the vast majority of the content is duplicate, but this will be a lot of work. Any suggestions on a strategy for handling this problem?
Intermediate & Advanced SEO | | ChatterBlock0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0