Content update on 24hr schedule
-
Hello!
I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page.
-
Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine)
-
Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask)
-
On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.)
-
Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page?
Thank you all mozzers!
-
-
When you say 1300 landing pages are coming online every night that doesn't mean 1300 new pages are being created does it? Based on the rest of your comment I'm taking it to mean that 1300 pages, which were already live and accessible to Google, are being updated and the content is changing if appropriate.
In terms of the specific situation I describe above, that should be fine - there shouldn't be a problem with having a system for keeping your site up-to-date. However, each of the below things, if true, would be a problem;
-
You are adding 1300 new pages to your site every night
-
This would be a huge increase for most sites, particularly if it was happening every night, but as I say above I don't think this is the case
-
You are actually scraping key information to include on your site
-
You mention an API so it may be that users are submitting this content to your site for you to use but if you are scraping the descriptions from some sites, and reviews from others that is what would be viewed as spammy and it seems like the biggest point of risk I've seen in this thread.
-
-
Something else occurred to me. So, our api rewrites EVERYTHING every night. So technically 1300 landing pages are coming online EVERY night, and the content isn't really changing. If that a problem?
To sorta explain, this is a review site for other websites/apps. Our API scrapes the description from the app/site, as well as ratings from app stores etc and then publishes that onto our page. So, generally the content isnt really changing, its just updating. Thoughts on that?
-
Thank you!!! Thats great info.
-
Hi,
As said below by Robin... I'm suggesting you think about the frequency that would be better for users/readers/clients. In the end, Google is another reader.
Hope it helps.
Best luck.
GR -
Hi, I think you've already got a couple of good answers here but just to throw in my thoughts; to me this would all come down to how much value you're getting for the volume of content you're creating.
It sounds to me like you have 1.3k product landing pages, and you're producing 80 articles a month, plus maybe you're indexing the review pages too?
I think frequency here becomes secondary to how much each of these things are adding. If you are indexing the reviews pages for specific products, those pages could just be diluting your site equity. Unless they are performing a valuable function I'd consider canonicalising them to the product pages. As the others have said, having product pages that regularly update with new reviews shouldn't be a problem but with all the content you're adding to the site you could be relying on Google indexing these changes far more quickly than it actually is.
If you're adding a large number of articles every month - are those articles cannibalising other pages, or each other? The way I'd try to gauge if it's too much is whether the pages are getting traffic, whether you're having a lot of flip-flopping in the keywords you're targeting, and whether you're starting to get issues with Google indexing all of your pages. Similar to the review pages, if the articles are providing value to your readers, getting you links or getting you a decent amount of traffic then grand, if they aren't generating much I'd consider producing less or removing/redirecting poorly performing articles after a little while to preserve site equity and help focus Google's crawl.
On the note of posting frequency, I would agree with Gaston that it's about what's right for your readers. If a lot of article-worthy content comes out at the same time, I'd post about it then and there, if this is just content you're coming up with and adding and timing doesn't matter, spreading it throughout the month makes sense in terms of staying fresh, getting the articles indexed, and honestly not having to rush deadlines/delay release.
-
Yeah so basically we are bumping up all the static content on our review pages. The reviews are updating daily. And to clarify when you say "wouldn't work in your favor" you mean we aren't getting any benefit from the content, it isn't negatively impacting us correct?
-
Thank you very much! Can you clarify number 3?
-
1. No, not really. It mostly depends on the percentage of content that isn't yours and can be viewed somewhere else. If reviews are 90% of the page and they're original content from another site that won't work in your favor though. But in this case, I'm assuming you're working around that.
2. No.
3. I would say No.
4. It depends, as long as you're not creating duplicate content at scale you should be fine.
-
Hi there!
- No, at all. There is no issue there, as long as changes do make sense.
- Noup, there is no such thing as "too much content".
- Think to Google other of your readers/clients. Wich frequency would be better for them?
- No, there aren't any negatives as long as you keep the content coherent and don't create duplicate content
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
What should I do if same content ranked twice or more on Google?
I have a Bangla SEO related blog where I have written article like "Domain Selection" "SEO Tools" "MOZ" etc. All the article has been written in Bengali language. I have used wp tag for every post. I have submit xml site map generated by Yoast SEO. However I kept "no index" for category. I know well duplicate content is a major problem for SEO. After publishing my content Google ranked them on 1st page. But my fear is that most of the content twice or more. The keywords are ranked by post, wp post tag and Archive. Now I have a fear of penalty. Please check the screenshot and please suggest me what to do. uRCHf yq7m2 rSLKFLG
Intermediate & Advanced SEO | | AccessTechBD0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Sitemap and content question
This is our primary sitemap https://www.samhillbands.com/sitemaps/sitemap.xml We have a about 750 location based URL's that aren't currently linked anywhere on the site. https://www.samhillbands.com/sitemaps/locations.xml Google is indexing most of the URL because we submitted the locations sitemap directly for indexing. Thoughts on that? Should we just create a page that contains all of the location links and make it live on the site? Should we remove the locations sitemap from separate indexing...because of duplicate content? # Sitemap Type Processed Issues Items Submitted Indexed --- --- --- --- --- --- --- --- --- 1 /sitemaps/locations.xml Sitemap May 10, 2016 - Web 771 648 2 /sitemaps/sitemap.xml Sitemap index May 8, 2016 - Web 862 730
Intermediate & Advanced SEO | | brianvest0 -
Can videos be considered duplicate content?
I have a page that ranks 5 and to get a rich snippet I'm thinking of adding a relevant video to the page. Thing is, the video is already on another page which ranks for this keyword... but only at position 20. As it happens the page the video is on is the more important page for other keywords, so I won't remove it. Will having the same video on two pages be considered a duplicate?
Intermediate & Advanced SEO | | Brocberry0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0 -
How often to resubmit updated sitemap?
Hello Forum, I am working with an eCommerce website (not huge, a site with ~300 products and a blog that is updated every few days) that occasionally adds new products and may make a few large edits per week, if that. Our CMS can automatically generate and submit our sitemap. Over what time interval is should we do this for? Daily, weekly, monthly, etc? Thanks!
Intermediate & Advanced SEO | | pano0 -
Login Page = Duplicate content?
I am having a problem with duplicate content with my log in page QuickLearn Online Anytime - Log-in
Intermediate & Advanced SEO | | QuickLearnTraining
http://www.quicklearn.com/maven/login.aspx
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BAM-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BRE-SP
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTAF
QuickLearn Online Anytime - Log-in
http://www.quicklearn.com/maven/login.aspx?ReturnUrl=/maven/purchase.aspx?id=BTDF What is the best way to handle it? Add a couple sentences to each page to make it unique? Use a rel canonical, or a no index no follow or something completely different? Your help is greatly appreciated!0