Content update on 24hr schedule
-
Hello!
I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page.
-
Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine)
-
Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask)
-
On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.)
-
Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page?
Thank you all mozzers!
-
-
When you say 1300 landing pages are coming online every night that doesn't mean 1300 new pages are being created does it? Based on the rest of your comment I'm taking it to mean that 1300 pages, which were already live and accessible to Google, are being updated and the content is changing if appropriate.
In terms of the specific situation I describe above, that should be fine - there shouldn't be a problem with having a system for keeping your site up-to-date. However, each of the below things, if true, would be a problem;
-
You are adding 1300 new pages to your site every night
-
This would be a huge increase for most sites, particularly if it was happening every night, but as I say above I don't think this is the case
-
You are actually scraping key information to include on your site
-
You mention an API so it may be that users are submitting this content to your site for you to use but if you are scraping the descriptions from some sites, and reviews from others that is what would be viewed as spammy and it seems like the biggest point of risk I've seen in this thread.
-
-
Something else occurred to me. So, our api rewrites EVERYTHING every night. So technically 1300 landing pages are coming online EVERY night, and the content isn't really changing. If that a problem?
To sorta explain, this is a review site for other websites/apps. Our API scrapes the description from the app/site, as well as ratings from app stores etc and then publishes that onto our page. So, generally the content isnt really changing, its just updating. Thoughts on that?
-
Thank you!!! Thats great info.
-
Hi,
As said below by Robin... I'm suggesting you think about the frequency that would be better for users/readers/clients. In the end, Google is another reader.
Hope it helps.
Best luck.
GR -
Hi, I think you've already got a couple of good answers here but just to throw in my thoughts; to me this would all come down to how much value you're getting for the volume of content you're creating.
It sounds to me like you have 1.3k product landing pages, and you're producing 80 articles a month, plus maybe you're indexing the review pages too?
I think frequency here becomes secondary to how much each of these things are adding. If you are indexing the reviews pages for specific products, those pages could just be diluting your site equity. Unless they are performing a valuable function I'd consider canonicalising them to the product pages. As the others have said, having product pages that regularly update with new reviews shouldn't be a problem but with all the content you're adding to the site you could be relying on Google indexing these changes far more quickly than it actually is.
If you're adding a large number of articles every month - are those articles cannibalising other pages, or each other? The way I'd try to gauge if it's too much is whether the pages are getting traffic, whether you're having a lot of flip-flopping in the keywords you're targeting, and whether you're starting to get issues with Google indexing all of your pages. Similar to the review pages, if the articles are providing value to your readers, getting you links or getting you a decent amount of traffic then grand, if they aren't generating much I'd consider producing less or removing/redirecting poorly performing articles after a little while to preserve site equity and help focus Google's crawl.
On the note of posting frequency, I would agree with Gaston that it's about what's right for your readers. If a lot of article-worthy content comes out at the same time, I'd post about it then and there, if this is just content you're coming up with and adding and timing doesn't matter, spreading it throughout the month makes sense in terms of staying fresh, getting the articles indexed, and honestly not having to rush deadlines/delay release.
-
Yeah so basically we are bumping up all the static content on our review pages. The reviews are updating daily. And to clarify when you say "wouldn't work in your favor" you mean we aren't getting any benefit from the content, it isn't negatively impacting us correct?
-
Thank you very much! Can you clarify number 3?
-
1. No, not really. It mostly depends on the percentage of content that isn't yours and can be viewed somewhere else. If reviews are 90% of the page and they're original content from another site that won't work in your favor though. But in this case, I'm assuming you're working around that.
2. No.
3. I would say No.
4. It depends, as long as you're not creating duplicate content at scale you should be fine.
-
Hi there!
- No, at all. There is no issue there, as long as changes do make sense.
- Noup, there is no such thing as "too much content".
- Think to Google other of your readers/clients. Wich frequency would be better for them?
- No, there aren't any negatives as long as you keep the content coherent and don't create duplicate content
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEM Rush & Duplicate content
Hi SEMRush is flagging these pages as having duplicate content, but we have rel = next etc implemented: https://www.key.co.uk/en/key/brand/bott https://www.key.co.uk/en/key/brand/bott?page=2 Or is it being flagged as they're just really similar pages?
Intermediate & Advanced SEO | | BeckyKey0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
What is the fastest way to deindex content from Google?
Yesterday we had a client discover that our staging URLs were being indexed in Google. This was due to a technical oversight from our development team (forgot to upload meta robots tags). We are trying to remove this content as quickly as possible. Are there any methods in the Google Search Console to expedite this process? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
How would you suggest finding content topics for this site?
Hello, How would you suggest finding content topics for this site: nlpca.com The end goal is signups for training seminars in San Francisco, California and Salt Lake City, Utah. In the future the seminars will move more towards life coaching trainings but right now they are mostly about NLP. NLP is a personal development field. Just looking for ideas for the process of finding topics for the most link-bait-heavy fabulous content. The owners of the site are authorities in the field. This is for both blog and article content. Thanks.
Intermediate & Advanced SEO | | BobGW0 -
Duplicate Page Content - Shopify
Moz reports that there are 1,600+ pages on my site (Sportiqe.com) that qualify as Duplicate Page Content. The website sells licensed apparel, causing shirts to go into multiple categories (ie - LA Lakers shirts would be categorized in three areas: Men's Shirts, LA Lakers Shirts and NBA Shirts)It looks like "tags" are the primary cause behind the duplicate content issues: // Collection Tags_Example: : http://www.sportiqe.com/collections/la-clippers-shirts (Preferred URL): http://www.sportiqe.com/collections/la-clippers-shirts/la-clippers (URL w/ tag): http://sportiqe.com/collections/la-clippers-shirts/la-clippers (URL w/ tag, w/o the www.): http://sportiqe.com/collections/all-products/clippers (Different collection, w/ tag and same content)// Blog Tags_Example: : http://www.sportiqe.com/blogs/sportiqe/7902801-dispatch-is-back: http://www.sportiqe.com/blogs/sportiqe/tagged/elias-fundWould it make sense to do 301 redirects for the collection tags and use the Parameter Tool in Webmaster Tools to exclude blog post tags from their crawl? Or, is there a possible solution with the rel=cannonical tag?Appreciate any insight from fellow Shopify users and the Moz community.
Intermediate & Advanced SEO | | farmiloe0 -
Duplicate content
I run about 10 sites and most of them seemed to fall foul of the penguin update and even though I have never sought inorganic links I have been frantically searching for a link based answer since April. However since asking a question here I have been pointed in another direction by one of your contributors. It seems At least 6 of my sites have duplicate content issues. If you search Google for "We have selected nearly 200 pictures of short haircuts and hair styles in 16 galleries" which is the first bit of text from the site short-hairstyles.com about 30000 results appear. I don't know where they're from nor why anyone would want to do this. I presume its automated since there is so much of it. I have decided to redo the content. So I guess (hope) at some point in the future the duplicate nature will be flushed from Google's index? But how do I prevent it happening again? It's impractical to redo the content every month or so. For example if you search for "This facility is written in Flash® to use it you need to have Flash® installed." from another of my sites that I coincidently uploaded a new page to a couple of days ago, only the duplicate content shows up not my original site. So whoever is doing this is finding new stuff on my site and getting it indexed on google before even google sees it on my site! Thanks, Ian
Intermediate & Advanced SEO | | jwdl0 -
Affiliate Site Duplicate Content Question
Hi Guys I have been un-able to find a definite answer to this on various forums, your views on this will be very valuable. I am doing a few Amazon affiliate sites and will be pulling in product data from Amazon via a Wordpress plugin. The plugin pulls in titles, descriptions, images, prices etc, however this presents a duplicate content issue and hence I can not publish the product pages with amazon descriptions. Due to the large number of products, it is not feasible to re-write all descriptions, but I plan re-write descriptions and titles for 50% of the products and publish then with “index, follow” attribute. However, for the other 50%, what would be the best way to handle them? Should I publish them as “noindex,follow”? **- Or is there another solution? Many thanks for your time.**
Intermediate & Advanced SEO | | SamBuck0