Content Publishing Volume/Timing
-
I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online.
Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities).
My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release.
Are there pitfalls I should avoid in terms of pushing out so much back content at once?
-
Converting all of those articles will take time.
I would design the site architecture and template and then immediately publish each article as soon as it is ready. This will get the articles flowing out into the search engines and get the money flowing in.
-
Hi Andrew,
I would definitely avoid throwing everything at Google all at once. This won't give any article time to gain traction and severely limit your chances to share everything through social channels.
There isn't a magic timescale where you should publish this over, but if there is that much, then you should be looking at months rather than days or weeks.
Leave the season-sensitive articles until those seasons to maximise on the impact they can have.
I would also update any articles that might have out-dated information, so look at these before they go live.
-Andy
-
Personally I would release it over a decided period. This way it would seem that your content is being continuously added rather than a massive once off DUMP.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fix Duplicate Content Before Migration?
My client has 2 Wordpress sites (A and B). Each site is 20 pages, with similar site structures, and 12 of the pages on A having nearly 100% duplicate content with their counterpart on B. I am not sure to what extent A and/or B is being penalized for this. In 2 weeks (July 1) the client will execute a rebrand, renaming the business, launching C, and taking down A and B. Individual pages on A and B will be 301 redirected to their counterpart on C. C will have a similar site structure to A and B. I expect the content will be freshened a bit, but may initially be very similar to the content on A and B. I have 3 questions: Given that only 2 weeks remain before the switchover - is there any purpose in resolving the duplicate content between A and B prior to taking them down? Will 301 redirects from penalized pages on A or B actually hurt the ranking of the destination page on C? If a page on C has the same content as its predecessor on A or B, could it be penalized for that, even though the page on A or B has since been taken down and replaced with a 301 redirect?
Intermediate & Advanced SEO | | futumara0 -
What does Disallow: /french-wines/?* actually do - robots.txt
Hello Mozzers - Just wondering what this robots.txt instruction means: Disallow: /french-wines/?* Does it stop Googlebot crawling and indexing URLs in that "French Wines" folder - specifically the URLs that include a question mark? Would it stop the crawling of deeper folders - e.g. /french-wines/rhone-region/ that include a question mark in their URL? I think this has been done to block URLs containing query strings. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Best practice for expandable content
We are in the middle of having new pages added to our website. On our website we will have a information section containing various details about a product, this information will be several paragraphs long. we were wanting to show the first paragraph and have a read more button to show the rest of the content that is hidden. Whats googles view on this, is this bad for seo?
Intermediate & Advanced SEO | | Alexogilvie0 -
Duplicate Page Content - Shopify
Moz reports that there are 1,600+ pages on my site (Sportiqe.com) that qualify as Duplicate Page Content. The website sells licensed apparel, causing shirts to go into multiple categories (ie - LA Lakers shirts would be categorized in three areas: Men's Shirts, LA Lakers Shirts and NBA Shirts)It looks like "tags" are the primary cause behind the duplicate content issues: // Collection Tags_Example: : http://www.sportiqe.com/collections/la-clippers-shirts (Preferred URL): http://www.sportiqe.com/collections/la-clippers-shirts/la-clippers (URL w/ tag): http://sportiqe.com/collections/la-clippers-shirts/la-clippers (URL w/ tag, w/o the www.): http://sportiqe.com/collections/all-products/clippers (Different collection, w/ tag and same content)// Blog Tags_Example: : http://www.sportiqe.com/blogs/sportiqe/7902801-dispatch-is-back: http://www.sportiqe.com/blogs/sportiqe/tagged/elias-fundWould it make sense to do 301 redirects for the collection tags and use the Parameter Tool in Webmaster Tools to exclude blog post tags from their crawl? Or, is there a possible solution with the rel=cannonical tag?Appreciate any insight from fellow Shopify users and the Moz community.
Intermediate & Advanced SEO | | farmiloe0 -
Sitelinks in 7-pack / blended / local results
I have a client who has been ranking well in the 7-pack for local searches, for 1.5+ years. I recently noticed a competitor's Google Places link has little sitelinks attached, but my client's link doesn't have them. This makes me sad. To provide a concise question: what can I do to help my client get sitelinks along with his Google Places listing in the 7-pack / blended / local results? Some example data: My client's business is called Ambiance Dental and his website is www.mycalgarydentist.com. An example search to see what I'm talking about is "calgary family dentist". The competitor that's showing sitelinks is www.aestheticdentalstudio.ca which has a title of "Dentist in Calgary | Cosmetic Treatment in Calgary". The sitelinks you'll see are "Dr. Gordon Chee", "Links", "Dr. Alexa Geminiano". Notice that my client doesn't have the same sitelinks. Some further data: If you do a a search for "calgary aesthetic dentist" you'll see the competitor's 1-box local result (is that what it's called?) with his Google Places data and sitelinks. If you search for "calgary ambiance dentist" you'll get a similar layout SERP for my client, again with no sitelinks. My client's sitelinks: If you search for "ambiance dental calgary" you'll see that Google does offer sitelinks for his site, just not in Google Places it seems. My client's website: My client's website has the navigation coded as a list (UL) without any javascript or complicated code messing things up. The competitor's navigation is built similarly, though he has about 40 more pages in his main navigation. My client's page names are concise, which I've read helps with sitelinks, the website is coded very cleanly, the URLs of his site are clear and concise without a complicated folder structure, so it seems like we're doing everything right. I appreciate any input other mozzers can provide, and discussion on the topic. I'm sure there are others who would benefit from local sitelinks as well!
Intermediate & Advanced SEO | | Kenoshi0 -
How And/Or If To Prune Footer Links
Hi, I have a site with a site-wide footer that currently has 28 internal links.The footer terms are the terms the pages are focused on. This footer is on every page of the site (hundreds of pages). Some pages of my site have 10 or so additional links pointing to internal and external pages (besides the footer) and some pages (like the homepage) have about 50 links besides the footer. I'm going for a half dozen new terms with new pages that I would be adding to the site-wide footer. Do you think I should trim the existing footer before adding these new terms? I guess I would remove the terms that show no real hope of ever getting to page one... like pages stuck in the 40s. Or, pages I for whatever reason don't care much if they rank or not. Would trimming it to a smaller number do more to help the remaining linked pages/terms? What do you think? Thanks!
Intermediate & Advanced SEO | | 945010 -
When to delete low quality content
If 75% of a site is poor quality, but still accounts for 35% of the traffic to the site, should the content be 404ed? Or, would it be better to move it to a subdomain and set up 301 re-directs? This site was greatly affected by Panda.
Intermediate & Advanced SEO | | nicole.healthline0 -
We are changing ?page= dynamic url's to /page/ static urls. Will this hurt the progress we have made with the pages using dynamic addresses?
Question about changing url from dynamic to static to improve SEO but concern about hurting progress made so far.
Intermediate & Advanced SEO | | h3counsel0