Delay release of content or fix after release
-
I am in the midst of moving my site to a new platform. As part of that I am reviewing each and every article for SEO - titles, URLs, content, formatting/structure, etc, etc. I have about 200 articles to move across and my eventual plan is to look at each article and update for these factors.
I have all the old content moved across to the new server as-is (the old server is still the one to which my domain's DNS records point). At a high level I have two choice:
- Point DNS to the new server, which will expose the same content (which isn't particularly SEO-friendly) and then work through each article, fixing the various elements to make them more user friendly.
- Go through each article, fixing content, structure, etc and THEN update DNS to point to the new server.
Obviously the second option adds time before I can switch across. I'd estimate it will take me a few weeks to get through the articles. Option 1 allows me to switch pretty soon and then start going through the articles and updating them.
An important point here is the new articles already have new (SEO-friendly) URLs and titles on the new server. I have 301 redirections in place pointing from the old to new URLs. So, it's "only" the content of each article that will be changing on the new server, rather than the URLs, etc.
So, I'd be interested in any suggestions on the best approach - move across to the new server now and then fix content or wait till all the content is done and then switch to the new server.
Thanks.
Mark
-
I would definitely at least clean up the article HTML and structure before launching the pages, since you don't want people who might land on them before they're updated to have a weird experience. As far as optimizing them for SEO, I think you could go ahead and make the pages live and roll out edits as you make them. Prioritizing the pages based on highest-traffic/best-converting first is the way to go. If switching your platform is going to make your site easier to crawl, you definitely want to do that sooner rather than later - plus, having the new pages live will allow them to start accumulating some links even before you make keyword-related changes.
In general with a major change like this I recommend changing as few other things as possible simultaneously. It's OK to make more gradual changes, and it gives Google fewer things to get used to at one time.
-
If search engines did not catch up with changes we make and improve our ranking for positive changes, there'd be little point to Search Engine Optimization.
If Google is already seeing your pages anyway and the move will only make them better (even if they are still not where you'd like them to be), then you can go ahead and move them if you like, as long as the move will not create a confusing situation for the people looking at the pages.
As you fix the pages to your satisfaction, wait for them to be crawled again or resubmit them using Fetch as Google to possibly get them crawled faster. [And as far as H2 tags, if that is your main worry, I wouldn't worry too much--they probably won't make much difference.]
-
Thank you for the response, Linda. So, this is a slightly tricky one because I don't have a specific deadline per se, but also want to build a plan that gets me over to the new server as soon as possible, without falling into a trap of the switchover date just "floating". Let me put it this way.
I have the following "phases" for each of the articles (as reminder, I have around 200 such articles):
- Create all articles: Using the planned titles, categories and URLs but with no content.
- Move content across from old site to the new articles. Done with straight cut-and-paste (don't ask about importing - long story :)). This gets the data into WordPress posts as-is, but includes HTML markup from the old CMS, doesn't correctly use styles (some articles look pretty messy) and doesn't have a consistent use of H2 tags (H1 is the title). Most articles look "OK" but a) some are messy but readable for the human eye and b) the lack of H2 tags means there's no structure from an SEO-perspective.
- Clean up article HTML/structure. Review each article, cleaning up the HTML and ensuring the content still makes sense and reads well. HTML clean-up includes removing HTML relevant to the old CMS and making sure I have article structure through use of H2 tags
- Review each article for SEO. Will be using the Yoast SEO plugin and making changes recommended. The keywords are already decided (the URLs and titles in step 1 reflect those decisions) so for each article I will be reviewing the rest of the content and making sure it looks acceptable from an SEO perspective,
I am currently done with step 2 (all articles moved across, albeit some looking somewhat untidy and without any document structure). I am starting to work through step 3 now, but this is a time-consuming process.
I guess what this all boils down to is if I switch across will search engines "catch up" later, when I revise the content for structure and SEO changes. The existing site is not good - so, as it stands, search engines don't look on the site kindly.
One option is to just bite the bullet and move across (I'd see benefits from the title and URL changes, with the associated 301 redirects in place) and subsequently do steps 3 and 4. I'd actually like to do that but ONLY if I can be confident the search engines will end up in the same place as they would if I just waited till step 4 is done.
Another option is to finish step 3, move to the new server and then start updating articles for SEO (step 4).
Thanks.
Mark
-
Why are you switching? If there is no reason to be in a rush, then I'd wait and make the change when everything is ready--a few weeks isn't that long.
If there is a particular reason for haste (like you were having technical problems with the old platform or a lot of your traffic is mobile and you want to make the April 21 Google deadline), then I think it depends on the state of the content.
If it is not perfect but still makes sense with the new titles and URLs, I'd do the update for your most important content and switch. If it is terrible, I'd wait. There is no point getting traffic for bad content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big retailers and duplicate content
Hello there! I was wondering if you guys have experience with big retailers sites fetching data via API (PDP content etc.) from another domain which is also sharing the same data with other multiple sites. If each retailer has thousands on products, optimizing PDP content (even in batches) is quite of a cumbersome task and rel="canonical" pointing to original domain will dilute the value. How would you approach this type of scenario? Looking forward to read your suggestions/experiences Thanks a lot! Best Sara
Intermediate & Advanced SEO | | SaraCoppola1 -
Sitemap and content question
This is our primary sitemap https://www.samhillbands.com/sitemaps/sitemap.xml We have a about 750 location based URL's that aren't currently linked anywhere on the site. https://www.samhillbands.com/sitemaps/locations.xml Google is indexing most of the URL because we submitted the locations sitemap directly for indexing. Thoughts on that? Should we just create a page that contains all of the location links and make it live on the site? Should we remove the locations sitemap from separate indexing...because of duplicate content? #
Intermediate & Advanced SEO | | brianvestSitemap Type Processed Issues Items Submitted Indexed --- --- --- --- --- --- --- --- --- 1 /sitemaps/locations.xml Sitemap May 10, 2016 - Web 771 648 2 /sitemaps/sitemap.xml Sitemap index May 8, 2016 - Web 862 730
0 -
Fixing A Page Google Omits In Search
Hi, I have two pages ranking for the same keyword phrase. Unfortunately, the wrong page is ranking higher, and the other page, only ranks when you include the omitted results. When you have a page that only shows when its omitted, is that because the content is too similar in google's eyes? Could there be any other possible reason? The content really shouldn't be flagged as duplicate, but if this is the only reason, I can change it around some more. I'm just trying to figure out the root cause before I start messing with anything. Here are the two links, if that's necessary. http://www.kempruge.com/personal-injury/ http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ Best, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
How to promote good content?
Our team just finished a massive piece of content.. very similar to the SEOmoz Begginer's Guide to SEO, but for the salon/aesthetics industry. We have a beautifully designed 10 Chapter, 50-page PDF which will require an email form submission to download. Each chapter is optimized for specific phrases, and will be separate HTML pages that are publicly available... very much like how this is setup: http://www.seomoz.org/beginners-guide-to-seo My question is, what's the best way to promote this thing? Any specific examples would be ideal. I think blogger outreach would likely be the best approach, but is there any specific way that I should be doing this?.. Again a specific start-to-finish example is what I'm looking for here. (I've read almost every outreach post on moz, so no need to reference them) Anyone care to rattle off a list of ideas with accompanying examples? (even if they seem like no-brainers.. I'm all ears)
Intermediate & Advanced SEO | | ATMOSMarketing560 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Image and Content Management
My boss has decided that on our new website we are building, that he wants all content and images managed by not allowing copying content and/or saving images. Some of the information and images is proprietary, yet most is available for public viewing, but never the less, he wants it prohibited from copy and/or saving. We would still want to keep the content indexable and use appropriate alt tags etc... I wanted to find out if there is any SEO reason and facts to why this would not be a good idea?Would implementing code to prohibit (or at least make it difficult) to save images and copy content, penalize us?
Intermediate & Advanced SEO | | KJ-Rodgers0