How to Implement Massive SEO Modifications
-
Hi everyone,
I'm implementing some fairly significant changes on a clients website and wanted to know if it was better to implement all the changes at once or if I should implement the changes gradually.
The changes are:
1. Amended information architecture
2. Completely new URL's
3. New meta data and some new on page content
4. Meta robots 'no index, follow' approximately 90% of the site
Can I make all these changes in one go (that would be my preference), or should I gradually implement? What are the risks?
Many thanks
James
-
Hi Joe,
Thanks for the response. Having had a variety of different opinions, and still not being 100% on the right answer, I spent a LOT of time crawling through SEOmoz Q&A:
Takeaways from my digging around are:
- Changes to title tags and URL's should be implemented separately. As you state above, reason for this is so that you can pinpoint problems if they arise (see point 3 of the answer) http://www.seomoz.org/qa/view/49136/revising-urls
- Title tag changes should also be implemented in stages. Homepage, top 50 pages, everything else (again, see point 3 of the answer): http://www.seomoz.org/qa/view/39946/title-tags-global-changes. (As an interesting aside, Dr Pete clearly states that when making sitewide changes, dont make more than one set of changes per page, it could cause an over-optimisation penalty)
- URL structure changes should be implemented all in one go: http://www.seomoz.org/qa/view/45183/update-url-structure (this link is an amazing guide from everett sizemore on exactly how to implement URL changes, recommended reading!)
I appreciate there's no right and wrong answer, but I think that with the above in mind, the approach I'm going to take to these changes is a scientific one. Make a change, assess results, move forward.
1. Implement title tag changes in stages (monitoring site performance at every stage). Homepage/Category Pages/Everything else.
2.Add new on page content.
3. Add new information architecture (couple of new categories- nothing significant)
4. Implement URL changes through 301 redirects all in one go. Keep old site XML sitemap in place. Once site has been crawled (and new pages found) move to new sitemap and update internal links
4. Implement meta robots 'noindex, follow' to various sections of the site. Not all in one go, but section by section, monitoring results and then moving on if no issues arise
Would be interested to know what you think of that as a plan? Also, need to send out love to Dr Pete and Everett Sizemore for their Q&A answers!
James
-
#2 - completely new url's says it all for me. The others are all subs of that change. If possible you need to address these changes in some form of 301 redirect so that the spiders can follow your changes. update the .htaccess file or even create static php redirect headers or similar if you have to. This should prevent the search engines from reporting the dreaded 404 and getting the page dumped.
#4 - The no-index is not something you have to worry about as you are removing pages from the SERPs, not trying to get them ranked. Any page that is getting no-index is out of the SEO equation at this point.
#3 - this will improve rankings/search-ability so you are not looking at seeing a negative effect here. Updates on these pages, if done correctly, generally have favorable results and at the worst have 'no change' in the SERPs
#1 - I would need to know more detail on this one, but the new architecture will probably be reflected in #2's urls so if that it solved, so is number 1. Again, a more clear, easily accessible architecture hopefully allows the spiders to effectively categorize the sections of your site. The new IA will probably be more pleasing to users which will have its own benefits as well.
---- And the final vote... All at once, just address the 404's and you should be ok
-
NoIndex 90% of a site? Interested to hear why that makes sense in any situation. Maybe only implement have of those noindex tags at first to see if you get the desired result.
As for the title, meta and content, all at once is fine. Hopefully your new stuff is better than the old! Best of luck!
-
I came here to tell him to do the exact opposite! I was going to suggest doing one change at a time to measure and or A/B test results to make sure maximum benefit of each was given. After reading your response and his issues, i've changed my opinion and agree with you that its probably best to do all of these at once in one MAJOR revision and then tweak after that.
-
Considering how massive the changes are, I'd say it's best to do them all at once. This will let you start rebuilding as soon as possible. Making one big change and then waiting to start ranking again, followed by another big change that could drop them out of the rankings again would likely cause a longer period of your client not getting traffic. I wouldn't say that the on-page and metadata changes need to be made at the same time, if there are limited resources.
One problem with doing this all at once is that it will be more difficulty to evaluate the effect of each change. This might not be a huge deal to you, but sometimes it is nice to know what return came from each change.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPS - implementation question
Hello, I am looking at a site on which they haven't 301'd http to https, so each URL is there whether you have http or https at the beginning. Why would a site owner not 301 to https? Is there any logical reason not to use 301? This particular website is simply using a canonical tag to point to the https version of each URL.
Intermediate & Advanced SEO | | McTaggart0 -
Keyword in URL - SEO impact
Hi, We don't have most important keyword of our industry in our domain or sub-domain. How important it is to have keyword in website URL? Most of our competitors pages with "keyword" urls been listing in SERP. What is back-links role in this scenarion? And which URL have more advantage? keyword in sub-domain or page with keyword. Like for "seo" keyword..... seo.example.com or example.com/seo
Intermediate & Advanced SEO | | vtmoz0 -
Technical SEO Issues - Traffic Drop
Hi guys, I hope you're all doing well! We're a small personalised gifts company who specialise in the provision of phone cases, mugs, macbook covers and the like. I head up the Digital Marketing but have little experience in the technical side of SEO and have very limited resources in terms of budget and staffing. Over the past few months, I've been working on stripping down the thin content on the site, fixing duplicate content issues and focusing on other digital channels to boost revenue. However, as of recent we've noticed a significant drop in traffic and our rankings. I've tried to diagnose the problem and I'm convinced there are some technical SEO fixes that need to be implemented. Our website is www.mrnutcase.com If any of you have any ideas, I'd love to hear some of them. Greatly appreciated, Danny
Intermediate & Advanced SEO | | DannyNutcase0 -
Multilingual SEO subdirectories structure
Hi, I have to optimize a domain for Google for 3 languages (.com with subdirectories)(Dutch, German and English) content is only served on domain.com/nl, .com/de and .com/en NO CONTENT is served on domain.com.
Intermediate & Advanced SEO | | bmcinternetmarketing
How do I exclude domain.com from getting in Google? Because there is no content on the top level only on subdirectories. Is there a rule we have to add to htaccess? Or Robots.txt by disallow all and next lines allow /nl, allow /de and allow /en? Thanks a lot! Kind regards, Alain Nijholt0 -
Reviews and Other Content in Tabs and SEO
Hello, We are redesigning our product page and have considered putting our customer reviews in a 'tab' on the page, so it is not visible to the user until they click on the tab. Are there any SEO implications of this? Right now, we do have problems with this because we use a third party tool for our reviews and they are in javascript, so they do not get crawled, but going forward we will be using our native platform. We want the text of the reviews to get crawled and indexed. Thanks.
Intermediate & Advanced SEO | | Colbys0 -
SEO: Subdomain vs folders
Hello, here's our situation:We have an ecommerce website, say www.example.com. For support, we use zendesk which offers a great platform that makes publishing FAQ and various resource articles very easy. We're torn between publishing these articles on our actual website, or publishing them via Zendesk. If we publish them on our website, the url would be something like:
Intermediate & Advanced SEO | | yacpro13
www.example.com/articles/title_article.html On the other hand, if we publish them via zendesk, the url would look like:
support.example.com/articles/title_of_article We would like to publish them via Zendesk, however, we do no want to miss out on any SEO benefit, however marginal it may be. Doing it this way, the domain would have all of the ecommerce pages (product and category pages), and the subdomain would have ALL other types of pages (help articles and also policies, such as return policies, shipping info, etc etc). I know a long time ago, folders were preferred over subdomains for SEO, but things change all the time.Do you think setting up all the ecommerce pages on the domain, and all content pages on the subdomain, would be a lesser solution? Thanks.0 -
Is CloudFlare bad for SEO?
I have been hit by DDoS attacks lately...not on a huge scale, but probably done by some "script kiddies" or competitors of mine. Still, I need to take some action in order to protect my server and my site against all of this spam traffic that is being sent to it. In the process of researching the tools available for defending a website from a DDoS attack, I came across the service offered by CloudFlare.com. According to the CloudFlare website, they protect your site against a DDoS attack by showing users/visitors they find suspicious an interstitial that asks them if they are a real user or a bot...this interstitial contains a Captcha that suspicious users are asked to enter in order to visit the site. I'm just wondering what kind of an effect such an interstitial could have on my Google rankings...I can imagine that such a thing could add to increased click-backs to the SERPs and, if Google detects this, to lower rankings. Has anyone had experience with the DDoS protection services offered by CloudFlare, who can say a word or two regarding any effects this may have on SEO? Thanks
Intermediate & Advanced SEO | | masterfish1 -
Effects on SEO with CDN
Should we be concerned about any adverse consequences to our site's SEO value when moving the site's assets (javascript files and css files) to a CDN (Akamai)?
Intermediate & Advanced SEO | | Volusion.com0