Guidance for setting up new 301s after having just done so (
-
Hi
I've recently set up a load of 301 redirects for a clients new site design/structure relaunch
One of the things we have done is take the kw out of the sub-category landing page url's since they now feature in the top level category page urls and don't want to risk over-optimisation by having kw repeats across the full urls. So the urls have changed and the original pages 301'd to the new current pages.
However If rankings start to drop & i decide to change urls again to include kw in final part of url too for the sub category landing pages, whats best way to manage the new redirects ?
-
Do i redirect the current urls (which have only been live for a week and have the original/old urls 301'd to them) to the new url's ? (worried this would create a chain of 301's which ive heard is not ideal)
-
Or just redirect the original urls to the new ones, and can forget about the current pages/url's since only been live for a week ?
(I presume best not since GWT sitemaps area says most new urls indexed now so I presume sees those as the original pages replacement now) -
Or should they all be 301'd (original urls and current urls to the new) ?
-
Or best to just run with current set up and avoid making too many changes again, and setting up even more 301's after having just done so ?
Many Thanks
Dan
-
-
Hi Aleyda
Sorry 1 more question:
Ive noticed that the new pages have started to rank well already.
In this case is it still advisable to 'fetch as google' in GWT the old redirected urls or, since the new pages have been found and ranking well, best to not fetch the old urls ?
Best Rgds
Dan
-
ok great many thanks Aleyda !!
-
You shouldn't remove them. If you implement the 301-redirects and do everything what's described before, although it might take some time, they should be eliminated from the index while referring their value with the redirects.
-
Great thanks Aleyda
I have already done most of those things except for fetching as googlebot in gwt, which ill do next.
Please confirm then that theres no need to remove the old urls in GWT at any point and they should just 'fade away' ?
many thanks
dan
-
Hi Dan,
Despite it's not a domain migration there are a lot of aspects that apply in this situation too! For example:
- You should monitor is that up update your internal (menu, content from your own site) and external links (at least the most important) towards the old URLs and target to the new ones.
- Make sure that the 301-redirects have been correctly implemented (in a URL to URL basis) to the relevant pages and make sure the new ones are still relevant and optimized for the same keywords than the previous ones.
- Generate a new XML sitemap with your new URLs and resubmit them in GWT.
- Resubmit your Old URLs with the fetch as Googlebot option in GWT to make sure Google knows about the update and identifies the 301-redirects.
- Monitor the (organic search and general) traffic and visibility (with GA and GWT) that the old URLs still receive vs. the new ones. If everything has been correctly implemented then there should be a trend that goes down with your old URLs and one that goes up with your new ones. If not, check which ones are still receiving a high amount of traffic and why: if they're still linked from an external source and hasn't been correctly redirected, etc.
Follow the previous recommendations and you should be able to minimize the effects of the changes.
Thanks,
Aleyda
-
Many Thanks Aleyda for your detailed reply
In this case though its not a site migration just a redesign but thanks for your great info ill use for future reference when tackling any migrations
In the case of a redesign where the urls have been changed (and hence why set up the 301's) would you recommend after aproximately 2 weeks that you should 'fetch as google' in GWT the old page urls and then few days after that remove the old urls ? (or should they just disapear eventually ?)
Many Thanks
Dan
-
Hi again Dan,
I would recommend that you first follow this checklist so you make sure your whole migration is SEO friendly and that you're not forgetting anything in the migration process and that you validate everything well.
The best process is the one the most straightforward one: do the migration as simple as possible (from one domain to the other with 301 redirects) and avoid changing many things at the same time. Please keep in mind that redirects will help but is not the only thing you should do, take a look at the checklist and you'll see.
It's actually natural that in the following days after the migration your rankings drop, search engines need time to crawl and index again the URLs in the new site, identify that you have moved your content from the old domain to the new one and with the help of the 301-redirects.
Among the additional things you should do to have an SEO friendly migration that you'll see in the checklist are that you should notify Google that you've moving your domain through Google Webmaster Tools (there's an option for this), create a new profile and generate a sitemap for the new domain, verify that the 301-redirects are all well implemented, that the content relevance is also kept in those pages for the keywords they were ranking with, that external and internal links going to the old urls are updated and now go to the new ones, prioritizing those landing pages that had the most highly relevant traffic, trying to change the minimum from the structure -keep one change at a time to be able to identify any issue-, check and follow-up on crawling errors). If you do everything what is recommended on the checklist little by little you should see how you start regaining your rankings again, but you need to give time (usually at least a couple of months if everything goes right).
So, there are a lot of other factors and I will make sure to follow the recommendations to minimize the negative impact, although is impossible of not having it. What is definitely not recommended is that you are updating many times your URLs or that you redirect many times, this is why this process should be effectively planned and designed validating all the aspects from the beginning and making sure that the more relevance signals you can keep (like the keywords in the appropriate elements of the pages) the best it is and making one change at a time makes things much more simple (first migrate, then one the effects of the migration have gone you can start optimizing and changing the new domain architecture) otherwise it can get really messy and this is what you're already experiencing I'm afraid.
I hope this helps! Aleyda
-
301 chains aren't ideal but, as long as they result in a real page, it's OK. The danger is you get a bad 301 in there which results in an infinite loop (A points to B points to C points to A).
I would avoid changing URLs on a frequent basis. A 301 is chaos for pages and results. Sometimes a page will inherit all of its predecessor's rank in a timely fashion but I've also seen where it takes time. You should expect some rankings drop after a URL structure change. It should recover in time but, again, there's no telling how fast that happens. As long as you can afford a short term drop, I say let it ride.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New pages need to be crawled & indexed
Hi there, When you add pages to a site, do you need to re-generate an XML site map and re-submit to Google/Bing? I see the option in Google Webmaster Tools under the "fetch as Google tool" to submit individual pages for indexing, which I am doing right now. Thanks,
Technical SEO | | SSFCU
Sarah0 -
Redirecting old Sitemaps to a new XML
I've discovered a ton of 404s from Google's WMT crawler looking for mydomain.com/sitemap_archive_MONTH_YEAR. There are tons of these monthly archive xmls. I've used a plugin that for some reason created individual monthly archive xml sitemaps and now I get 404s. Creating rules for each archive seems a bad solution. My current sitemap plugin creates a single clean one mydomain.com/sitemap_index.xml. How can I create a redirect rule in the Redirection WP plugin that will redirect any URL that has the 'sitemap' and 'xml' string in it to my current xml sitemap? I've tried using a wildcard like so: mysite.com/sitemap*.*, mysite.com/sitemap ., mysite.com/sitemap(.), mysite.com/sitemap (.) but none of the wildcard uses got the general redirect to work. Is there a way to make this happen with the WP Redirection plugin? If not, is there a htaccess rule, and what would the code be for it? Im not very fluent with using general redirects in htaccess unfortunately. Thanks!
Technical SEO | | IgorMateski0 -
Responsive web design has a crawl error of redirecting to HTTP instead of HTTPS ? is this because of the new update of google that appreciates the HTTPs more?
We at yamsafer.me are using a Repsonsive web design! A crawl errors occured which redirects the hompage to an HTTP version instead of HTTPS? Any ideas on why this happened?
Technical SEO | | Yamsafer.com0 -
How do we ensure our new dynamic site gets indexed?
Just wondering if you can point me in the right direction. We're building a 'dynamically generated' website, so basically, pages don’t technically exist until the visitor types in the URL (or clicks an on page link), the pages are then created on the fly for the visitor. The major concern I’ve got is that Google won’t be able to index the site, as the pages don't exist until they're 'visited', and to top it off, they're rendered in JSPX, which makes things tricky to ensure the bots can view the content We’re going to build/submit a sitemap.xml to signpost the site for Googlebot but are there any other options/resources/best practices Mozzers could recommend for ensuring our new dynamic website gets indexed?
Technical SEO | | Hutch_e0 -
Redirecting a old aged site to a new exact match site?
Hi All, I have a question. I have 2 sites with me in the same sector and want some help. site 1 is a old site started back in 2003 and has some amount of links to it and has a pr 3 with some good links to it but doesn't rank much for any keywords for the timing. site 2 is a aged domain but newly developed with unique content and has a good amount of exact match with a .com version. so will there be any benefit by redirecting site 1 to site 2 to get the seo benefits and a start for link bulding? or is it best to develop and work on each site? the sector is health insurance. Thanks
Technical SEO | | macky71 -
No results in new campaign for 3 weeks
Hi. I have 3 campaigns in SEOmoz. 2 are working ok but one I've set up like 3 weeks ago and still have no details about it , everywhere is '0' and no results at all. Google Analytics shows links etc but in SEOmoz nothing is moving above 0. Domain is quite new, like 2 months but I should get some rsults , isn't it ? Strange In Google I was on position 16 for my keywords and now I am 89 ! Anyone know what coule be reason for no stats in SEOmoz ? Thanks
Technical SEO | | sever3d0 -
Very, very confusing behaviour with 301s. Help needed!
Hi SEOMoz gang! Been a long timer reader and hangerouter here but now i need to pick your brains. I've been working on two websites in the last few days which are showing very strange behaviour with 301 redirects. Site A This site is an ecommerce stie stocking over 900 products and 000's of motor parts. The old site was turned off in Feb 2011 when we built them a new one. The old site had terrible problems with canonical URLs where every search could/would generate a unique ID e.g. domain.com/results.aspx?product=1234. When you have 000's of products and Google can find them it is a big problem. Or was. We launche the new site and 301'd all of the old results pages over to the new product pages and deleted the old results.aspx. The results.aspx page didn't index or get shown for months. Then about two months again we found some certain conditions which would mean we wouldn't get the right 301 working so had to put the results.aspx page back in place. If it found the product, it 301'd, if it didn't it redirected to the sitemap.aspx page. We found recently that some bizarre scenerio actually caused the results.aspx page to 200 rather than 301 or 404. Problem. We found this last week after our 404 count in GWMT went up to nearly 90k. This was still odd as the results.aspx format was of the OLD site rather than the new. The old URLs should have been forgetten about after several months but started appearing again! When we saw the 404 count get so high last week, we decided to take severe action and 301 everything which hit the results.aspx page to the home page. No problem we thought. When we got into the office on Monday, most of our product pages had been dropped from the top 20 placing they had (there were nearly 400 rankings lost) and on some phrases the old results.aspx pages started to show up in there place!! Can anyone think why old pages, some of which have been 301'd over to new pages for nearly 6 months would start to rank? Even when the page didn't exist for several months? Surely if they are 301's then after a while they should start to get lost in the index? Site B This site moved domain a few weeks ago. Traffic has been lost on some phrases but this was mainly due to old blog articles not being carried forward (what i'll call noisy traffic which was picked up by accident and had bad on page stats). No major loss in traffic on this one but again bizarre errors in GWMT. This time pages which haven't been in existence for several YEARS are showing up as 404s in GWMT. The only place they are still noted anywhere is in the redirect table on our old site. The new site went live and all of the pages which were in Googles index and in OpenSiteExplorer were handled in a new 301 table. The old 301s we thought we didn't need to worry about as they had been going from old page to new page for several years and we assumed the old page had delisted. We couldn't see it anywhere in any index. So... my question here is why would some old pages which have been 301'ing for years now show up as 404s on my new domain? I've been doing SEO on and off for seven years so think i know most things about how google works but this is baffling. It seems that two different sites have failed to prevent old pages from cropping up which were 301d for either months or years. Does anyone has any thoughts as to why this might the case. Thanks in advance. Andy Adido
Technical SEO | | Adido-1053990 -
Are there any SEO implications if a page does two 301s and then a 304?
Curious to see if this is a positive or negative thing for SEO...or even perhaps, neutral. h9SZz
Technical SEO | | RodrigoStockebrand0