Best practice to change the URL of all my site pages
-
Hi,
I need to change all my site pages URL as a result of moving the site into another CMS platform that has its own URL structure:
- Currently the site is highly ranked for all relevant KWs I am targeting.
- All pages have backlinks
- Content and meta data should remain exactly the same.
- The domain should stay the same
The plan is as follow:
- Set up the new site using a temporary domain name
- Copy over all content and meta data
- Set up all redirects (301)
- Update the domain name and point the live domain to the new one
- Watch closely for 404 errors and add any missing redirects
Questions:
- Any comments on the plan?
- Is there a way (the above plan or any other) to make sure ranking will not be hurt
- What entries should I add to the sitemap.xml: new pages only or new pages and the pages from the old site?
Thanks,
Guy.
-
As Ben and RDK shared, your transition plan is solid.
Is there a way (the above plan or any other) to make sure ranking will not be hurt
If you have links, there will always be some loss of DA during the transition. Approximately 90% of the link value will transfer with the 301, but there is some degradation. This can be minimized by changing any links you control to the new site. This includes social media pages, signatures, etc.
Another recommendation I make to anyone moving sites is to create a link building campaign 30 days after the site move to generate links to the new site. These links will help offset any minor loss from the move. If the move is done properly, you can be back to your old rankings or better within 60 days after the move.
What entries should I add to the sitemap.xml: new pages only or new pages and the pages from the old site?
A sitemap should include all pages on the new site you wish to be included in the index.
One final note. Since you are changing URLs during this move, think out the new URLs carefully. Try to build URLs which will last 20 years. Try to account for any foreseeable changes. One example is to remove technology extensions from URLS. Instead of mysite.com/products.html use mysite.com/products.
-
Some quick thoughts:
1. Take advantage of seomoz site crawler for issues sometime before going live- this will help detect any possible URL (duplicate) problems that are common in transition. Don't forget about redirecting any PPC campaigns you may have.
2. Based on experience, ranking will suffer in transition, regardless of preparation. If you spent time tightening your SEO metadata and structured your website exactly the same, you should see overall improvement after a few weeks. How quickly you are reindexed varies from one site to another. If you're seeing 20,000+ visitors a month, indexing should be relatively quick (that's just based off of my small statistical sample of personal experience.)
3. XML sitemap should be the current site you want to have indexed... not sure what you mean when you say you'd include the old site pages, which might lead to indexing problems no matter how you might work that out. With luck, your 301's will feed the wounded patient until Googlebot comes to stitch everything up.
If you could document your experience, it would be really helpful. Test and measurement = smart SEO
-
We have just completed a similar update on our site using pretty much exactly the technique you describe above. As you say keep a real close eye on the 404 pages as they can catch you out. The other thing to watch out for is to make sure you maintain any historical 301 redirects as they are easy to miss when implementing such a big change. We suddenly found loads of old links that we didn't even realise existed!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best SEO Practices for Displaying FAQs throughout site?
I've got an FAQ plugin (Ultimate FAQ) for a Wordpress site with tons of content (like 30 questions each with a page full, multi-paragraphs, of answers with good info -- stuff Google LOVES.) Right now, I have a main FAQ page that has 3 categories and about 10 questions under each category and each question is collapsed by default. You click an arrow to expand it to reveal the answer.I then have a single category's questions also displayed at the bottom of an appropriate related page. So the questions appear in two places on the site, always collapsed by default.Each question has a permalink that links to an individual page with only that question and answer.I know Google discounts (doesn't ignore) content that is hidden by default and requires a click (via js function) to reveal it.So what I'm wondering is if the way I have it setup is optimal for SEO? How is Google going to handle the questions being in essentially three places: it's own standalone page, in a list on a category page, and in a list on a page showing all questions for all categories. Should I make the questions not collapsed by default (which will make the master FAQ page SUPER long!)Does Google not mind the duplicate content within the site?What's the best strategy?
Intermediate & Advanced SEO | | SeoJaz0 -
E-Commerce Site Collection Pages Not Being Indexed
Hello Everyone, So this is not really my strong suit but I’m going to do my best to explain the full scope of the issue and really hope someone has any insight. We have an e-commerce client (can't really share the domain) that uses Shopify; they have a large number of products categorized by Collections. The issue is when we do a site:search of our Collection Pages (site:Domain.com/Collections/) they don’t seem to be indexed. Also, not sure if it’s relevant but we also recently did an over-hall of our design. Because we haven’t been able to identify the issue here’s everything we know/have done so far: Moz Crawl Check and the Collection Pages came up. Checked Organic Landing Page Analytics (source/medium: Google) and the pages are getting traffic. Submitted the pages to Google Search Console. The URLs are listed on the sitemap.xml but when we tried to submit the Collections sitemap.xml to Google Search Console 99 were submitted but nothing came back as being indexed (like our other pages and products). We tested the URL in GSC’s robots.txt tester and it came up as being “allowed” but just in case below is the language used in our robots:
Intermediate & Advanced SEO | | Ben-R
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /9545580/checkouts
Disallow: /carts
Disallow: /account
Disallow: /collections/+
Disallow: /collections/%2B
Disallow: /collections/%2b
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b
Disallow: /design_theme_id
Disallow: /preview_theme_id
Disallow: /preview_script_id
Disallow: /apple-app-site-association
Sitemap: https://domain.com/sitemap.xml A Google Cache:Search currently shows a collections/all page we have up that lists all of our products. Please let us know if there’s any other details we could provide that might help. Any insight or suggestions would be very much appreciated. Looking forward to hearing all of your thoughts! Thank you in advance. Best,0 -
Can I change a URL on a site that has only a few back links?
I have a site that wants to change their URL, It's a very basic site with hardly any backlinks. http://www.cproofingandexteriors.com/ The only change they want to make is taking out the 'and'.. so it would be cproofingexteriors.com they already own the domain. What should I do?? Thanks
Intermediate & Advanced SEO | | MissThumann0 -
How to optimize count of interlinking by increasing Interlinking count of chosen landing pages and decreasing for less important pages within the site?
We have taken out our interlinking counts (Only Internal Links and not Outbound Links) through Google WebMaster tool and discovered that the count of interlinking of our most significant pages are less as compared to of less significant pages. Our objective is to reverse the existing behavior by increasing Interlinking count of important pages and reduce the count for less important pages so that maximum link juice could be transferred to right pages thereby increasing SEO traffic.
Intermediate & Advanced SEO | | vivekrathore0 -
Would be the network site map page considered link spam
In the course of the last 18 months my sites have lost from 50 to 70 percent of traffic. Never have used any tricks, just simple white-hat SEO. Anyway, I am now trying to fix things that hadn't been a problem before all those Google updates, but apparently now are. Would appreciate any help.. I used to have a network site map page on everyone of my sites (about 30 sites). It basically would be a page called 'our network' and it'll show a list of links to all of my other sites. These pages were indexed, had decent PR and didn't seem to cause any problem. Here's an example of one of them:
Intermediate & Advanced SEO | | romanbond
http://www.psoriasisguide.ca/psoriasis_scg.html In the light of Panda and Penguin and all these 'bad links' I decided to get rid of most of them. My traffic didn't recover at all, it actually went further down. Not sure if there is any connection to what I'd done. So, the question is: In your opinion/experience, do you think such network sitemap pages could be causing penalties for link spam?0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Googlebot found an extremely high number of URLs on your site
I keep getting the "Googlebot found an extremely high number of URLs on your site" message in the GWMT for one of the sites that I manage. The error is as below- Googlebot encountered problems while crawling your site. Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site. I understand the nature of the message - the site uses a faceted navigation and is genuinely generating a lot of duplicate pages. However in order to stop this from becoming an issue we do the following; No-index a large number of pages using the on page meta tag. Use a canonical tag where it is appropriate But we still get the error and a lot of the example pages that Google suggests are affected by the issue are actually pages with the no-index tag. So my question is how do I address this problem? I'm thinking that as it's a crawling issue the solution might involve the no-follow meta tag. any suggestions appreciated.
Intermediate & Advanced SEO | | BenFox0 -
Is it good to change an ecommerce site homepage everyday?
my website is www.theprinterdepo.com but now my boss says that it should change everyday, its en ecommerce site, that the products should change everyday, even some times more than day. Is it good for seo? taking into account that google does not crawl all sites every day I suppose.
Intermediate & Advanced SEO | | levalencia10