Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to 301 redirect from old domain and their pages to new domain and pages?
-
Hi i am a real newbie to this and i hope for a guide on how to do this. I seen a few moz post and is quiet confusing hopefully somebody able to explain it in layman terms to me.
I would like to 301 redirect this way, both website contain the same niche.
oldwebsite.com > newwebsite.com
and also its pages.....
oldwebsite.com/test >newwebsite.com/test
So my question here is i would like to host my old domain and its pages in my new website hosting in order to redirect to my new domain and its pages
how do i do that? would my previous page link overwrite my new page link? or it add on the juice link?
Do i need to host the whole old domain website into my new hosting in order to redirect the old pages?
really confusing here, thanks!
-
i tried pasting that code there and change the old domain and new domain but it doesnt work
-
Yes. you should move your contents to your new host and redirect your old domain to new one.
-
by parking the domain in the new hosting, all my previous pages will be gone right? unless i move everything from the old server to the new one? so i can redirect those pages to my new one too
-
Yes. Simply park your domain on your new hosting and redirect it using this code.
-
do we need to keep this redirect permanently? for example to keep the hosting on in order for it to redirect it to my new site.
-
yes; all of your old links move to new domain without hurting your SEO. Its a permanent redirection called 301.
-
by using this code
Options +FollowSymlinks
RewriteEngine On
RewriteCond %{HTTP_HOST} ^olddomain.com [NC]
RewriteRule ^(.*)$ http://newdomain.com/$1 [L,R=301]how would it affect my seo? would it merge from the old link to the new link?
-
Hi
its better to ask instead of not asking

First, you should host your old domain on your new site and use this htaccess code to redirect 301 all your urls to new domain(don't forget to change newdomain.com and yourdomain.com with your new and old site):
Options +FollowSymlinks
RewriteEngine On
RewriteCond %{HTTP_HOST} ^olddomain.com [NC]
RewriteRule ^(.*)$ http://newdomain.com/$1 [L,R=301]RewriteCond %{HTTP_HOST} ^www.olddomain.com [NC]
RewriteRule ^(.*)$ http://newdomain.com/$1 [L,R=301]then you should submit a request in webmaster tools > change of address part to move your site authority completely.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam Score & Redirecting Inbound Links
Hi, I recently downloaded a spreadsheet of inbound links to my client sites and am trying to 301 redirect the ones that are formatted incorrectly or just bad links in general (they all link to the site domain, but they used to have differently formatted urls on their old site, or the link URL in general has strange stuff on it). My question is, should I even bother redirecting these links if their spam score is a little high (i.e. 20-40%)? it already links to the existing domain, just with a differently formatted URL. I just want to make sure it goes to a valid URL on the site, but I don't want to redirect to a valid URL if it's going to harm the client's SEO. Also not sure what to do about the links with the --% spam score. I really appreciate any input as I don't have a lot of experience with how to deal with spammy links.
White Hat / Black Hat SEO | | AliMac260 -
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Remedies, Cure, and Precautions for 302 redirect Hijacking.
Hi Moz Guys, I hope all of you are good out there. I am here to discuss remedies, cure, and precautions for 302 redirect hijacking. Although it is quite old and whenever I searched in Google, it looks like a long gone glitch of Google serps but it just happened to one of my customers' site. The site in question is www(dot)solidswiss(dot)cd. If you check the cache(cache:site) then you can see a hijacked site in the urls of the cached page. As a result all my customer's listing in the serps are replaced with this site. This hacked site then is redirecting to a competitor's site. I did many things to cop with the problem, site came back in the serps but hackers are doing this on lots of domains so when it recovered from one site then another site catches it. I am doing lots of reporting on submit spam site. I am doing lots of feedback on the serps page. I have switched to https . But seems like nothing is working. This community is full of experts and technical people. I am wondering that what are your views and suggestions to handle the problem permanently?
White Hat / Black Hat SEO | | adqas0 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Best Location to find High Page Authority/ Domain Authority Expired Domains?
Hi, I've been looking online for the best locations to purchase expired domains with existing Page Authority/ Domain Authority attached to them. So far I've found: http://www.expireddomains.net
White Hat / Black Hat SEO | | VelasquezEF
http://www.domainauthoritylinks.com
http://moonsy.com/expired_domains/ These site's are great but I'm wondering if I'm potentially missing other locations? Any other recommendations? Thanks.1 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0