Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I use change of address when moving to subdomain
-
Hi guys
So we had a domain that was only for one country, www.example.com
1 year later we decided to go to another country so we will have all the current website under a country subdomain like : ae.example.com
we did a 301 redirect
Should I perform a change of address action from www.example.com to ae.example.com ?please help
Thanks -
Thank you for your answer, yes we moved from www.example.com to
uk.example.com and the new countries will have
us.example.comnow since all old links were for the www.example.com UK we are thinking of using the change of address in Search Console.
-
Thank you for your answer: we did a 301 but I was considering the "change of address tool" to make sure we don't lose any backlinks.
-
I assume you mean an "official" change of address in Search Console? Yes, if this is an official, permanent move then you should do the official change of address in WMT.
Use the change of address tool
If you've moved your site to a new domain or subdomain, use the Change of address tool in Search Console. A change of address notification helps you manage the transition needed by Google to index your new URLs at the new address, while minimizing impact to your current ranking in Google Search results.
-
I am a little unclear on the question however a 301 redirect is the best way to tell google to forget about the old URL focus on the end URL of the redirect. It is as clean and efficient as you get.
But to be clear direct from google webmaster:-
"301 redirects are particularly useful in the following circumstances:
- You've moved your site to a new domain, and you want to make the transition as seamless as possible."
https://support.google.com/webmasters/answer/93633?hl=en
Hope that answers your query.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Url Removes Backlink
Hello MOZ Community, I have question regarding Bad Backlink Removal. My Site's Post's Image got 4 to 5k backlinks from unknown sites and also their is no contact details on their site so that i can contact them to remove. So, I have an idea for which i want suggestion " If I change the url that receieves backlinks" does this will remove backlinks? For Example: https://example.com/test/ got 5k backlinks if I change this url to https://examplee.com/test-failed/ does this will remove those 5k backlinks? If not then How Can I remove those Backlinks? I Know about disavow but this takes time.
Intermediate & Advanced SEO | | Jackson210 -
What IP Address does Googlebot use to read your site when coming from an external backlink?
Hi All, I'm trying to find more information on what IP address Googlebot would use when arriving to crawl your site from an external backlink. I'm under the impression Googlebot uses international signals to determine the best IP address to use when crawling (US / non-US) and then carries on with that IP when it arrives to your website? E.g. - Googlebot finds www.example.co.uk. Due to the ccTLD, it decides to crawl the site with a UK IP address rather than a US one. As it crawls this UK site, it finds a subdirectory backlink to your website and continues to crawl your website with the aforementioned UK IP address. Is this a correct assumption, or does Googlebot look at altering the IP address as it enters a backlink / new domain? Also, are ccTLDs the main signals to determine the possibility of Google switching to an international IP address to crawl, rather than the standard US one? Am I right in saying that hreflang tags don't apply here at all, as their purpose is to be used in SERPS and helping Google to determine which page to serve to users based on their IP etc. If anyone has any insight this would be great.
Intermediate & Advanced SEO | | MattBassos0 -
Changing title tags - any potential issues?
Hello all, I am planning to change the title tags throughout a site and am vaguely aware (perhaps wrongly!) that changing title tags across a site is a risk factor - can be a spam flag if changes (to a specific title tag) are implemented too regularly, for example. Would you change title tags across a site in one go, or implement changes gradually - to avoid any risk of upsetting Google. Do you have any insights/tips on the implementation of title tag changes?
Intermediate & Advanced SEO | | McTaggart1 -
Using disavow tool for 404s
Hey Community, Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp). It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question. Feel free to ask any questions that may help you understand the issue more. Thanks for your help,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
Recent Algo Change
I was wondering if anybody can shed some light on any recent changes to the Google algorithm in Australia. A competitor, www.manwithavan.com.au has always been number 1 for the most competitive search term in our industry "removalists melbourne". However, in the last week, they have fallen out of the the SERPS and are now (according to MOZ) ranking outside the top 50. As far as l can tell, they have a really well optimized site with good structure, great text and updated content. They are very active within social media circles and have some really good external links. Can anybody tell me why they would have been hit so badly. The reason l ask is that i want to make sure we don't make the same mistake. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | RobSchofield1 -
How long is it safe to use a 302 redirect?
Hi All, Lets assume there is site A and site B, both sites are live on the internet today as standalone businesses, but they sell very similar products. Site B has built up some link equity and will eventually become the domain for site A due to an organisational re-brand. For the time being however site A will remain, but site B needs to disappear temporarily, but not lose the link equity which has been built up against it. My current thinking is to 302 redirect site B to site A such that users and search bots accessing site B will be redirected to site A whilst leaving the link equity that exists against site B fully intact and allowing us to continue to grow it should we wish to. The question is, does anybody have a view on how long it is safe to use a 302 temporary redirect for? i.e., is 8-10 months to long. Thanks, Ben
Intermediate & Advanced SEO | | BenRush0 -
Paging. is it better to use noindex, follow
Is it better to use the robots meta noindex, follow tag for paging, (page 2, page 3) of Category Pages which lists items within each category or just let Google index these pages Before Panda I was not using noindex because I figured if page 2 is in Google's index then the items on page 2 are more likely to be in Google's index. Also then each item has an internal link So after I got hit by panda, I'm thinking well page 2 has no unique content only a list of links with a short excerpt from each item which can be found on each items page so it's not unique content, maybe that contributed to Panda penalty. So I place the meta tag noindex, follow on every page 2,3 for each category page. Page 1 of each category page has a short introduction so i hope that it is enough to make it "thick" content (is that a word :-)) My visitors don't want long introductions, it hurts bounce rate and time on site. Now I'm wondering if that is common practice and if items on page 2 are less likely to be indexed since they have no internal links from an indexed page Thanks!
Intermediate & Advanced SEO | | donthe0 -
All page files in root? Or to use directories?
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /. For example: /aosta-valley-i6816.html
Intermediate & Advanced SEO | | Peter264
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.html We are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following: /images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.html Would we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory: /images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/ If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/ Just looking for some clarity to our problem! Thank you for your help guys!0