Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Domain Name Change - Best Practices?
-
Good day guys,
We got a restaurant that is changing its name and domain.
However they are keeping the same server location, same content and same pages (we are just changing the logo on the website).
It just has to go a new domain. We don't want to lose the value of the current site, and we want to avoid any duplicate penalties.
Could you please advise of the best practices of doing a domain name change?
Thank you.
-
This YouMoz post may also help some too. http://www.seomoz.org/blog/web-site-migration-guide-tips-for-seos
-
I would recommend following the step by step guidance of Google in the Webmaster Tools -> Configuration -> Change of Address:
-
Set up your content on your new domain.
-
Redirect content from your old site using 301 redirects.
They offer further guidelines here: https://support.google.com/webmasters/bin/answer.py?hl=en&answer=83105
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Structure & Best Practice when Facing 4+ Sub-levels
Hi. I've spent the last day fiddling with the setup of a new URL structure for a site, and I can't "pull the trigger" on it. Example: - domain.com/games/type-of-game/provider-name/name-of-game/ Specific example: - arcade.com/games/pinball/deckerballs/starshooter2k/ The example is a good description of the content that I have to organize. The aim is to a) define url structure, b) facilitate good ux, **c) **create a good starting point for content marketing and SEO, avoiding multiple / stuffing keywords in urls'. The problem? Not all providers have the same type of game. Meaning, that once I get past the /type-of-game/, I must write a new category / page / content for /provider-name/. No matter how I switch the different "sub-levels" around in the url, at one point, the provider-name doesn't fit as its in need of new content, multiple times. The solution? I can skip "provider-name". The caveat though is that I lose out on ranking for provider keywords as I don't have a cornerstone content page for them. Question: Using the URL structure as outlined above in WordPress, would you A) go with "Pages", or B) use "Posts"
Intermediate & Advanced SEO | | Dan-Louis0 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Faceted Navigation URLs Best Practices
Hi, We are developing new Products Pages with faceted filters. You can see it here: https://www.viatrading.com/wholesale-products/ We have a feature allowing to Order By and Group By, which alters the order of all products. There will also be the option to view Products as a table, which will contain same products but with different design and maybe slightly different content of each product. All this will happen without changing the URL, https://www.viatrading.com/all/ Is this the best practice? Thanks,
Intermediate & Advanced SEO | | viatrading10 -
Linking from & to in domains and sub-domains
What's the best optimised linking between sub-domains and domains? And every time we'll give website link at top with logo...do we need to link sub-domain also with all it's pages? If example.com is domain and example.com/blog is sub-domain or sub-folder... Do we need to link to example.com from /blog? Do we need to give /blog link in all pages of /blog? Is there any difference in connecting domains with sub-domains and sub-folders?
Intermediate & Advanced SEO | | vtmoz0 -
Consolidating Multiple Domains into A Single Domain
I have a client who's website is an amalgamation of multiple domains. jacksonhole.net is the main domain but the site passes traffic back and forth from the following domains/sites. My questions is, would it it be better for SEO to consolidate all of these domains under the single high authority domain and 301 redirect the rest or is that a really bad idea? Thanks for your help. jacksonhole.net (Domain Authority 31) jackson-hole-rental-condos.com (Domain Authority 22) jackson-hole-rental-homes.com (Domain Authority 21) j acksonholehotelguide.com (Domain Authority 19)
Intermediate & Advanced SEO | | dbaxa-2613381 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Redirect ruined domain to new domain without passing link juice
A new client has a domain which has been hammered by bad links, updates etc and it's basically on its arse because of previous SEO guys. They have various domains for their business (brand.com, brand.co.uk) and want to use a fresh domain and take it from there. Their current domain is brand.com (the ruined one). They're not bothered about the rankings for brand.com but they want to redirect brand.com to brand.co.uk so that previous clients can find them easily. Would a 302 redirect work for this? I don't want to set up a 301 redirect as I don't want any of the crappy links pointing across. Thanks!
Intermediate & Advanced SEO | | jasonwdexter0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0