How to use Google search console's 'Name change' tool?
-
Hi There,
I'm having trouble performing a 'Name change' for a new website (rebrand and domain change) in Google Search console.
Because the 301 redirects are in place (a requirement of the name change tool), Google can no longer verify the site, which means I can't complete the name change?
To me, step two (301 redirect) conflicts with step there (site verification) - or is there a way to perform a 301 redirect and have the tool verify the old site?
Any pointers in the right direction would be much appreciated.
Cheers
Ben
-
Your answer is confusing sorry.
If you're not supposed to add the old site's verification to your new site, how does anyone complete the Name Change Tool?
- Step 2 of the tool requires 301 redirects be in place before you can move on
- Step 3 of the tool requires you verify the old site
Obviously, if step 2 is working, then step 3 will always fail.
How do you complete the name change tool?
-
Don't try to edit old one. Let read old version to Google. You need to add new property. You don't need to worry about old property. Due to 301, Google auto ignore your old version.
-
Got this working.
What's not immediately obvious, is that you're allowed to use one site's verification on another.
So you have to copy the old site's verification over to the new site so that your new site is running both verifications.
Then when you run the tool, verification of the old site works.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
Pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. The page below in search console shows the error above...
Technical SEO | | Sean_White_Consult0 -
Robots.txt Disallow: / in Search Console
Two days ago I found out through search console that my website's Robots.txt has changed to User-agent: *
Technical SEO | | RAN_SEO
Disallow: / When I check the robots.txt in the website it looks fine - I see its blocked just in search console( in the robots.txt tester). when I try to do fetch as google to the homepage I see its blocked. Any ideas why would robots.txt block my website? it was fine until the weekend. before that, in the last 3 months I saw I had blocked resources in the website and I brought back pages with fetch as google. Any ideas?0 -
Godaddy and Soft 404's
Hello, We've found that a website we manage has a list of not-found URLS in Google webmaster tools which are "soft 404's " according to Google. I went to the hosting company GoDaddy to explain and to see what they could do. As far as I can see GoDaddy's server are responding with a 200 HTTP error code - meaning that the page exists and was served properly. They have sort of disowned this as their problem. Their server is not serving up a true 404 response. This is a WordPress site. 1) Has anyone seen this problem before with GoDaddy?Is it a GoDaddy problem?2) Do you know a way to sort this issue? When I use the command site:mydomain.co.uk the number of URLs indexed is about right except for 2 or 3 "soft URLs" . So I wonder why webmaster tools report so many yet I can't see them all in the index?
Technical SEO | | AL123al0 -
Abnormally high internal link reported in Google Search Console not matching Moz reports
If I'm looking at our internal link count and structure on Google Search Console, some pages are listed as having over a thousand internal links within our site. I've read that having too many internal links on a page devalues that page's PageRank, because the value is divided amongst the pages it links out to. Likewise, I've heard having too many internal links is just bad in general for SEO. Is that true? The problem I'm facing is determining how Google is "discovering" these internal links. If I'm just looking at one single page reported with, say, 1,350 links and I'm just looking at the code, it may only have 80 or 90 actual links. Moz will confirm this, as well. So why would Google Search Console report different? Should I be concerned about this?
Technical SEO | | Closetstogo0 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
URL Folders and Naming Convention Changes?
1. We’re looking for some clarification in regards to our URL structure. Currently, at our product level we have http://www.ties.com/v/a/elite-solid-black-black-tie however the parent URL is http://www.ties.com/black-ties. a. So here are the question. How much is this hurting because semantically the naming convention of this URL and weird and doesn’t follow logical patterns. In other words. Should the product page for this be http://ties.com/black-ties/elite-solid-black-tie. How bad is this hurting us? b. If we were to change the ULR structure, should we do it in phases or all at once? We don’t want to get penalized. We have well over 3,000 product pages.
Technical SEO | | Ties.com0 -
My Old Domain is Not Changing in Google
I have taken over the following domain www.choice-cottages.co.uk, part of the contract was to re-direct the old site www.choicecottages.info to the new site. Unfortunately I am only a middle man in the arrangement as the website is hosted with another company. The switch was done well over 4 weeks ago, the re-direct itself is working fine. However if you google choice cottages you will see the first listing is www.choicecottages.info, then I have my new site below for a few listings. Google is definitely updating something as before the old domain had lots of site links but this has reduced to a few. Does anyone know anything on this, as in the past it only takes a couple of days to update. Many thanks Andy
Technical SEO | | iprosoftware0