Search console site verification
-
I've been going on the assumption that when verifying a website in search console, it's always good to register and verify all variants of the site URL:
- http
- https
- www
- non-www
However, if you create redirects to the preferred URL, is it really necessary to register/virfy of the other three? If so, why?
-
Your dev is right on the specific question of whether the additional GMB profiles are necessary for Google to understand which is the primary version. This is not essential as long as the proper redirects are in place. It's a "belt and suspenders" approach. If there's no way for the search engine crawlers to ever reach anythig but the primary version, then there's no way for them to get it wrong. You're using the GMB info to reinforce what the redirects are already doing. (That said - it's trivial to reinforce the redirect process with the declaration in GMB, so best practice is to do that as well. You need the GMB profile to properly manage the other aspects of marketing the site anyway, so...)
Put another way - declaring the primary version of the site using the alternate GMB profiles is Google's way of allowing those who might not have proper dev access to sites to at least partly accomplish the same thing from within GMB which they can manage.
The real value to verifying the other versions in GMB is so that you can monitor to make certain that those non-canonical versions of the site are in fact definitely not getting indexed or ranked. This is essential after an HTTPS migration, for example, as you should see the HTTP profile showing a steady drop in indexing while the HTTPS profile shows the steady increase.
A periodic check of the non-canonical GMB profiles will alert you immediately to any newly discovered issues Google's crawlers may be encountering (like a sitewide redirect got accidentally removed or changed, for example.)
Make sense?
Paul
-
If you don't have access to the old Google accout, verify the property from the current new account. Verify all versions and then select the preferred one. This is what Google asks people to do. It is very fast and easy. This is always the preferred method.
Best Regards
-
The preferred version has been verified by adding an HTML tag in the Shopify theme (this is how I usually verify too). But I don't have access to the same search console account...
- can I generate a new HTML tag to verify the other three variants (ie. is it OK to use two different HTML tags)?
- or should I create a new HTML tag to verify ALL 4 variants (ie. is there any negative side to replacing the orginal HTML tag)?
-
Nope, you should not believe what your dev is saying (in this particular case).
As William said, Google suggest us to verify all your versions and set the preferred to be considerd. -
Thanks William,
That's always been my approach too, but the dev is adamant that if they redirect all variants to the preferred version there is no need to verify them all.
My question is: should I believe what this dev is saying?
-
Hello,
Yes, Google suggests that you register all variants of your site. Then make sure to select the preferred one in the search console. That way Google will understand your intent and desire.
http://www.site.com
http://site.com
https://www.site.com
https://site.comMake sure you select the preferred one to show in the search results.
Best Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO audit for ccTLD sites
Hi.
Intermediate & Advanced SEO | | SEOSanna
My customer has sites with Country code TLDs (e.g. https://sample.fr, https://sample.fi and https://sample.en) and I'm supposed to perform SEO audit for them. Is there any way to perform audit at the same time for all versions? Can I perform audit e.g. just for https://sample or do I have to run every .fr, .fi and .en separately in multiple tools I use? Thanks much in advance!0 -
In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description. Why is the number of "issues" so high? Does it compound over time as Google re-crawls the sitemap?
Intermediate & Advanced SEO | | FPD_NYC0 -
Site Migration Question
Hi Guys, I am preparing for a pretty standard site migration. Small business website moving to a new domain, new branding and new cms. Pretty much a perfect storm. Right now the new website is being designed and will need another month, however the client is pretty antsy to get her new brand out over the web. We cannot change the current site, which has the old branding. She wants to start passing out business cards and hang banners with the new domain and brand. However, I don't want to be messing with any redirects and potentially screw up a clean migration from the old site to the new. To be specific, she wants to redirect the new domain to the current domain and then when the new site, flip the redirect. However, I'm a little apprehensive with that because a site migration from the current to the new is already so intricate, I don't want to leave any possibility of error. I'm trying to figure out the best solution, these are 2 options I am thinking of: DO NOT market new domain. Reprint all Marketing material and wait until new domain is up and then start marketing it. (At cost to client) Create a one pager on new domain saying the site is being built & have a No Follow link to the current site. No redirects added. Just the no follow link. I'd like option 2 so that the client could start passing out material, but my number one concern is messing with any part of the migration. We are about to submit a sitemap index to Google Search Console for the current site, so we are just starting the site migration. What do you guys think?
Intermediate & Advanced SEO | | Khoo0 -
How old is 404 data from Google Search Console?
I was wondering how old the 404 data from Google Search Console actually is? Does anyone know over what kind of timespan their site 404s data is compiled over? How long do the 404s tend to take to disappear from the Google Search Console, once they are fixed?
Intermediate & Advanced SEO | | McTaggart0 -
Google WMT/search console showing thousands of links in "Internal Links"
Hi, One of our blog-post has been interlinked with thousands of internal links as per search console; but lists only 2 links it got connected from. How come so many links it got connected internally? I don't see any. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
How did my dev site end up in the search results?
We use a subdomain for our dev site. I never thought anything of it because the only way you can reach the dev site is through a vpn. Google has somehow indexed it. Any ideas on how that happened? I am adding the noindex tag, should I used canonical? Or is there anything else you can think of?
Intermediate & Advanced SEO | | EcommerceSite0 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
How to Hide Directories in Search?
I noticed bad 404 error links in Google Webmaster Tools and they were pointing to directories that do not have an actual page, but hold information. Ex: there are links pointing to our PDF folder which holds all of our pdf documents. If i type in , example.com/pdf/ it brings up a unformated webpage that displays all of our PDF links. How do I prevent this from happening. Right now I am blocking these in my robots.txt file, but if i type them in, they still appear. Or should I not worry about this?
Intermediate & Advanced SEO | | hfranz0