Search console site verification
-
I've been going on the assumption that when verifying a website in search console, it's always good to register and verify all variants of the site URL:
- http
- https
- www
- non-www
However, if you create redirects to the preferred URL, is it really necessary to register/virfy of the other three? If so, why?
-
Your dev is right on the specific question of whether the additional GMB profiles are necessary for Google to understand which is the primary version. This is not essential as long as the proper redirects are in place. It's a "belt and suspenders" approach. If there's no way for the search engine crawlers to ever reach anythig but the primary version, then there's no way for them to get it wrong. You're using the GMB info to reinforce what the redirects are already doing. (That said - it's trivial to reinforce the redirect process with the declaration in GMB, so best practice is to do that as well. You need the GMB profile to properly manage the other aspects of marketing the site anyway, so...)
Put another way - declaring the primary version of the site using the alternate GMB profiles is Google's way of allowing those who might not have proper dev access to sites to at least partly accomplish the same thing from within GMB which they can manage.
The real value to verifying the other versions in GMB is so that you can monitor to make certain that those non-canonical versions of the site are in fact definitely not getting indexed or ranked. This is essential after an HTTPS migration, for example, as you should see the HTTP profile showing a steady drop in indexing while the HTTPS profile shows the steady increase.
A periodic check of the non-canonical GMB profiles will alert you immediately to any newly discovered issues Google's crawlers may be encountering (like a sitewide redirect got accidentally removed or changed, for example.)
Make sense?
Paul
-
If you don't have access to the old Google accout, verify the property from the current new account. Verify all versions and then select the preferred one. This is what Google asks people to do. It is very fast and easy. This is always the preferred method.
Best Regards
-
The preferred version has been verified by adding an HTML tag in the Shopify theme (this is how I usually verify too). But I don't have access to the same search console account...
- can I generate a new HTML tag to verify the other three variants (ie. is it OK to use two different HTML tags)?
- or should I create a new HTML tag to verify ALL 4 variants (ie. is there any negative side to replacing the orginal HTML tag)?
-
Nope, you should not believe what your dev is saying (in this particular case).
As William said, Google suggest us to verify all your versions and set the preferred to be considerd. -
Thanks William,
That's always been my approach too, but the dev is adamant that if they redirect all variants to the preferred version there is no need to verify them all.
My question is: should I believe what this dev is saying?
-
Hello,
Yes, Google suggests that you register all variants of your site. Then make sure to select the preferred one in the search console. That way Google will understand your intent and desire.
http://www.site.com
http://site.com
https://www.site.com
https://site.comMake sure you select the preferred one to show in the search results.
Best Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console - Best practice to fetch pages when you update them?
Hi guys, If you make changes to a page e.g. add more content or something is it good practice to get google to fetch that page again in search console? My assumption is this way, Google can review the updated page quicker, resulting in faster changes in the SERPs for that page. Thoughts? Cheers.
Intermediate & Advanced SEO | | wozniak650 -
Old site penalised, we moved: Shall we cut loose from the old site. It's curently 301 to new site.
Hi, We had a site with many bad links pointing to it (.co.uk). It was knocked from the SERPS. We tried to manually ask webmasters to remove links.Then submitted a Disavow and a recon request. We have since moved the site to a new URL (.com) about a year ago. As the company needed it's customer to find them still. We 301 redirected the .co.uk to the .com There are still lots of bad links pointing to the .co.uk. The questions are: #1 Do we stop the 301 redirect from .co.uk to .com now? The .co.uk is not showing in the rankings. We could have a basic holding page on the .co.uk with 'we have moved' (No link). Or just switch it off. #2 If we keep the .co.uk 301 to the .com, shall we upload disavow to .com webmasters tools or .co.uk webmasters tools. I ask this because someone else had uploaded the .co.uk's disavow list of spam links to the .com webmasters tools. Is this bad? Thanks in advance for any advise or insight!
Intermediate & Advanced SEO | | SolveWebMedia0 -
My site is always in the top 4 on google, and sometimes goes to #2\. But the site at #1 is always at #1 .. how can i beat them?
So i'm sure this is a very generic question.. of course everyone wants to be #1. We are an ecommerce web site. We have all sorts of products, user ratings, and are loved by our customers. We sell over 3 million a year. So let me give you some data.. First of all one of the sites that keeps taking the #2 or #3 spot is amazons category for what we sell.. (i'm not sure if I should say who we are here.. as I don't want the #1 spot to realize we are trying to take them over!) Amazon of course has a domain authority of 100. But they never take the #1 spot. The other site that takes the #2 and #3 spot is not even selling anything. Happens to be a technical term's with the same name wikipedia page! (i wish google would figure out people aren't looking for that!) Anyways.. every day we bouce back and forth between #4 and #2.. but #1 never changes.. Here are the stats of us verse #1 from moz: #1: Page Authority: 56.8, Root Domains Linking to page: 158, Domain Authority: 54.6: root domains linking to the root domain 1.42k my site: Page Authority: 60.6, Root domains linking to the page: 562, Domain Authority: 52.8: root domains linking to the root domain: 1.03k So they beat us in domain authority SLIGHTLY and in root domains linking to the root domain. So SEO masters.. what do I do to fix this? Get better backlinks? But how.... I can't just email GQ and ask them to write about us can I? I'm open to all things.. Maybe i'm not using moz data correctly.. We should at least be #2. We get #2 every other day.
Intermediate & Advanced SEO | | 88mph0 -
Site Navigation
Hi Mozzers, I am an SEO at uncommongoods.com and looking for your opinion on our site nav. Currently our nav & URLs are structured in 3 levels. From the top level down, they are: 1. Category ex: http://www.uncommongoods.com/home-garden 2. Subcat ex: http://www.uncommongoods.com/home-garden/bed-bath 3. Family ex:http://www.uncommongoods.com/home-garden/bed-bath/bath-accessories Right now, all levels are accessible from our top nav but we are considering removing the family pages. If we did that, Google could still find & crawl links to the family pages, but they would have to drill down to the subcat pages to find them. Do you guys think this would help or hurt our SEO efforts? Thanks! -Zack
Intermediate & Advanced SEO | | znotes0 -
Site Search Results in Index -- Help
Hi, I made a mistake on my site, long story short, I have a bunch of search results page in the Google index. (I made a navigation page full of common search terms, and made internal links to a respective search results page for each common search term.) Google crawled the site, saw the links and now those search results pages are indexed. I made versions of the indexed search results pages into proper category pages with good URLs and am ready to go live/ replace the pages and links. But, I am a little unsure how to do it /what the effects can be: Will there be duplicate content issues if I just replace the bad, search results links/URLs with the good, category page links/URLs on the navi. page? (is a short term risk worth it?) Should I get the search results pages de-indexed first and then relaunch the navi. page with the correct category URLs? Should I do a robots.txt disallow directive for search results? Should I use Google's URL removal tool to remove those indexed search results pages for a quick fix, or will this cause more harm than good? Time is not the biggest issue, I want to do it right, because those indexed search results pages do attract traffic and the navi. page has been great for usability. Any suggestions would be great. I have been reading a ton on this topic, but maybe someone can give me more specific advice. Thanks in advance, hopefully this all makes sense.
Intermediate & Advanced SEO | | IOSC1 -
On-site links
Hi everybody, There's a lot of information about getting sitewide backlinks, but so few about on-site optimization. Is there a maximum of links to put on a page ? Is there a maximum of link that a page should receive ? etc ... ? So, what is the optimal strategy ? And I'm only concerned about on-page and on-site link, not backlinks commming from other sites. Thanks
Intermediate & Advanced SEO | | DavidPilon0 -
This site got hit but why..?
I am currently looking at taking on a small project website which was recently hit but we are really at a loss as to why so I wanted to open this up to the floor and see if anyone else had some thoughts or theories to add. The site is Howtotradecommodities.co.uk and the site appeared to be hit by Penguin because sure enough it drops from several hundred visitors a day to less than 50. Nothing was changed about the website, and looking at the Analytics it bumbled along at a less than 50 visitors a day. On June 25th when Panda 3.8 hit, the site saw traffic increase to between 80-100 visitors a day and steadily increases almost to pre-penguin levels. On August 9th/10th, traffic drops off the face of the planet once again. This site has some amazing links http://techcrunch.com/2012/02/04/algorithmsdata-vs-analystsreports-fight/
Intermediate & Advanced SEO | | JamesAgate
http://as.exeter.ac.uk/library/using/help/business/researchingfinance/stockmarket/ That were earned entirely naturally/editorially. I know these aren't "get out of jail free cards" but the rest of the profile isn't that bad either. Normally you can look at a link profile and say "Yep, this link and that link are a bit questionable" but beyond some slightly off-topic guest blogging done a while back before I was looking to get involved in the project there really isn't anything all that fruity about the links in my opinion. I know that the site design needs some work but the content is of a high standard and it covers its topic (commodities) in a very comprehensive and authoritative way. In my opinion, (I'm not biased yet because it isn't my site) this site genuinely deserves to rank. As far as I know, this site has received no unnatural link warnings. I am hoping this is just a case of us having looked at this for too long and it will be a couple of obvious/glaring fixes to someone with a fresh pair of eyes. Does anyone have any insights into what the solution might be? [UPDATE] after responses from a few folks I decided to update the thread with progress I made on investigating the situation. After plugging the domain into Open Site Explorer I can see quite a few links that didn't show up in Link Research Tools (which is odd as I thought LRT was powered by mozscape but anyway... shows the need for multiple tools). It does seem like someone in the past has been a little trigger happy with building links to some of the inner pages.0 -
The Site: search and Flow of PageRank
It is my understanding that if I do a search for site:mydomain.com the results are like every other SERP in that the most authoritative pages are ranked higher. So obviously I would expect my homepage to be first (in most cases), then followed by main category pages, etc. My question is has anybody ever seen disturbing results when doing this (i.e. pages that should have no authority outranking main category pages)? Is this always an issue with site structure or can you think of other factors that may cause this?
Intermediate & Advanced SEO | | purch0