Site property is verified for new version of search console, but same property is unverified in the old version
-
Hi all!
This is a weird one that I've never encountered before.
So basically, an admin granted me search console access as an "owner" to the site in search console, and everything worked fine. I inspected some URL's with the new tool and had access to everything. Then, when I realized I had to remove certain pages from the index it directed me to the old search console, as there's no tool for that yet in the new version. However, the old version doesn't even list the site under the "property" dropdown as either verified or unverified, and if I try to add it it makes me undergo the verification process, which fails (I also have analytics and GTM access, so verification shouldn't fail).
Has anyone experienced something similar or have any ideas for a fix? Thanks so much for any help!
-
That assuredly did used to be a problem and in these times I've found it hit and miss. Sometimes Google is able to reach the file directly and not be redirected, but sometimes Google still can't reach the file. In which case, you modify your .htaccess file to allow that one file (or URL) to be accessed via either protocol. I don't remember the exact rule but from memory, doing this isn't that hard
Failing that you should have access to this method:
https://support.google.com/webmasters/answer/9008080?hl=en
Ctrl+F (find) for "DNS record" and expand that bit of info from Google. That version works really well and I think, it also gives you access to the new domain level property
The htaccess mod method may be more applicable for you. Certainly make the change via FTP and not via a CMS back-end. If you break the .htaccess and kill the site, and you only have the CMS back-end to fix it - which also becomes broken, you're stuck. Modding your .htaccess file should not break FTP unless you do something out of this world, crazy-insanely wrong (in-fact I'm not sure you can break FTP with your .htaccess file)
Another option, temporarily nullify the HTTP to HTTPS redirects in the .htaccess, verify, make your changes, then put the rule back on. This is a bad method because, in a few weeks Google will fail to reach the file and you will be unverified again. Also your site may have legal reasons it must, must be on HTTPS. Also allowing HTTP again may shake up and mess up your SERPs unless you act lightning fast (before Google's next crawl of your site)
Something like this might help: https://serverfault.com/questions/740640/disable-https-for-a-single-file-in-apache or these search results: https://www.google.co.uk/search?q=disable+https+redirect+for+certain+file
Hope that helps
-
Thanks very much for your response. You are exactly right about the travails of the multiple properties, and I hadn't even thought about how the new domain level access should handle the multiple versions of each site (I'm still used to having to verify four separate properties).
In the end, you were exactly right; I just had to FTP the verification file once more and it worked immediately.
A question, though: if you were trying to verify a non secured protocol (http://) of a site that is https://, and you were for some reason unable to verify through GA or GTM, wouldn't uploading a verification file automatically create a secured protocol and therefore be invalid for verification? This is (thank goodness) purely theoretical, but it seems as though it would be a rough task which I'm sure happens periodically.
Thanks again for the insight. You were a great help!
-
I have no experience with this particular error but from the sounds of it, you will just have to re-verify and that's all that you can do. One thing to keep in mind is that different versions of the same site (HTTPS/WWW, HTTPS, HTTP/WWW, HTTP, any sub-domains) all count as separate websites in Search Console
The days of that being a problem are numbered as Google have come out with new domain-level properties for Search Console, but to verify those you need hosting level access so most people still aren't using that until Google can make the older verification methods applicable
What this does mean is that, if the URLs which you want to remove are for a different version of the site (which still counts as a separate property) then you still have to verify that other version of the site (maybe the pre-HTTPS version, or a version without WWW). If you have the wrong version of the property (site) registered in your GSC (which doesn't contain the URLs you want to remove) then you still need to register the old version
A common issue is when people move from HTTP to HTTPS, and they want to 'clean up' some of the old HTTP URLs and stop them from ranking (or at least, re-direct Google from the old property to the new one properly). They delete the HTTP version of the site from their GSC, but then they can't get back to do proper clean-up. In most instances Google still considers different site versions to be different sites in GSC. As mentioned this isn't a problem for some people now, soon it won't be a problem for anyone. But if you're looking at any kind of legacy account for sites that were built and verified up to a few months ago, the likelihood is you still have to re-verify other site versions
The new domain level properties may also have bugs in, where they defer back to the non-domain level properties for some stuff. You may have just found an example of that to be honest (but I can't confirm this)
I'd advise just doing what the UI tells you, it's really all you can feasibly do at this juncture
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console not loading some resources
When I check an URL with Search Console it cannot load some page resources, even from other domains (like: ssl.google-analytics.com, www.facebook.com and www.google-analytics.com).
Reporting & Analytics | | TottiataHUN
Have any of you experienced this issue?
Steps to reproduce: open Google Search Console check an URL click "View crawled page" link select "More info" tab click "Page resources" {?}/{?} couldn't be loaded When I check the listed resources, all of them can be loaded from a web browser.
So I do not understand why Google cannot load them.
And there is no additional info why the resources could not be loaded.
Any ideas? google-search-console-other-error-1.png google-search-console-other-error-2.png0 -
AMP Version Disappears
Hey, Moz fans. I am wondering why my AMP version shows in SERP some time and disappear again!!! there is no Error in AMP validator. Please help me. Landing: https://www.bimetime.com/blog/pasargad-life-insurance/amp
Reporting & Analytics | | DivarOfficial0 -
PDF web traffic hitting our site
Hi there, Over the last few months our traffic has spiked due to irrelevant pdf documents sending us crap traffic, our bounce rate is sky high as well as other metrics. I don't want to just filter out this traffic in GA rather try and stop our site from being attacked. Any advice on a way forward would be great. Thanks
Reporting & Analytics | | ICMPmarketing0 -
Migrated website but Google Analytics still displays old URL's and none new?!
I migrated a website from a .aspx to a .php and hence had to 301 all the old urls to the new php ones. It's been months after and I'm not seeing any of the php pages showing results but I'm still getting results from the old .aspx pages. Has any one had any experience with this issue or knows what to do? Many thanks,
Reporting & Analytics | | CoGri0 -
What will be configuration for new version of tag manager for given below code?
Hello Expert, I am using new version of tag manager for enhance ecommerce. Now i have post related to enhance ecommerce for old version of tag manager this one - https://developers.google.com/tag-manager/enhanced-ecommerce In this post, below is the configuration of "Measuring Views of Product Details" for old version of tag manager, can you please tell me what will be configuration for new version of tag manager? ( mainly basic setting and firing rule ) Tag type : Universal Analytics
Reporting & Analytics | | bkmitesh
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
Basic Settings - Document Path: {{url path}}
Firing Rule: {{event}} equals gtm.js Thanks! BK Mitesh0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
Site re-crawled?
I've fixed many of my errors, but they're still showing in my dashboard. When will the site be crawled again?
Reporting & Analytics | | sakeith0 -
How to measure number of visits from Google News coming from Google Universal Search (NOT referral coming directly coming from news.google.com) with google analyitcs
I'm running a news site, and I have a problem of accuratly measuring which traffic is REALLY coming from google news. I analyzed a lot of individual articles and I come to the conclusion, that the visits, that come from the google news section in the universal search results are counted as "normal" search engine traffic in google analytics. So if you do a Google search for a topic that includes links from Google news, you don't get an accurate referral count. As an example, if you do a search for "eBay", incorporated into the page 1 search results you may also see Google news results as well.
Reporting & Analytics | | Mulle
If someone clicks on that Google news link that appears in Google search, it shows up in Google analytics as a referral from Google search, when it was actually from a Google news referral. I was already checking google analytics and google news help forums and searched SEO blogs for this. But I wasn't able to find a working solution. Can anybody help me out with this problem? Thanks so much, Matthias0