Google cache is for a 3rd parties site for HTTP version and correct for HTTPS
-
If I search Google for my cache I get the following:
- cache:http://www.saucydates.com -> Returns the cache of netball.org (HTTPS page with Plesk default page)
- cache:https://www.saucydates.com -> Displays the correct page
Prior to this my http cache was the Central Bank of Afghanistan. For most searches at present my index page is not returned and when it is, it’s the Net Ball Plesk page. This is, of course hurting my search traffic considerably.
** I have tried many things, here is the current list:**
- If I fetch as Google in webmaster tools the HTTPS fetch and render is correct.
- If I fetch the HTTP version I get a redirect (which is correct as I have a 301 HTTP to HTTPS redirect).
- If I turn off HTTPS on my server and remove the redirect the fetch and render for HTTP version is correct.
- The 301 redirect is controlled with the 301 Safe redirect option in Plesk 12.x
- The SSL cert is valid and with COMODO
- I have ensured the IP address (which is shared with a few other domains that form my sites network / functions) has a default site
- I have placed a site on my PTR record and ensured the HTTPS version goes back to HTTP as it doesn’t need SSL
- I have checked my site in Waybackwhen for 1 year and there are no hacked redirects
- I have checked the Netball site in Waybackwhen for 1 year, mid last year there is an odd firewall alert page.
- If you check the cache for the https version of the netball site you get another sites default plesk page.
- This happened at the same time I implemented SSL
Points 6 and 7 have been done to stop the server showing a Plesk Default page as I think this could be the issue (duplicate content)
** Ideas:**
- Is this a 302 redirect hi-jack?
- Is this a Google bug?
- Is this an issue with duplicate content as both servers can have a default Plesk page (like millions of others!)
- A network of 3 sites mixed up that have plesk could be a clue?
Over to the experts at MOZ, can you help?
Thanks,
David
-
After 1 month of mixed caches it fixed itself.
-
PS was your hosting company? If there managed host I would suggest you have them look into the issue requesting force HTTPS
-
Please verify everything is in Google Webmaster tools for HTTPS
so there are four versions of your site in Webmaster tools pick the version you wish to index in your case will be www. or non-www. HTTPS then fetch as a Googlebot. look for errors.
- http://
- http://www
- https://
- https://www
https://varvy.com/tools/redirects/
screaming frog SEO spider free for The first 500 URLs it will show if there is a problem most likely be very helpful.
https://www.screamingfrog.co.uk/seo-spider/
or https://deepcrawl.com To figure out rather or not it's duplicated content and find possibly broken redirects or missing HSTS
this will confirm you do not have any and redirects.
After doing so if you do have bad redirects you will have to speak to your hosting company or at least share info with us about your server configuration sure that it is set to force HTTPS
See the guides below to make sure you don't have an error somewhere.
- https://www.keycdn.com/blog/http-to-https/
- https://moz.com/blog/seo-tips-https-ssl
- http://searchengineland.com/http-https-seos-guide-securing-website-246940
- https://support.google.com/webmasters/answer/6073543?hl=en
If you would like to force the redirect to HTTPS there are three third-party tools that will allow you to do this for any site
You still will want to use deep crawl or screaming frog to check your set up.
I hope this is of help, and the last URL is a free tool.
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New site migration (multiple sites into one + new domain)
Hi, I have read so many very helpful guides and experiences from you guys that will greatly help me but I have a few questions please. Our company has 3 sites, the main site and 2 sites for different product ranges: BrandProductName.com (main site - DA = 22 raking well for product name) Productname2.com (DA = 10 ranking very well for product name and little competition) BrandProductName3.com (DA = 10 poor ranking) We wish to bring all the sites into one with categories for the 3 different product. The main site is an e-commerece site whereas the other 2 are not (currently). On top of this as the main domain has one of the product names in it they wish to change the domain to be just Brandname.com. So the plan is to combine site 2 and 3 into site 1 and change that domain name. As you can imagine this is going to be quite a job. I am fairly happy with the steps required (having read all the guides and migrated many sites in the past) but with the added domain name change this is a little daunting. So my questions are: Should I merge the 3 sites into 1 and then changed the domain at a later point? Should I change the domain of the main site first and then merge site 2 and 3 in later? Should I just do it all together? Or based on the data i have provided do you disagree with the plan, what would you recommend? We are not in a massive rush to complete all of this so we have the time to plan and execute this when we are fully ready. Any help / advise would be greatly appreciated. Thanks all
Intermediate & Advanced SEO | | csimmo0 -
Completely redesigned webmaster - set up new site in Google Webmaster Tools, or keep existing??
Hi - our company just completely redesigned our website and went from a static HTML site to a PHP based site, so every single URL has changed (around 1500 pages). I put the same verification code into the new site and re-verified but now Google is listing tons and tons of 404's. Some of them are really old pages that haven't existing in a long time, it would literally be impossible to create all the redirects for the 404s it's pulling. Question - when completely changing a site like this, should I have created a whole new Search Console? Or did I do the right thing by using the existing one?
Intermediate & Advanced SEO | | Jenny10 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
Site re-design, full site domain A/B test, will we drop in rankings while leaking traffic
We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete
Intermediate & Advanced SEO | | leshonk0 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
Better SEO Option, 1 Site 3 Subdomains or 4 Separate Sites?
Hey Mozzers, I'm working with a client who wants to redo their web presence. They have a a main website for the umbrella and then 3 divisions which have their own website as well. My question is: Is it better to have the main site on the main domain and then have the 3 separate sites be subdomains? Or 4 different domains with a linking structure to tie them all together? To my understanding option 1 would include high traffic for 1 domain and option 2 would be building Page Authority by having 4 different sites linking to each other? My guess would be option 2, only if all 4 sites start getting relevant authority to make the links of value. But right out of the gates option 1 might be more beneficial. A little advice/clarification would be great!
Intermediate & Advanced SEO | | MonsterWeb280 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
How Can I move a site higher in Google Places?
As we all know Google Local Business/Places now has significant real estate for many searches. What I find hard to understand is what makes the difference between the different positions. Is it solely based on the content in Google Places itself or is it regular ranking factors. I am (like everybody) on a hell for leather search to try and rank above my competition but having studied their Places information I do not think there is much I more I can do. Suggestions hat have actually worked for you?
Intermediate & Advanced SEO | | kdaly1000