Google dropping pages from SERPs even though indexed and cached. (Shift over to https suspected.)
-
Anybody know why pages that have previously been indexed - and that are still present in Google's cache - are now not appearing in Google SERPs?
All the usual suspects - noindex, robots, duplication filter, 301s - have been ruled out. We shifted our site over from http to https last week and it appears to have started then, although we have also been playing around with our navigation structure a bit too.
Here are a few examples...
Example 1:
- Live URL: https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place
- Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place
- SERP (1): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place
- SERP (2): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place+site%3Awww.normanrecords.com
Example 2:
- SERP: https://www.google.co.uk/search?q=deaf+center+recount+site%3Awww.normanrecords.com
- Live URL: https://www.normanrecords.com/records/149001-deaf-center-recount-
- Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149001-deaf-center-recount-
These are pages that have been linked to from our homepage (Moz PA of 68) prominently for days, are present and correct in our sitemap (https://www.normanrecords.com/catalogue_sitemap.xml), have unique content, have decent on-page optimisation, etc. etc.
We moved over to https on 11 Aug. There were some initial wobbles (e.g. 301s from normanrecords.com to www.normanrecords.com got caught up in a nasty loop due to the conflicting 301 from http to https) but these were quickly sorted (i.e. spotted and resolved within minutes). There have been some other changes made to the structure of the site (e.g. a reduction in the navigation options) but nothing I know of that would cause pages to drop like this.
For the first example (Memory Drawings) we were ranking on the first page right up until this morning and have been receiving Google traffic for it ever since it was added to the site on 4 Aug.
Any help very much appreciated! At the very end of my tether / understanding here...
Cheers,
Nathon
-
Sorry denverish, I have been really busy lol
Point number 3 refers to "pinging" your website, to let the search engines know your site has been updated. There are a few spammy ones out there, but I would try using:
http://pingomatic.com/
or
https://pingler.com/Pinging is a process by which you can inform major Search Engines and RSS direcotries(Google,Bing,Yahoo,Technorati etc) that you have updated content/URL's in your blog/Website. Pinging can also help in getting website changes indexed quickly.
-
Hi David, did you see Nathon's most recent response? Just checking.
-
Hi David,
Thanks for your response!
Yup, our sitemaps are recreated and resubmitted every day, and those pages have both been fetched (and rendered) in GWT. Not sure exactly what you mean by point 3 though??
-
Even though the 301's are all in place, have you notified Google of the changes? Might be why you are seeing them disappear
1. Sitemap recreation and resubmission
2. Resubmit via Webmaster tools using "fetch as Google"
3. Ping website for new pages
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Ranking Drop and Google Disavow Requests
My website, www.nile-cruises-4u.co.uk has fallen dramatically for the top industry search terms (nile cruise, nile cruises) over the last 12 months from previous page one rankings to page three which has very badly affected us financially. I found, using Linkdetox, that we had thousands of back-links for non-related anchor-text, mainly porn terms, viagra, etc. I have submitted a Disavow file and request about a week ago and wondered firstly if the enormous amount of these links would have helped cause the drop to page three and secondly if the Disavow request will eventually help the website return to better rankings? Thanks,Colin
Technical SEO | | GratefulFred0 -
Why my site is not indexing in google
In google webmaster i have updated my sitemap in Mar 6th..There is around 22000 links..But google fetched only 5300 links for long time...
Technical SEO | | Rajesh.Chandran
I waited for 1 month till no improvement in google index..So apr6th we have uploaded new sitemap (1200 links totally)..,But only 4 links indexed in google ..
why google not indexing my urls? Is this affect our ranking in SERP? How many links are advisable to submit in sitemap for a website?0 -
Does google like Category pages or pages with lots of Products on them?
We are having an issue with getting Google to rank the page we want. To have this page http://www.jakewilson.com/c/52/-/346/Cruiser-Motorcycle-Tires rank for the key word Cruiser Motorcycle Tires; however, this page http://www.jakewilson.com/t/52/-/343/752/Cruiser-Motorcycle-Tires is ranking instead and it has less links and page authority according to site explorer and it is farther down in the hierarchy. I am wondering if google just likes pages that have actual products on them instead of a page leading to the page with all the products. Thoughts?
Technical SEO | | DoRM0 -
Properly Moving Blog from Index to its Own Page
Right now I have a website that is exclusively a blog. I want to create pages outside of the blog and move the blog to a page other than the index file e.g.) from domain.com to domain.com/blog I will have the blog post pages stay in the root directory. e.g.) domain.com/blog-post Any suggestions how to properly tell SE's and other websites that the blog has moved?
Technical SEO | | Bartell0 -
Google indexing directory folder listing page
Google somehow managed to find several of our images index folders and decided to include them into their index. Example: websitesite.com/category/images/ is what you'll see when doing a site:website.com search. So, I have two-part question: 1) Does this hurt our site's ability to rank in any way?
Technical SEO | | invision
Because all Google sees is just a directory listing page with a bunch of links to images in the folder. 2) If there could be any negative effect, what is the best way to get these folders out of Google's index?
I could block via robots.txt, but I'm afraid it will also block all the images in that folder from being indexed in Google image search. I could also turn off directory listing in cpanel / htaccess, but then that gives is a 403 forbidden. Will this hurt the site in anyway and would it prevent Google from indexing the images in the directory? Thanks,
Tony0