[wtf] Mysterious Homepage De-Indexing
-
Our homepage, as well as several similar landing pages, have vanished from the index. Could you guys review the below pages to make sure I'm not missing something really obvious?!
URLs: http://www.grammarly.com http://www.grammarly.com/plagiarism-checker
- It's been four days, so it's not just a temporary fluctuation
- The pages don't have a "noindex" tag on them and aren't being excluded in our robots.txt
- There's no notification about a penalty in WMT
Clues:
-
WMT is returning an "HTTP 200 OK" for Fetch, is showing a redirect to grammarly.com/1 (alternate version of homepage, contains rel=canonical back to homepage) for Fetch+Render. Could this be causing a circular redirect?
-
Some pages on our domain are ranking fine, e.g. https://www.google.com/search?q=grammarly+answers
-
A month ago, we redesigned the pages in question. The new versions are pretty script-heavy, as you can see.
-
We don't have a sitemap set up yet.
Any ideas? Thanks in advance, friends!
-
Did this get resolved? I'm seeing your home-page indexed and ranking now.
I'm not seeing any kind of redirect to an alternate URL at this point (either as a browser or as GoogleBot). If you 301'ed to an alternate URL and then rel=canonical'ed back to the source of the 301, that could definitely cause problems. It's sending a pretty strong mixed-signal. In that case you'd probably want to 302 or use some alternate method. Redirects for the home-page are best avoided, in most cases.
-
Are you sure it was missing for a time? Ultimately I wouldn't use a third-party (Google) as a tool to diagnose problems (faulty on-site code) that I know are problems and need to be fixed.I'd fix the problems I know are issues and then go from there. Or hire someone capable of fixing the problems.
-
Thanks, Ryan. I'll get to work on the issues you mentioned.
I do have one question for you - grammarly.com/proofreading (significantly fewer links, identical codebase) is now back on the index. If the issue was too many scripts or HTML errors, wouldn't both pages still be de-indexed?
-
Here are some issues just going down the first few lines of code...
- There's a height attribute in your tag.
- Your cookie on the home page is set to expire in the past, not the future
- Your tag conflicts with your script and other code issues (http://stackoverflow.com/questions/21363090/doctype-html-ruins-my-script)
- Your Google Site Verification meta tag is different than other pages.
- Your link to the Optimizely CDN is incorrect... (missing 'http:' so it's looking for the script on your site)
- You have many other Markup Issues.
And that's prior to getting into the hundreds of lines of code preceding the start of your page at the tag... 300 lines or so on your other indexed pages 1100+ on your home page. So not only are you not following best practices as outlined by Google, but you have broken stuff too.
-
The saga continues...
According to WMT, there are no issues with grammarly.com The page is fetched and rendered correctly.
Google! Y u no index? Any ideas?
-
Like Lynn mentioned below, if you're having redirection take place across several portions of the site, that could cause the spikes, and a big increase in total download time is worrying if you're crossing the average bounce rate threshold for most people's patience.
Here's the Google Page speed take on it: https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fgrammarly.com&tab=desktop. They go over both desktop and mobile.
-
Hmm, was something done to fix the googlebot redirect issue or did it just fix itself? Here it states that googlebot will often identify itself as mozilla and your fetch/render originally seemed to indicate that at least some of the time that was the page google was getting. It is a bit murky technically what exactly is going on there but if google is getting redirected some of the time then as you said you are getting into a circular situation between the redirect and the canonical where it is a bit difficult to predict what will happen. If that is 100% fixed now and google sees the main page all the time then I would wait a day or two to see if the page comes back into the index (but be 100% sure that you know it is fixed!). I still think that is the most likely source of your troubles...
-
Excellent question, Lynn. Thank you for chiming in here. There's a user agent based javascript redirect that keeps Chrome visitors on grammarly.com (Chrome browser extension) and sends other browsers to grammarly.com/1 (Web app that works on all browsers).
UPDATE: According to WMT Fetch+Render, the Googlebot redirection issue has been fixed. It is no longer being redirected anywhere and returning a 200 OK for grammarly.com.
Kelly, if that was causing the problem, how long should I hold my breath for re-indexing after re-submitting the homepage?
-
Yup definitely. Whether you're completely removed or simply dropped doesn't matter. If you're not there anymore, for some reason Google determined you're no longer an authority for that keyword. So you need to find out why. Since you just redesigned, the way way is to back track, double check all the old tags and compare them to the new site, check the text and keyword usage on the website, look for anything that's changed that could contribute to the drop. If you don't find anything, tools like majesticSEO are handy to checking if your backlinks are still healthy.
-
Hi Alex, Thank you for your response. The pages didn't suffer in ranking, they were completely removed from the index. Based on that, do you still think it could be a keyword issue?
-
That's actually a great point. I suppose Google could have been holding on to a pre-redesign cached version of the pages.
There has been a 50-100% increase in page download times as well as some weird 5x spikes for crawled pages. I know there could probably be a million different reasons, but do any of them stick out at you as being potential sources of the problem?
-
How does that second version of the homepage work and how long has it been around for? I get one version of the homepage in one browser and the second in another, what decides which version is served and what kind of redirect is it? I think that is the most likely source of your troubles.
-
Yes, but the pages were indexed prior to the redesign, no? Can you look up your crawl stats in GWT to see if there's been a dramatic up tick in page download times, and a down trend in pages crawled. That will at least give you a starting point as to differences between now and then: https://www.google.com/webmasters/tools/crawl-stats
-
Logo definitely needs to be made clickable to Home.
Did you compare the old design and the new design's text to make sure you're still covering the same keywords. In many cases a redesign is more "streamlined" which also means less text or a re-write which is going to impact the keywords your site is relevant for.
-
Thanks, Ryan. Improving our code-to-text ratio is on our roadmap, but could that really be the issue here? The pages were all fully indexed without problems for a full month after our redesign, and we haven't added any scripts. Was there an algorithm update on Monday that could explain the sudden de-indexing?
-
VERY script heavy. Google has recently released updates on a lot of this (Q4 2014) here: http://googlewebmastercentral.blogspot.mx/2014/10/updating-our-technical-webmaster.html. With further guidance given here: https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/optimize-encoding-and-transfer. Without doing a deep dive that's the most glaring issue and obvious difference between pages that are still being indexed and those that are not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google cache is showing my UK homepage site instead of the US homepage and ranking the UK site in US
Hi There, When I check the cache of the US website (www.us.allsaints.com) Google returns the UK website. This is also reflected in the US Google Search Results when the UK site ranks for our brand name instead of the US site. The homepage has hreflang tags only on the homepage and the domains have been pointed correctly to the right territories via Google Webmaster Console.This has happened before in 26th July 2015 and was wondering if any had any idea why this is happening or if any one has experienced the same issueFDGjldR
Intermediate & Advanced SEO | | adzhass0 -
Home page not being indexed
Hi Moz crew. I have two sites (one is a client's and one is mine). They are both Wordpress sites and both are hosted on WP Engine. They have both been set up for a long time, and are "on-page" optimized. Pages from each site are indexed, but Google is not indexing the homepage for either site. Just to be clear - I can set up and work on a Wordpress site, but am not a programmer. Both seem to be fine according to my Moz dashboard. I have Webmaster tools set up for each - and as far as I can tell (definitely not an exper in webmaster tools) they are okay. I have done the obvious and checked that the the box preventing Google from crawling is not checked, and I believe I have set up the proper re-directs and canonicals.Thanks in advance! Brent
Intermediate & Advanced SEO | | EchelonSEO0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Home page mysteriously not ranking at all
Hey everyone, I'm baffled by a situation. I started working on www.hushabyephotography.com almost 3 months ago. The first thing I did was changed the address (was previously blog.hushabyephotography.com) and updated all of the links pointing from the old name to the new "www" and used 301 redirects for canonicalization. Have also built some additional links, added content to the site, and performed on-page optimization to rank for the primary key phrase "san diego newborn photography". But ranking is no where to be found, I searched for the key phrase, and at rank 190 is a random subpage, not even the home page. The listing had also disappeared from Google Places a few weeks ago after the client updated their listing (which I believe was the result of the known Places bug (http://www.seomoz.org/blog/why-you-may-need-to-hide-your-google-places-address-asap). I'm at a loss here for ideas, as I've never failed to at least have a site on the radar. There aren't any spam links so I don't think any penalties are the result. One last thing, it ranks on page 2 in both Bing and Yahoo... Please help me out Mozzers!!! 😕
Intermediate & Advanced SEO | | Joes_Ideas0 -
Indexing issue or just time?
Hey guys, When I publish a post on our blog, I notice that it barely shows up in SERPs even if I copy and paste the title verbatim into Google. All my settings in Yoast are correct from what I've seen. Is this just Google slowly getting around to crawling our site? Or is something else wrong here? We recently shut down and relaunched our site about 3 weeks ago. Here is the site URL: The Tech Block
Intermediate & Advanced SEO | | ttb0 -
No index, follow vs. canonical url
We have a site that consists almost entirely as a directory of videos. Example here: http://realtree.tv/channels/realtreeoutdoorsclassics We're trying to figure out the best way to handle pagination and utility features such as sort for most recent, most viewed, etc. We've been reading countless articles on this topic, but so far have been unable to determine what might be considered the industry standard. Two solutions seem to stand out... Using the canonical url on all the sorted and paginated pages. However, after reading many blog posts, it seems that you should NEVER use the canonical url to solve the issue of paginated, and thus duplicated content because the search bots will never crawl past the first page leaving many results not in the index. (We are considering ruling this method out.) Another solution seems to be using the meta tag for noindex, follow so that a search engine like Google will crawl your directory pages but not add them to the index themselves. All links are followed so content is crawled and any passing link juice remains unchanged. However, I did see a few articles skeptical of this solution as well saying that there are always better alternatives, or that there is no verification that search engines obey this meta tag. This has placed some doubt in our minds. I was hoping to get some expert advice on these methods as it would pertain to our site. Thank you.
Intermediate & Advanced SEO | | grayloon0 -
Should I Allow Blog Tag Pages to be Indexed?
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
Intermediate & Advanced SEO | | JSOC0