[wtf] Mysterious Homepage De-Indexing
-
Our homepage, as well as several similar landing pages, have vanished from the index. Could you guys review the below pages to make sure I'm not missing something really obvious?!
URLs: http://www.grammarly.com http://www.grammarly.com/plagiarism-checker
- It's been four days, so it's not just a temporary fluctuation
- The pages don't have a "noindex" tag on them and aren't being excluded in our robots.txt
- There's no notification about a penalty in WMT
Clues:
-
WMT is returning an "HTTP 200 OK" for Fetch, is showing a redirect to grammarly.com/1 (alternate version of homepage, contains rel=canonical back to homepage) for Fetch+Render. Could this be causing a circular redirect?
-
Some pages on our domain are ranking fine, e.g. https://www.google.com/search?q=grammarly+answers
-
A month ago, we redesigned the pages in question. The new versions are pretty script-heavy, as you can see.
-
We don't have a sitemap set up yet.
Any ideas? Thanks in advance, friends!
-
Did this get resolved? I'm seeing your home-page indexed and ranking now.
I'm not seeing any kind of redirect to an alternate URL at this point (either as a browser or as GoogleBot). If you 301'ed to an alternate URL and then rel=canonical'ed back to the source of the 301, that could definitely cause problems. It's sending a pretty strong mixed-signal. In that case you'd probably want to 302 or use some alternate method. Redirects for the home-page are best avoided, in most cases.
-
Are you sure it was missing for a time? Ultimately I wouldn't use a third-party (Google) as a tool to diagnose problems (faulty on-site code) that I know are problems and need to be fixed.I'd fix the problems I know are issues and then go from there. Or hire someone capable of fixing the problems.
-
Thanks, Ryan. I'll get to work on the issues you mentioned.
I do have one question for you - grammarly.com/proofreading (significantly fewer links, identical codebase) is now back on the index. If the issue was too many scripts or HTML errors, wouldn't both pages still be de-indexed?
-
Here are some issues just going down the first few lines of code...
- There's a height attribute in your tag.
- Your cookie on the home page is set to expire in the past, not the future
- Your tag conflicts with your script and other code issues (http://stackoverflow.com/questions/21363090/doctype-html-ruins-my-script)
- Your Google Site Verification meta tag is different than other pages.
- Your link to the Optimizely CDN is incorrect... (missing 'http:' so it's looking for the script on your site)
- You have many other Markup Issues.
And that's prior to getting into the hundreds of lines of code preceding the start of your page at the tag... 300 lines or so on your other indexed pages 1100+ on your home page. So not only are you not following best practices as outlined by Google, but you have broken stuff too.
-
The saga continues...
According to WMT, there are no issues with grammarly.com The page is fetched and rendered correctly.
Google! Y u no index? Any ideas?
-
Like Lynn mentioned below, if you're having redirection take place across several portions of the site, that could cause the spikes, and a big increase in total download time is worrying if you're crossing the average bounce rate threshold for most people's patience.
Here's the Google Page speed take on it: https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fgrammarly.com&tab=desktop. They go over both desktop and mobile.
-
Hmm, was something done to fix the googlebot redirect issue or did it just fix itself? Here it states that googlebot will often identify itself as mozilla and your fetch/render originally seemed to indicate that at least some of the time that was the page google was getting. It is a bit murky technically what exactly is going on there but if google is getting redirected some of the time then as you said you are getting into a circular situation between the redirect and the canonical where it is a bit difficult to predict what will happen. If that is 100% fixed now and google sees the main page all the time then I would wait a day or two to see if the page comes back into the index (but be 100% sure that you know it is fixed!). I still think that is the most likely source of your troubles...
-
Excellent question, Lynn. Thank you for chiming in here. There's a user agent based javascript redirect that keeps Chrome visitors on grammarly.com (Chrome browser extension) and sends other browsers to grammarly.com/1 (Web app that works on all browsers).
UPDATE: According to WMT Fetch+Render, the Googlebot redirection issue has been fixed. It is no longer being redirected anywhere and returning a 200 OK for grammarly.com.
Kelly, if that was causing the problem, how long should I hold my breath for re-indexing after re-submitting the homepage?
-
Yup definitely. Whether you're completely removed or simply dropped doesn't matter. If you're not there anymore, for some reason Google determined you're no longer an authority for that keyword. So you need to find out why. Since you just redesigned, the way way is to back track, double check all the old tags and compare them to the new site, check the text and keyword usage on the website, look for anything that's changed that could contribute to the drop. If you don't find anything, tools like majesticSEO are handy to checking if your backlinks are still healthy.
-
Hi Alex, Thank you for your response. The pages didn't suffer in ranking, they were completely removed from the index. Based on that, do you still think it could be a keyword issue?
-
That's actually a great point. I suppose Google could have been holding on to a pre-redesign cached version of the pages.
There has been a 50-100% increase in page download times as well as some weird 5x spikes for crawled pages. I know there could probably be a million different reasons, but do any of them stick out at you as being potential sources of the problem?
-
How does that second version of the homepage work and how long has it been around for? I get one version of the homepage in one browser and the second in another, what decides which version is served and what kind of redirect is it? I think that is the most likely source of your troubles.
-
Yes, but the pages were indexed prior to the redesign, no? Can you look up your crawl stats in GWT to see if there's been a dramatic up tick in page download times, and a down trend in pages crawled. That will at least give you a starting point as to differences between now and then: https://www.google.com/webmasters/tools/crawl-stats
-
Logo definitely needs to be made clickable to Home.
Did you compare the old design and the new design's text to make sure you're still covering the same keywords. In many cases a redesign is more "streamlined" which also means less text or a re-write which is going to impact the keywords your site is relevant for.
-
Thanks, Ryan. Improving our code-to-text ratio is on our roadmap, but could that really be the issue here? The pages were all fully indexed without problems for a full month after our redesign, and we haven't added any scripts. Was there an algorithm update on Monday that could explain the sudden de-indexing?
-
VERY script heavy. Google has recently released updates on a lot of this (Q4 2014) here: http://googlewebmastercentral.blogspot.mx/2014/10/updating-our-technical-webmaster.html. With further guidance given here: https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/optimize-encoding-and-transfer. Without doing a deep dive that's the most glaring issue and obvious difference between pages that are still being indexed and those that are not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is a canonicalized URL still in index?
Hi Mozers, We recently canonicalized a few thousand URLs but when I search for these pages using the site: operator I can see that they are all still in Google's index. Why is that? Is it reasonable to expect that they would be taken out of the index? Or should we only expect that they won't rank as high as the canonical URLs? Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
Google is indexing the wrong pages
I have been having problems with Google indexing my website since mid May. I haven't made any changes to my website which is wordpress. I have a page with the title 'Peterborough Cathedral wedding', I search Google for 'wedding Peteborough Cathedral', this is not a competitive search phrase and I'd expect to find my blog post on page one. Instead, half way down page 4 I find Google has indexed www.weddingphotojournalist.co.uk/blog with the title 'wedding photojournalist | Portfolio', what google has indexed is a link to the blog post and not the blog post itself. I repeated this for several other blog posts and keywords and found similar results, most of which don't make any sense at all - A search for 'Menorca wedding photography' used to bring up one of my posts at the top of page one. Now it brings up a post titled 'La Mare wedding photography Jersey" which happens to have a link to the Menorca post at the bottom of the page. A search for 'Broadoaks country house weddng photography' brings up 'weddingphotojournalist | portfolio' which has a link to the Broadoaks post. a search for 'Blake Hall wedding photography' does exactly the same. In this case Google is linking to www.weddingphotojournalist.blog again, this is a page of recent blog posts. Could this be a problem with my sitemap? Or the Yoast SEO plugin? or a problem with my wordpress theme? Or is Google just a bit confused?
Intermediate & Advanced SEO | | weddingphotojournalist0 -
Which URLs were indexed 2 years ago?
Hi, I hope anyone can help me with this issue. Our french domain experienced a huge drop of indexed URLs in 2012. More than 50k URLs were indexed, after the drop less than 10k were counted. I would like to check what happened here and which URLs were thrown out of the index. So I was thinking about a comparison between todays data and the data of 2012. Unfortunately we don't have any data on the indexed pages in 2012 beside the number of indexed pages. Is there any way to check, which URLs were indexed 2 years ago?
Intermediate & Advanced SEO | | Sandra_h0 -
Links from non-indexed pages
Whilst looking for link opportunities, I have noticed that the website has a few profiles from suppliers or accredited organisations. However, a search form is required to access these pages and when I type cache:"webpage.com" the page is showing up as non-indexed. These are good websites, not spammy directory sites, but is it worth trying to get Google to index the pages? If so, what is the best method to use?
Intermediate & Advanced SEO | | maxweb0 -
Homepage is losing rankings...need help!
Hi folks, Our company has been doing SEO for www.3dincites.com since October. Since the website hadn't been optimized at all, one of the first things we did was to add keyword-optimized page title and meta description to the homepage. The person who implemented didn't do it right the first time so we needed to change the page title again after a week or so. We also changed the preferred domain in GWT. Ever since those changes were implemented, the homepage has been decreasing in rankings for the the keywords we optimized it for. This is very surprising as the website has high-quality unique content and pretty solid backlinks. Moreover, the search traffic to other pages is through the roof (2K increase since October) so the client is happy, however, this decrease in rankings for the homepage (from 3d to 6th page for "3d IC technology" which is the main keyword) doesn't let me sleep well at nights. Any ideas why this is happening?
Intermediate & Advanced SEO | | zoomzoom0 -
Certain Product Pages Not Indexing
Hey All, We discovered an issue where new product pages on our site were not getting indexed because a "noindex" tag was inadvertently being added to section when those pages were created. We removed the noindex tag in late April and some of the pages that had not been previously indexed are now showing up, but others are still not getting indexed and I'd appreciate some help on why this could be. Here is an example of a page that was not in the index but is now showing after removal of noindex: http://www.cloud9living.com/san-diego/gaslamp-quarter-food-tour And here is an example of a page that is still not showing in the index: http://www.cloud9living.com/atlanta/race-a-ferrari UPDATE: The above page is now showing after I manually submitted it in WMT. I had previously submitted another page like a month ago and it was still not indexing so I thought the manual submission was a dead end. However, it just so happens that the above URL just had its Page Title and H1 updated to something more specific and less duplicative so I am currently running a test to see if that's the problem with these pages not indexing. Will update this soon. Any suggestions? Thanks!
Intermediate & Advanced SEO | | GManSEO0 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0 -
How to find the number of DoFollow links on the homepage
Dear SEOmozzers, I just have a quick question. Can you please tell me how I can find out how many DoFollow links I have on the page of a site? I did a link exchange with a person who claims that she found 50+ DoFollow links on my homepage. How did she do that? What tool did she use to find out the number of DoFollow links on my homepage? Thank you. Sal
Intermediate & Advanced SEO | | salvyy0