Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Home page suddenly dropped from index!!
-
A client's home page, which has always done very well, has just dropped out of Google's index overnight!
Webmaster tools does not show any problem. The page doesn't even show up if we Google the company name.The Robot.txt contains:
Default Flywheel robots file
User-agent: *
Disallow: /calendar/action:posterboard/
Disallow: /events/action~posterboard/The only unusual thing I'm aware of is some A/B testing of the page done with 'Optimizely' - it redirects visitors to a test page, but it's not a 'real' redirect in that redirect checker tools still see the page as a 200. Also, other pages that are being tested this way are not having the same problem.
Other recent activity over the last few weeks/months includes linking to the page from some of our blog posts using the page topic as anchor text.
Any thoughts would be appreciated.
Caro -
Woot! So glad to see it wasn't a penalty!
-
Michael,
Duplicate content wasn't the issue in the end, but your response prompted me to analyse their home page text more closely and I discovered that there was room for improvement - too much of the home page content was also present on other pages of the site. Thanks for that!
-
Everyone, this has been resolved! The problem turned out to be a code error in the canonical tag for the page. There was an extra space and slash. Ironically, the canonical tag was one of the first things we looked at, yet we all overlooked that error
Thank you all so much for your input and assistance.
-
Thank you Michael...I'll do that.
-
I've seen a client have an internal page just suddenly be de-indexed. What appears to have happened is that Google saw it as a near duplicate of another page on their site, and dropped it from the index for that reason. Then, magically, it reappeared a week later.
You may be seeing something like this here. See what Moz Pro thinks in terms of duplicate content on your site, and if the home page gets called out along with another page.
-
Thanks so much for that info. I had not heard of Kerboo...I'll definitely check that out right away. Your input has been extremely helpful Kristina.
Caro
-
I would be incredibly surprised if internal links to the homepage caused the issue. Google expects you to have a bunch of internal links to the homepage.
What you're going to need to do now is do a thorough review of all of the external links pointing to your homepage. I would do this with a tool - I recommend Kerboo, although I'm sure there are others that could do the same thing. Otherwise, you can look through all of the links yourself and look for spam indications (steps outlined in this handy Moz article).
Either way, make sure that you pull your list of links from Ahrefs or Majestic. Ideally both, and merge the lists. Moz doesn't crawl nearly as many links.
Since you haven't gotten a manual penalty warning, you're going to have to take as many of the spammy links you find down as you can and disavow the others. For speed, I'd recommend that you immediately upload a list of spammy links with Google's disavow tool, then start asking for an actual removal.
Keep in mind that you're probably going to disavow links that were helping rankings, so expect that your homepage won't come back ranking as well for nonbranded search terms as it used to. You'll probably want to start out uploading a very conservative set of URLs to the disavow tool, wait a couple of days to see if that fixes the problem, upload a bigger set, check, etc.
Good luck!
-
No luck Kristina
I'm wondering if it's an algorithmic penalty in response to back links. We've never done shady linking, but over the years the site has gathered some strange links. Or, is there some chance that about two dozen anchor text links from their blog to the home page could have done it? I deleted them. But I can't request reconsideration if the penalty isn't manual.
-
Any luck so far? Usually it only takes a few hours for Google to crawl new pages after you submit them in GSC, in my experience.
-
I see no serious crawl issues. Mostly things we're already addressing, like duplicate content caused by blog tags and categories, missing meta descriptions (mostly in our knowledge base, so not an issue) and stuff like that.
When I checked the home page alone it said zero high, medium or low priority issues.
The page only de-indexed very recently. Maybe the next crawl will catch something. Same with GSC...it looks like the last 2 days of info is not available yet.
I should mention the home page Optimizely test had been running for at least a week before the page got dropped (will get actual date from client) , plus they have had a product page running a test for weeks with no problem. But I still think your suggestion to pause the test is a good one as I don't want anything to hinder the process of fixing this.
Update: Optimizely has been paused, code removed, home page submitted in GSC.
-
Okay, I ran some tests, and can't see anything that could've gone wrong. That does make it seem like a penalty, but given that this coincided with setting up Optimizely, let's go down that path first.
While your team is taking down the test - have you checked Moz to see if its crawler sees anything that could be causing an issue? I set up my Moz crawler to look into it, but it'll take a few days.
-
Thanks Kristina,
We have not tried pausing the test, but I can request they do that. It may be a good idea to do it regardless of whether it's causing the problem or not, while we get this issue sorted out.
Fetch as Google gave this result:HTTP/1.1 200 OK - so looks ok. I understand this also submits your page to Google as an actual indexing request?
site:https://website.com shows all our pages except the home page.
So, it looks like it's decided not to rank it for some reason.
I deleted some links from the blog to the home page - they had a keyword phrase as the anchor text. There were about 20 links that had accumulated over a few months. Not sure if that's the issue.
Still no manual penalty notice from Google.
-
Hm, I've done a lot with Optimizely in the past, and it's never caused an SEO problem, but it's completely possible something went wrong. Since that's your first inkling, have you tried pausing that test and removing the Optimizely code from the homepage? Then you can determine whether or not it's an Optimizely problem.
Another thing you can do is use the Fetch as Googlebot feature in GSC. Does GSC say it can fetch the page properly?
If it says it can, try searching for "site:www.yourcompanysite.com". This will show if Google's got your URL in its index. If nothing comes up, it's not there; if it comes up, Google's decided not to rank it for some reason.
After those steps, get back to us so we can figure out where to go from there!
Good luck,
Kristina
-
Jordan, not on the original version of the home page, but there is on the B test version.
The way I understand it the B version is a javascript page that is noindexed. Their redirect system seems to leave the original page looking like there is no redirect. Are you suggesting we use a 302 instead? -
Also, Google recommends you 302 those url's instead of returning a 200 http code. You can read more about their best practices about a/b testing.
-
Is there a 'meta no index no follow tag' implemented by chance?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
My blog is indexing only the archive and category pages
Hi there MOZ community. I am new to the QandA and have a question. I have a blog Its been live for months - but I can not get the posts to rank in the serps. Oddly only the categories rank. The posts are crawled it seems - but seen as less important for a reason I don't understand. Can anyone here help with this? See here for what i mean. I have had several wp sites rank well in the serps - and the posts do much better. Than the categories or archives - super odd. Thanks to all for help!
Intermediate & Advanced SEO | | walletapp0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
Page position dropped on Google
Hey Guys, My web designer has recommended this forum to use, the reason being: my google position has been dropped from page 1 to page 10 in the last week. The site is weloveschoolsigns.co.uk, but our main business site is textstyles.co.uk the school signs are a product of text styles. I have been told off my SEO company, that because I have changed the school logo to the text styles logo, Google have penalised me for it, and dropped us from page 1 for numerous keywords, to page 10 or more. They have also said that duplicate content within the school site http://www.weloveschoolsigns.co.uk/school-signs-made-easy/ has also a contributed to the drop in positions. (this content is not on the textstyles site) Lastly they said, that having the same telephone number is a definate no no. They said that I have been penalised, because google see the above as trying to monopolise on the market. I don’t know if all this is true, as the SEO is way above my head, but they have quoted me £1250 to repair all the errors, when the site only cost £750. They have also mentioned that because of the above changes, the main text styles site will also be punished. Any thoughts on this matter would be much appreciated as I don't know whether to pay them to crack on, or accept the new positions. Either way I'm very confused. Thanks Thomas
Intermediate & Advanced SEO | | TextStylesUK0 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0 -
Whats the best way to remove search indexed pages on magento?
A new client ( aqmp.com.br/ )call me yestarday and she told me since they moved on magento they droped down more than US$ 20.000 in sales revenue ( monthly)... I´ve just checked the webmaster tool and I´ve just discovered the number of crawled pages went from 3.260 to 75.000 since magento started... magento is creating lots of pages with queries like search and filters. Example: http://aqmp.com.br/acessorios/lencos.html http://aqmp.com.br/acessorios/lencos.html?mode=grid http://aqmp.com.br/acessorios/lencos.html?dir=desc&order=name Add a instruction on robots.txt is the best way to remove unnecessary pages of the search engine?
Intermediate & Advanced SEO | | SeoMartin10 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0