Page drops from index completely
-
We have a page that is ranking organically at #1 but over the past couple of months the page has twice dropped from a search term entirely. There don't appear to be any issues with the page in Search Console and adding the page on https://www.google.com/webmasters/tools/submit-url seems to fix the issue.
The search term we're tracking that drops is in the URL for the page and is the h1 of the page.
Here is a screenshot of the ranking over the past few months: https://jmp.sh/akvaKGF
What could cause this to happen? There is nothing in search console that shows any problems with the page. The last time this happened the page completely dropped on all search terms and showed up again after submitting the url to google manually. This time it dropped on just one search term and reappeared the next day after manually submitting the page again.
-
I had a simlar issue on a couple Wordpress freebie sub domains I made while conducting reputation management for clients. What had ended up happening was The site would index immediately and then 24 hours later be ghosted completely.
Turns out I was submitting the news sitemap that it automatically generated and being that I wasn't in their list of approved news sitemaps, I guess it just ripped everything out, as I'm sure the news sitemap and the regular one had the same pages listed just with more detail on the news one.
I doubt it's the exact same occurrence but if you recently submitted a sitemap, I'd check it closely, as it has been known to trigger a similar problem, at least for me!
-
Thanks, Nigel. Your responses are actually quite helpful pointers. There's a possibility that Google is flagging it as duplicate content as perhaps the content on this page is a bit sparse. We have two posts - first post is a "What is this type of document you need" and the second post which is a link to a template for that doc. The template one is the one that has dropped twice. Here is the search we're dropping from occasionally. Interestingly enough, Google is indexing the public Google Doc our page points to and including that in search results.
Excuse the bitly links, just trying to avoid the search terms showing up for others to find.
To answer your questions directly:
- Google seems to be respecting canonicals
- Page is in sitemap
- Perhaps too much repetition? Maybe we should expand the content a bit
- This may well have happened as we have seen a few sites "republish" some of our content.
-
Hi Russell
I would have to see the URL but it looks like a duplicate content problem. Have you recently written a blog post with a very similar title?
Is Google respecting your canonicals?
Is the page in your sitemap?
Is it over optimised? too much repetition of teh main keyword?
Has someone stolen all of the content creating cross-site duplication?There isn't a lot to go on but I agree it's very unusual!
Regards Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I noindex shop page and blog page for SEO?
I have about 15 products in my store. Should I noindex shop and blog page for SEO? The reason I ask this question is because I see someone suggesting noindex archives pages. And the shop page is product archive and blog page is archive too, so should I choose index or noindex? Thanks!
White Hat / Black Hat SEO | | Helloiamgood0 -
Unlisted (hidden) pages
I just had a client say they were advised by a friend to use 'a bunch of unlisted (hidden) pages'. Isn't this seriously black hat?
White Hat / Black Hat SEO | | muzzmoz0 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
Thin Content Pages: Adding more content really help?
Hello all, So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links). The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add. We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting. Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Getting a link from an internal page with PR 2 of a domain with PR 5 is how much effective?
My website got a link from an internal page with PR rank of 2 but the domain has the PR rank 5. For example - A domain www.example.com with PR rank 5 and internal page www.example.com/extra/1 PR rank 2. I got a link from the internal page, will I benefit from main domain Page rank 5? Thanks, Sameer
White Hat / Black Hat SEO | | KaylaKerr0 -
How do you optimize a page with Syndicated Content?
Content is syndicated legally (licensed). My questions are: What is the best way to approach this situation? Is there any a change to compete with the original site/page for the same keywords? Is it okay to do so? Will there be any negative SEO impact on my site?
White Hat / Black Hat SEO | | StickyRiceSEO0 -
Indexing search results
One of our competitors indexes all searches performed by users on their site. They automatically create new pages/ new urls based on those search terms. Is it black hat technique? Do search engines specifically forbid this?
White Hat / Black Hat SEO | | AEM131