Google de-indexed a page on my site
-
I have a site which is around 9 months old. For most search terms we rank fine (including top 3 rankings for competitive terms).
Recently one of our pages has been fluctuating wildly in the rankings and has now disappeared altogether from the rankings for over 1 week.
As a test I added a similar page to one of my other sites and it ranks fine.
I've checked webmaster tools and there is nothing of note there.
I'm not really sure what to do at this stage. Any advice would me much appreciated!
-
Another thing which is weird:
If I google "site:medexpress.co.uk" the page /clinics/erectile-dysfunction/viagra appears.
If I google "site:medexpress.co.uk viagra" that page no longer appears!
I could see why my page would have maybe dropped in the rankings but to go from a rank of around 30-70 for most of the keywords for this page to dropping off the index points to something more extraordinary. Its been like this for almost two weeks and I'm considering maybe changing the URL (as I mentioned other long-tail optimized blog pages are now starting to rank so it appears to be just this page).
Is it possible this page has been moved to the supplemental index for some reason?
-
Hi,
The domain actually has plenty of links and that page has two very high quality backlinks. The Moz tool for some reason is missing almost all the backlinks. If you use AHREFS it works much better (remembering to use www as the prefix for the domain).
The domain authority is 49 which is much higher than the bulk of sites on there and higher than the site in position number #1 which has somehow managed to rank with almost no backlinks (it has less than 20) or any quality content.
Whats even weirded I added a page to my other site www.centraltravelclinic.co.uk and it now outranks my main site even though it has less backlinks/relevant content.
There is even a page in my blog section which is now ranking as well.
I'm not sure but it seems as if there is some penalty against this one page...or something has gone wrong on the search index and it has been temporarily removed for whatever reason.
Regards,
Dwayne
-
Hi Everett,
I agree 100% the domain itself has a great name, the lack of domain authority or links undoubtably has taken its toll surprised it ranks for anything the honest without any authority.
A content marketing campaign with high quality content covering the topics he discussed like law, medical care and including tools for the end-user would make it extremely brand worthy.
I think the best way to go about it would be to start a six-month campaign and do some very aggressive work on site.
You have some great advice.
Sincerely,
Thomas
-
Thank you Thomas,
I did have a look at the site and came to the conclusion that it had relatively no domain authority or page/URL-level authority from quality backlinks. The page had no links, and the domain only had a couple, neither of them great. As you mention, this industry is very competitive. It is one of the most competitive niches online.
My recommendation was to embark on a content marketing campaign, possibly one that highlights medical care / law / coverage disparities between countries. The domain itself is very brandable, and if they were to latch onto a hot-topic issue in the right way they could get quoted as experts in that industry by major news outlets, or at least some decent blogs. Meanwhile, steady posting about the lindustry on the blog, occasional white papers or studies created by the company about the the industry... will help build their body of content and present their expertise on the topic. Any tool (calculators, price comparison charts, etc...) would also help. I also mentioned a recommended minimum budget for content creation and paid promotion of it over the next six months.
-
Hi Everett,
There is a good reason for him not not actually posting URL for keywords wants list for.
I think Deelo555 should private message you the information he sent me. They are keywords that are commonly thought of negatively is all I will say.
Deelo555 please send Everett what you sent me.
All the best,
Thomas
-
Deelo555 We'll need more information to be of much help here. Can you post the site and/or the keyword? If not, this is probably going to have to be assisted by private message with someone.
I'd look for over-optimized anchor text or spammy links going into that page just to make sure. From there I would look into whether the content on the page "should" rank for the keyword. In other words, does it answer the query or give the searcher what they are looking for? After that I would move on to general user experience. Does the site look good, trustworthy, authoritative...
Follow that path and you'll probably uncover some things to improve. If not, let us know.
-
I replied to the private message I hope it is helpful.
-
Fluxing sometimes can mean Google is testing your site verse the other competitors for that keyword.
If Google is unable to determine what is the most enjoyed by the end-user so they will allow for you to fluctuate in the SERPS.
If people query your keyword find your site in the SERPS then immediately leave once they have landed on your website. That is called "Pogo sticking" Google is simply trying to find out whether or not you are the best fit for that keyword.
1st off if you could test the site some of the tools below
https://moz.com/researchtools/crawl-test
http://deepcrawl.co.uk/ (Amazing tool)
http://www.screamingfrog.co.uk/seo-spider/ ( everyone should have a copy I recommend the paid but you can crawl 500 pages for free using SF free addition)
You want to be see what Google bot can see your site can do that using
http://www.feedthebot.com/tools/
Check your Robots.txt using
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/ ( this will pull the file sometimes there are more than 1 it does happen)
make sure that if you're using WordPress or other similar CMS is that Google bot is not being blocked from seeing J query
The reason for this is Google cannot see the site as it might be shown to the end user therefore they would be less likely to come back & index
https://www.authoritydev.com/dont-block-jquery-from-googlebot/
once we have completed eliminated and chance of robots.txt or no follow / no index then you will want to look at your entire website Very thoroughly.
I do not know how old your domain is could you share its age?
what is your domain authority? what is the page authority of pages not being indexed?
we need to look at all of this before we can render a verdict.
If you're uncomfortable sharing your URL publicly you're welcome to send it to me privately.
what type of URLs are pointing (if any) Internal or external are pointing at your pages that are not indexing.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Is this page de-indexed?
I have dropped out for all my first page KWDs for this page https://www.key.co.uk/en/key/dollies-load-movers-door-skates Can anyone see an issue? I am trying to find one.... We did just migrate to HTTPS but other areas have no problem
Intermediate & Advanced SEO | | BeckyKey0 -
Password Protected Page(s) Indexed
Hi, I am wondering if my website can get a penalty if some password protected pages are showing up when I search on google: site:www.example.com/sub-group/pass-word-protected-page That shows that my password protected page was indexed either before or after adding the password protection. I've seen people suggest no indexing the page. Is that the best method to take care of this? What if we are planning on pushing the page live later on? All of these pages have no title tag, meta description, image alt text, etc. Should I add them for each page? I am wondering what is the best step, especially if we are planning on pushing the page(s) live. Thanks for any help!
Intermediate & Advanced SEO | | aua0 -
E-Commerce Site Collection Pages Not Being Indexed
Hello Everyone, So this is not really my strong suit but I’m going to do my best to explain the full scope of the issue and really hope someone has any insight. We have an e-commerce client (can't really share the domain) that uses Shopify; they have a large number of products categorized by Collections. The issue is when we do a site:search of our Collection Pages (site:Domain.com/Collections/) they don’t seem to be indexed. Also, not sure if it’s relevant but we also recently did an over-hall of our design. Because we haven’t been able to identify the issue here’s everything we know/have done so far: Moz Crawl Check and the Collection Pages came up. Checked Organic Landing Page Analytics (source/medium: Google) and the pages are getting traffic. Submitted the pages to Google Search Console. The URLs are listed on the sitemap.xml but when we tried to submit the Collections sitemap.xml to Google Search Console 99 were submitted but nothing came back as being indexed (like our other pages and products). We tested the URL in GSC’s robots.txt tester and it came up as being “allowed” but just in case below is the language used in our robots:
Intermediate & Advanced SEO | | Ben-R
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /9545580/checkouts
Disallow: /carts
Disallow: /account
Disallow: /collections/+
Disallow: /collections/%2B
Disallow: /collections/%2b
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b
Disallow: /design_theme_id
Disallow: /preview_theme_id
Disallow: /preview_script_id
Disallow: /apple-app-site-association
Sitemap: https://domain.com/sitemap.xml A Google Cache:Search currently shows a collections/all page we have up that lists all of our products. Please let us know if there’s any other details we could provide that might help. Any insight or suggestions would be very much appreciated. Looking forward to hearing all of your thoughts! Thank you in advance. Best,0 -
Client has moved to secured https webpages but non secured http pages are still being indexed in Google. Is this an issue
We are currently working with a client that relaunched their website two months ago to have hypertext transfer protocol secure pages (https) across their entire site architecture. The problem is that their non secure (http) pages are still accessible and being indexed in Google. Here are our concerns: 1. Are co-existing non secure and secure webpages (http and https) considered duplicate content?
Intermediate & Advanced SEO | | VanguardCommunications
2. If these pages are duplicate content should we use 301 redirects or rel canonicals?
3. If we go with rel canonicals, is it okay for a non secure page to have rel canonical to the secure version? Thanks for the advice.0 -
Apps content Google indexation ?
I read some months back that Google was indexing the apps content to display it into its SERP. Does anyone got any update on this recently ? I'll be very interesting to know more on it 🙂
Intermediate & Advanced SEO | | JoomGeek0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
Is it better to not allow Google to index my Tumblr Blog?
Currently using a subdomain for my blog via Tumblr In my seo reports I see alot of errors. Mostly from the Tumblr blog. Made change so there are unique titles and tags. Too many errors I am wondering if it is best to just not allow it to be indexed via tumblr control panel. It certainly is doing a great job with engagement and social network follows, but i'm starting to wonder if and how much it is penalizing my domain.. Appreciate your input.. By the way this theme is not flash for the content very basic single a theme...
Intermediate & Advanced SEO | | wickerparadise0 -
Amount of pages indexed for classified (number of pages for the same query)
I've notice that classified usually has a lots of pages indexed and that's because for each query/kw they index the first 100 results pages, normally they have 10 results per page. As an example imagine the site www.classified.com, for the query/kw "house for rent new york" there is the page www.classified.com/houses/house-for-rent-new-york and the "index" is set for the first 100 SERP pages, so www.classified.com/houses/house-for-rent-new-york www.classified.com/houses/house-for-rent-new-york-1 www.classified.com/houses/house-for-rent-new-york-2 ...and so on. Wouldn't it better to index only the 1st result page? I mean in the first 100 pages lots of ads are very similar so why should Google be happy by indexing lots of similar pages? Could Google penalyze this behaviour? What's your suggestions? Many tahnks in advance for your help.
Intermediate & Advanced SEO | | nuroa-2467120