Why Is this page de-indexed?
-
I have dropped out for all my first page KWDs for this page
https://www.key.co.uk/en/key/dollies-load-movers-door-skates
Can anyone see an issue?
I am trying to find one....
We did just migrate to HTTPS but other areas have no problem
-
Hi
Yes, there was an issue with my rank tracking software - phew
Thank you
-
OK great thanks for your help
I'll keep an eye on everything
-
You are 5th for Heavy Duty Dolly
I don't see what the problem is - the page is doing really well??
Regards
Nigel
-
This is not an issue.
Its totally normal to have some http indexed pages left. Even more common if the migration is recent. Dont be afraid of this Becky.
-
Yeap, give it a little time to google re-crawl all the new site. I´d give it nearley a month to consider that google has seen completely the new version of the site, always checking the number of indexed pages in GSC and the resuls appearing for a site: search
Being out of top 100 gives you a clue that you are in the middle of the transition.and for the keyword: _Heavy Duty Dolly; _ I do see your page. Check attached image.
Best luck.
GR. -
Hi Becky
I just searched in a normal browser so it could be Google skewing the results for you.
For indexed pages
site:key.co.uk inurl:http:
Regards
Nigel
-
How did you find these http pages?
I did a search in Incognito, but I couldn't see anything myself.
I'll try again, thanks!
-
Hi
Thanks for this. Yes I've checked in Google Console, I can find the page in indexed pages but the indexed pages are a lot less since migration:
HTTP - indexed 13013, blocked 12,891
HTTPS - indexed - 2814 / blocked robots.txt 5713
Do I just wait?
One keyword example for that page would be #Heavy Duty Dolly' & 'load moving dolly'
Were position 1 now out of top 100.
We're working on page speed/load time for the whole site, but why would it affect that one page so badly?
-
Hi Becky,
Without knoing those relevant search terms, there's almost no analysis to be done.
I´ve noticed that it took very long time to load, here a GTmetrix report.Remember that migrating to HTTPs makes google to re-crawl all your website's pages and re evaluate all ranking factors.
My advise is to wait a little longer. It might take a few weeks.Also, always monitor the Google Search console profile, there could be some message. Take a look into indexed pages, there could be also that there are less pages indexed now than before migration.
Hope I've helped.
Best luck.
GR. -
Hi Becky
Load Movers - Pos 3
Wooden dollies - Pos 1Maybe open an incognito browser with history cleared.
I don't see a problem
Regards
Nigel
PS You still have 748 http pages indexed but it's only 10% of the total
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How I can improve my website On page and Off page
My Website is guitarcontrol.com, I have very strong competition in market. Please advice me the list of improvements on my websites. In regarding ON page, Linkbuiding and Social media. What I can do to improve my website ranking?
Intermediate & Advanced SEO | | zoe.wilson170 -
[wtf] Mysterious Homepage De-Indexing
Our homepage, as well as several similar landing pages, have vanished from the index. Could you guys review the below pages to make sure I'm not missing something really obvious?! URLs: http://www.grammarly.com http://www.grammarly.com/plagiarism-checker It's been four days, so it's not just a temporary fluctuation The pages don't have a "noindex" tag on them and aren't being excluded in our robots.txt There's no notification about a penalty in WMT Clues: WMT is returning an "HTTP 200 OK" for Fetch, is showing a redirect to grammarly.com/1 (alternate version of homepage, contains rel=canonical back to homepage) for Fetch+Render. Could this be causing a circular redirect? Some pages on our domain are ranking fine, e.g. https://www.google.com/search?q=grammarly+answers A month ago, we redesigned the pages in question. The new versions are pretty script-heavy, as you can see. We don't have a sitemap set up yet. Any ideas? Thanks in advance, friends!
Intermediate & Advanced SEO | | ipancake0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
Indexing specified entry pages
Hi,We are currently working on location based info.Basically, when someone searches from Florida they will get specific Florida results and when they search from California they will specific California results.How does this location based info affect crawling and indexing?Lets say we have location info for googlebot, sometimes they crawl from a New York ip address, sometimes they do it from Texas and sometimes from California. In this case google will index 3 different pages with 3 different prices and a bit different text, and I'm afraid they might see these as some kind of cloaking or suspicious movement because we serve different versions of the page. What's the best way to handle this?
Intermediate & Advanced SEO | | SEODinosaur0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0 -
Which page to target? Home or /landing-page
I have optimized my home page for the keyword "computer repairs" would I be better of targeting my links at this page or an additional page (which already exists) called /repairs it's possible to rename & 301 this page to /computer-repairs The only advantage I can see from targeting /computer-repairs is that the keywords are in the target URL.
Intermediate & Advanced SEO | | SEOKeith0 -
We are changing ?page= dynamic url's to /page/ static urls. Will this hurt the progress we have made with the pages using dynamic addresses?
Question about changing url from dynamic to static to improve SEO but concern about hurting progress made so far.
Intermediate & Advanced SEO | | h3counsel0 -
Can a XML sitemap index point to other sitemaps indexes?
We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!
Intermediate & Advanced SEO | | CareerBliss0