Google Indexing
-
Hi
We have roughly 8500 pages in our website. Google had indexed almost 6000 of them, but now suddenly I see that the pages indexed has gone to 45.
Any possible explanations why this might be happening and what can be done for it.
Thanks,
Priyam
-
Hi,
I am also facing a similar issue.
My website is https://infinitelabz.com. When I try to crawl and index the site it is saying not able to crawl.
-
check the robots.txt for no-follow designations?
also, is your hosting reliable? i have had websites go down, which would cause crawler errors from reporting, resulting in a poor index.
-
I have done that all already actually and there is nothing unusual. So it's confusing
-
I have tried that and it fetches them correctly.
-
The only time it's ever really hit me hard and fast like that is on Tumblr. with adult content. Once they find out about it, they flip the robot.txt hide switch and you're burnt lol.
But ya like taryn suggested go into Google webmasters and have a look around the property and all the options starting from the messages/mailbox thing within webmasters.
-
Have you checked your traffic and ranks? This could just be an issue with index reporting. If traffic and ranks are stable then no need to worry.
If however traffic has declined, there is certainly an issue, check:
- have you received a penalty
- use site: command to see what URLs are actually showing Google index
- fetch and render to see how Google see's your pages
- run a crawl of your site using Screaming Frog or other such tool
- is there issue's with 404 / 500 or No response pages
- has dev deployed anyting like moving to https without applying 301 redirects
I think first is to determine, if this is an actual issue or not. Then if it is, run some serious analysis to determine cause and apply a fix.
Many thanks,
-
Hi,
Have you tried to fetch your pages too see if Google can read them?
See: https://support.google.com/webmasters/answer/6066468?hl=en
It is a good place to start.
BTW: I have tried something similar with a home built CMS that Google for some reason didn't like
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why do SEO agencies ask for access to our Google Search Console and Google Tag Manager?
What do they need GTM for? And what is the use case for setting up Google Search Console?
Intermediate & Advanced SEO | | NBJ_SM0 -
How can I make a list of all URLs indexed by Google?
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
Intermediate & Advanced SEO | | Bryggselv.no0 -
Blocking Certain Site Parameters from Google's Index - Please Help
Hello, So we recently used Google Webmaster Tools in an attempt to block certain parameters on our site from showing up in Google's index. One of our site parameters is essentially for user location and accounts for over 500,000 URLs. This parameter does not change page content in any way, and there is no need for Google to index it. We edited the parameter in GWT to tell Google that it does not change site content and to not index it. However, after two weeks, all of these URLs are still definitely getting indexed. Why? Maybe there's something we're missing here. Perhaps there is another way to do this more effectively. Has anyone else ran into this problem? The path we used to implement this action:
Intermediate & Advanced SEO | | Jbake
Google Webmaster Tools > Crawl > URL Parameters Thank you in advance for your help!0 -
Why are bit.ly links being indexed and ranked by Google?
I did a quick search for "site:bit.ly" and it returns more than 10 million results. Given that bit.ly links are 301 redirects, why are they being indexed in Google and ranked according to their destination? I'm working on a similar project to bit.ly and I want to make sure I don't run into the same problem.
Intermediate & Advanced SEO | | JDatSB1 -
Incorrect cached page indexing in Google while correct page indexes intermittently
Hi, we are a South African insurance company. We have a page http://www.miway.co.za/midrivestyle which has a 301 redirect to http://www.miway.co.za/car-insurance. Problem is that the former page is ranking in the index rather than the latter. The latter page does index occasionally in the same position, but rarely. This is primarily for search phrases like "car insurance" and "car insurance quotes". The ranking was knocked down the index with Penquin 2.0. It was not ranking at all but we have managed to recover to 12/13. This abnormally has only been occurring since the recovery. The correct page does index for other search terms like "insurance for car". Your help would be appreciated, thanks!
Intermediate & Advanced SEO | | miway0 -
Google Phone Numbers
What process is performed to get a company's phone number to show as "A" on google maps. Google displays the phone number for the company on the map as "A" first. It would be beneficial to get that position. Is there a sub-category of seo that does this? Thanks in advance!
Intermediate & Advanced SEO | | JML11790 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0