Google Indexing
-
Hi
We have roughly 8500 pages in our website. Google had indexed almost 6000 of them, but now suddenly I see that the pages indexed has gone to 45.
Any possible explanations why this might be happening and what can be done for it.
Thanks,
Priyam
-
Hi,
I am also facing a similar issue.
My website is https://infinitelabz.com. When I try to crawl and index the site it is saying not able to crawl.
-
check the robots.txt for no-follow designations?
also, is your hosting reliable? i have had websites go down, which would cause crawler errors from reporting, resulting in a poor index.
-
I have done that all already actually and there is nothing unusual. So it's confusing
-
I have tried that and it fetches them correctly.
-
The only time it's ever really hit me hard and fast like that is on Tumblr. with adult content. Once they find out about it, they flip the robot.txt hide switch and you're burnt lol.
But ya like taryn suggested go into Google webmasters and have a look around the property and all the options starting from the messages/mailbox thing within webmasters.
-
Have you checked your traffic and ranks? This could just be an issue with index reporting. If traffic and ranks are stable then no need to worry.
If however traffic has declined, there is certainly an issue, check:
- have you received a penalty
- use site: command to see what URLs are actually showing Google index
- fetch and render to see how Google see's your pages
- run a crawl of your site using Screaming Frog or other such tool
- is there issue's with 404 / 500 or No response pages
- has dev deployed anyting like moving to https without applying 301 redirects
I think first is to determine, if this is an actual issue or not. Then if it is, run some serious analysis to determine cause and apply a fix.
Many thanks,
-
Hi,
Have you tried to fetch your pages too see if Google can read them?
See: https://support.google.com/webmasters/answer/6066468?hl=en
It is a good place to start.
BTW: I have tried something similar with a home built CMS that Google for some reason didn't like
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Stopped
Hello Team, A month ago, Google was indexing more than 2,35,000 pages, now has reduced to 11K. I have cross-checked almost everything including content, backlinks and schemas. Everything is looking fine, except the server response time, being a heavy website, or may be due to server issues, the website has an average loading time of 4 secs. Also, I would like to mention that I have been using same server since I have started working on the website, and as said above a month ago the indexing rate was more than 2.3 M, now reduced to 11K. nothing changed. As I have tried my level best on doing research for the same, so please if you had any such experiences, do share your valuable solutions to this problem.
Intermediate & Advanced SEO | | jeffreyjohnson0 -
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
Why is a site no longer being indexed by Google after HTTPS switch?
A client of ours recently had a new site built and made the switch to HTTPS. We made sure to redirect all of the HTTP pages to HTTPS and submitted a new sitemap to Google. GWT says the sitemap was submitted successfully but only 4 pages have been indexed where there should be over 2000. This has led to a plummet of organic traffic and we can't find the issue. Has anyone else had issues/success with doing a HTTPS switch that knows how to fix this problem?
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Google Plus Authorship
Situation Description: I have a website called Website A. I wish to migrate alot of the content from Website A to Website B. Website B will be on a completely different domain name and environment. Authors of Website A will act as contributing authors for Website B. It is also possible that other contributing authors of other websites C and D commit to writing content on Website B. Questions (1) Does it make sense to create a google plus profile under UserA@websiteA.com and link from content on websiteB to their google plus profile under UserA@websiteA.com? (2) Does AuthorRank affect PageRank? If yes, if I take the above approach would websiteA be effected or websiteB since the content writers of websiteA are contributing to websiteB? (3) Is it ok for userA to have a corporate google plus profile assuming he might also have another google plus profile under a different address? I always think it make sense that there exists a google plus profile at an employee level and another google plus profile at a personal level. (4) If an employee leaves the company, do I leave his/hers Google Plus profile alive? The fact that no more content would be published under that particular profile, would that negatively effect author rank over time? (5) Another interesting observation is that UsaToday, CNN etc do not use authorship? No authors link to their twitter profile or google plus profile. Shouln't they be doing this in terms of author rank or is author rank not that important? Thank you
Intermediate & Advanced SEO | | seo12120 -
Google is indexing wordpress attachment pages
Hey, I have a bit of a problem/issue what is freaking me out a bit. I hope you can help me. If i do site:www.somesitename.com search in Google i see that Google is indexing my attachment pages. I want to redirect attachment URL's to parent post and stop google from indexing them. I have used different redirect plugins in hope that i can fix it myself but plugins don't work. I get a error:"too many redirects occurred trying to open www.somesitename.com/?attachment_id=1982 ". Do i need to change something in my attachment.php fail? Any idea what is causing this problem? get_header(); ?> /* Run the loop to output the attachment. * If you want to overload this in a child theme then include a file * called loop-attachment.php and that will be used instead. */ get_template_part( 'loop', 'attachment' ); ?>
Intermediate & Advanced SEO | | TauriU0 -
Google Not Indexing Description or correct title (very technical)
Hey guys, I am managing the site: http://www.theattractionforums.com/ If you search the keyword "PUA Forums", it will be in the top 10 results, however the title of the forum will be "PUA Forums" rather than using the code in the title tag, and no description will display at all (despite there being one in the code). Any page other than the home-page that ranks shows the correct title and description. We're completely baffled! Here are some interesting bits and pieces: It shows up fine on Bing If I go into GWT and Fetch as Google Bot, it shows up as "Unreachable" when I try to pull the home-page. We previously found that it was pulling 'index.htm' before 'index.php' - and this was pulling a blank page. I've fixed this in the .htaccess however to make it redirect, however this hasn't solved the problem. I've disallowed it from pulling the description .etc from the Open Directory with the use of meta tags - didn't change anything. It's vBulletin and is running vBSEO Any suggestions at all guys? I'll be forever in anyones debt who can solve this, it's proving to be near impossible to fix. Here is the .htaccess file, it may be a part of the issue: RewriteEngine On DirectoryIndex index.php index.html Redirect /index.html http://www.theattractionforums.com/index.php RewriteCond %{HTTP_HOST} !^www.theattractionforums.com
Intermediate & Advanced SEO | | trx
RewriteRule (.*) http://www.theattractionforums.com/$1 [L,R=301] RewriteRule ^((urllist|sitemap_).*.(xml|txt)(.gz)?)$ vbseo_sitemap/vbseo_getsitemap.php?sitemap=$1 [L] RewriteCond %{REQUEST_URI} !(admincp/|modcp/|cron|vbseo_sitemap/)
RewriteRule ^((archive/)?(..php(/.)?)?)$ vbseo.php [L,QSA] RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !^(admincp|modcp|clientscript|cpstyles|images)/
RewriteRule ^(.+)$ vbseo.php [L,QSA]
RewriteRule ^forum/(.*)$ http://www.theattractionforums.com/$1 [R=301,L]0 -
Should I index tag pages?
Should I exclude the tag pages? Or should I go ahead and keep them indexed? Is there a general opinion on this topic?
Intermediate & Advanced SEO | | NikkiGaul0 -
Can a XML sitemap index point to other sitemaps indexes?
We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!
Intermediate & Advanced SEO | | CareerBliss0