Site De-Indexed except for Homepage
-
Hi Mozzers,
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September:Please see screenshot attached to show this:
- 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools
- 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps.
Site is: (removed)
As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking.
It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools.
Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing.
What else could be wrong?
Any suggestions appeciated. Thanks.
-
Resolved!
Thanks for your replies everyone.
The strange thing was that even though the www version of pages did seem to 301 to non-www version (I checked headers were indeed 301), all our pages had disappeared from Google index and rankings too (exept homepage).
The resolution came after we had our host reset the domain to www version on the server to the original state. Within days of changing that, all our deindexed pages (the whole site) jumped back into the original ranking positions in Google with www version and are re-indexed like nothing had happened.
Hope this helps someone else.
-
Hi Emerald
Enter both www and non www to webmaster tools
pages like http://www.toursistanbul.com/bosphoruscruises.htm are set to noindex
There are 30+ pages indexed in google right now
It's a mix of www and non www
Webmaster tools treats this as different so you will see a drop.
Go to your webmaster tools and set a preferred version. (which is www.)
If you plan to move to https then you need to also enter both https (www and non www versions)
BTW, be sure your analytics is using the new code as well.
Good luck!
-
There is nothing you can do now. You have made a mistake and fixed it. Since then you have submitted a sitemap, "fetched" the site and redirected non-www traffic to www in your htaccess... There are no other ways to speed the process up. Just sit and wait for Google crawler to fully re-crawl the site and the number of indexed pages will come back to what it was.
You said all rankings disappeared in Moz Tracker, but what about the actual rankings in Google search results? Have you checked that? What are WMT and GA saying about your rankings/traffic?
My gut feeling tells me your pages are still ranking as they were, but since your WMT was still set to show data for www domain you weren't seeing any... am I correct?
-
Hi,
Without digging in detail, it is always awkward to suggest possible issues - simply because there are so many possibilities.
That said, what you mentioned about the move could cause a temporary drop while Google corrects the indexation of the site. The site has gone from www.site.com to http://site.com and now back to www.site.com again. That is a lot of movement for Google to try and make sense of.
I am guessing also that because this was not a planned move from www to non-www, that no 301's were implemented, which means that Google would effectively see the site as having just gone and now returned again.
I would assume that the site will recover, but this can take time.
Use webmaster tools to 'Fetch as Google' at the root of the site and see the site is back again.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this site buying backlinks?
dankstop.com Almost all of their links come from mainly the same few sites, and some of their links have a really high spam score.
White Hat / Black Hat SEO | | tlorenzi0 -
Malicious links on our site indexed by Google but only visible to bots
We've been suffering from some very nasty black hat seo. In Google's index, our pages show external links to various pharmaceutical websites, but our actual live pages don't show them. It seems as though only certain user-agents see the malicious links. Setting up Screaming Frog SEO crawler using the Googlebot user agent also sees the malicious links. Any idea what could have caused this or how this can be stopped? We scanned all files on our webserver and couldn't find any of malicious links. We've changed our FTP and CMS passwords, is there anything else we can do? Thanks in advance!
White Hat / Black Hat SEO | | SEO-Bas0 -
How Can I Safely Establish Homepage Relevancy With Internal Keyword Links?
My website has roughly 1000-2000 pages. However, our homepage is lacking relevancy as to what it is about. One way that I'd like to tackle this problem, is by updating many of our pages with internal linking. I often hear, use exact keyword links with caution, but have assumed this mainly referred to external backlinks. Would it be a disaster to set up our single most relevant keyword on about 300 pages and point it to our homepage? There are breadcrumbs on our site, but the home link uses an image (It's a picture of a house, if you're curious.) Am I better off just to change that to our most relevant keyword? I could use any advice on internal links for establishing better homepage relevancy. Thank you!
White Hat / Black Hat SEO | | osaka730 -
Old Press Release sites - Which ones do you Disavow and leave alone
Hi Mozers! I need your help. I'm in the final stages of a huge link audit and press releases are a big concern. As you know, press release distribution sites up until 2012 had "follow" links, giving webmasters a delight of having their keyword anchor texts a big boost in rankings. These are the websites that are troubling me today so i would appreciate your input on my strategy below as most of these websites are asking for money to remove them: 1. Press Release sites that are on the same C-class - Disavow 2. Not so authoritative press release websites that just follow my www domain only (no anchor texts) - I leave it alone 3. Not so authoritative press release websites but have anchor texts that are followed - Disavow 4. Post 2012 press release websites that have "followed" anchor text keywords - Request to remove, then disavow 5. Post 2012 press release websites that just follow my www domain only (no anchor texts) - leave it alone #2 and #5 are my biggest concern. Now more than ever I would appreciate your follow ups. I will respond quickly and apply "good answers" to the one's that make the most sense as my appreciation to you. God bless you all.
White Hat / Black Hat SEO | | Shawn1240 -
Site Scraping and Canonical Tags
Hi, So I recently found a site (actually just one page) that has scraped my homepage. All the links to my site have been removed except the canonical tag, should this be disavowed through WMT or reported through WMT's Spam Report? Thanks in advance for any feedback.
White Hat / Black Hat SEO | | APFM0 -
Cloaking for better user experience and deeper indexing - grey or black?
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands. The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines. If we limit the results, then not all of the individual directory listing pages can be found. I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black? I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply. To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines. This is a positive user experience. For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page. This requires cloaking, but is ultimately serving the same content in slightly different ways. 1. Where on the scale of white to black is this? 2. Would you do this for a client's site? 3. Would you do it for your own site?
White Hat / Black Hat SEO | | ServiceCrowd_AU0 -
2 sites in one niche?
Hello, Can you be penalized for having 2 ecommerce sites in the same niche? Is there a way to do it white-hat? Please explain.
White Hat / Black Hat SEO | | BobGW0 -
Has anyone been able to recover a site from that was slapped by panda?
I have a client that the only thing I can determine is over optimization of a couple anchor terms which the person no longer ranks for..I tried mixing up with brandname , brandname.com and a diversity of links but nothing seems to budge anyone have a similar problem?
White Hat / Black Hat SEO | | foreignhaus0