Home Page Deindexed Only at Google after Recovering from Hack Attack
-
Hello, Facing a Strange issue, wordpress blog hghscience[dot]com was hacked by someone, when checked, I found index.php file was changed & it was showing some page with a hacked message, & also index.html file was added to the cpanel account.All pages were showing same message, when I found it, I replaced index.php to default wordpress index.php file & deleted index.htmlI could not find any other file which was looking suspicious. Site started working fine & it was also indexed but cached version was that hacked page. I used webmaster tool to fetch & render it as google bot & submitted for indexing. After that I noticed home page get deindexed by google. Rest all pages are indexing like before. Site was hacked around 30th July & I fixed it on 1st Aug. Since then home page is not getting indexed, I tried to fetch & index multiple time via google webmasters tool but no luck as of now. 1 More thing I Noticed, When I used info:mysite.com on google, its showing some other hacked site ( www.whatsmyreferer.com/ ) When Searching from India But when same info:mysite.com is searched from US a different hacked site is showing ( sigaretamogilev.by )However when I search "mysite.com" my site home page is appearing on google search but when I check cached URL its showing hacked sites mentioned above.As per my knowledge I checked all SEO Plugins, Codes of homepage, can't find anything which is not letting the homepage indexed.PS: webmaster tool has received no warning etc for penalty or malware.
I also noticed I disallowed index.php file via robots.txt earlier but now I even removed that. 7Dj1Q0w.png 3krfp9K.png
-
.htaccess file has nothing but
BEGIN WordPress
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule>END WordPress
Installed Plugins
Yoast SEO, Google XML Sitemaps, Akismat, Udinra All Image Sitemap, Social Share Bar (Digg Digg Alternative), Jetpack by WordPress.com, AuthorH Review.
Apart from Yoast, it seems nothing can block site, and Yoast settings are fine, just disabled tag indexing & subpages along with author archive.
Problem is something else I guess
-
Hi Ankit,
Though I have checked for the pages you're serving to bots, could you please have a look at your .htaccess file once? Does it contains something like:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (google|yahoo) [OR]
RewriteCond %{HTTP_REFERER} (google|aol|yahoo)Do you have your code's copy in github or bitbucket or any other source code management tool? If yes, please scan last few commits thoroughly.
You can create a list of plugins installed recently. Remove them one by one and submit your home page URL to GWT for fetching a fresh copy it every time. Not sure what's the issue here, let's do hit-and-trial to deep dive a bit.
-
Hey Alan,
Do let me know if you find some solution or identify the problem.
-
That's what. Not able to find any good information to go next-step for this. But, still checking random things with a "hope".
-
Domaintools domain report shows no more info that could be helpful. Leaving me at a complete loss as to what else to check.
-
More info.
Because Nitin was able to run a ping and traceroute without problem, I went to DomainTools.com - the worlds leading resource for forensic digital investigative research. I use it whenever I am doing investigations for expert witness work I do.
When I ran the domain there, it had a screen-capture of the home page from June. So I submitted a refresh, and it came back as not being able to provide a screen-shot of the home page.
While not a smoking gun issue, it further clouds my trust in regard to whether the domain is actually functioning properly in the hosting environment as I originally thought it might not be.
I will run a deeper test to see if I can get more information, however I wanted to post this update because I believe it relevant.
-
Well, this is probably 1 of the most interesting issues an SEO can come across with. Google is showing different cached version in different countries. For me, that's strange too. Is that usual thing?
-
Nitin
Thanks for doing that - Now I'm stumped - I've never had Pingdom fail before with both ping and traceroute. And I now wonder if it's a non-issue, or part of the confused mess that Ankit referenced somehow.
-
That's right, its showing different cached versions in different countries. Just checked for US here. Screenshot attached.
-
I think that index.php disallowed was not an issue, I took suggestion and removed it but many sites disallow index.php via robots.txt to avoid duplicate content issue in site.com & site.com/index.php
here is an example - http://www.shoutmeloud.com/robots.txt
Still I did it about 10-12 days ago, fetched & submitted to index & also put rendering request.
Attaching current Screenshot of last rendering request.
I think some other issue, what's your view on that info:site.com showing some other hacked sites, how's this happening & sites are also changing. Its different in India, Different in US.
-
Ping and traceroute worked for me when I tried using my terminal (screenshot is attached).
Well, I agree that the problem is actually bigger. If you see its cached version on google, it was last cached on 16th Aug i.e after the issue of index.php/index.html was fixed by the admin (another screenshot attached).
I tried to see this page as googlebot as well, couldn't find the issue (wanted to check it for cloaking as well).
-
UPDATE TO MY ORIGINAL COMMENT
I initially found a problem doing a ping and traceroute test using Pingdom.com - both returned an "invalid host name" error, something I have not seen previously for both ping and traceroute simultaneously.
Nitin (see his comment below) did a similar test locally and found both to be okay. Though he has other thoughts.
I just wanted to clarify here now, that my original finding may not be a key to this issue, though I want to understand why my test came back that way...
-
You said you remove the index.php from the robots.txt. I just wanted to when did that happened? Because after removal, it usually took some time to get back in index (crawler need to recrawl the website accordingly).
My advice is to resubmit your robots.txt and updated sitemap.xml to Webmaster console and wait for the next crawl and this should be fixed.
Hope this helps!
-
Just sent SC, Nothing helped so far, Its quite strange that the info:domain.com is now showing some other hacked URL. SC attached.
-
It was quite strange for me as well, Just attached Screen Shot after fetching for 1 more time.
1 more thing I noticed, that info:mysite.com is not showing some other Hacked domain. Not sure How it's happening & why It's happening.
Sorry for the delay in reply, I was not getting email updates so I though no one answered my question.
-
Hi Ankit! Did Nitin's suggestions help at all? And are you able to share the screenshot he asked for?
-
Check the following, may be it'll help you resolve the issue:
https://moz.com/community/q/de-indexed-homepage-in-google-very-confusing
https://moz.com/community/q/site-de-indexed-except-for-homepage
-
That's really strange. Could you please share the screenshot when you're trying to fetch it as google in the GWT?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will redirecting a logged in user from a public page to an equivalent private page (not visible to google) impact SEO?
Hi, We have public pages that can obviously be visited by our registered members. When they visit these public pages + they are logged in to our site, we want to redirect them to the equivalent (richer) page on the private site e.g. a logged in user visiting /public/contentA will be redirected to /private/contentA Note: Our /public pages are indexed by Google whereas /private pages are excluded. a) will this affect our SEO? b) if not, is 302 the best http status code to use? Cheers
Technical SEO | | bernienabo0 -
Will Google still ignore the second instance of anchor text on a page if it has an H2 tag on it?
We have a page set up that has anchor text with header tags. There is an instance where the same anchor text is on the page twice linking to the same page, and I know that Google will ignore the second instance. But in the second instance it also had an H2 tag (which I removed and put it on the first instance of anchor text even though it's smaller). Is this good practice?
Technical SEO | | AliMac260 -
Google Published Date - Does Google Lie?
Here's the scenario. I create a page called "ABC" and it gets published and found by Google lets say on the 13th of April. on the 15th (or 14th) i decide to update the URL, page Title, and content. (Redirect old URL to new URL as well) Will Google still show this page as being published on the 13th? or would it update the publish date according to the new URL? Greg | | | | | | <a id="question_reply-to-question-36769-description_codeblock" class="mceButton mceButtonEnabled mce_codeblock" style="color: #000000; border: 1px solid #f0f0ee; margin: 0px 1px 0px 0px; padding: 0px; background-color: transparent; cursor: default; vertical-align: baseline; width: 20px; border-collapse: separate; display: block; height: 20px;" title="Create Code Block" tabindex="-1"></a>Create Code Block | | | | | | | | | | | | | | |
Technical SEO | | AndreVanKets0 -
Does Google Still Pass Anchor Text for Multiple Links to the Same Page When Using a Hashtag? What About Indexation?
Both of these seem a little counter-intuitive to me so I want to make sure I'm on the same page. I'm wondering if I need to add "#s to my internal links when the page I'm linking to is already: a.) in the site's navigation b.) in the sidebar More specifically, in your experience...do the search engines only give credit to (or mostly give credit to) the anchor text used in the navigation and ignore the anchor text used in the body of the article? I've found (in here) a couple of folks mentioning that content after a hashtagged link isn't indexed. Just so I understand this... a.) if I were use a hashtag at the end of a link as the first link in the body of a page, this means that the rest of the article won't be indexed? b.) if I use a table of contents at the top of a page and link to places within the document, then only the areas of the page up to the table of contents will be indexed/crawled? Thanks ahead of time! I really appreciate the help.
Technical SEO | | Spencer_LuminInteractive0 -
Why do they rank the home page?
We are trying to rank for the key word Motorcycle Parts. We have moved up to page 2 over the past couple months; however, google is ranking our home page not our http://www.rockymountainatvmc.com/s/49/61/Motorcycle-Parts page that is for motorcycle parts. We are working on internal linking to help point the right signals too. Any other thoughts? ( we have new content written to put in as well we just have to wait for an issue to be fixed before we can put it in)
Technical SEO | | DoRM0 -
Why is google not deindexing pages with the meta noindex tag?
On our website www.keystonepetplace.com we added the meta noindex tag to category pages that were created by the sorting function. Google no longer seems to be adding more of these pages to the index, but the pages that were already added are still in the index when I check via site:keystonepetplace.com Here is an example page: http://www.keystonepetplace.com/dog/dog-food?limit=50 How long should it take for these pages to disappear from the index?
Technical SEO | | JGar-2203710 -
Top pages give " page not found"
A lot of my top pages point to images in a gallery on my site. When I click on the url under the name of the jpg file I get an error page not found. For instance this link: http://www.fastingfotografie.nl/architectuur-landschap/single-gallery/10162327 Is this a problem? Thanks. Thomas. JkLej.png
Technical SEO | | thomasfasting0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0