Homepage refusing to show up in Google (rest of pages fine)
-
edit
-
Ah, I was wondering since they may have entirely different pricing based upon who you talk to.
-
SiteLock
-
So, on an invoice, do you or the client pay Incapsula or SiteLock?
-
Exactly, I've been told that these problems surfaced around the time the firewall was put up. I've just removed the timthumb file and I'm working on disavowing the spammy links pointing to us. I'm considering ditching sitelock in the next few days and seeing if that helps at all. We were also looking at Sucuri as a firewall option as well.
-
All of the header checks I've done come back with Incapsula. I don't really want to get much further into that for a number of reasons. But if you're actually paying SiteLock that's pretty interesting.
But you're saying the site ranked for it's brand term, at least, before implementing either SiteLock or Incapsula?
-
This is a huge help. I spent some time yesterday going through the site and updating my links to https where possible. Those don't all appear to have indexed yet. The bit about the timthumb exploit is particularly helpful. My theme lets me disable that, and I can get rid of the timthumb php file. I'm still concerned that sitelock could be exaggerating the problem though, we started having these issues with google around when it was implemented.
-
The site is using Incapsula as a CDN and web application firewall. The site still has a timthumb file. So I wouldn't recommend stepping out from behind that right now.
A wildcard search on the domain yields a lot of spam backlinks. Check ahrefs.
-
The entire site appears to index fine. As Patrick pointed out, it appears some of the pages in the index aren't https. But I don't know when you made the move, so things may be chugging right along.
The issue is ranking. But I know what you mean.
So what we have is (not all bad, per se - just what I see):
- Previously hacked site
- Timthumb file
- Some very spammy links
- HTTPS implemented on unknown date
- Moved to CDN / WAF
- Redirects
No doubt, you're going to have to disavow the bad links. Take down requests are nice and all, and you should note them in your disavow submission, but you don't have to manually contact each individual link/domain. It's not really a fire-and-forget process. You can submit it more than once.
I would bet a shiny nickle the attack/hack exploited the timthumb file. The site still uses it. Stop using it. Find an alternative. All it does is resize images.
The https migration (redirects... etc.) is just a confounding factor.
After you've removed the timthumb file, request a security review. Also consider the site may still have issues from the hack. So fetch as google from Webmaster Tools. If you see anything different than the real page, you still have a problem.
Read a little more about recovering from a hacked site here. I think that's more than likely the core of the problem right now.
-
Let me guess - you're using SiteLock after you were hacked to keep them out?
SiteLock creates this issue frequently (we solved it for another Q&A user about a month ago.)
Disable SiteLock, check your settings are all right in Webmasters Tools and Fetch the page in WMT. Add a link to it on Google+ so it gets recrawled quickly.
I only see 1 backlink to the site from Ahrefs (https://ahrefs.com/site-explorer/overview/subdomains?target=www.newstaradhesives.com) and only 2 in Majestic (https://majestic.com/reports/site-explorer?folder=&q=www.newstaradhesives.com)
Very, very low authority & SiteLock - those would be the two I'd start with.
-
It absolutely was very hacked. I'm currently in the process of submitting takedowns manually for those spam posts in google's index. The site has been cleaned up and relaunched since. Could these be harming the indexing of the homepage as well?
-
I think Incapsula is throwing the false noindex tag. But yeah, that's just how Incapsula do. The home page shows just fine with a site: operator.
Judging by the anchor text I see pointed at the site... and the Timthumbs.php file... the site was very very hacked at some point.
Edit: Yep. It was hacked until late last year.
-
Hi Patrick
Thanks for taking a look. If I could ask, where are you seeing this noindex tag and what are you using to see it? I've got my homepage set up in the yoast seo plugin to index and follow, and I had also previously added a into my header just to make sure. My suspicion is that the sitelock firewall installed on our site right now is blocking robots. Does this make any sense?
Thanks again
-
I wanted to attach this image - in my crawl, I am getting a "noindex,nofollow" but your code isn't showing it. I would check with your web development team to see what exactly is happening and how this can be fixed.
-
Hi there
It appears your homepage has a "noindex,nofollow" tag - change this to "index,follow". Make sure this is fixed across the site.
If for some reason that doesn't work (which it will):
Have you checked to see if you have a manual action?
If you have multiple URLs going on with the same content - check your canonical tags and make sure you do a content audit to see if this information can be removed, consolidated, or updated. Your SSL seems to not be configured properly also.
I would also make sure that you do a backlink audit to see if any links can be removed or updated. Also, check your local SEO presence and that everything is on point and consistent. Same with on-site SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A campaign ghost keeps returning to my Google Analytics - Help!
A couple of campaign tracking links were created on my homepage (leading to internal pages), these were removed a few weeks ago (100% removed from the site). I understand there is a 6 month window and as long as a user returns (no matter from which source) they will be counted as a session against that campaign. Since these campaign links were set-up in error, I hoped creating a fresh new view within Google Analytics would stop them appearing. However they are still showing as sessions even in the new view (created after removing the campaign links in question). Is there anyway to stop this happening!? I want to be able to report on sessions correctly. Thanks, Sam
Web Design | | Sam.at.Moz0 -
Does Google penalise for alot of advertising on your site?
I look after the search side of a decorating website on which we carry a large amount of advertising from external brands as that is our business model. Do you know if we would get penalised for having too much advertising - would it be deemed to affect the user experience? Many thanks for your help on this.
Web Design | | Pday0 -
Need to hire a tech for find out why google spider can't access my site
Google spider can't access my site and all my pages are gone out of serps. I've called network solutions tech support and they say all is fine with the site which is wrong. does anyone know of a web tech who i can hire to fix this issue? Thanks, Ron
Web Design | | Ron100 -
Does Google have problem crawling ssl sites?
We have a site that was ranking well and recently dropped in traffic and ranking. The whole site is https and and not just the shopping pages. Thats the way the server is setup, they make whole site https. My manager thinks the drop in ranking is due to google not crawling https. I think contrary, but would like some feedback on this. Site is here
Web Design | | anthonytjm0 -
Legitimate hidden text and H1s are "OK?" Show me the data!
I'm trying to promote the SEO perspective during a site redesign so I'm researching the impact of design requests: Embedding text in graphic headers and applying to the graphics to get the SEO value Reducing view-able text on a page for design reasons and by using JavaScript to hide text in accordions or tabs. SEOmoz uses these techniques on their ranking report and most of what I read in teh forums says it is OK to hide text if your motives are pure and the text displays in a text-only browser. But I do SEO, not SEOK. I want to optimize, not just avoid penalties. And I try to make decisions based on data, not just anecdotes. Are there any studies out there on the effects these hidden-text topics? How much difference DOES it make to have the text exposed? Since there is potential for spam with these techniques, why would Google give the same rank to pages with and without hidden text? When I'm balancing UX and SEO, I want to clearly define the trade-off. What have you done when faced with this dilemma?
Web Design | | integra-telecom0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0 -
Google cache and rel alternate
http://groups.google.com/a/googleproductforums.com/forum/#!searchin/webmasters/rel$20alternate$20not$20works/webmasters/xzwTBJemPss/LyRjRCigZdYJ http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Web Design | | ctam
Situation:www.example.ca - canadian IP and https://www.google.com/webmasters/tools/settings:"Your site's domain is currently associated with the target: Canada" - done long time ago.<head><link rel="canonical" href="http://www.example.ca/" /> <link< span="">rel="alternate" hreflang="en-us" href="http://www.example.com/" /></link<><link rel="alternate" hreflang="en-ca" href="http://www.example.ca/" /><link rel="alternate" hreflang="en" href="http://www.example.com/" />```
www.example.com - US IP and
https://www.google.com/webmasters/tools/settings:
"Geographic target Target users in: United States" done long time ago.
<head><link rel="canonical" href="http://www.example.com/" /> <link< span="">rel="alternate" hreflang="en-us" href="http://www.example.com/" /><link rel="alternate" hreflang="en-ca" href="http://www.example.ca/" />
<link rel="alternate" hreflang="en" href="http://www.example.com/" /> Differences: Prices and some minor changes in design. cache:www.example.com - shows .ca version,
with snapshot's date after rel="alternate" had been added.
Results: In usexample.com pages do not appear in search results.
Some times www.example.ca pages do,
but they are even close so well ranked as example.com pages before. Question: What we are doing wrong?</link<>0 -
Link Pages/Directory
Hello, What is best practise for dealing with alot of links. I was thinking of breaking them download to alphabet pages i.e. all A on one page etc... BUT should I then make the links clickable on this list OR that they load to a sub company page which has a clickable link to there website.
Web Design | | JohnW-UK0