Homepage refusing to show up in Google (rest of pages fine)
-
edit
-
Ah, I was wondering since they may have entirely different pricing based upon who you talk to.
-
SiteLock
-
So, on an invoice, do you or the client pay Incapsula or SiteLock?
-
Exactly, I've been told that these problems surfaced around the time the firewall was put up. I've just removed the timthumb file and I'm working on disavowing the spammy links pointing to us. I'm considering ditching sitelock in the next few days and seeing if that helps at all. We were also looking at Sucuri as a firewall option as well.
-
All of the header checks I've done come back with Incapsula. I don't really want to get much further into that for a number of reasons. But if you're actually paying SiteLock that's pretty interesting.
But you're saying the site ranked for it's brand term, at least, before implementing either SiteLock or Incapsula?
-
This is a huge help. I spent some time yesterday going through the site and updating my links to https where possible. Those don't all appear to have indexed yet. The bit about the timthumb exploit is particularly helpful. My theme lets me disable that, and I can get rid of the timthumb php file. I'm still concerned that sitelock could be exaggerating the problem though, we started having these issues with google around when it was implemented.
-
The site is using Incapsula as a CDN and web application firewall. The site still has a timthumb file. So I wouldn't recommend stepping out from behind that right now.
A wildcard search on the domain yields a lot of spam backlinks. Check ahrefs.
-
The entire site appears to index fine. As Patrick pointed out, it appears some of the pages in the index aren't https. But I don't know when you made the move, so things may be chugging right along.
The issue is ranking. But I know what you mean.
So what we have is (not all bad, per se - just what I see):
- Previously hacked site
- Timthumb file
- Some very spammy links
- HTTPS implemented on unknown date
- Moved to CDN / WAF
- Redirects
No doubt, you're going to have to disavow the bad links. Take down requests are nice and all, and you should note them in your disavow submission, but you don't have to manually contact each individual link/domain. It's not really a fire-and-forget process. You can submit it more than once.
I would bet a shiny nickle the attack/hack exploited the timthumb file. The site still uses it. Stop using it. Find an alternative. All it does is resize images.
The https migration (redirects... etc.) is just a confounding factor.
After you've removed the timthumb file, request a security review. Also consider the site may still have issues from the hack. So fetch as google from Webmaster Tools. If you see anything different than the real page, you still have a problem.
Read a little more about recovering from a hacked site here. I think that's more than likely the core of the problem right now.
-
Let me guess - you're using SiteLock after you were hacked to keep them out?
SiteLock creates this issue frequently (we solved it for another Q&A user about a month ago.)
Disable SiteLock, check your settings are all right in Webmasters Tools and Fetch the page in WMT. Add a link to it on Google+ so it gets recrawled quickly.
I only see 1 backlink to the site from Ahrefs (https://ahrefs.com/site-explorer/overview/subdomains?target=www.newstaradhesives.com) and only 2 in Majestic (https://majestic.com/reports/site-explorer?folder=&q=www.newstaradhesives.com)
Very, very low authority & SiteLock - those would be the two I'd start with.
-
It absolutely was very hacked. I'm currently in the process of submitting takedowns manually for those spam posts in google's index. The site has been cleaned up and relaunched since. Could these be harming the indexing of the homepage as well?
-
I think Incapsula is throwing the false noindex tag. But yeah, that's just how Incapsula do. The home page shows just fine with a site: operator.
Judging by the anchor text I see pointed at the site... and the Timthumbs.php file... the site was very very hacked at some point.
Edit: Yep. It was hacked until late last year.
-
Hi Patrick
Thanks for taking a look. If I could ask, where are you seeing this noindex tag and what are you using to see it? I've got my homepage set up in the yoast seo plugin to index and follow, and I had also previously added a into my header just to make sure. My suspicion is that the sitelock firewall installed on our site right now is blocking robots. Does this make any sense?
Thanks again
-
I wanted to attach this image - in my crawl, I am getting a "noindex,nofollow" but your code isn't showing it. I would check with your web development team to see what exactly is happening and how this can be fixed.
-
Hi there
It appears your homepage has a "noindex,nofollow" tag - change this to "index,follow". Make sure this is fixed across the site.
If for some reason that doesn't work (which it will):
Have you checked to see if you have a manual action?
If you have multiple URLs going on with the same content - check your canonical tags and make sure you do a content audit to see if this information can be removed, consolidated, or updated. Your SSL seems to not be configured properly also.
I would also make sure that you do a backlink audit to see if any links can be removed or updated. Also, check your local SEO presence and that everything is on point and consistent. Same with on-site SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating login page from website: SEO impact
Our current login page looks like www.website.com/log-in/. We are planning to migrate it to a sub directory login.website.com. For years, our login page is the top landing with highest visits after homepage. If we migrate this now, are we going to loose traffic and drop in rankings? Thanks
Web Design | | vtmoz0 -
What is your opinion in the use of jquery for a continuous scroll type of page layout?
So, I'm in 2 minds about this; let me start with a bit of background info. Context
Web Design | | ChrisAshton
We have a new client who is in the final days of their new site design and were when they first contacted us. Their design essentially uses 5 pages, each with several pages worth of content on each, separated with the use of jquery. What this means is a user can click a menu item from a drop-down in the nav and be taken directly to that section of content like using internal anchor links as if it were a separate page, or they can click the top-level nav item and scroll through each "sub-page" without having to click other links. Vaguely similar to Google's "How Search Works" page if each sector of that page had it's own URL, only without the heavy design elements and slow load time. In this process, scrolling down to each new "sub-page" changes the URL in the address bar and is treated as a new page as far as referencing the page, adding page titles, meta descriptions, backlinks etc. From my research this also means search engines don't see the entire page, they see each sub-page as their own separate item like a normal site. My Reservations I'm worried about this for several reasons, the largest of them being that you're essentially presenting the user with something different to the search engines. The other big one being that I just don't know if search engines really can render this type of formatting correctly or if there's anything I need to look out for here. Since they're so close to launching their new site, I don't have time to set up a test environment and I'm not going to gamble with a new corporate website but they're also going to be very resistant to the advice of "start the design over, it's too dangerous". The Positives
For this client in particular, the design actually works very well. Each of these long pages is essentially about a different service they offer and the continuous scrolling through the "sub-pages" acts as almost a workflow through the process, covering each step in order. It also looks fantastic, loads quickly and has a very simple nav so the overall user experience is great. Since the majority of my focus in SEO is on UX, this is my confusion. Part of me thinks that obscuring the other content on these pages and only showing each individual "sub-page" to search engines is an obvious no-no, the other part of me feels that this kind of user experience and the reasonable prevalence of AJAX/Paralax etc means search engines should be more capable of understanding what's going on here. Can anyone possibly shed some light on this with either some further reading or first-hand experience?0 -
2 Menu links to same page. Is this a problem?
One of my clients wants to link to the same page from several places in the navigation menu. Does this create any crawl issues or indexing problems? It's the same page (same url) so there is no duplicate content problems. Since the page is promotional, the client wants the page accessible from different places in the nav bar. Thanks, Dino
Web Design | | Dino640 -
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Attachment Pages
i have hundreds/thousands of images on my site, but for some reason the images on this page - http://indigocarhire.co.uk/top-of-the-range-car-hire/ - are being flagged as attachment pages, meaning im getting errors for duplicate titles, missing metas ect why are these images and only these ones being flagged up, they have been added in exactly the same way as every other image on the site appreciate any advice Thanks
Web Design | | RGOnline0 -
On Page Local SEO
What do you believe is the best approach when it comes to Local SEO for businesses in 2013?
Web Design | | BlueRockDigital0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0 -
Do iFrames embedded in a page get crawled?
Do iFrames embedded in a page get crawled? I have an iFrame which prints a page hosted by another company embedded in my page. Their links don't include rel=nofollow attributes, so I don't want Google to see them. Do spiders crawl the content in iFrames, or do I have to ensure that the links on this page include the nofollow attribute?
Web Design | | deuce1s0