Homepage refusing to show up in Google (rest of pages fine)
-
edit
-
Ah, I was wondering since they may have entirely different pricing based upon who you talk to.
-
SiteLock
-
So, on an invoice, do you or the client pay Incapsula or SiteLock?
-
Exactly, I've been told that these problems surfaced around the time the firewall was put up. I've just removed the timthumb file and I'm working on disavowing the spammy links pointing to us. I'm considering ditching sitelock in the next few days and seeing if that helps at all. We were also looking at Sucuri as a firewall option as well.
-
All of the header checks I've done come back with Incapsula. I don't really want to get much further into that for a number of reasons. But if you're actually paying SiteLock that's pretty interesting.
But you're saying the site ranked for it's brand term, at least, before implementing either SiteLock or Incapsula?
-
This is a huge help. I spent some time yesterday going through the site and updating my links to https where possible. Those don't all appear to have indexed yet. The bit about the timthumb exploit is particularly helpful. My theme lets me disable that, and I can get rid of the timthumb php file. I'm still concerned that sitelock could be exaggerating the problem though, we started having these issues with google around when it was implemented.
-
The site is using Incapsula as a CDN and web application firewall. The site still has a timthumb file. So I wouldn't recommend stepping out from behind that right now.
A wildcard search on the domain yields a lot of spam backlinks. Check ahrefs.
-
The entire site appears to index fine. As Patrick pointed out, it appears some of the pages in the index aren't https. But I don't know when you made the move, so things may be chugging right along.
The issue is ranking. But I know what you mean.
So what we have is (not all bad, per se - just what I see):
- Previously hacked site
- Timthumb file
- Some very spammy links
- HTTPS implemented on unknown date
- Moved to CDN / WAF
- Redirects
No doubt, you're going to have to disavow the bad links. Take down requests are nice and all, and you should note them in your disavow submission, but you don't have to manually contact each individual link/domain. It's not really a fire-and-forget process. You can submit it more than once.
I would bet a shiny nickle the attack/hack exploited the timthumb file. The site still uses it. Stop using it. Find an alternative. All it does is resize images.
The https migration (redirects... etc.) is just a confounding factor.
After you've removed the timthumb file, request a security review. Also consider the site may still have issues from the hack. So fetch as google from Webmaster Tools. If you see anything different than the real page, you still have a problem.
Read a little more about recovering from a hacked site here. I think that's more than likely the core of the problem right now.
-
Let me guess - you're using SiteLock after you were hacked to keep them out?
SiteLock creates this issue frequently (we solved it for another Q&A user about a month ago.)
Disable SiteLock, check your settings are all right in Webmasters Tools and Fetch the page in WMT. Add a link to it on Google+ so it gets recrawled quickly.
I only see 1 backlink to the site from Ahrefs (https://ahrefs.com/site-explorer/overview/subdomains?target=www.newstaradhesives.com) and only 2 in Majestic (https://majestic.com/reports/site-explorer?folder=&q=www.newstaradhesives.com)
Very, very low authority & SiteLock - those would be the two I'd start with.
-
It absolutely was very hacked. I'm currently in the process of submitting takedowns manually for those spam posts in google's index. The site has been cleaned up and relaunched since. Could these be harming the indexing of the homepage as well?
-
I think Incapsula is throwing the false noindex tag. But yeah, that's just how Incapsula do. The home page shows just fine with a site: operator.
Judging by the anchor text I see pointed at the site... and the Timthumbs.php file... the site was very very hacked at some point.
Edit: Yep. It was hacked until late last year.
-
Hi Patrick
Thanks for taking a look. If I could ask, where are you seeing this noindex tag and what are you using to see it? I've got my homepage set up in the yoast seo plugin to index and follow, and I had also previously added a into my header just to make sure. My suspicion is that the sitelock firewall installed on our site right now is blocking robots. Does this make any sense?
Thanks again
-
I wanted to attach this image - in my crawl, I am getting a "noindex,nofollow" but your code isn't showing it. I would check with your web development team to see what exactly is happening and how this can be fixed.
-
Hi there
It appears your homepage has a "noindex,nofollow" tag - change this to "index,follow". Make sure this is fixed across the site.
If for some reason that doesn't work (which it will):
Have you checked to see if you have a manual action?
If you have multiple URLs going on with the same content - check your canonical tags and make sure you do a content audit to see if this information can be removed, consolidated, or updated. Your SSL seems to not be configured properly also.
I would also make sure that you do a backlink audit to see if any links can be removed or updated. Also, check your local SEO presence and that everything is on point and consistent. Same with on-site SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any proof that google can crawl PWA's correctly, yet
At the end of 2018 we rolled out our agency website as a PWA. At the time, Google used Chrome (41) headless to render our website. Although all sources announced at the time that it 'should work', we experienced the opposite. As a solution we implement the option for server side rendering, so that we did not experience any negative effects. We are over a year later. Does anyone have 'evidence' that Google can actually render and correctly interpret client side PWA's?
Web Design | | Erwin000 -
When rel canonical tag used, which page does Google considers for ranking and indexing? A/B test scenario!
Hi Moz community, We have redesigned our website and launched for A/B testing using canonical tags from old website to new website pages, so there will be no duplicate content issues and new website will be shown to the half of the website visitors successfully to calculate the metrics. However I wonder how actually Google considers it? Which pages Google will crawl and index to consider for ranking? Please share your views on this for better optimisation. Thanks
Web Design | | vtmoz0 -
Do Google Fonts Slow Down Your Site?
Hi Guys,
Web Design | | jeeyer
I just did a webpage speed test on http://www.webpagetest.org to see how our site is performing.
I noticed that an exteral URL called fonts.gstatic.com has a "huge" impact on our sites loading time. See a screen here: http://monosnap.com/image/z6drzC2ELoJ48d1rM0Tmtuszl3pFpH#
An overview can be seen here: http://monosnap.com/image/9hofUpr5Ld8D7mi7zyaJmGFIGhpBsY# All our scores are green and A (finally!) but I was a bit concerned when I saw the outcome of the pagespeedtest regarding the fonts.
When I load a page on my pc I indeed notice that the text content is usally quite slow in showing up, pops up afer a few seconds. Is this a know problem and Is this something I need to fix? If so what is the best approach? Looking forward on your thoughts!
Joost1 -
Can only get a few pages indexed on by google
Hi I've touched upon this before on previous questions so apologies for repeating myself. In a nutshell out of the 60 webpages submitted to Google 11 have been indexed and out of the 140 images submitted none have indexed any ideas would be great! Here is a screen shot of what Google Webmaster is showing http://www.tidy-books.com/sitemapshow.png and here is the sitemap - > http://www.tidy-books.com/sitemap/us/sitemap.xml Thanks
Web Design | | tidybooks0 -
Does Google have problem crawling ssl sites?
We have a site that was ranking well and recently dropped in traffic and ranking. The whole site is https and and not just the shopping pages. Thats the way the server is setup, they make whole site https. My manager thinks the drop in ranking is due to google not crawling https. I think contrary, but would like some feedback on this. Site is here
Web Design | | anthonytjm0 -
How keywords per page to keep from being "spammy"?
Hi all, I am currently doing a marketing internship for a B2B company that does all sorts of out-sourced recruiting work. I have some experience with SEO, but not completely confident. My first question is, I know Google sees websites that load up on keywords as "spammy", so what is the appropriate number of keywords per page? Currently, I was thinking about this setup: 1 keyword for the URL 1 keyword per alt tag (1 per page, at most) 2 keywords per each title tag (approximately 4 pages that I am going to follow internally, not following the "about us" page). After that, I was thinking of adding 2-3 more keywords in each meta description and 2-3 in the body copy. That would equate to 6-8 keywords on each page, is this too many and should keywords be repeated (on the same page or across multiple pages)? Since this website is brand new (zero links), would it make sense to nofollow all of the internal links so that they homepage can gain ranking as quickly as possible within Google?
Web Design | | wlw20090 -
Getting ranked on google
I help run a small real estate site in ireland www.aplacetorent.ie and Im in charge of seo. I have read lots of books over the last year or so and while they offer lots of advice some of them dont actually show you what to do. I have joined distilled and I think its the best thing i have done in the last few weeks and am learning a lot but if anyone has any advice i would be very grateful. Thank you
Web Design | | Kessie0 -
What's the best was to structure Product page information on my site?
Hi - I run a hobby related niche new / article / resource site (http://tinyurl.com/4eavaj4). One of the most critical components of the site is our product database. We don't actually sell anything directly - instead we monetize them by displaying relevant affiliate product feeds and price comparisons. However since the Panda update was implemented in February my traffic (particularly my long tail, product related traffic) has dropped off considerably. I had about a 20% drop in overall traffic, but have made up some of the ground in the past week. However I want to know once and for all how I should structure my product related information as I have a ton of great content that is ready to be published in this section but want to be sure I structure it the best possible way from a SEO standpoint. Here are a few different options I've come up with for displaying information about products on my site. For the purpose of these examples I am going to refer to all of the information that makes up my product pages collectively as "product profiles". Please let me know which is the best SEO wise (or if you have a better way of doing it let me know): - Option 1 - Current Method - Divide Content Sections into different pages / urls Example: http://tinyurl.com/4tpdlbl This is how the majority of my product profiles are currently structured. I did this to improve load times and to keep the total number of links per page down. In addition to the core product profile subpages: "Product Details","Compare Prices", **"**Product Review", "Hot Auctions", and "Checklists", I have the Checklists area further segmented by subset, each of which is on its own page that is only accessible through the main Checklists tab of the profile. - Option 2 - Everything on one url / page the old fashioned way, with everything available by scrolling vertically. This would make the page go on forever though. - Option 3 - Everything on one url / page, but visually segmented using css / javascript tabs. Example: http://tinyurl.com/4kqhauh I looked at the source code and all the page text is there, so it looks like it would be spider-able but you tell me. Or would another method of tabbing be better? My site is wordpress based so the functionality comes from a plugin. - Option 4 - Use post tabs that are technically all on the same page, but make each individual tab be accessible through its own suburl, all of which share the same core canonical url. Example: http://tinyurl.com/4bs9pjs Clicking on any of the individual tabs will result in something like ?postTabs=2 being appended to the core url. Example: http://tinyurl.com/4gvgufc Any input would be greatly appreciated asap! Thanks Mike
Web Design | | MikeATL0