Homepage refusing to show up in Google (rest of pages fine)
-
edit
-
Ah, I was wondering since they may have entirely different pricing based upon who you talk to.
-
SiteLock
-
So, on an invoice, do you or the client pay Incapsula or SiteLock?
-
Exactly, I've been told that these problems surfaced around the time the firewall was put up. I've just removed the timthumb file and I'm working on disavowing the spammy links pointing to us. I'm considering ditching sitelock in the next few days and seeing if that helps at all. We were also looking at Sucuri as a firewall option as well.
-
All of the header checks I've done come back with Incapsula. I don't really want to get much further into that for a number of reasons. But if you're actually paying SiteLock that's pretty interesting.
But you're saying the site ranked for it's brand term, at least, before implementing either SiteLock or Incapsula?
-
This is a huge help. I spent some time yesterday going through the site and updating my links to https where possible. Those don't all appear to have indexed yet. The bit about the timthumb exploit is particularly helpful. My theme lets me disable that, and I can get rid of the timthumb php file. I'm still concerned that sitelock could be exaggerating the problem though, we started having these issues with google around when it was implemented.
-
The site is using Incapsula as a CDN and web application firewall. The site still has a timthumb file. So I wouldn't recommend stepping out from behind that right now.
A wildcard search on the domain yields a lot of spam backlinks. Check ahrefs.
-
The entire site appears to index fine. As Patrick pointed out, it appears some of the pages in the index aren't https. But I don't know when you made the move, so things may be chugging right along.
The issue is ranking. But I know what you mean.
So what we have is (not all bad, per se - just what I see):
- Previously hacked site
- Timthumb file
- Some very spammy links
- HTTPS implemented on unknown date
- Moved to CDN / WAF
- Redirects
No doubt, you're going to have to disavow the bad links. Take down requests are nice and all, and you should note them in your disavow submission, but you don't have to manually contact each individual link/domain. It's not really a fire-and-forget process. You can submit it more than once.
I would bet a shiny nickle the attack/hack exploited the timthumb file. The site still uses it. Stop using it. Find an alternative. All it does is resize images.
The https migration (redirects... etc.) is just a confounding factor.
After you've removed the timthumb file, request a security review. Also consider the site may still have issues from the hack. So fetch as google from Webmaster Tools. If you see anything different than the real page, you still have a problem.
Read a little more about recovering from a hacked site here. I think that's more than likely the core of the problem right now.
-
Let me guess - you're using SiteLock after you were hacked to keep them out?
SiteLock creates this issue frequently (we solved it for another Q&A user about a month ago.)
Disable SiteLock, check your settings are all right in Webmasters Tools and Fetch the page in WMT. Add a link to it on Google+ so it gets recrawled quickly.
I only see 1 backlink to the site from Ahrefs (https://ahrefs.com/site-explorer/overview/subdomains?target=www.newstaradhesives.com) and only 2 in Majestic (https://majestic.com/reports/site-explorer?folder=&q=www.newstaradhesives.com)
Very, very low authority & SiteLock - those would be the two I'd start with.
-
It absolutely was very hacked. I'm currently in the process of submitting takedowns manually for those spam posts in google's index. The site has been cleaned up and relaunched since. Could these be harming the indexing of the homepage as well?
-
I think Incapsula is throwing the false noindex tag. But yeah, that's just how Incapsula do. The home page shows just fine with a site: operator.
Judging by the anchor text I see pointed at the site... and the Timthumbs.php file... the site was very very hacked at some point.
Edit: Yep. It was hacked until late last year.
-
Hi Patrick
Thanks for taking a look. If I could ask, where are you seeing this noindex tag and what are you using to see it? I've got my homepage set up in the yoast seo plugin to index and follow, and I had also previously added a into my header just to make sure. My suspicion is that the sitelock firewall installed on our site right now is blocking robots. Does this make any sense?
Thanks again
-
I wanted to attach this image - in my crawl, I am getting a "noindex,nofollow" but your code isn't showing it. I would check with your web development team to see what exactly is happening and how this can be fixed.
-
Hi there
It appears your homepage has a "noindex,nofollow" tag - change this to "index,follow". Make sure this is fixed across the site.
If for some reason that doesn't work (which it will):
Have you checked to see if you have a manual action?
If you have multiple URLs going on with the same content - check your canonical tags and make sure you do a content audit to see if this information can be removed, consolidated, or updated. Your SSL seems to not be configured properly also.
I would also make sure that you do a backlink audit to see if any links can be removed or updated. Also, check your local SEO presence and that everything is on point and consistent. Same with on-site SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are slides how's etc the new Splash Pages?
[How did Moz know that my question was about this?!] I've just completed an audit of nearly 50 websites in the tourism industry and 90% had a slideshow, large image or video taking up more than the initial screen on the fairly large screened Chromebook that I'm using. I'm advising them all to ditch this and am often getting resistance from the site owners and their web developers. I know that these can be better optimized for page load speed, which is poor for most of these sites, especially on mobile devices; but from a usability standpoint, are these affective at drawing in users? Do users take the time to view these? Are they annoyed at always having to scroll down to see if there is anything else useful on the homepage? I think they are like the splash pages of the past: poor for usability and SEO. I've advised to at least make sure that the images are sized so the top of the page fits any screen (some of them do resize well for mobile devices, but maybe not laptops/desktops), include text with calls to action and click through to relevant content. I've been noting that they aren't media businesses selling images or videos, so they need to get their offerings to the top of the page so that users can see and engage more quickly. Anyone have any stats or experience on this? Thanks, Ann
Web Design | | anndonnelly0 -
Multiple Similar Product Variations - Page layout, Title and SEO best practice??
Im doing some research into SEO for our new web design. I sell designer eyewear prescription and sunglasses. Lets take a Ray Ban Wayfarer sunglass it comes in 30 colours and 3 sizes for each model. Up till now i was of the impression that for best practice SEO i would need to have each individual variation as its own page, this would also help with things like google shopping too. So for example heres 1 colour product in 3 sizes of 30 colour variations for this particular model. Ray Ban Wayfarer RB2140
Web Design | | Craigboi1987
Colour: Black 901
Sizes: 47, 50, 54 Currently my urls looks like this with a new page and the size changing on the end for each variation. Ray Ban Wayfarer RB2140 - Black 901 - 47 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=47 Ray Ban Wayfarer RB2140 - Black 901 - 50 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=50 Ray Ban Wayfarer RB2140 - Black 901 - 54 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=54 This is very time consuming and I'm not sure if its adding any benefit to my SEO in fact scared its actually a) slowing my site down (content heavy)
b) looking like duplicate content I am thinking about moving towards a page more like this were it would be just be a model with variations. (not effecting the title/getting a new page per variation) http://demoleotheme.com/vigoss/index.php/atomic-endurance-running-tee-crew-neck.html I am not sure of the pros and cons of doing it this way over the way I'm doing it currently all i know is my site is ranking horribly. Lastly I'm currently running a magento V1.9 store which is renowned for duplicate content slow site speeds etc so have been told moving to woo commerce would benefit me for both site performance and seo but I'm skeptical as currently with this structure of a each SKU being a new page il be up to 8000+ products and multiple product variations that it can handle my needs, anyone with any experience on woo commerce platform? (this might be a operate question apologise) This is absolutely frying my brain so any advice appreciated. Im prepared to put every dying second into just need some solid advice in which direction to go!0 -
Will google penalize a website for using a table layout?
I just got a new client today and his entire website layout and structure is using tables instead of divs. This client is on a tight budget and wants to avoid unnecessary hours for re-coding the website, but at the same time he wants me to improve his SEO organically. This is the first time I've been asked to do work on an existing website that uses pure tables for the entire layout and I'm wondering if this effects the SEO in any way. So my question is, will tables effect rankings and SEO in any way?
Web Design | | ScottMcPherson0 -
How to create Google Section or Jump To Links
Hello all! i need create a jump to links on my site and when seach a keyword on google it will display jump to links http://techwyse.com/blog/wp-content/uploads/2009/12/google-jump-to-links.jpg i same with that images please help me how to do it, and have any plugin on wordpress can do that google-jump-to-links.jpg
Web Design | | ITVSEO.COM0 -
What is the best information architecture for developing local seo pages?
I think I have a good handle on the external local seo factors such as citations but I'd like to determine the best IA layout for starting a new site or adding new content to a local site. I see lots of small sites with duplicate content pages for each town/area which I know is poor practice. I also see sites that have unique content for each of those pages but it seems like bad design practice, from a user perspective, to create so many pages just for the search engines. To the example... My remodeling company needs to have some top level pages on its site to help the customers learn about my product, call these "Kitchen Remodeling" and "Bathroom Remodeling" for our purposes. Should I build these pages to be helpful to the customer without worrying too much about the SEO for now and focus on subfolders for my immediate area which would target keywords like "Kitchen Remodeling Mytown"? Aside from my future site, which is not a priority, I would like to be equipped to advise on best practices for the website development in situations where I am involved at the beginning of the process rather than just making the local SEO fit after the fact. Thanks in advance!
Web Design | | EthanB0 -
Sub-pages with more links than homepage - bad?
Hi,
Web Design | | rayvensoft
I am working on merging a number of my niche websites into a larger site (301 redirects, phased in over a few months). My question/concern is whether google will penalize the main site when it sees that the homepage has almost no links to it, and that about 10-15 sub-pages have a lot of links back to it. Does anybody have experience with this kind of scenario? Will it create a problem? Theoretically I could spend a year or so building up links to the new main page - building the brand - before doing the 301's. The smaller pages still bring in clients, but it is getting hard to maintain that many micro sites. Thanks in advance for any help.0 -
How serious is duplicate page content?
We just launched our site on a new platform - Magento Enterprise. We have a wholesale catalog and and retail catalog. We have up to 3 domains pointing to each product. We are getting tons of duplicate content errors. What are the best practices for dealing with this? Here is an example: mysite.com/product.html mysite.com/category/product.html mysite.com/dynamic-url
Web Design | | devonkrusich0 -
Google cache and rel alternate
http://groups.google.com/a/googleproductforums.com/forum/#!searchin/webmasters/rel$20alternate$20not$20works/webmasters/xzwTBJemPss/LyRjRCigZdYJ http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Web Design | | ctam
Situation:www.example.ca - canadian IP and https://www.google.com/webmasters/tools/settings:"Your site's domain is currently associated with the target: Canada" - done long time ago.<head><link rel="canonical" href="http://www.example.ca/" /> <link< span="">rel="alternate" hreflang="en-us" href="http://www.example.com/" /></link<><link rel="alternate" hreflang="en-ca" href="http://www.example.ca/" /><link rel="alternate" hreflang="en" href="http://www.example.com/" />```
www.example.com - US IP and
https://www.google.com/webmasters/tools/settings:
"Geographic target Target users in: United States" done long time ago.
<head><link rel="canonical" href="http://www.example.com/" /> <link< span="">rel="alternate" hreflang="en-us" href="http://www.example.com/" /><link rel="alternate" hreflang="en-ca" href="http://www.example.ca/" />
<link rel="alternate" hreflang="en" href="http://www.example.com/" /> Differences: Prices and some minor changes in design. cache:www.example.com - shows .ca version,
with snapshot's date after rel="alternate" had been added.
Results: In usexample.com pages do not appear in search results.
Some times www.example.ca pages do,
but they are even close so well ranked as example.com pages before. Question: What we are doing wrong?</link<>0