Why isn't google indexing our site?
-
Hi,
We have majorly redesigned our site. Is is not a big site it is a SaaS site so has the typical structure, Landing, Features, Pricing, Sign Up, Contact Us etc...
The main part of the site is after login so out of google's reach.
Since the new release a month ago, google has indexed some pages, mainly the blog, which is brand new, it has reindexed a few of the original pages I am guessing this as if I click cached on a site: search it shows the new site.
All new pages (of which there are 2) are totally missed. One is HTTP and one HTTPS, does HTTPS make a difference.
I have submitted the site via webmaster tools and it says "URL and linked pages submitted to index" but a site: search doesn't bring all the pages?
What is going on here please? What are we missing? We just want google to recognise the old site has gone and ALL the new site is here ready and waiting for it.
Thanks
Andrew
-
Well, links/shares are good. But of course I'm just begging the question of how you can get those.
Rand gave a great talk called "Inbound Marketing for Startups" at a Hackers & Founders meetup that was focused more on Inbound as a whole than SEO in particular, but it's full of valuable insights: http://vimeo.com/39473593 [video]
Ultimately it'll come down to some kind of a publishing/promotional strategy for your startup. Sometimes your startup is so unique/interesting that it has its own marketing baked right in - in which case you can get a lot of traction by simply doing old-school PR to get your startup in front of the right people.
Other times, you've got to build up links/authority on the back of remarkable marketing.
BufferApp is a great example of a startup that built traction off their blog. Of course, they weren't necessarily blogging as an SEO play - it was more in the aim of getting directly in front of the right audience for direct signups for their product. But they definitely built up some domain authority as a result.
I'd also take a look at the guides Mailchimp has created - they created the dual benefit of getting in front of the right audience in a positive/helpful way (which benefits the brand and drives sign-ups directly) as well as building a considerable number of inbound links, boosting their domain authority overall.
Unfortunately no quick/easy ways to build your domain authority, but things you do to build your authority can also get you immediately in front of the audience you're looking for - and SEO just becomes a lateral benefit to that.
-
Thank you all for your responses. It is strange. we are going to add a link to our g+ page and then add a post.
As a new site what is the best way to get our domain authority up so we get crailed quicker?
Thanks again
Andrew
-
I disagree. Unless the old pages have inbound links from external sites, there's not much reason to 301 them (and not much benefit). If they're serving up 404 errors, they will fall out of the index.
Google absolutely does have a way to know these new pages exist - by crawling the home page and following the links discovered there. Both of the pages in question are linked to prominently, particularly the Features page which is part of the main navigation. A sitemap is just an aid for this process - it can help move things along and help Google find otherwise obscure/deep pages, but it by no means is a necessity for getting prominent pages indexed, particularly pages that are 1-2 levels down from the home page.
-
If you didn't redirect the old URLs to the new ones when the new site went live, this will absolutely be the cause of your problem, Studio33. That, combined with having no (or misdirected) sitemap means there was essentially no way for Google to even know your site's pages existed.
Good catch Billy.
-
Hi Andrew,
-
Google has been indexing HTTPS URLs for years now without a problem, so is unlikely to be part of the issue.
-
Your domain authority on the whole may be slowing Google down in indexing new pages. Bottom line is crawl rate and depth are both functions of how authoritative/important you appear based on links/shares/etc.
-
That said, I don't see any indication as to why these two particular pages are not being indexed by Google. I'm a bit stumped here.
I see some duplication between your Features page and your Facebook timeline, but not with the invoice page.
As above, your domain authority (17) is a bit on the low side. So this could simply be a matter of Google not dedicating enough resources to crawl/index all of your pages yet. But why these two pages would be the only ones is perplexing, particularly after a full month. There are no problems with your Robots.txt, no canonical tag issues, the pages are linked to properly.
Wish I had an easy answer here. One idea, a bit of a long shot: we've seen Google index pages faster when they're linked to from Google+ posts. I see you have a Google+ business page for this website - you might try simply writing a (public) post there that includes a link over to the Features page.
As weak as that is, that's all I've got.
Best of Luck,
Mike -
-
OK - I would get a list of all of your old pages and start 301 redirecting them to your new pages asap. This could be part of your issue.
-
Hi checked XML, its there if you view source it just doesn't have a stylesheet
-
Hi thanks about 1 month. The blog page you are getting maybe the old ones,as they are working this end http://www.invoicestudio.com/Blog . What you have mentioned re the blog is part of the problem. Google has the old site and not the new.
-
Getting this on your Blog pages:
The page cannot be displayed because an internal server error has occurred.
where you aware?
Anyway - may I ask how old these pages are?
-
Thanks. I will look into the sitemap. That only went live about an hour ago whilst this thread has been going on.
-
Yeah - with no path specified the directive is ignored. (you don't have a '/' so the directive (disallow) is ignored)
however, you do direct to your xml sitemap which appears to be empty. You might want to fix that....
-
Hi no I think its fine as we do not have the forward slash after the disallow. See
http://www.robotstxt.org/robotstxt.html
I wish it was as simple as that. Thanks for your help though its appreciated.
-
Hmmm. That link shows that the way you have it will block all robots.
-
Thanks but I think Robots.txt is correct. Excert from http://www.robotstxt.org/robotstxt.html
To exclude all robots from the entire server
User-agent: * Disallow: /
To allow all robots complete access
User-agent: * Disallow:
(or just create an empty "/robots.txt" file, or don't use one at all)
-
It looks like your robots.txt file is the problem. http://www.invoicestudio.com/robots.txt has:
User-agent: * Disallow: When it should be:
User-agent: *
Allow: / -
Hi,
The specific pages are
https://www.invoicestudio.com/Secure/InvoiceTemplate
http://www.invoicestudio.com/Features
I'm not sure what other pages are not indexed.
New site has been live 1 month.
Thanks for your help
Andrew
-
Without seeing the specific pages i cant check for things such as noindex tags or robot text blocking access, i would suggest you double check these aspects. The pages will need to be accesible to Search engines when they crawl your site, so if there are no links to those pages Google will be unable to access them.
How long have they been live since the site re-launch as it may just be that they have not been crawled yet, particuarly if they are deeper pages within your site hierarchy.
Heres a link to Googles resources on crawling and indexing sites incase you have not been able to check through them yet.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cache and index page of Mobile site
Hi, I want to check cache and index page of mobile site. I am checking it on mobile phone but it is showing the cache version of desktop. So anybody can tell me the way(tool, online tool etc.) to check mobile site index and cache page.
Intermediate & Advanced SEO | | vivekrathore0 -
Pagination and View All Pages Question. We currently don't have a canonical tag pointing to View all as I don't believe it's a good user experience so how best we deal with this.
Hello All, I have an eCommerce site and have implemented the use rel="prev" and rel="next" for Page Pagination. However, we also have a View All which shows all the products but we currently don't have a canonical tag pointing to this as I don't believe showing the user a page with shed loads of products on it is actually a good user experience so we havent done anything with this page. I have a sample url from one of our categories which may help - http://goo.gl/9LPDOZ This is obviously causing me duplication issues as well . Also , the main category pages has historically been the pages which ranks better as opposed to Page 2, Page 3 etc etc. I am wondering what I should do about the View All Page and has anyone else had this same issue and how did they deal with it. Do we just get rid of the View All even though Google says it prefers you to have it ? I also want to concentrate my link juice on the main category pages as opposed being diluted between all my paginated pages ? - Does anyone have any tips on how to best do this and have you seen any ranking improvement from this ? Any ideas greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
When does Google index a fetched page?
I have seen where it will index on of my pages within 5 minutes of fetching, but have also read that it can take a day. I'm on day #2 and it appears that it has still not re-indexed 15 pages that I fetched. I changed the meta-description in all of them, and added content to nearly all of them, but none of those changes are showing when I do a site:www.site/page I'm trying to test changes in this manner, so it is important for me to know WHEN a fetched page has been indexed, or at least IF it has. How can I tell what is going on?
Intermediate & Advanced SEO | | friendoffood0 -
Why is my site not getting crawled by google?
Hi Moz Community, I have an escort directory website that is built out of ajax. We basically followed all the recommendations like implementing the escaped fragment code so Google would be able to see the content. Problem is whenever I submit my sitemap on Google webmastertool it always 700 had been submitted and only 12 static pages had been indexed. I did the site query and only a number of pages where indexed. Does it have anything to do with my site being on HTTPS and not on HTTP? My site is under HTTPS and all my content is ajax based. Thanks
Intermediate & Advanced SEO | | en-gageinc0 -
I don't get it... A Grade, etc
I have an A grade for a specific keyword, my competitors ranked above me have poor grades.
Intermediate & Advanced SEO | | tylerwp
My "Root Domains Linking to the Page" is higher than competitors ranked above me.
Yet I'm ranked 7th on google.
Even worse the 3rd place ranked URL has very poor page authority and linking. What is giving these poor sites such great rankings? Jxluy8Z.png0 -
Strange indexing of multi language site
I've been looking at a site which has a strange ranking/indexing issue. The website has several translated versions of the site for different languages and these translated pages seem to be outranking the UK pages in the UK search results. All of the translated pages are in sub folders, eg domain.com domain.com/fr domain.com/sv domain.com/it domain.com/es I cant work out how or why Google would see these pages with non english content as more relevant than the UK pages? One thing I did notice is that there are no meta language tags on there. Could this be the issue?
Intermediate & Advanced SEO | | edwardlewis0 -
Google giving me only partial site links?
Hi Guys, My site is #1 ranked for the term "waiting till marriage," but Google only gives me partial site links. See "Forums - Articles - Questions - Videos" links in attached screenshot. How do I get the full, page-dominating, mini-description-having site links? Any suggestions? Note: I've got a ton of content and decent traffic, but I haven't put much time into developing back links yet. I'm a php developer, but I'm new to professional-level SEO. Any help would be hugely appreciated. Also, sorry about the inflammatory nature of the site. It's not a preachy site; it's just a support group. Hope it doesn't offend. partial-sitelinks.png
Intermediate & Advanced SEO | | MikeAM270 -
Why isn't link velocity in the 2011 Ranking Factors?
How come there's no reference to link velocity in the Search Ranking Factors, 2011 or prior? We know that we have to continue building links for a client even if they're already doing well, not just because of the competition nipping at their heels but because if we stop they slip down anyway, so we know that stopping link building will often times have an adverse effect... meaning link velocity right? So how come there's no mention of it? Just curious 🙂
Intermediate & Advanced SEO | | SteveOllington0