Pages not Indexed after a successful Google Fetch
-
I am trying to understand why google isn't indexing key content on my site.
www.BeyondTransition.com is indexed and new pages show up in a couple of hours.
My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform).
These pages are reached via a search page, but no direct navigation from the home page.
When I link to an event page from an indexed page it doesn't show up in search results.
When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page
e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results....
So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated
Thanks
Denis
-
Thanks Nick. I'll work through all of those points
-
Not sure if it was a connection issue on my end or what, but that page takes a very long time to load, which could explain the lack of indexing of the pages linked from it.
Also, Google states that pages submitted witht the Fetch as Googlebot tool are not guaranteed to be indexed, so there may be quite a delay on that. Are all pages included in your XML sitemap? An XML sitemap is the preferred way to notify Google of pages it may not normally find. Here is a link to more about XML sitemaps https://www.google.com/support/webmasters/bin/answer.py?answer=156184&hl=en
Even with an XML sitemap, Google may not immediately crawl many pages. Actually, indexing is rarely immediate. The frequency of crawling and speed of indexing has to do with many of the same factors as your ranking - quality, number of inbound links and pagerank, site performance, etc. If all your pages load quickly and you are in pretty good shape as far as links, etc, you could also try something to draw Google's attention to the new pages - like Tweeting a link or posting to Google+. That seems to "force" faster indexing in some cases.
I just checked your site with webpagetest.org and it is showing a load time of about 14 seconds. Tools.pingdom.com seemed to get hung up on some of the javascripts and couldn't complete its test. Doing what you can to speed up the site and address any other "quality" issues will help with indexing, and your performance in search engine results in general. -
I''m not sure - I created this page yesterday as a map of all the races and added to the bottom of the home page as 'site map'. I then added 'site map' to the index using fetch on webmaster tools and used the submit links option. This morning it's been indexed but after quick sample none of the links from it have been indexed (or appear in google search results).
This suggests its something that's wrong with my page/page design but what?????
So a widget will help, but only once I've figured out the underlying problem
-
I''m not sure - I created this page yesterday as a map of all the races and added to the bottom of the home page as 'site map'. I then added 'site map' to the index using fetch on webmaster tools and used the submit links option. This morning it's been indexed but after quick sample none of the links from it have been indexed (or appear in google search results).
This suggests its something that's wrong with my page/page design but what?????
So a widget will help, but only once I've figured out the underlying problem
-
Since it may not be practical to have every event linked through navigation, maybe a widget that shows the last maybe ten events would be good enough.
-
Hi Nick,
Thanks for the answer. I've got a word press plugin but I don't think it captures everything so I'm in the process of manually generating an XML site map - but I think you have you finger on the answer why pages aren't crawled
Navigation is on the list of things to do - it's working out the relative urgency.
I like the RSS idea - time for some research on how to do it.
-
You should use a XML site map to keep Google up to date with new pages. I could not find one for your site. Otherwise, if the event pages can only be found by using the search feature on your site, those pages will not probably not be crawled and indexed. you could also submit the feed to RSS sites Fetch as Googlebot may work, but it probably will not be as fast as using a sitemap.xml file.
Would it be possible to have the event pages available through some kind of navigation in addition to being found by your site's search?
You might also consider setting up an RSS feed of the events and submitting it to feed burner and other RSS sites. That may be a little complicated, but would also help speed up indexing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetch as Google temporarily lifting a penalty?
Hi, I was wondering if anyone has seen this behaviour before? I haven't! We have around 20 sites and each one has lost all of its rankings (not in index at all) since the medic update apart from specifying a location on the end of a keyword. I set to work trying to identify a common issue on each site, and began by improving speed issues in insights. On one site I realised that after I had improved the speed score and then clicked "fetch as google" the rankings for that site all returned within seconds. I did the same for a different site and exactly the same result. Cue me jumping around the office in delight! The pressure is off, people's jobs are safe, have a cup of tea and relax. Unfortunately this relief only lasted between 6-12 hours and then the rankings go again. To me it seems like what is happening is that the sites are all suffering from some kind of on page penalty which is lifted until the page can be assessed again and when it is the penalty is reapplied. Not one to give up I set about methodically making changes until I found the issue. So far I have completely rewritten a site, reduced over use of keywords, added over 2000 words to homepage. Clicked fetch as google and the site came back - for 6 hours..... So then I gave the site a completely fresh redesign and again clicked fetch as google, and same result. Since doing all that, I have swapped over to https, 301 redirected etc and now the site is completely gone and won't come back after fetching as google. Uh! So before I dig myself even deeper, has anyone any ideas? Thanks.
Technical SEO | | semcheck11 -
Fake Links indexing in google
Hello everyone, I have an interesting situation occurring here, and hoping maybe someone here has seen something of this nature or be able to offer some sort of advice. So, we recently installed a wordpress to a subdomain for our business and have been blogging through it. We added the google webmaster tools meta tag and I've noticed an increase in 404 links. I brought this up to or server admin, and he verified that there were a lot of ip's pinging our server looking for these links that don't exist. We've combed through our server files and nothing seems to be compromised. Today, we noticed that when you do site:ourdomain.com into google the subdomain with wordpress shows hundreds of these fake links, that when you visit them, return a 404 page. Just curious if anyone has seen anything like this, what it may be, how we can stop it, could it negatively impact us in anyway? Should we even worry about it? Here's the link to the google results. https://www.google.com/search?q=site%3Amshowells.com&oq=site%3A&aqs=chrome.0.69i59j69i57j69i58.1905j0j1&sourceid=chrome&es_sm=91&ie=UTF-8 (odd links show up on pages 2-3+)
Technical SEO | | mshowells0 -
How to block text on a page to be indexed?
I would like to block the spider indexing a block of text inside a page , however I do not want to block the whole page with, for example , a noindex tag. I have tried already with a tag like this : chocolate pudding chocolate pudding However this is not working for my case, a travel related website. thanks in advance for your support. Best regards Gianluca
Technical SEO | | CharmingGuy0 -
Does google know every time you change content on your page
What i mean by the question is, so on our home page www.in2town.co.uk we change the article under lifestyle story of the day, if this changes every hour, will this encourage google to visit that page more often or will then just ignore that and just visit each day would love to hear your thoughts on this
Technical SEO | | ClaireH-1848860 -
No confirmation page on Google's Disavow links tool?
I've been going through and doing some spring cleaning on some spammy links to my site. I used Google's Disavow links tool, but after I submit my text file, nothing happens. Should I be getting some sort of confirmation page? After I upload my file, I don't get any notifications telling me Google has received my file or anything like that. It just takes me back to this page: http://cl.ly/image/0S320q46321R/Image 2013-04-26 at 11.15.25 AM.png Am I doing something wrong or is this what everyone else is seeing too?
Technical SEO | | shawn810 -
Google sees 2 home pages while I only have 1
How to solve the problem of google seeing both domain.com and domain.com/index.htm when I only have one file? Will the cannonical work? If so which? Or any other solutions for a novice? I learned from previous blogs that it needs to be done by hosting service, but Yahoo has no solution.
Technical SEO | | Kurtyj0 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0 -
Are Google now indexing iFrames?
A client is pulling content through an iFrame, and when searching for a snippet of that exact content the page that is pulling the data is being indexed and not the iFrame page. Seen this before?
Technical SEO | | White.net0