Why Google did not index exactly these 2 pages? Any ideas?
-
Dear Community,
on 27th of July I relaunched my own website and submitted the sitemap as well I send the index-page to crawl it including all linked pages. Already the next day the new pages have been indexed. Today I checked them manually if they have been indexed.
The result is that 2 of 13 pages have not been indexed, here marked in bold:
http://inlinear.com/
http://inlinear.com/suchmaschinenoptimierung-online-marketing.php
http://inlinear.com/design/
http://inlinear.com/design/printmedien-gestaltung.php
http://inlinear.com/design/corporate-design-und-corporate-identity.php
http://inlinear.com/design/corporate-raum-design.php
http://inlinear.com/webentwicklung/
http://inlinear.com/virtueller-rundgang-360grad-fotografie.php
http://inlinear.com/business-atlas-online-verzeichnis.php
http://inlinear.com/baudokumentation-bauueberwachung.php
http://inlinear.com/ueber-uns.php
http://inlinear.com/blog/
http://inlinear.com/kontakt/-
The page "/design/" (which is the index.php of this folder should be the main-page because its about WEB DESIGN.
Should I create a copy and call it /design/web-design.php? May be Google prefers a meaningful URL than the index.php? So I put then a rel=canonical to web-design.php in my index.php? -
design/corporate-design-und-corporate-identity.php
The URL is a little long, but this should not be the reason? Or might be a reason that another page which is still in the index, but not online anymore (even redirecting to /design/) is still more dominant? Strange.... orshould I simply wait a little or try submitting these to sites manually to google? -
When checking Google Webmasters Tools Google tells me that just 3 pages have been indexed.
When I was checking which page is indexed or not I checked each URL with the site-search option:
site:inlinear.com/pageX.php ... when Google shows this page, it was a sign that it was indexed but why webmasters tools show up only 3 pages? (see screenshot)
Do you have any ideas?
Thank You -
-
Thank you all
Karl, yes thats right. I just picked these 2 pages and submitted them "fetch as google" and both pages have been indexed shortly after.
Holger
-
I think it would be worth adding the "priority" attribute in your XML sitemap. That way you can tell Google which pages you want to be crawled as priority. Remember, Webmaster Tools is not a live update of what Google sees, in some instances, it can be a couple of weeks behind.
Just looked to see if **http://inlinear.com/design/ **was cached and it was cached today about 5 hours ago.
-
I think you might just need to wait it out. But you can always help they by using " Fetch as Google " from Google Webmaster tools
-
Just went through your blog and everything seems to be fit on the code end... the thoughts you have mentioned in the pointers should not be implemented because the URL generation is correct and the URL is not at all too long!
I believe give it a bit more time and may be it gets index! Plus also try to generate links and social shares to these pages and this will indicate Google about this page and it will index it accordingly!
I believe you should give it a little more time plus social shares and some good quick links would accelerate the process!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
Which is better? One dynamically optimised page, or lots of optimised pages?
For the purpose of simplicity, we have 5 main categories in the site - let's call them A, B, C, D, E. Each of these categories have sub-category pages e.g. A1, A2, A3. The main area of the site consists of these category and sub-category pages. But as each product comes in different woods, it's useful for customers to see all the product that come in a particular wood, e.g. walnut. So many years ago we created 'woods' pages. These pages replicate the categories & sub-categories but only show what is available in that particular wood. And of course - they're optimised much better for that wood. All well and good, until recently, these specialist page seem to have dropped through the floor in Google. Could be temporary, I don't know, and it's only a fortnight - but I'm worried. Now, because the site is dynamic, we could do things differently. We could still have landing pages for each wood, but of spinning off to their own optimised specific wood sub-category page, they could instead link to the primary sub-category page with a ?search filter in the URL. This way, the customer is still getting to see what they want. Which is better? One page per sub-category? Dynamically filtered by search. Or lots of specific sub-category pages? I guess at the heart of this question is? Does having lots of specific sub-category pages lead to a large overlap of duplicate content, and is it better keeping that authority juice on a single page? Even if the URL changes (with a query in the URL) to enable whatever filtering we need to do.
On-Page Optimization | | pulcinella2uk0 -
Why is the seomoz showing it crawled 3 pages when i only have 2 pages?
I had seomoz crawl my site. I only have 2 pages. The site url is www.autoinsurancefremontca.com.
On-Page Optimization | | Greenpeak0 -
Issue: Duplicate Page Content (index.htm)
I get an error of "**Issue:**Duplicate Page Content" for the following pages in the SEOMOZ Crawl Diagnostics. But these pages are the same one! Duhhhh.... Is there a way to hide this false error? http://www.stdtime.com/ http://www.stdtime.com/index.htm BTW, I also get "**Issue:**Duplicate Page Title" for this page. Another false error...
On-Page Optimization | | raywhite0 -
Missing meta descriptions on indexed pages, portfolio, tags, author and archive pages. I am using SEO all in one, any advice?
I am having a few problems that I can't seem to work out.....I am fairly new to this and can't seem to work out the following: Any help would be greatly appreciated 🙂 1. I am missing alot of meta description tags. I have installed "All in One SEO" but there seems to be no options to add meta descriptions in portfolio posts. I have also written meta descriptions for 'tags' and whilst I can see them in WP they don't seem to be activated. 2. The blog has pages indexed by WP- called Part 2 (/page/2), Part 3 (/page/3) etc. How do I solve this issue of meta descriptions and indexed pages? 3. There is also a page for myself, the author, that has multiple indexes for all the blog posts I have written, and I can't edit these archives to add meta descriptions. This also applies to the month archives for the blog. 4. Also, SEOmoz tells me that I have too many links on my blog page (also indexed) and their consequent tags. This also applies to the author pages (myself ). How do I fix this? Thanks for your help 🙂 Regards Nadia
On-Page Optimization | | PHDAustralia680 -
If a site has https versions of every page, will the search engines view them as duplicate pages?
A client's site has HTTPS versions of every page for their site and it is possible to view both http and https versions of the page. Do the search engines view this as duplicate content?
On-Page Optimization | | harryholmes0070 -
Why is this page ranking highest?
I've just used Open Site Explorer to compare some sites whose (unpaid) Google ranking I aspire to. They all have higher authority than my site, but the top ranking site out of the 3 I've looked at has the lowest Page Authority, hardly and links (when the others have hundreds), lowest page rank and lowest page trust. In fact, when you look at the top ranking page (ranks #1), it does not even have the search term in it as a complete phrase. One thing I do notice is that it does have 100,000s of linking root domains from one linking root domain. So how can it rank number one on Google?
On-Page Optimization | | Beemer0 -
Google Reconsideration
Our site fell from grace last July and landed on page five of the Google search results for our primary keyword. For 6 months I tried a number of strategies with no results, including reconfiguring our site based on the SOEmoz on-page grading tool. More recently, after receiving your advice in a Q&A, I took down all of my paid links and submitted a reconsideration request to Google. Interestingly, 3 days later we popped up 20 spots. This left us on the top of page three. Better than page 5, but still not prime time! A few days ago (two weeks after our reconsideration request was submitted) I got a message back in my Webmaster Tools, that they had completed a review of our site - but oddly enough they provided no info on the outcome, positive or negative. And there has been no additional movement in the rankings since I received the message. Was the original 20 point jump the result of the reconsideration request, or just a coincidence? Or, is it possible that they did a review and the results will only occur later during some organic re-indexing process? What do you think?
On-Page Optimization | | JimSkychief0