Why is Google only indexing 3 of 8 pages?
-
Hi everyone, I have a small 8 page website I launched about 6 months ago. For the life of me I can not figure out why google is only indexing 3 of the 8 pages. The pages are not duplicate content in any way. I have good internal linking structure. At this time I dont have many inbound links from others, that will come in time. Am I missing something here? Can someone give me a clue?
Thanks Tim
-
I know the feeling, it's the same story with me and my own site.
-
Thank you everyone. I suspected the lack of back links to become an issue. But I did think google would index all pages backlinks or not. I just figured the page would not rank high without them. Thank you everyone for your suggestions and I will work on it some more. As soon as I have time.
-
Thanks Keri. I know of these issues. I work on so many sites and this one is my own. So it gets the lower priority when it comes to time spent.
-
Unrelated to indexing, look at the file sizes of images on your page. Some of them are unnecessarily huge, which hurts user experience and can affect rankings (the auto-play of the video is also a negative for user experience).
Also, your home page renders with just the .com or the .com/index.html, and that should be fixed.
If you need instructions or references about how to do any of the suggestions in this thread, please let us know and we can post more information.
-
Your sitemap.xml looks fine, but like Tim said, submit it using Google Webmaster Tools.
Also, add it to your robots.txt file. Add a new line:
User-agent: * Allow: /
-
Hi Tim,
Submit you Sitemap.xml in Google Webmaster tools to get fast and better indexing...
Create some good, relevant, high quality backlinks for your other 5 pages..
You will get in Google index fast..
-
At this time I dont have many inbound links from others
I don't think that you are doing anything wrong. However, the speed and depth of indexing depends on the strength and number of inbound links.
Hang in there... you will be indexed in time... get a few more links if you can.
Probably has nothing to do with your indexing problem but there is an awful lot of empty table code on your homepage.
-
Hello,
I'm pretty certain it will be to do with back links. The more relevant back links you can get then the better you're indexing will be.
I run a similar small website which has 12 pages but only 4 are indexed and the one thing it has in common with yours is the lack of links.
Have you thought of adding some of your wedding videos to youtube and going down the social route to attract links?
-
Hi Tim,
The only thing I can figure is that using coords links instead of traditional text links is keeping Google from crawling through the entire site.
Instead of:
<area shape="rect" coords="194,132,289,160" href="wedding-video-pricing.html" alt="Wedding Video Prices Arizona">
Try:
I'm not positive if it will help, but it can't hurt with indexation...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird Google indexing issues with www being forced
IM working on a site which is really not indexing as it should, I have created a sitemap.xml which I thought would fix the issue but it hasn't, what seems to be happening is the Google is making www pages canonical for some of the site and without www for the rest. the site should be without www. see images attached for a visual explanation.
Technical SEO | | Donsimong
when adding pages in Google search console without www some pages cannot be indexed as Google thinks the www version is canonical, and I have no idea why, there is no canonical set up at all, what I would do if I could is to add canonical tags to each page to pint to the non www version, but the CMA does not allow for canonical. not quite sure how to proceed, how to tell google that the non www version is in fact correct, I dont have any idea why its assuming www is canonical either??? k11cGAv zOuwMxv0 -
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
Best practices for types of pages not to index
Trying to better understand best practices for when and when not use a content="noindex". Are there certain types of pages that we shouldn't want Google to index? Contact form pages, privacy policy pages, internal search pages, archive pages (using wordpress). Any thoughts would be appreciated.
Technical SEO | | RichHamilton_qcs0 -
Anything new if determining how many of a sites pages are in Google's supplemental index vs the main index?
Since site:mysite.com *** -sljktf stopped working to find pages in the supplemental index several years ago has anyone found another way to identify content that has been regulated to the supplemental index?
Technical SEO | | SEMPassion0 -
From page 1th to page 18th @ Google
Hello Mozzers! I have a question, you may help.. How may it be possible that a page ranking well (1th result) goes from 1th result to the 18th page just in 1 day? It doesnt seem to be any kind of penalization.. I now had all suspicious outgoing links to be nofollow (they were not before), this may be a cause .. (?) Do you have any other suggestion? Thanks
Technical SEO | | socialengaged0 -
Sitemap all of a sudden only indexing 2 out of 5000+ pages
Any ideas why this happened? Our sitemap looks the same. Also, our total number of pages indexed has not decreased, just the sitemap. Could this eventually affect my pages being in the index?
Technical SEO | | rock220 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
Why is our page not visible in Google-ranking? www.loseweight.com.
using Wordpress as platform. Using the URL gets into the site,- but seems to be non-existent for public... No comments at all, seems to be "invisible"?
Technical SEO | | gewi0