Low Index: 72 pages submitted and only 1 Indexed?
-
Hi Mozers,
I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues.
I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues.
I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each.
Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k
Any advice around this would be so much appreciated!
Thanks
Justin
-
Hi,
Yes you've got it spot on, 301s are there to keep old things pointing to the new, but only the new should be in the sitemap.
When you've crawled the live site ready to make your sitemap you can manually right click and remove a URL you would not want in there before generating it.
Kind Regards
Jimmy
-
Hey Jimmy,
Wow thanks so much for your great feedback, much appreciated!
Just want to clarify your answer to the 301. So it is okay to create the 301s for our users to direct them to the new urls, but not good to include in the sitemap? Am I correct in saying this? Or am I totally off track with this?
I think whats happened also is the sitemap from screaming frog has generated old urls and some new urls as well, I'm now seeing a two of our contact pages indexed for the com.au site, one is the older url and the other is the new url.
Let me know your feedback
Cheers again Jimmy
-
Hi Justin,
Yes as long as WMT is specifically watching the HTTPS website then the problem is not in WMT unfortunately.
As hectormainar says, check your sitemap in screaming frog
go to your sitemap.xml and save it to your computer
change the frog to list mode
open your sitemap and runAll the links in the sitemap should report 200
any 301s should be swapped with the direct versionsThe 301 is good to maintain backwards compatibility and allow backlinks and old users to navigate to your new content, but shouldn't be used as major navigation.
Kind Regards
Jimmy
-
Hey Jimmy,
Thanks for the heads up! Yes, I have been watching this via WMT and also I used screaming frog to generate the sitemaps and gave to my developer he then gave me the url to submit to google.
I also used https. I hope that helps?
Let me know if you have any further questions
Cheers Jimmy thanks again
-
Hi Hectormainar,
I understand what your saying, yes we had https://www.zenory.com.au/psychic-readings/psychic-readings before we updated the urls to the following https://www.zenory.com.au/psychic-readings
after doing this we were told to add 301 redirects.. so am a little confused now as to why it should not be done as our visitors would go to the old urls?
I used screaming frog to generate the sitemaps, and from that I think it included the urls? I'm not too sure which exactly it included? Is there a way to check this?
Thanks for your help
Justin
-
Which URLs do you include in your sitemap? Could you check if you try to index
https://www.zenory.com.au/psychic-readings/psychic-readings or https://www.zenory.com.au/psychic-readings ?The first one is the URL you link at menus, but it has a 301 redirect to the second URL format (and the same for the rest of main options). That is quite a bad idea. Please make sure you include the correct address on the sitemap and not the 301 redirect one. That could be causing the problem of Google Webmaster Tools not showing that page in your sitemap as indexed, as although final page is properly indexed in google (as you can check if you look up for site:www.zenory.com.au), GWT is not able to match both addresses.
-
Hi Justin,
It is hard to tell by your screenshot, but what website are you watching in Webamasters Tools? As you are using https, the website to track would have to be the https one as a recent WMT update now classifies these differently.
Having crawled your sites with the screaming frog, I don't see any smoking guns as to why the pages would not be indexed.
Let me know about the WMT account
Kind Regards
Jimmy
-
Hi Michael,
Thanks for your response! I have also done site:yourdomain and this is also showing up quiet low compared to the amount of pages submitted. usa is showing 10 pages indexed. AU slightly more and NZ alot more.
-
Webmaster Tools is not a current accurate reflection of what is actually indexed.
A search in Google for site:yourdomain.com will show the accurate information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should Hreflang x-default be on every page of every country for an International company?
UPDATED 4/29/2019 4:33 PM I had made to many copy and pastes. Product pages are corrected Upon researching the hreflang x-default tag, I am getting some muddy results for implementation on an international company site older results say just homepage or the country selector but…. My Question/Direction going forward for the International Site I am working on: I believe I can to put x-default all the pages of every country and point it to the default language page for areas that are not covered with our current sites. Is this correct? From my internet reading, the x-default on every page is not truly necessary for Google but it will be valid implemented. My current site setup example:
International SEO | | gravymatt-se
https://www.bluewidgets.com Redirects to https://www.bluewidgets.com/us/en (functions as US/Global) Example Countries w/ code Site:- 4 countries/directories US/Global, France, Spain Would the code sample below be correct? https://www.bluewidgets.com/us/en/ (functions as US/Global) US/Global Country Homepage - https://www.bluewidgets.com/us/en/ US/Global Country Product Page(s) This would be for all products - https://www.bluewidgets.com/us/en/whizzer-5001/ http://www.bluewidgets.com/us/en (functions for France) France Country Homepage - https://www.bluewidgets.com/fr/fr/ France Country Product Page(s) This would be for all products- https://www.bluewidgets.com/es/es/whizzer-5001 http://www.bluewidgets.com/us/en (functions as Spain) Spain Country Homepage - https://www.bluewidgets.com/es/es/ Spain Country Product Page(s) This would be for all products - https://www.bluewidgets.com/es/es/whizzer-5001 Thanks for the spot check Gravy0 -
Hreflang tags and canonical tags - might be causing indexing and duplicate content issues
Hi, Let's say I have a site located at https://www.example.com, and also have subdirectories setup for different languages. For example: https://www.example.com/es_ES/ https://www.example.com/fr_FR/ https://www.example.com/it_IT/ My Spanish version currently has the following hreflang tags and canonical tag implemented: My robots.txt file is blocking all of my language subdirectories. For example: User-agent:* Disallow: /es_ES/ Disallow: /fr_FR/ Disallow: /it_IT/ This setup doesn't seem right. I don't think I should be blocking the language-specific subdirectories via robots.txt What are your thoughts? Does my hreflang tag and canonical tag implementation look correct to you? Should I be doing this differently? I would greatly appreciate your feedback and/or suggestions.
International SEO | | Avid_Demand0 -
Website relaunched: Both old pages and new pages indexed
Hi all, We have recently made major changes to our website and relaunched it. We have changed URLs of some pages. We have redirected old URLs to new before taking website live. When I check even after one week, still the same old and new pages also indexed at Google. I wonder why still old pages cache is there with Google. Please share your ideas on this. Thanks
International SEO | | vtmoz0 -
DMoz, can I submit 3 top level domains?
Hi Guys, I have 3 top level domains, does anyone know if it is okay to submit all 3 domains? they all cover different countries, NZ, AUS, USA - The NZ one has been submitted, but our main site is the .com (USA) after running a few adwords campaign we decided to work the .com instead! Does anyone know the terms or guidelines around this?
International SEO | | edward-may0 -
I have more than 4000 pages but still have a low trafic. Would love to know more to be better ranked ?
My website is a magazine about travel and fashion. But even if i have a lot of pages, I am still low in ranking. Why ? Thanks for any advice !
International SEO | | ccjourn0 -
SEO international - ccTLD or Subdirectories / Hosting on 1 server (IP) in Netherlands
Hi All, I do mingle me in discussion if it's better to have an Ecommerce site of a Brand X on seperate ccTLD's (Brand.nl / Brand.de / Brand.com or use subdirectories (brand.com/nl, brand.com/de, brand.com/fr etc. I see a lot of comments on this, but i am missing one (maybe) essential part. We are using Magento with multi ccTLD support. BUT the environment is hosted in the Netherlands. Will we be "penalized" on hosting in NL when using www.brand.DE or other countries? Or is it MUCH better to host those ccTLD in country of Origin? Because if it is, maybe we can better use subdirs because then we can use our builded authority of the root domain. Hope someone have an answer on this one! Thanks! Jeroen
International SEO | | RetailClicks0 -
Lightbox on Home Page for Geo-Targeting
Hi -- I have a client with various international versions of their site. By adding a lightbox to their U.S. home page enabling the user to select their preferred translation (and cookie them)....does this have any negative SEO implications? It seems like a better alternative than the splash page they were using, but just want to be sure. Thanks!
International SEO | | MedThinkCommunications0 -
Geolocation and Indexing
Hi all, Our company owns site that have over 5 millions pages in Google index. We are locating in German, but our business aimed to US market. So, recently I checked index of our site using region targeting in US and there were only 150k of pages, but when I checked targeting in German there were almost 5 billion pages. Our server/IP locating in US, all the backlinks are from US sites. So, why there it is only small part of the site indexed in US? Regards, Dmitry
International SEO | | bubliki0