Low Index: 72 pages submitted and only 1 Indexed?
-
Hi Mozers,
I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues.
I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues.
I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each.
Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k
Any advice around this would be so much appreciated!
Thanks
Justin
-
Hi,
Yes you've got it spot on, 301s are there to keep old things pointing to the new, but only the new should be in the sitemap.
When you've crawled the live site ready to make your sitemap you can manually right click and remove a URL you would not want in there before generating it.
Kind Regards
Jimmy
-
Hey Jimmy,
Wow thanks so much for your great feedback, much appreciated!
Just want to clarify your answer to the 301. So it is okay to create the 301s for our users to direct them to the new urls, but not good to include in the sitemap? Am I correct in saying this? Or am I totally off track with this?
I think whats happened also is the sitemap from screaming frog has generated old urls and some new urls as well, I'm now seeing a two of our contact pages indexed for the com.au site, one is the older url and the other is the new url.
Let me know your feedback
Cheers again Jimmy
-
Hi Justin,
Yes as long as WMT is specifically watching the HTTPS website then the problem is not in WMT unfortunately.
As hectormainar says, check your sitemap in screaming frog
go to your sitemap.xml and save it to your computer
change the frog to list mode
open your sitemap and runAll the links in the sitemap should report 200
any 301s should be swapped with the direct versionsThe 301 is good to maintain backwards compatibility and allow backlinks and old users to navigate to your new content, but shouldn't be used as major navigation.
Kind Regards
Jimmy
-
Hey Jimmy,
Thanks for the heads up! Yes, I have been watching this via WMT and also I used screaming frog to generate the sitemaps and gave to my developer he then gave me the url to submit to google.
I also used https. I hope that helps?
Let me know if you have any further questions
Cheers Jimmy thanks again
-
Hi Hectormainar,
I understand what your saying, yes we had https://www.zenory.com.au/psychic-readings/psychic-readings before we updated the urls to the following https://www.zenory.com.au/psychic-readings
after doing this we were told to add 301 redirects.. so am a little confused now as to why it should not be done as our visitors would go to the old urls?
I used screaming frog to generate the sitemaps, and from that I think it included the urls? I'm not too sure which exactly it included? Is there a way to check this?
Thanks for your help
Justin
-
Which URLs do you include in your sitemap? Could you check if you try to index
https://www.zenory.com.au/psychic-readings/psychic-readings or https://www.zenory.com.au/psychic-readings ?The first one is the URL you link at menus, but it has a 301 redirect to the second URL format (and the same for the rest of main options). That is quite a bad idea. Please make sure you include the correct address on the sitemap and not the 301 redirect one. That could be causing the problem of Google Webmaster Tools not showing that page in your sitemap as indexed, as although final page is properly indexed in google (as you can check if you look up for site:www.zenory.com.au), GWT is not able to match both addresses.
-
Hi Justin,
It is hard to tell by your screenshot, but what website are you watching in Webamasters Tools? As you are using https, the website to track would have to be the https one as a recent WMT update now classifies these differently.
Having crawled your sites with the screaming frog, I don't see any smoking guns as to why the pages would not be indexed.
Let me know about the WMT account
Kind Regards
Jimmy
-
Hi Michael,
Thanks for your response! I have also done site:yourdomain and this is also showing up quiet low compared to the amount of pages submitted. usa is showing 10 pages indexed. AU slightly more and NZ alot more.
-
Webmaster Tools is not a current accurate reflection of what is actually indexed.
A search in Google for site:yourdomain.com will show the accurate information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will hreflang with a language and region allow Google to show the page to all users of that language regardless of region?
I'm launching translations on a website with the first translation being Brazilian Portuguese. If I use the following hreflang: If a user is outside of Brazil and has their browser language set to just Portuguese (Not Portuguese (Brazil)) will Google still serve them the Portuguese version of my pages in search results?
International SEO | | Brando160 -
Web Site Migration - Time to Google indexing
Soon we will do a website migration .com.br to .com/pt-br. Wi will do this migration when we have with lower traffic. Trying to follow Google Guidelines, applying the 301 redirect, sitemap etc... I would like to know, how long time the Google generally will use to transfering the relevance of .com.br to .com/pt-br/ using redirect 301?
International SEO | | mobic0 -
Hreflang tag on every page?
Hello Moz Community, I'm working with a client who has translated their top 50 landing pages into Spanish. It's a large website and we don't have the resources to properly translate all pages at once, so we started with the top 50. We've already translated the content, title tags, URLs, etc. and the content will live in it's own /es-us/ directory. The client's website is set up in a way that all content follows a URL structure such as: https://www.example.com/en-us/. For Page A, it will live in English at: https://www.example.com/en-us/page-a For Page A, it will live in Spanish at https://www.example.com/es-us/page-a ("page-a" may vary since that part of the URL is translated) From my research in the Moz forums and Webmaster Support Console, I've written the following hreflang tags: /> For Page B, it will follow the same structure as Page A, and I wrote the corresponding hreflang tags the same way. My question is, do both of these tags need to be on both the Spanish and English version of the page? Or, would I put the "en-us" hreflang tag on the Spanish page and the "es-us" hreflang tag on the English page? I'm thinking that both hreflang tags should be on both the Spanish and English pages, but would love some clarification/confirmation from someone that has implemented this successfully before.
International SEO | | DigitalThirdCoast0 -
Google is still indexing with https,i removed ssl for my website
My website is claydip.com. I removed ssl for my website, but when i type claydip in google search it is still displaying with https and saying no description available..i lost visitors from search..kindly help me. I moved from bluehost to deamhost. I had a ssl at bluehost, when i move to dreamhost i am not using it.
International SEO | | knextweb8190 -
Include mobile and international versions of pages to sitemap or not?
My pages already have alternate and hreflang references to point to international and mobile versions of the content. If I add 5 language desktop versions and 5 language mobile versions as https://support.google.com/webmasters/answer/2620865?hl=en explains, my sitemap will get bulky. What are the pros and cons for referencing all page versions in sitemap and for include just general (English/Desktop) version in sitemap?
International SEO | | poiseo0 -
Pages with Title Element is Too Short (Chinese)
How do these issues apply to a Chinese website? Should I ignore the issues on Moz regarding this? Thanks.
International SEO | | fdmgroup0 -
E-commerce : 1 site per country or 1 site per language?
I'm working with an European e-commerce; they already have a French website with a .fr domain. They want to target the Belgium with a .be domain and the Nederland with a .nl domain. Belgium = 50% dutch, 50% French. Is it better to do 3 websites, one per country, or 2 websites, one per language ? Thinking to SEO, costs, VAT management, what is your opinion?
International SEO | | johnny1220 -
Targeting specific Geographic areas. Use 1 large.Com or several smaller country specific TLDs?
Hi, I have a small number of exact match domains, both country specific TLDs and also the Generic TLD dot com and dot net. They are: ExactMatch**.Com**
International SEO | | Hurf
ExactMatch**.Net** ExactMatch**.Co.Uk**
ExactMatch**.Ca**
ExactMatch**.Co.Nz**
ExactMatch**.Co.Za** We have already successfully launched our UK site using the exact match .co.uk and this is currently number 2 in the UK SERPS for the Google, Yahoo and Bing. They are/will be niche specific classified ad sites, which are Geographically targeted by country (to Engish speakers in the main) and each region is likely to have a minumum of 2,000 unique listings submitted over the course of a year of so. My question (FINALLY) is this: Am I better to build one large global site (will grow to approx. 12,000 listings) using EXACTMATCH.Com with .com - targeting US users and then geo-targeted sub directories (ExactMatch.Com/Nz etc) - each sub dir targeted to the matching geographic area in webmaster tools, or use the ccTLDs and host each site in the country with perhaps (each site growing to approx 2,000 listings) I could use the ccTLDs just for marketing/branding onlyand redirect these to the specific sub directory of the .com site? I am aware that there is one main ccTLD that I cannot get .Com.Au (as I am not a resident of Australia - and it is already in use.) so I was wondering if the single site with .Com/AU/ etc might help me better target that country? If I use each ccTLD as separate sites I suppose I could use the largely redundant .net to target Australia? Your thoughts and advice would be most welcome. Thanks! An additional bit of intormation (or two) the .com is circa 2004. The product advertised is a reasonably bulky (perhaps 6kgs boxed) physical product and therefore the seller is unlikely to want to ship globally - will this make them shy away from a global site - even one divided into global sub sections? FYI Seller can specify in their listing Will Ship To ....... I would be open to looking at using the front page of the .Com site as a page which visitors select the country they wish to buy/sell on. (IF it is the general consensus that it is better to create one large site.) Consider also please how the end user is likely to percieve the benefits to them of one LARGE SITE versus TARGETED SITE - I know the .Com would be divided into geographic sub directories, but I am not sure if they won't see an additinal benefit to the ccTLD - Does this add a degree of reassurance and relevance that a .com/ccTLD cannot provide? I suppose I am biased by the fact that ebay use ccTLDs? Thanks again - and please forgive my tone which may suggest I am playing devil's advocate here. I am very torn on this issue.0