Low Index: 72 pages submitted and only 1 Indexed?
-
Hi Mozers,
I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues.
I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues.
I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each.
Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k
Any advice around this would be so much appreciated!
Thanks
Justin
-
Hi,
Yes you've got it spot on, 301s are there to keep old things pointing to the new, but only the new should be in the sitemap.
When you've crawled the live site ready to make your sitemap you can manually right click and remove a URL you would not want in there before generating it.
Kind Regards
Jimmy
-
Hey Jimmy,
Wow thanks so much for your great feedback, much appreciated!
Just want to clarify your answer to the 301. So it is okay to create the 301s for our users to direct them to the new urls, but not good to include in the sitemap? Am I correct in saying this? Or am I totally off track with this?
I think whats happened also is the sitemap from screaming frog has generated old urls and some new urls as well, I'm now seeing a two of our contact pages indexed for the com.au site, one is the older url and the other is the new url.
Let me know your feedback
Cheers again Jimmy
-
Hi Justin,
Yes as long as WMT is specifically watching the HTTPS website then the problem is not in WMT unfortunately.
As hectormainar says, check your sitemap in screaming frog
go to your sitemap.xml and save it to your computer
change the frog to list mode
open your sitemap and runAll the links in the sitemap should report 200
any 301s should be swapped with the direct versionsThe 301 is good to maintain backwards compatibility and allow backlinks and old users to navigate to your new content, but shouldn't be used as major navigation.
Kind Regards
Jimmy
-
Hey Jimmy,
Thanks for the heads up! Yes, I have been watching this via WMT and also I used screaming frog to generate the sitemaps and gave to my developer he then gave me the url to submit to google.
I also used https. I hope that helps?
Let me know if you have any further questions
Cheers Jimmy thanks again
-
Hi Hectormainar,
I understand what your saying, yes we had https://www.zenory.com.au/psychic-readings/psychic-readings before we updated the urls to the following https://www.zenory.com.au/psychic-readings
after doing this we were told to add 301 redirects.. so am a little confused now as to why it should not be done as our visitors would go to the old urls?
I used screaming frog to generate the sitemaps, and from that I think it included the urls? I'm not too sure which exactly it included? Is there a way to check this?
Thanks for your help
Justin
-
Which URLs do you include in your sitemap? Could you check if you try to index
https://www.zenory.com.au/psychic-readings/psychic-readings or https://www.zenory.com.au/psychic-readings ?The first one is the URL you link at menus, but it has a 301 redirect to the second URL format (and the same for the rest of main options). That is quite a bad idea. Please make sure you include the correct address on the sitemap and not the 301 redirect one. That could be causing the problem of Google Webmaster Tools not showing that page in your sitemap as indexed, as although final page is properly indexed in google (as you can check if you look up for site:www.zenory.com.au), GWT is not able to match both addresses.
-
Hi Justin,
It is hard to tell by your screenshot, but what website are you watching in Webamasters Tools? As you are using https, the website to track would have to be the https one as a recent WMT update now classifies these differently.
Having crawled your sites with the screaming frog, I don't see any smoking guns as to why the pages would not be indexed.
Let me know about the WMT account
Kind Regards
Jimmy
-
Hi Michael,
Thanks for your response! I have also done site:yourdomain and this is also showing up quiet low compared to the amount of pages submitted. usa is showing 10 pages indexed. AU slightly more and NZ alot more.
-
Webmaster Tools is not a current accurate reflection of what is actually indexed.
A search in Google for site:yourdomain.com will show the accurate information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages to put hreflang on?
Hi, we are running a site which is a directory consisting of numbers of phone spammers. It contains descriptions, comments and so on. We are currently present in 9 countries. The websites all have the same structure, but, of course, the spam numbers in each country are different ones. If I want to tell Google that our website is available is several locations/languages, do I only put my hreflang tag on the start page then? Thanks
International SEO | | Roverandom
Thomas0 -
Issues with Baidu indexing
I have a few issues with one of my sites being indexed in Baidu and not too sure of how to resolve them; 1. Two subdomains were redirected to the root domain, but both (www. and another) subdomains are still indexed after ~4 months. 2. A development subdomain is indexed, despite no longer working (it was taken down a few months back). 3. There's conflicting information on what the best approach is to get HTTPS pages indexed in Baidu and we can't find a good solution. 4. There are hundreds of variations of the home page (and a few other pages) on the main site, where Baidu has indexed lots of parameters. There doesn't appear to be anywhere in their webmaster tools to stop that happening, unlike with Google. I'm not the one who deals directly with this site, but I believe that Baidu's equivalent of Webmaster Tools has been used where possible to correctly index the site. Has anyone else had similar issues and, if so, were you able to resolve them? Thanks
International SEO | | jobhuntinghq0 -
I have more than 4000 pages but still have a low trafic. Would love to know more to be better ranked ?
My website is a magazine about travel and fashion. But even if i have a lot of pages, I am still low in ranking. Why ? Thanks for any advice !
International SEO | | ccjourn0 -
Getting pages that load dynamically into the SE's
SEO'ers, Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected. Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
International SEO | | RonFav0 -
2 Domains, 2 Languages, but 1 WP Install?
I've got a case who wants to have one english website at one domain targeting Hawaii/ USA (bodywellnesshawaii.com) and a spanish speaking one (bodywellnesschile.cl) targeting Chile/ South America. What's the best way to go about this? Just clone the current bodywellnesshawaii.com site, translate it and have it live on a separate WP install? OR Is there a way in which we can use just one WP install with multi language and have each language live on separate domains? Not sure whether that's even possible, but it would be easier to add content/ maintain... Either one better for SEO? Thanks in advance.
International SEO | | stephanwb0 -
What is the current thinking about translated versions of pages? Is it necessary
My company is about to do a big push in China. We can get our homepage translated in Chinese at a very reasonable price. My questions are: Is it worth it? Do browsers to an adequately job of translating pages? If it is worth it: Can someone suggest a good post explaining what to do with the translation? What are the SEO implications? Thank you
International SEO | | appbackr
Sarah0 -
Should I be deindexing pages with thin or weak content?
If I have pages that rank product categories by alphabetical order should I deindex those pages? Keeping in mind the pages do not have any content apart from product titles? For example: www.url.com/albums/a/ www.url.com/albums/b/ If I deindexed these pages would I lose any authority passed through internal linking?
International SEO | | Jonathan_Hatton0 -
Does 301 redirect on homepage impact seo strongness of this page
Hi, we are running a multilingual website with this structure : http://www.website.com/en
International SEO | | Samuraiz
http://www.website.com/fr
http://www.website.com/de
http://www.website.com/lang (etc.) with then all onsite URLs this way:
http://www.website.com/en/hello
http://www.website.com/fr/bonjour
http://www.website.com/it/ciao We have a 301 redirect on http://www.website.com going to http://www.website.com/en - except if a user already went on the website and chose a specific language. My question is : Do you think the english homepage will have more seo power if it goes directly to http://www.website.com/ I wonder if we lose some linkjuice with the 301 redirection, as many backlink goes directly to http://www.website.com1