Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
-
Hi!
The Problem
We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them.
The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed.
Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions?
Thanks for any input on this one.
-
Hi Allison, any updates on this?
From my understanding, it is possible that Google is not indexing the mobile versions of pages if they are simply corresponding to the desktop pages (and indicated as such with the rel=alternate mobile switchboard tags). If they have that information they may simply index the desktop pages and then display the mobile URL in search results.
It is also possible that the GSC data is not accurate - if you do a 'site:' search for your mobile pages (I would try something like 'site:domain/m/' and see what shows up), does it show a higher number of mobile pages than what you're seeing in GSC?
Can you check data for your mobile rankings and see what URLs are being shown for mobile searchers? If your data is showing that mobile users are landing on these pages from search, this would indicate that they are being shown in search results, even if they're not showing up as "indexed" in GSC.
-
Apologies on the delayed reply and thank you for providing this information!
Has there been any change in this trend over the last week? I do know that subfolder mobile sites are generally not recommended by search engines. That being said, I do not feel the mobile best practice would change as a result. Does the site automatically redirect the user based on their device? If so, be sure Google is redirecting appropriately as well.
"When a website is configured to serve desktop and mobile browsers using different URLs, webmasters may want to automatically redirect users to the URL that best serves them. If your website uses automatic redirection, be sure to treat all Googlebots just like any other user-agent and redirect them appropriately."
Here is Google's documentation on best practices for mobile sites with separate URLs. I do believe the canonical and alternate tags should be left in place. It may be worth experimenting with the removal of these mobile URLs from the sitemap though I feel this is more of a redundancy issue than anything.
I would also review Google's documentation on 'Common Mobile Mistakes', perhaps there is an issue that is restricting search engines from crawling the mobile site efficiently.
Hope that helps!
-
Hi Paul and Joe
Thanks for the reply!
Responsive is definitely in the works...
In the meantime to answer:
-
GSC is setup for the mobile site. However its not on a subdomain, its a subdirectory mobile site. So rather than m.site.com we have www.site.com/m for the mobile sites. A sitemap has been submitted and thats where I can see the data as shown in the image.
-
Because the mobile site is a subdirectory site the data becomes a little blended with the main domain data in Google Search Console. If I want to see Crawl Stats for example Google advises "To see stats and diagnostic information, view the data for (https://www.site.com/)."
-
re: "My recommendation is to remove the XML sitemap and rely on the rel=alternate/canonical tags to get the mobile pages indexed. Google's John Mueller has stated that you do not need a mobile XML sitemap file." I had read this previously, but due to the nature of the sub-directory setup of the site, the mobile sitemap became part of the sitemap index...rather than having just one large sitemap.
Thoughts?
-
-
ASs joe says - set up a separate GSC profile for the mdot subdomain. The use that to submit the mdot sitemap directly if you wish. You'll get vastly better data about the performance of the mdot site by having it split out, instead of mixed into and obfuscated by the desktop data.
Paul
-
Hi Alison,
While this is a bit late, I would recommend moving to a responsive site when/if possible. Much easier to manage, fewer issues with search engines.
My recommendation is to remove the XML sitemap and rely on the rel=alternate/canonical tags to get the mobile pages indexed. Google's John Mueller has stated that you do not need a mobile XML sitemap file.
Also, do you have Google Search Console set up for both the m. mobile site and the desktop version? It does not seem so with all sitemaps listed in the one property in your screenshot. If not, I recommend setting this up as you may receive some valuable insights into how Google is crawling the mobile site.
I'd also review Google's Common Mobile Mistakes guide to see if any of these issues could be impacting your situation. Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not Indexing images on CDN.
My URL is: https://bit.ly/2hWAApQ We have set up a CDN on our own domain: https://bit.ly/2KspW3C We have a main xml sitemap: https://bit.ly/2rd2jEb and https://bit.ly/2JMu7GB is one the sub sitemaps with images listed within. The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: https://bit.ly/2FAWJjk. Yet, GWT still reports none of our images on the CDN are indexed. I ve followed all the steps and still none of the images are being indexed. My problem seems similar to this ticket https://bit.ly/2FzUnBl but however different because we don't have a separate image sitemap but instead have listed image urls within the sitemaps itself. Can anyone help please? I will promptly respond to any queries. Thanks
Technical SEO | | TNZ
Deepinder0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
Abnormally high internal link reported in Google Search Console not matching Moz reports
If I'm looking at our internal link count and structure on Google Search Console, some pages are listed as having over a thousand internal links within our site. I've read that having too many internal links on a page devalues that page's PageRank, because the value is divided amongst the pages it links out to. Likewise, I've heard having too many internal links is just bad in general for SEO. Is that true? The problem I'm facing is determining how Google is "discovering" these internal links. If I'm just looking at one single page reported with, say, 1,350 links and I'm just looking at the code, it may only have 80 or 90 actual links. Moz will confirm this, as well. So why would Google Search Console report different? Should I be concerned about this?
Technical SEO | | Closetstogo0 -
Meta Titles and Meta Descriptions are not Indexing in Google
Hello Every one, I have a Wordpress website in which i installed All in SEO plugin and wrote meta titles and descriptions for each and every page and posts and submitted website to index. But after Google crawl the Meta Titles and Descriptions shown by Google are something different that are not found in Content. Even i verified the Cached version of the website and gone through Source code that crawled at that moment. the meta title which i have written is present there. Apart from this, the same URL's are displaying perfect meta titles and descriptions which i wrote in Yahoo and Bing Search Engines. Can anyone explain me how to resolve this issue. Website URL: thenewyou (dot) in Regards,
Technical SEO | | SatishSEOSiren0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | | rhoadesjohn0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0 -
Include pagination in sitemap.xml?
Curious on peoples thoughts around this. Since restructuring our site we have seen a massive uplift in pages indexed and organic traffic with our pagination. But we haven't yet included a sitemap.xml. It's an ancient site that never had one. Given that Google seems to be loving us right now, do we even need a sitemap.xml - aside from the analytical benefis in WM Tools? Would you include pagination URL's (don't worry, we have no duplicate content) in the sitemap.xml? Cheers.
Technical SEO | | sichristie0