Desktop & Mobile Sitemaps Covering The Same Ground - Any Benefit To Having Both?
-
If my URL structure is the same for the desktop and mobile experience, is there any benefit to creating a mobile sitemap, considering that the sitemap for our desktop site covers the same URLs?
-
Yes, it's responsive design with the exact same URLs for both mobile and desktop.
Thanks for your helpful response!
-
Hi John,
When you say that the URLs have the same structure: do you mean that they are different URLs but organised the same way (e.g. www.domain.com is the same as m.domain.com, www.domain.com/page-1 is the sames as m.domain.com/page-1, etc)? Or is it a responsive site with the same URLs regardless of device?
The primary benefit of a sitemap is for discovery by the search engine crawlers. If you have a responsive site, you don't need a separate mobile sitemap. If you have a different set of URLs for mobile devices, even if it follows the same structure as the desktop site, I'd recommend creating a mobile sitemap.
Hope that helps!
-
John,
Yes! In search marketing today everything is about an edge. Anything you can do to make it easier for search engines to quantify that giving your company as a result is better increases the possibility that you get a higher ranking. Well over 50% of search is on mobile devices today so serving this traffic well is a priority. You should also take some time to look at your menu on mobile devices to see if they are organized in a way that would be convenient for a mobile user. I would highly recommend getting a smart phone with a smaller screen to make sure the buttons are convenient to use. Although these UI adjustments don't directly affect your rankings they do effect your user engagement which in turn decreases your bounce rate and improves your conversion rate. This in turn is factored into your rankings.
Going back to my first argument I would recommend that If you have video or audio that you submit site maps for these as well.
Hope this helps,
Ron
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I submit an additional sitemap to speed up indexing
Hi all, Wondered if there was any wisdom on this that anyone could impart my way? I'm moving a set of pages from one area of the site to another - to bring them up the folder structure, and so they generally make more sense. Our URLs are very long in some cases, so this ought to help with some rationalisation there too. We will have redirects in place, but the pages I'm moving are important and I'd like the new paths to be indexed as soon as possible. In such an instance, can I submit an additional sitemap with just these URLs to get them indexed quicker (or to reaffirm that indexing from the initial parse)? The site is thousands of pages. Any benefits / disadvantages anyone could think of? Any thoughts very gratefully received.
Intermediate & Advanced SEO | | ceecee0 -
New Subdomain & Best Way To Index
We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!
Intermediate & Advanced SEO | | Markbwc0 -
AMP Benefits
Hello, Does AMP have ranking benefits ? Should I just AMP my post or all the pages of my website, product page, homepage etc... Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Pagination & Canonicals
Hi I've been looking at how we paginate our product pages & have a quick question on canonicals. Is this the right way to display.. Or should the canonical point to the main page http://www.key.co.uk/en/key/euro-containers-stacking-containers, so Google doesn't pick up duplicate meta information? Thanks!
Intermediate & Advanced SEO | | BeckyKey0 -
Application & understanding of robots.txt
Hello Moz World! I have been reading up on robots.txt files, and I understand the basics. I am looking for a deeper understanding on when to deploy particular tags, and when a page should be disallowed because it will affect SEO. I have been working with a software company who has a News & Events page which I don't think should be indexed. It changes every week, and is only relevant to potential customers who want to book a demo or attend an event, not so much search engines. My initial thinking was that I should use noindex/follow tag on that page. So, the pages would not be indexed, but all the links will be crawled. I decided to look at some of our competitors robots.txt files. Smartbear (https://smartbear.com/robots.txt), b2wsoftware (http://www.b2wsoftware.com/robots.txt) & labtech (http://www.labtechsoftware.com/robots.txt). I am still confused on what type of tags I should use, and how to gauge which set of tags is best for certain pages. I figured a static page is pretty much always good to index and follow, as long as it's public. And, I should always include a sitemap file. But, What about a dynamic page? What about pages that are out of date? Will this help with soft 404s? This is a long one, but I appreciate all of the expert insight. Thanks ahead of time for all of the awesome responses. Best Regards, Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
Menu Structure & SEO
Hi I have been trying to decide whether we need to change our menu structure http://www.key.co.uk/en/key/ We have a lot of subcategories which are not in the menu structure and for SEO I wonder whether its best to have menu drop downs, so if a customer hovers over one category, it will display all the subcategories within this. I am concerned that sub categories we are trying to rank are many levels away from the homepage e.g If you want to find leather office chairs from the homepage, you have to go to the 'More categories' link, then choose seating > office seating > leather office seating. Users need to do a lot of navigating before seeing what we offer. I would prefer if a user could see these options in the menu when they hover over it. Does anyone think this would help SEO or just customer journey? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
How to best serve images optimised for mobile devices in WordPress
Issue: Images too large for mobile devices in some articles, cannot be shrunk responsively, also should help reduce page size/improve site speed on small screen devices. I am thinking of switching depending on the user-agent, such as iPhone / Android devices and serving up an optimised, rediced size image. I envisage this working in the background / ie. hidden from authors so it is easy. Platform: WordPress Would like a solution or some feedback on people's experiences with this problem. No good plugins found that can handle this so would probably need to be custom coded, but no processing overhead, unless it is generated upon publication of article. Thanks peeps Keith H
Intermediate & Advanced SEO | | Greywood0 -
Does a sitemap override Google parameter handling?
This question might seem silly, but I'll ask anyway. We have an eCommerce site with a ton of duplicate content, mostly caused by faceted navigation. In researching ways to reduce the clutter, I've decided to use Google parameter handling to stop Googlebot from crawling pages with certain parameters, like: sort order, page #, etc... Now my question: If I set all of these parameters so that Googlebot doesn't crawl the grids, how will they ever find the individual product pages? We do upload a sitemap with all of the product pages. Does this solve my issue? Or, should I handle the duplicate content with noindex, follow tag? Or, is there an even better way? Thanks
Intermediate & Advanced SEO | | rhoadesjohn0