Submitting an 'HTTPS' sitemap.xml to Bing
-
I have been trying to submit my sitemap to Bing [via their webmaster tools] for well over a week and it continues to report 'pending' My site is HTTPS and the sitemap is accepted by Google. I questioned Bing about this and got this response:
To set your expectations, our Sitemap fetchers use a different pipeline and because of this, we cannot crawl Sitemaps in HTTPS format. We require that you submit an HTTP version of sitemap in order for Bing to properly crawl the file. Please go ahead and delete the current Sitemap and resubmit a new one in HTTP.
Currently I don't and can't have a HTTP version of my site & sitemap and my developers are telling me that 3hrs worth of dev time will go into coming up with a work-around which I'm not sure I want to invest in [I have more important things to concentrate my spend on!].
Has anyone been faced with this problem and is there any quick/cheap alternative or do I just accept that Bing won't crawl my site until they update their end?!
-
Hi Matthew, your response makes perfect sense. Thankfully Bing [seems to be!!] indexing my site - well certainly the pages that count as we are showing up in search results. We've been trying to come up with a work-around but all solutions will involve an element of dev. time which I don't really think is money well spent - at the moment anyway!
Cheers
Iain
-
Hey Iain. If it were me, I'd probably just accept that Bing can't crawl the sitemap and let it go. XML sitemaps are important, but not something that will generally make a huge life altering difference for your website's performance.
Now, I say "probably" because I'm wondering if you are having indexing problems with Bing. Are there pages you want Bing to index that maybe they can't reach easily (or at all) without an XML sitemap? If that is the case, then maybe it is worth the 3 hours of dev time to get the XML sitemap in place. Alternatively, you could find other ways to link to those pages Bing isn't currently indexing (on your site or others) to get those pages noticed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
AMP Session Stitching - How to deal with Google's Client ID AMP Policy
Hello, I recently attended SMX East and the concept of 'session stitching' for AMP was brought up (https://www.stonetemple.com/amp-tech-guide/). I reached out to my development team and they told me they could do it, but that we would need to agree to the new TOS changes and making users aware of then... https://support.google.com/analytics/answer/7486055 Has anyone here done something like this? And if so how did you deal with the Google's Client ID AMP policy? Thank you all! -Margarita
Reporting & Analytics | | MargaritaS1 -
How To Stop Google's "Fetch & Render" From Showing Up In Google Analytics
Hi all, Within Google's "Fetch & Render" (found in Google Search Console) is the ability to index certain pages from my website on-demand. Unfortunately, every time I ask Google to index a page, it registers as a bounce in Google Analytics. Also, if it means anything, my website (www.knowtro.com) is a single-page application, functioning similarly to Google. If you guys know of any solution to this problem, please help! I originally thought that Google would know to block its own Fetch & Render crawler from Google Analytics but that doesn't seem to be the case. Thanks, Austin
Reporting & Analytics | | A_Krauss0 -
Why google stubbornly keeps indexing my http urls instead of the https ones?
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why. Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum The third organic result listed is still http. Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index. Anyone knows why? My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
Reporting & Analytics | | max.favilli0 -
What's the best way to figure out which keywords are the highest converting?
We have a client using Google Analytics. They currently have 3 goals set up to track when website visitors fill out 3 forms: Form A, Form B, Form C. I can easily figure out what traffic sources have driven the highest number of conversions on each form (Search for Form A, for instance, or Referrals for Form B), but of course, when I try to drill down on search terms that have driven conversions to each form, I get stuck in "not provided" territory. I'd like to know what people are searching for when they ultimately fill out each form. This will answer questions like: are people familiar with us already when they convert, or did they randomly find our website when searching for something we sell? It seems like there must be a way, using Google Webmaster Tools, Analytics, or another third-party app, to answer the question: what keyword searches are responsible for the highest number of conversions? Especially on a website that has traffic of 10,000+/month and a healthy dose of search traffic. Right? Where am I missing this information?
Reporting & Analytics | | timfrick1 -
Webmaster Tools Indexed pages vs. Sitemap?
Looking at Google Webmaster Tools and I'm noticing a few things, most sites I look at the number of indexed pages in the sitemaps report is usually less than 100% (i.e. something like 122 indexed out of 134 submitted or something) and the number of indexed pages in the indexed status report is usually higher. So for example, one site says over 1000 pages indexed in the indexed status report but the sitemap says something like 122 indexed. My question: Is the sitemap report always a subset of the URLs submitted in the sitemap? Will the number of pages indexed there always be lower than or equal to the URLs referenced in the sitemap? Also, if there is a big disparity between the sitemap submitted URLs and the indexed URLs (like 10x) is that concerning to anyone else?
Reporting & Analytics | | IrvCo_Interactive1 -
WMT and 'Links To Your Site'
Anyone else find that there are, almost continually, links added to the 'Links To Your Site' list from years ago that weren't previously reflected? I'm seeing links that were added to directories in 2008 (by whoever was doing the SEO then) only showing in the last week or so when these links weren't in the list a few months ago. I don't suppose there's much I can do - it's just annoying in that it adds to more people to contact to have nonsense removed.
Reporting & Analytics | | Martin_S0 -
Google Analytics Organic search queries aren't being updated, even though I'm still seeing results in all our typical results pages.
We pushed some new changes to the site and Google Analytics is no longer updating the Organic Search queries listing, even though traffic is consistent and and we're still landing results in all our typical keyword searches. Any ideas?
Reporting & Analytics | | unclekaos0 -
Will Google start trimming 'stale' sites rank?
With the recent focus on Google to reduce rank of farms and low value sites, I am interested to get SEO view on if you think Google will start devaluing stale sites. I do find it a bit frustrating that in the top 5 for my main key phrase, there is one site that has NO content just an error and another blog that has not updated content in 2 years. How can blogs that do not blog be considered high enough value by Google to rank in the top 5? How can sites that just return 404 or 500 for ALL their pages be even considered a site let alone rank 2nd. I am interested so see others experiences and thoughts on 'user experience' clean ups by Google and why these types of sites get missed?
Reporting & Analytics | | oznappies0