Recovering from Sitemap Issues with Bing
-
Hi all,
I recently took over SEO efforts for a large e-commerce site (I would prefer not to disclose). About a month ago, I began to notice a significant drop in traffic from Bing and uncovered in Bing Webmaster Tools that three different versions of the sitemap were submitted and Bing was crawling all three. I removed the two out of date sitemaps and re-submitted the up to date version. Since then, I have yet to see Bing traffic rebound and the amount of pages indexed by Bing is still dropping daily. During this time there has been no issue with traffic from Google.
Currently I have 1.3 million pages indexed by Google while Bing has dropped to 715K (it was at 755K last week and was on par with Google several months ago). I know that no major changes have been made to the site in the past year so I can't point to anything other than the sitemap issue to explain this. If this is indeed the only issue, how long should I expect to wait for Bing to re-index the pages? In the interim I have been manually submitting important pages that aren't currently in the index.
Any insights or suggestions would be very much appreciated!
-
Hi there.
Well, here is my question - how can you be sure that traffic from Bing is related to sitemaps resubmission? How about, simply, rankings? Or one of tons of other reasons?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does changing sitemaps affect SEO
Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!
Technical SEO | | Jason-Reid0 -
Google Search Console - Sitemap
Hi all, Quick question. I'm trying to update my sitemap via Google Search Console using a sitemap.xml file that I've created with ScreamingFrog. However, when trying to submit it, it seems that Google only allows sitemaps that are located at a path within your domain (i.e. www.example.com/sitemap.xml) as opposed to being able to directly upload a sitemap.xml file.Is there any way that I can easily upload my sitemap.xml file? Or is there any easy way that I can upload the file to a path on my domain so I can upload via the URL?Any insight would be much appreciated!Best,Sung
Technical SEO | | hdeg0 -
Sitemap.xml Site multilang
HI all, I have some questions about multilang sitemap.xml. So, we use the same domain subdirectories with gTLDs example.com/pt-br/
Technical SEO | | mobic
example.com/us/
example.com/es/ How should I do the sitemap.xml in this case? I thought of three alternatives: Should I do a sitemap_index.xml to each lang and make categories for these sitemaps? Examples:
http://www.example.com/pt-br/sitemap_index.xml
http://www.example.com/en/sitemap_index.xml
http://www.example.com/es/sitemap_index.xml Should I do only one sitemap_index.xml covering all categories of all languages ? Examples:
http://www.example.com/sitemap_index.xml
http://www.example.com/pt-br/sitemap_categorias_1.xml
http://www.example.com/es/sitemap_categorias_1.xml
http://www.example.com/us/sitemap_categorias_1.xml Should I do a sitemap setting all multilang? <url><loc>http://www.example.com/us/</loc>
<xhtml:link <br="">rel="alternate"
hreflang="es"
href="http://www.example.com/pt-br/"
/>
<xhtml:link <br="">rel="alternate"
hreflang="us"
href="http://www.example.com/us/"
/>
<xhtml:link <br="">rel="alternate"
hreflang="pt-br"
href="http://www.example.com/pt-br/"
/></xhtml:link></xhtml:link></xhtml:link></url> Thanks for any advice.0 -
Which Sitemap to keep - Http or https (or both)
Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
Technical SEO | | ashishb010 -
Exact match and Bing
Hi Folks Fairly new to SEO and had two questions; 1- I have an SEO group doing work for us on a particular keyword, for the site http://bluetea.com.au/ .... however i've noticed that we only rank for the exact match of the keyword we ask them to work on... never a variation or even the plural .... Any thoughts? 2- http://bluetea.com.au/ has been around for almost 2 years however we still have absolutely no presence in Bing / Yahoo.... what am I missing.. any tips? Articles I must read? Thank you in advance 🙂
Technical SEO | | Intrested0 -
Penality issues
Hi there, I'm working on site that has been badly hit by penguin. The reasons are clear, exact match blog network links and tons of spammy exact match links such as comment spam, low quality directories, the usual junk. The spammy links were mainly to 2 pages, they were targetting keyword 1 and keyword 2. I'd like to remove these two pages from google, as they dont even rank in google now and create one high quality page that targets both the keywords, as they are similar. The dilemma I have is these spammy pages still get traffic from bing and yahoo and it's profitable traffic. Is there a safe way to remove the pages from google and leave them for bing and yahoo? Peter
Technical SEO | | PeterM220 -
Issue with .uk.com domain
hi i have rockshore.uk.com which is not indexing properly. the internal pages do not show up for the text they have on them, or the title tags. the site is on aekmps shops platform. I understand that a .uk.com is not a proper TLD but i think i have a subdomain of .uk.com Can anyone help? thanks
Technical SEO | | Turkey0