Bing Webmaster Tools Incompatibility Issues with new Microsoft Edge Browser
-
Our client received an email from Bing WMTs saying
"We have identified 4 known issues with your website in Microsoft Edge – the new default browser for Windows 10 and Bing – Of the four problems mentioned, only two seem to be relevant (maybe)
- We’ve found that this webpage may include HTML markup that treats Microsoft Edge differently from other modern browsers. The new EdgeHTML rendering engine for Microsoft Edge is document-mode agnostic and designed for fast, modern rendering. We recommend that you implement one code base for all modern browsers and include Microsoft Edge as part of your modern browser test matrix.
- **We've found that this webpage may have missing vendor-specific prefixes **or may have implemented vendor-specific prefixes when they are not required in common CSS properties. This may cause compatibility problems with how this webpage renders across different browsers.
Last month the client received 20K visitors from all IE browsers and this is significant enough to be concerned about.
**Are other folks making changes to their code to adapt to MS Edge? **
-
First i'm on Mac since 2008/09 and i missed Windows7,8 and 10. But i have experience since Windows 3.0 somewhere in 91/92. So longs story short - you should do routine website testing in Edge just to see that everything is correct and works perfect. You shouldn't see any impact if you don't follow their suggestions.
Actual all web developers and designers have long and difficult partnership with IE. IE6 trying to break W3C standards and implement own version of them. This is described here: https://en.wikipedia.org/wiki/Internet_Explorer_box_model_bug IE7 fixes them but add new "bugs", and IE8 add new portion of them. This is described here: http://www.smashingmagazine.com/2009/10/css-differences-in-internet-explorer-6-7-and-8/ http://code.tutsplus.com/tutorials/9-most-common-ie-bugs-and-how-to-fix-them--net-7764 https://css-tricks.com/ie-css-bugs-thatll-get-you-every-time/ and in many other articles. So all devs finally was little bit pissed off and stop supporting IE at all. They just make some IE specific hacks for specific versions user just to see something almost correct.
Later Webkit and Firefox join the party in CSS and add their own versions of CSS styles. Each of them start with webkit- or moz- as prefix. Opera also joins the party with o- preffix. But they fix almost all vendor prefixes within years with CSS3 and finally all agreed to abandon them. But all "vendors" comes with autoupdated browsers so they push new versions to users. You can't imagine today working with other than current version of Firefox, Chrome or Opera. Safari is different because actual version is only on current version of OSX. And in this world only IE is outdated... Just MS refuses to push new versions to old OSes.
So today as Bob Dylan sing "Times have changed" everything is different. And Microsoft is trying to bring back support of all standards they lacking in past. But web here is different than web before and bringing support for IE is hard. That's why they sent mails about "possible problems". Honestly mine site works in IE 6,7,8,9,10 and 11 (tested, no joke!) and should works in Edge too (not tested).
So that's why they sent you mail about "possible issues". In all web companies are jokes about IE6 or IE7 support: http://www.smashingmagazine.com/2011/11/but-the-client-wants-ie-6-support/ http://www.sitepoint.com/how-to-stop-wasting-time-developing-for-internet-explorer/ And if today Microsoft warn us about "standartds" we should ask them where have they are all over years NOT supporting standards?
-
We have not made any changes to our client sites for Edge.
Since Edge tends to handle rendering better than previous versions of IE we don't do any conditional formatting for IE.
Now if you are doing something like:
Then I can see where that may try to force Edge to render in ways that the newer version has taken care of.
-
Hi Peter: Yes, we have already used that link because it was provided in Bing's message to us and that's where I pulled the two bulleted "issues" I referenced in the original email.
The question is, do we need to go to this trouble for Edge and what might it do to legacy versions of IE? We checked other clients and all of them have alerts - or what Bing is calling "Suggestions".
I don't see everyone following MS's suggestions and if we don't what serious impact could there be?
-
If you need to support EDGE you need to follow this one link: https://dev.windows.com/en-us/microsoft-edge/tools/staticscan/ There a static tester will check your HTML and CSS for specific issues against it.
There you can see "same markup", "browser detection" and "CSS prefixes" section with more information.
The real problem is outdated IE versions. You can easy support EDGE but what to do with old versions? I know sites with visitors from IE7 still. Now Microsoft tell "support us", but what to do with all their versions?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New URL Structure
Hi Guy's, For our webshop we're considering a new URL structure because longtail keywords to rank so well. Now we have /category (main focus keywords)
Technical SEO | | Happy-SEO
/product/the-product345897345123/ (nice to rank on, not that much volume) We have over 500 categories and every one of them is placed after our domain. Because i think it's better to work with a good structure and managed a way to make categories and sub-categories. The 500 categories may be the case why not every one of them is ranking so well, so that was also the choice of thinking about a new structure. So the new URL structure will be: /category (main focus keywords)
/category/subcat/ (also main focus keywords) Everything will be redirect (301, good way), so i think there won't be to much problems. I'm thinking about what to do with the /product/ URL. Because now it will be on the same level as the subcategories, and i'm affraid that when it's on that level, Google will give the same value to both of them. My options that i'm considering are: **Old way **
/product/the-product-345897345123/ .html (seen this on big webshops)
/product/the-product-345897345123.html/ Level deeper SKU /product/the-product/345897345123/ What would you suggest? The new structure would be 20 categories 500+ sub's devided under main categories 5000+ products Thanks!0 -
Why Are Some Pages On A New Domain Not Being Indexed?
Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"
Technical SEO | | BenjaminMorel0 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0 -
What may be the reason a sitemap is not indexed in Webmaster Tools?
Hi,
Technical SEO | | SorinaDascalu
I have a problem with a client's website. I searched many related questions here about the same problem but couldn't figure out a solution. Their website is in 2 languages and they submitted 2 sitemaps to Webmaster Tools. One got 100% indexed. From the second one, from over 800 URLs only 32 are indexed. I checked the following hypothesis why the second sitemap may not get indexed: sitemap is wrongly formatted - False sitemap contains URLs that don't return 200 status - False, there are no URLs that return 404, 301 or 302 status codes sitemap contains URLs that are blocked by robots.txt - False internal duplicate content problems - False issues with meta canonical tags - False For clarification, URLs from the sitemap that is not indexed completely also don't show up in Google index. Can someone tell me what can I also check to fix this issue?0 -
Webmaster tools doesn't pick up 301 redirect
I had a few hundred URLs that died on my site. Google Webmaster Tools notified me about the increase in 404 errors. I fixed all of them by 301 redirecting them to the most relevant page and did multiple header checks to ensure that the 301 has been implemented correctly. Now a few weeks later, Google is giving me the exact same message in Google Webmaster Tools but they are all still 301 redirected. WTF?
Technical SEO | | DROIDSTERS0 -
Difference between SEOMOZ and Webmaster Tools information
Hello, There is an issue that confuses me and I thought perhaps you will be able to help me shed some light on it. I have a website which shows 2,549 crawled pages on SEOMOZ and 24,542 pages on webmaster tools! Obviously there is some technical issue with the site, but my question is: why the vast difference between what the SEOMOZ crawl report and webmaster tools report show? Thanks! Guy Cizner
Technical SEO | | ciznerguy0 -
When should we use Remove URLs feature on Google Webmasters Tool?
Hi there, I run an ecommerce website on Magento. We are no longer using a category. It actually does not appear on the menu: mydomain.com/category.html If this is the case, do you recommend to remove it through the Removal URL feature on GWT? I don't want this to affect the juice of other links of the site such as: mydomain.com/product.html Thanks very much. Regards
Technical SEO | | footd0 -
Crawl Errors In Webmaster Tools
Hi Guys, Searched the web in an answer to the importance of crawl errors in Webmaster tools but keep coming up with different answers. I have been working on a clients site for the last two months and (just completed one months of link bulding), however seems I have inherited issues I wasn't aware of from the previous guy that did the site. The site is currently at page 6 for the keyphrase 'boiler spares' with a keyword rich domain and a good onpage plan. Over the last couple of weeks he has been as high as page 4, only to be pushed back to page 8 and now settled at page 6. The only issue I can seem to find with the site in webmaster tools is crawl errors here are the stats:- In sitemaps : 123 Not Found : 2,079 Restricted by robots.txt 1 Unreachable: 2 I have read that ecommerce sites can often give off false negatives in terms of crawl errors from Google, however, these not found crawl errors are being linked from pages within the site. How have others solved the issue of crawl errors on ecommerce sites? could this be the reason for the bouncing round in the rankings or is it just a competitive niche and I need to be patient? Kind Regards Neil
Technical SEO | | optimiz10