What do Bing and Yahoo look for in a site?
-
Do Bing and Yahoo look for authoritative sites like google does? Do they punish sites for black hat or spamming?
The reason I ask these questions is because one of my competitors was ranking in first place for many great keywords in Google, they have the highest authority out of all of their competitors. They must have been punished by Google because now they are not ranking for any great keywords in Google. However they are ranking 1st in Bing and Yahoo for most the top keywords, getting the the most visibility out of all the sites.
I attached a small Graph with latest visibility for the sites with the top keywords from google and then I also included the company that was punished from google they are the green circles on the graph.
-
I think TEST is the keyword when Duane is talking about the index. Further down the page it makes it quite clear they will kick it back out again its no good.
“If the users love it, it stays. If the users don’t like it, it gets dropped. This is a way to determine if the users feel this was a quality result.”
Duane has said many times that they will not be indexing everything; they only want your best pages.
"Rand: Right, yeah. I was going to say, and Bing has been pretty good about
penalizing a lot of the links that look manipulative on the Web
too.Duane: Yeah. It's a natural part of keeping things clean, right?
At Bing, we are very keen on having a quality driven index. So, the main focus
we have is making sure that everything that gets in is a good resource, when
someone makes a query they get a realistic answer that is actually an answer to
their query. Not, here's some shallow depth data. I'm going to click on it, and
then oh, it's not really what I want. I go back and I try it again. We're trying
to shorten that number of searches to get to the final answer."Duane: Right, exactly. I love this idea, Rand, this whole pick your top 200, whatever the number happens to be for you, pick it and run with it. You don't need everything indexed. Pick your best stuff and make sure that's in there. Make sure your quality content is in there, right? Be sure that you look at the site and say, "What's the goal of this page? Is it to monetize ads? Is it to convert somehow? What is the goal of it? Is it optimized properly to do that? If it is, I want that indexed in the search engine ranking well."
http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholdsThat’s good news about the Social media, because every thing I build seems to rank high in Bing, with no social media. I guess that’s something I can fall back on, if rankings start to slip.
-
Here are some interesting insights from Duane Forrester, who is a senior product manager at Bing.
http://www.stonetemple.com/search-algorithms-and-bing-webmaster-tools-with-duane-forrester/
Two of the biggest things of interest are:
- The huge weight placed by Bing on user interaction with the search results as a ranking factor. This was amazing stuff. Basically, Bing is willing to test any page by indexing it. In fact you can pretty much directly inject any URL you want into their search results using the Submit URL feature of Bing Webmaster Tools. Then they will test it, and if the click interaction data is bad, out (or down) you go.
- The ranking of the priorities for publishers in Duane’s eyes. #1 Content #2 Social Media #3 Links. Links were rated as the third most important area. Third.
The article is very easy to read, with the highlights put in front. This is recent information from a couple of months ago.
-
Very interesting. I never knew that.
And wow, that's the oldschool Yahoo design. Haven't seen that look since viewing Yahoo.com in the WayBack machine..
-
Yahoo uses Google in Japan (not that you, or anyone really cares).
-
A large difference I've noticed with Bing vs Google in the years has been that Google is more inclined to index and place a site within the SERP's much quicker, basically giving a new site 'the benefit of the doubt'; however, that site must maintain a good standing throughout the course of the 'sandbox' period to ensure they don't drop off the map after a year or two.
Bing seems to show preference towards domains that are aged. Their search index, at least at one point; I'm sure they're working to update, or might have even done so already, doesn't seem to be as fresh as Google's, which has its advantages as well.
With Google, you'll often find many new sites at the top of the SERP's for any given search on a non-highly-competitive search term. Just Google's way of getting more information to the masses whether it's a scraped site of not (unfortunately, I'm still finding scraped sites in the index). Where Bing seems to have sites that are tried and true.
Just my observations over the years. However, it's been a while since I've really paid a whole lot of attention to this.
-
From what I have read and my own experiences, Bing is lot more fussy on what they index, its lot harder to get in the index,
I have found that Bing also likes clean code free from all violations. your site needs to be able to be crawled easily.
Bing is also quick to lose trust if you misuse things such as redirects, canonicals and sitemaps. Duane Forrester told me in regard to sitemaps that they will lose trust in your site map if your lastmod dates are not accurate, if you have any 404’s in it; they only want 200 status pages. You not only should have a sitemap. You should keep it up to date they have no intention of indexing everything that Google does.I have also got sites to well in Bing with no or few links, for pretty good keywords, so i dont think they rely on links so much.
-
Well, to begin, Yahoo search is now run off the Bing algorithm (algo). So while there may still be a "Yahoo Slurp" crawler out there, it's based on a different algo than once before. Bing now completely runs Yahoo search.
Search engines have their own algorithms. There is no specific algo that they all must adhere to. So while rankings for your site might go up in one engine, they might very well go down in another (or not move at all).
And I can assume Bing watches for black-hat SEO tactics, although I don't have any physical data to back that up. But it's safe to say they do.
Huge mistake website owners make is to optimize their sites for Google only. Google only makes up 65% (?) of the search market, so by optimizing for Google, and Google alone, you're cutting off a potential 35% of traffic.
There is a ton of forums, documentation, webmaster tools for Bing, just as there is Google, so you need to put in that extra effort to see what makes a site rank well in Bing.
As long as you stick to the fundamentals, ie. proper internal link structure, attain solid/safe, relevant backlinks to your site, use your Webmaster tools (and SEOmoz ;)) to make sure site errors and such are taken care of, and get your HTML error free with proper H1-H6 tags (where applicable)/title tags, meta tags, etc., then, and only then, should you start tweaking your site for direct optimization for each engine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Dealing with Dodgy Looking Traffic
Hi there I am really hoping someone can help. The site I run has started receiving traffic from the US (we are a UK run firm who don't ship overseas). Ordinarily, this wouldn't be a massive problem but the traffic is coming directly to lots of pages and instantly bouncing. I am worried this is going to negatively impact my rankings as drop off rate and conversions are getting hammered by this 'fake traffic'. The attached image shows the traffic for the homepage but its happening on every page with hundreds of hits bouncing and hurting my stats. Is there any way of dealing with this or reporting it to an authority or even Google itself? Any help would be greatly appreciated. George 7vprsJo
White Hat / Black Hat SEO | | BrinvaleBird0 -
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
Has our site been attacked?
Hello fellow mozers! I am having a problem you might be able to help me with and any thoughts on the issue will be greatly appreciated. Yesterday, I received an automated monthly report from Quill Engage, a tool that fetches data from Google Analytics and generates reports in a narrative format. Last month's 'referral traffic' section indicates two incredibly spammy websites driving more than 200 sessions to our website. Naturally, I checked out GWT and Open Site Explorer but couldn't find any traces of such activity. Futhermore, all our metrics seem ok. Can this possibly be a negative SEO attack that was only traced by the aforementioned tool? Can you propose any other way to test this and make sure we're not being attacked?
White Hat / Black Hat SEO | | SMD_0 -
Whether to use new domain or old ecommerce site domain that has been incomplete for a long time.
Hello, We are starting a second store in our niche. Which of the following should I choose: A. We have a site from a year and a half ago that we put content on but never actually added products. The category and article content needs to be completely rewritten. We will completely rewrite the content to be much better and up to date. We're planning on adding products and rewriting the manufacturer descriptions. B. We could use a new domain that is closer to exact match for our main keyword. We'd just buy one for $15 I don't know whether A or B would be the fastest way to get the site going. I'm concerned that leaving a site half done for a year could cause an issue, but I really don't know. If you've got experience with this, please advise. Thank you.
White Hat / Black Hat SEO | | BobGW0 -
Does this graph look like a Penguin 2.0 hit?
Hello,Does the attached graph look like a Penguin 2.0 hit? Keep in mind that on our eCommerce site most purchases are from return customers. I forgot to add here that we cut a bunch of paid links in May 2013 as well. We quit cutting paid links when our rankings dropped - we thought it was the paid links. We currently have 30% paid links. Penguin 2.0 was on May 22. ga2.png
White Hat / Black Hat SEO | | BobGW0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0