What do Bing and Yahoo look for in a site?
-
Do Bing and Yahoo look for authoritative sites like google does? Do they punish sites for black hat or spamming?
The reason I ask these questions is because one of my competitors was ranking in first place for many great keywords in Google, they have the highest authority out of all of their competitors. They must have been punished by Google because now they are not ranking for any great keywords in Google. However they are ranking 1st in Bing and Yahoo for most the top keywords, getting the the most visibility out of all the sites.
I attached a small Graph with latest visibility for the sites with the top keywords from google and then I also included the company that was punished from google they are the green circles on the graph.
-
I think TEST is the keyword when Duane is talking about the index. Further down the page it makes it quite clear they will kick it back out again its no good.
“If the users love it, it stays. If the users don’t like it, it gets dropped. This is a way to determine if the users feel this was a quality result.”
Duane has said many times that they will not be indexing everything; they only want your best pages.
"Rand: Right, yeah. I was going to say, and Bing has been pretty good about
penalizing a lot of the links that look manipulative on the Web
too.Duane: Yeah. It's a natural part of keeping things clean, right?
At Bing, we are very keen on having a quality driven index. So, the main focus
we have is making sure that everything that gets in is a good resource, when
someone makes a query they get a realistic answer that is actually an answer to
their query. Not, here's some shallow depth data. I'm going to click on it, and
then oh, it's not really what I want. I go back and I try it again. We're trying
to shorten that number of searches to get to the final answer."Duane: Right, exactly. I love this idea, Rand, this whole pick your top 200, whatever the number happens to be for you, pick it and run with it. You don't need everything indexed. Pick your best stuff and make sure that's in there. Make sure your quality content is in there, right? Be sure that you look at the site and say, "What's the goal of this page? Is it to monetize ads? Is it to convert somehow? What is the goal of it? Is it optimized properly to do that? If it is, I want that indexed in the search engine ranking well."
http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholdsThat’s good news about the Social media, because every thing I build seems to rank high in Bing, with no social media. I guess that’s something I can fall back on, if rankings start to slip.
-
Here are some interesting insights from Duane Forrester, who is a senior product manager at Bing.
http://www.stonetemple.com/search-algorithms-and-bing-webmaster-tools-with-duane-forrester/
Two of the biggest things of interest are:
- The huge weight placed by Bing on user interaction with the search results as a ranking factor. This was amazing stuff. Basically, Bing is willing to test any page by indexing it. In fact you can pretty much directly inject any URL you want into their search results using the Submit URL feature of Bing Webmaster Tools. Then they will test it, and if the click interaction data is bad, out (or down) you go.
- The ranking of the priorities for publishers in Duane’s eyes. #1 Content #2 Social Media #3 Links. Links were rated as the third most important area. Third.
The article is very easy to read, with the highlights put in front. This is recent information from a couple of months ago.
-
Very interesting. I never knew that.
And wow, that's the oldschool Yahoo design. Haven't seen that look since viewing Yahoo.com in the WayBack machine..
-
Yahoo uses Google in Japan (not that you, or anyone really cares).
-
A large difference I've noticed with Bing vs Google in the years has been that Google is more inclined to index and place a site within the SERP's much quicker, basically giving a new site 'the benefit of the doubt'; however, that site must maintain a good standing throughout the course of the 'sandbox' period to ensure they don't drop off the map after a year or two.
Bing seems to show preference towards domains that are aged. Their search index, at least at one point; I'm sure they're working to update, or might have even done so already, doesn't seem to be as fresh as Google's, which has its advantages as well.
With Google, you'll often find many new sites at the top of the SERP's for any given search on a non-highly-competitive search term. Just Google's way of getting more information to the masses whether it's a scraped site of not (unfortunately, I'm still finding scraped sites in the index). Where Bing seems to have sites that are tried and true.
Just my observations over the years. However, it's been a while since I've really paid a whole lot of attention to this.
-
From what I have read and my own experiences, Bing is lot more fussy on what they index, its lot harder to get in the index,
I have found that Bing also likes clean code free from all violations. your site needs to be able to be crawled easily.
Bing is also quick to lose trust if you misuse things such as redirects, canonicals and sitemaps. Duane Forrester told me in regard to sitemaps that they will lose trust in your site map if your lastmod dates are not accurate, if you have any 404’s in it; they only want 200 status pages. You not only should have a sitemap. You should keep it up to date they have no intention of indexing everything that Google does.I have also got sites to well in Bing with no or few links, for pretty good keywords, so i dont think they rely on links so much.
-
Well, to begin, Yahoo search is now run off the Bing algorithm (algo). So while there may still be a "Yahoo Slurp" crawler out there, it's based on a different algo than once before. Bing now completely runs Yahoo search.
Search engines have their own algorithms. There is no specific algo that they all must adhere to. So while rankings for your site might go up in one engine, they might very well go down in another (or not move at all).
And I can assume Bing watches for black-hat SEO tactics, although I don't have any physical data to back that up. But it's safe to say they do.
Huge mistake website owners make is to optimize their sites for Google only. Google only makes up 65% (?) of the search market, so by optimizing for Google, and Google alone, you're cutting off a potential 35% of traffic.
There is a ton of forums, documentation, webmaster tools for Bing, just as there is Google, so you need to put in that extra effort to see what makes a site rank well in Bing.
As long as you stick to the fundamentals, ie. proper internal link structure, attain solid/safe, relevant backlinks to your site, use your Webmaster tools (and SEOmoz ;)) to make sure site errors and such are taken care of, and get your HTML error free with proper H1-H6 tags (where applicable)/title tags, meta tags, etc., then, and only then, should you start tweaking your site for direct optimization for each engine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is new domian redirection to old domain make risk for my site ?
i have site with Da about 30 and i have many project on it if i register a domain for each project and after that redirect to my site is that bad for my site and my site seo ? for ex: my project is : mysite.com/helloword i register helloworld.com or domain like that helllo-world.com and redirctet to mysite.com/helloworld
White Hat / Black Hat SEO | | MidnightShop791 -
Moving content form Non-performing site to performing site - wihtout 301 Redirection
I have 2 different websites: one have good amount of traffic and another have No Traffic at all. I have a website that has lots of valuable content But no traffic. And I want to move the content of non-performing site to performing site. (Don't want to redirect) My only concern is duplicate content. I was thinking of setting the pages to "noindex" on the original website and wait until they don't appear in Google's index. Then I'd move them over to the performing domain to be indexed again. So, I was wondering If it will create any copied content issue or not? What should i have to take care of when I am going to move content from one site to another?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Why Google cached another site, not mine?
Hi Guys, please help me. I need your help regarding my business website i.e. https://www.kamagratablets.com/. Before 8-10 days it was ranked in top 10 from home page but I lost my position and ranking page also changed by Google. If you will check caching of this website then you will see Google cache another site - http://www.hiphoptoptower.com/ - I have checked my code and nothing found related to this website. Please check and help me on this point, how can I remove this site from caching and get my previous ranking in Google.
White Hat / Black Hat SEO | | Devtechexpert0 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
Disavow tool for blocking 4 to 5 sites for Article Republishing
Am finding some very low authority sites (recently picked our articles from ezine and other article sites - written over a year back) and pasted on to there site. The number of articles copies are not 1 or 2, but more than 10-12 in all these domains This has also led to our anchor based url - backlink to us from them (a part of article). Have Wrote down to remove my author profile and articles - but there has been no response from webmaster of these sites. Is Disavow a right approach. The number of such sites are 4 or 5 in nature !!
White Hat / Black Hat SEO | | Modi0 -
HELP - Site architecture of E-Commerce Mega Menu - Linkjuice flow
Hi everyone, I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that. We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts. Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language. Then on the homepage, we have the big MEGA MENU - and we have
White Hat / Black Hat SEO | | bjs2010
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3 Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc... Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above. So once the bots hit the site, immediately they have this structure to deal with. Here is what stats look like:
Domain Authority: 18 www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90 www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54 www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54 Product pages themselves - have a PA of 1 and no mR or mT. I really need some other opinions here - I am thinking of: Removing links in Nav menu so it only contains CAT1 and SUBCAT1 but DELETE SUBSUBCATS1 which represent around 80 links Remove products within the CAT1 page - eg., the CAT 1 would "tile" graphical links to subcategories, but not display products themselves. So products are only available right at the lowest part of the chain (which will be shortened) But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice? Thanks all, Ben0 -
Will my association's network of sites get penalized for link farming?
Before beginning I found these similar topics here: http://www.seomoz.org/q/multiple-domains-on-same-ip-address-same-niche-but-different-locations http://www.seomoz.org/q/multiple-domains-on-1-ip-address We manage over two dozen dental sites that are individually owned through out the US. All these dentists are in a dental association which we also run and are featured on (http://www.acedentalresource.com/). Part of the dental associations core is sharing information to make them better dentists and to help their patients which in addition to their education, is why they are considered to be some of the best dentists in the world. As such, we build links from what we consider to be valuable content between the sites. Some sites are on different IPs and C-Blocks, some are not. Given the fact that each site is only promoting the dentist at that brick and mortar location but also has "follow" links to other dentists' content in the network we fear that we are in the grey area of link building practices. Questions are: Is there an effective way to utilize the power of the network if quality content is being shared? What risks are we facing given our network? Should each site be on a different IP? Would having some of our sites on different servers make our backlinks more valuable than having all of our sites under the same server? If it is decided that having unique IPs is best practice, would it be obvious that we made the switch? Keep in mind that ALL sites are involved in the association, so naturally they would be linking to each other, and the main resource website mentioned above. Thanks for your input!
White Hat / Black Hat SEO | | DigitalElevator0 -
Should this site be punished?
Every summer for the past 4 years one of our customer's competitors suddenly has a big jump in Google's (.co.uk) rankings for some of the main industry phrases, particularly "air conditioning". We were always under the impression that they bought links before the busy summer season, as they have these strange massive jumps in the rankings. (for the rest of the year they often drop down) I recently checked out some of the back-links going to their site and noticed something I'd not seen before. Of the (approx) 480 links that showed up, around 80% of the SourceURL's ended with "?Action=Webring" (see 1st attached image). To me it doesn't look natural at all and I'm surprised that Google hasn't picked up on. Their site is www.aircon247.com. It had been mentioned to me that this may be to do with link sharing sites (which I assume is black-hat) but I'm not 100% sure that they are doing this. They also have an identical long spammy-looking footer at the bottom of every page which is clearly only for search engines to see. We reported it to Google a year ago but no action was taken. Do you think that it is acceptable to have it on every page? (see 2nd attached image) I would be interested to know your thoughts on both of these, and whether this would be a dangerous tactic to try and emulate? Gc5MU.png iXGA9.png
White Hat / Black Hat SEO | | trickshotric0