Google number one search result looks drastically different in firefox compared to chrome
-
I just noticed this today that some websites and brands look like this on firefox only, and others while still being number one result for their brand name, do not appear like this at all.
also, this does not happen over chrome at all.
both images provided for comparison are using the same google apps account logged in.
It would be nice if someone could shed some light on as to why this happens sporadically and what does it take to be distinguished like this for your own brand if you own the identical domain.com or whatever.
-
Unless I'm missing something in the question I'm pretty confident that it's just the different browsers interpretation of the CSS used to format the page!? Some CSS elements are supported in most browsers but not necessarily all of them....
-
To the best of my knowledge, this is a test. Google has run multiple experiments with formatting #1+Site-links, and all of them involve some kind of bounding box. Interesting that Firefox triggered it, but I don't think it's specifically a browser thing. I expect Google is going to keep messing with this for a few weeks or months and then roll something out to everyone. At that point, any #1+Site-links site should get the new look.
-
It could be personalization. I've also seen that Google is experimenting with new card-style layouts just like this. When Google does layout/formatting experiments, people will often see varied results. It's the nature of a A/B type test. Because you have two sessions (your using two browsers), you may see two different layouts. I think you may have just stumbled across Google's experiment.
Kurt Steinbrueck
OurChurch.Com -
Probably because of the personalization? I guess Google saves some navigation data in your browser, therefore when performing the search you may be sending some usage details and therefore google is serving you more personalized results.
There are so many variables google uses that results aren't consistent at all within browsers, accounts, computers, locations, etc. That shows you how invaded is our privacy when using google (at least I see much more consistent results over bing and yahoo).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you measure impacts of Google Updates Like Penguin 4?
Having a conversation with a fellow SEO via twitter and we were discussing measuring algorithm updates. In the aftermath of Google Penguin 4 how do you determine the effects it has on your site/sites and your respective verticals?
Industry News | | Thos0030 -
I need some help with google please
Hello i am trying to remove google partial penalization to my website since july 2012 I have removed link, used disallow tools, wrote letters Last comunicacion was a letter send on 12/2012 that they responded 3/2013 web in cuestion is www.propdental.com I have not answered yet because i am looking for a good profesional help they responded saying that i still have unnuturall links wich is pretty probably but i dont nkow wich ones remove. Site authority has dropped from 43 to 32 with all links removed traffic dropped from 48000 to 15.000 Please profesional contacts needed
Industry News | | maestrosonrisas0 -
Looking for Freelance SEO'ers
Hi, I wouldn't normally post a request for freelancers here, but I noticed someone else has done it and its not against the rules! so here goes... We are looking for Freelancers to help with our work load. In particular we are searching for Link Builders who can establish links on various websites through the use of article writing/blog content posting and other methods. Also, we require general SEOs who can perform the usual technical tasks such as creating 301's, canonicals and SEO audit reports on client websites. Ideally we only want UK based, white hatters with at least 2 years SEO experience. Please pm me if this interests you. Thanks Aran
Industry News | | Chiefblob0 -
Google Search Quality Team - Commission Based Reviews
I have been busy this past week writing articles for various sources about the recent update on Google. A number of people contacted me about the analysis I was doing and the report. Some were members of the Google Search Quality Team. I knew manual reports were done before - but after the documents they showed me regarding the reports they do and the compensation for doing the reports - I am left in a state of being pretty shocked. May be I have been naive for all these years but I didn't realize that; Google outsourced the review and reconsideration requests to individual reviewers for a compensation Google's position in terms of checking qualification and experience of these "reviewers" was very insufficient at best, The three contacts I spoke to who had done reports had very little training or experience. I went through the GSQT REVIEWERS PDF (a very long and thorough document) that I was sent - with them. We went together through some sites I wanted them to review and their comments that came back were quite astounding to say the least and would have made many of you Mozzers laugh. Obviously I don't want to post said document online here.... BUT, I wanted to know if: a) any Mozzers had ever been part of such a group - the GSQT b) had any dealings with them - in terms of having your website reviewed and known about it. I knew about this group way back - like in 2005 or 2006 or sometime around then - I was told at time it was stopped and Google had stopped paying these sub contractor reviewers. Please don't get me wrong here... totally on board with manual reviews... I would just prefer them done by a trained team that possibly worked for either a professional company that maintain high quality review testing and standards - or for that matter GOOGLE employees that were trained. I just am a little unsure of them being done by individual subbies that get paid for the amount they do. What if that subbie has got some skin in the game for a particular keyword? What if their knowledge about certain aspects isn't up to par or not tested on a regular basis. This space is always changing and as you guys ./ girls on this forum know - it can change pretty quick. I just would want all websites to be judged fairly and equally by a group trained EQUALLY and to the same standards. I don't care if this is a G team or not - I just want it to be a team that is trained equally and trained continuously as opposed to paying outside people based on numbers of reviews done. When the livelihood of a small business is the balance I don't want a commission hungry toe rag with one years experience being the gate keeper for me or any of our clients. Carlos
Industry News | | CarlosFernandes0 -
New Search Engine
Looks, like I signed up months ago to be a BETA tester for a new search engine called Volunia Has anyone else heard about this engine? Just curious! Thx
Industry News | | Ben-HPB0 -
Searching for a keyword on html source code of a website via Google
Is such a thing possible? Can we google for a specific keyword that can be found on the source code of a website? Is there any search operator for this? Thanks in advance!
Industry News | | merkal20050 -
Does anyone have a copy of the 2011 Google Quality Raters Handbook that was recently leaked?
http://searchengineland.com/download-the-latest-google-search-quality-rating-guidelines-97391 Google has been on a conquest taking them down online but I would really like to take a look at it if you have a copy! [moderator note - please use the PM system and exchange email addresses there. We've removed emails from this thread before it gets indexed and exposed to the world]
Industry News | | altecdesign4 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690