Searching for a keyword on html source code of a website via Google
-
Is such a thing possible? Can we google for a specific keyword that can be found on the source code of a website? Is there any search operator for this? Thanks in advance!
-
Why do you want keyword from a HTML source?, instead of that why can't you do a keyword analysis online , with can help you in understanding your competitor profile online. I can help you with Keyword research. Please reply me back in case if you are looking for any assistance.
I also recommend SEOMoz tool for competitor analysis. Great tool for learning and understanding the SEO results.
:Naveen Srikantaiah
-
Thank you so much Martijn! Definitely helps! Have a great day...
-
Hi,
Eventually the answer to your question is No. Google doesn't have an operator to search within the source of a document/ Web site. As far as I know, those and a couple of others are the only ones available within Google. I also checked Bing and Blekko. But they both also seem not to provide an operator like you would like to have.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Effect of changes on the Content page on search ranking
Hi, I have a question related to my ranking on Google Search On the content pages of my website, there is a section where our content keeps on changing. Whenever a visitor enters new information, Old information will be removed from the page. Therefore our content page is dynamic. Does this make us difficult to rank for particular keyword / overall.
Industry News | | adiez12340 -
I need some help with google please
Hello i am trying to remove google partial penalization to my website since july 2012 I have removed link, used disallow tools, wrote letters Last comunicacion was a letter send on 12/2012 that they responded 3/2013 web in cuestion is www.propdental.com I have not answered yet because i am looking for a good profesional help they responded saying that i still have unnuturall links wich is pretty probably but i dont nkow wich ones remove. Site authority has dropped from 43 to 32 with all links removed traffic dropped from 48000 to 15.000 Please profesional contacts needed
Industry News | | maestrosonrisas0 -
Are Wordpress sites being dinged by Google? Read a few articles regarding.
I read a couple "SEO" related articles that sites built in Wordpress are going to be dinged by Google because Google sees Wordpress sites as simple to make and a higher potential to be "spammy". Is there any truth to this? Your thoughts? I do give "thumbs up" and "best answer" marks and appreciate receiving thumbs up myself... Thanks
Industry News | | JChronicle1 -
Not schema, but a new kind of search result?
I came across this search result in Google and I've been racking my brain out in trying to figure out how they did it. Do a search for Novus CD4 and you'll see a search result where they list additional products from the landing page. I used Google's Rich Snippet tool to analyse the page and find no microdata at play. Any ideas how this was achieved? Have you guys come across anything like this? I was thinking of integrating this with schema to display rating stars and prices on an ecommerce site S4aL5.png
Industry News | | Bio-RadAbs0 -
Is Google Making Life Harder For Aggregators?
Theres been a bunch of updates recently which have hurt aggregators: Reducing the number of search results to 7 for branded search queries The DMCA update which penalises those with trademark related takedown requests against them. At least 2 'domain diversity' updates, the most recent last week, which seeks to reduce the ability of sites to dominate SERPS e.g. a site which may have 2 search results on page 1 now may have 1. Plus Its commonly believed that Google favours big brands over smaller brands e.g. Marriott over examplehotelaggregator.com. Is this a deliberate ploy against aggregators in favour of brands i.e. does Google believe a brand site is a better search result than an aggregator? A brand site returned above an aggregator for a branded term may be seen by Google as a better fit, a better search result that should be higher. But is that true? Consumers like to see unbiased reviews and lowest prices and that isnt always available at the brand site. Thoughts please.
Industry News | | AndyMacLean0 -
Google Search Quality Team - Commission Based Reviews
I have been busy this past week writing articles for various sources about the recent update on Google. A number of people contacted me about the analysis I was doing and the report. Some were members of the Google Search Quality Team. I knew manual reports were done before - but after the documents they showed me regarding the reports they do and the compensation for doing the reports - I am left in a state of being pretty shocked. May be I have been naive for all these years but I didn't realize that; Google outsourced the review and reconsideration requests to individual reviewers for a compensation Google's position in terms of checking qualification and experience of these "reviewers" was very insufficient at best, The three contacts I spoke to who had done reports had very little training or experience. I went through the GSQT REVIEWERS PDF (a very long and thorough document) that I was sent - with them. We went together through some sites I wanted them to review and their comments that came back were quite astounding to say the least and would have made many of you Mozzers laugh. Obviously I don't want to post said document online here.... BUT, I wanted to know if: a) any Mozzers had ever been part of such a group - the GSQT b) had any dealings with them - in terms of having your website reviewed and known about it. I knew about this group way back - like in 2005 or 2006 or sometime around then - I was told at time it was stopped and Google had stopped paying these sub contractor reviewers. Please don't get me wrong here... totally on board with manual reviews... I would just prefer them done by a trained team that possibly worked for either a professional company that maintain high quality review testing and standards - or for that matter GOOGLE employees that were trained. I just am a little unsure of them being done by individual subbies that get paid for the amount they do. What if that subbie has got some skin in the game for a particular keyword? What if their knowledge about certain aspects isn't up to par or not tested on a regular basis. This space is always changing and as you guys ./ girls on this forum know - it can change pretty quick. I just would want all websites to be judged fairly and equally by a group trained EQUALLY and to the same standards. I don't care if this is a G team or not - I just want it to be a team that is trained equally and trained continuously as opposed to paying outside people based on numbers of reviews done. When the livelihood of a small business is the balance I don't want a commission hungry toe rag with one years experience being the gate keeper for me or any of our clients. Carlos
Industry News | | CarlosFernandes0 -
Google to Target Overly SEOd Sites
I just watched the video from Barry Schwartz talking about the new Update to come with the Google Algo. Video of his Friday Post: http://www.youtube.com/watch?v=fJqSPT2NXdA I have also started reading on Webmasterworld on this topic: http://www.webmasterworld.com/google/4429947.htm What do you think Google has in the list of changes?
Industry News | | Ben-HPB0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690