Does Google bot read embedded content?
-
Is embedded content "really" on my page?
There are many addons nowadays that are used by embedded code and they bring the texts after the page is loaded.
For example - embedded surveys.
Are these read by the Google bot or do they in fact act like iframes and are not physically on my page?
Thanks
-
If you look at most of the Facebook comment implementations, they're usually embedded with an iframe.
Technically speaking, that is making the content load from another source (not on your site).
As we're constantly seeing Google evolve with regard to "social signals", however, I suspect embedded Facebook comments may begin to have an impact if they pertain to content that is actually located on your website.
-
Thanks!
I'm guessing it will remain a no for me since it is third party scripts - a black box for that matter.
What do you think about Facebook comments then?
Not readable as well? -
I didn't see any recent test for 2013, but it's been analyzed quite a bit, and the 2 links below expand a bit on what I mentioned.
The conclusion on the first one below is that it won't index content loaded dynamically from a javascript file on another server/domain.
http://www.seomoz.org/ugc/can-google-really-access-content-in-javascript-really
Here's the link that talks about extra programming necessary to make AJAX content crawlable and indexable.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=174992
-
Thank you all.
Here is an example from survey monkey:
There many other tools that look quite the same.
The content it loads is not visible in the view source.
-
Googlebot has become extremely intelligent since its inception, and I'd guess that most members here would probably agree that it's gotten to the point where it can detect virtually any type of content on a page.
For the purposes of analyzing the actual content that it indexes and uses for ranking / SEO, however, I'd venture to guess that the best test would be viewing the page source after the page has loaded.
If you can see the content you're questioning in the actual HTML, then Google will probably index it, and use it considerably for ranking purposes.
On the other hand, if you just see some type of javascript snippet / function where the content would otherwise be located in the page source, Google can probably read it, but won't likely use it heavily when indexing and ranking.
There are special ways to get Google to crawl such content that is loaded through javascript or other types of embeds, but it's been my experience that most embeds are not programmed this way by default.
-
Is it's easier to analyze if you have an example URL. These can be coded many different ways and a slight change can make a difference.
-
What language is the code of the embedded survey?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
SEO best practices for embedding content in a map
My company is working on creating destination guides for families exploring where to go on their next vacation. We've been creating and promoting content on our blog for quite some time in preparation for the map-based discovery. The UX people in my company are pushing for design/functionality similar to:
Intermediate & Advanced SEO | | Vacatia_SEO
http://sf.eater.com/maps/the-38-essential-san-francisco-restaurants-january-2015 From a user perspective, we all love this, but I'm the SEO guy and I'm having a hard time figuring out the best way to guide my team regarding getting readers to the actual blog article from the left content area. The way they want to do it is to have the content displayed overtop the map when someone clicks on a pin. Great, but there's no way for me to optimize the map for every article. After all, if we have an article about best places to snorkel on Maui, I want Google to direct people to the blog article specific to that search term because that page is the authority on that subject. Additionally, the map page itself will have no original content because it will be pulling all the blog content from other URLS, which will get no visitors if people read on the map. We also want people, when they find an article they like, to be able to copy a URL to share. If the article is housed on the map page, the URL will be ugly and long (not SEO friendly) based on parameters from the filters the visitor used to drill down to that article. So I don't think I can simply optimize the map filtered-URL. Can I? The others on my team do not want visitors to ping pong back and forth between map and article and would prefer people stay on the discovery map. We did have a thought that we'd give people an option to click a link to read the article off the map but I doubt people will do it which means that page will never been visited, thus crushing it's page rank. so questions: How can i pass link juice/SEO love from the map page to the actual blog article while keeping the user on the map? Does google pass that juice if you use Iframes? What about doing ajax calls? Anyone have experience doing this? Am I making a mountain out of a molehill? Should I trust that if I create good content, good UX and allow people to explore how they prefer, Google will give me the love? Help me Rand Fishkin, you're my only hope!1 -
Google Search Results...
I'm trying to download every google search results for my company site:company.com. The limit I can get is 100. I tried using seoquake but I can only get to 100. The reason for this? I would like to see what are the pages indexed. www pages, and subdomain pages should only make up 7,000 but search results are 23,000. I would like to see what the others are in the 23,000. Any advice how to go about this? I can individually check subdomains site:www.company.com and site:static.company.com, but I don't know all the subdomains. Anyone cracked this? I tried using a scrapper tool but it was only able to retrieve 200.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Google images
Hi, I am working on a website with a large number (millions) of images. For the last five months Ihave been trying to get Google Images to crawl and index these images (example page: http://bit.ly/1ePQvyd). I believe I have followed best practice in the design of the page, naming of images etc. Whilst crawlng and indexing of the pages is going reasonably well with the standard crawler, the image bot has only crawled about half a million images and indexed only about 40,000. Can anyone suggest what I could do to increase this number 100 fold? Richard
Intermediate & Advanced SEO | | RichardTay0 -
How does google recognize original content?
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website. However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content. I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic. What do you guys think? You think all our original content effort is going to trash?
Intermediate & Advanced SEO | | serkie1 -
Google+ Pages on Google SERP
Do you think that a Google+ Page (not profile) could appear on the Google SERP as a Rich Snippet Author? Thanks
Intermediate & Advanced SEO | | overalia0 -
Schema Embedded items Hierarchy
Marking up an attorney bio. Using: http://schema.org/Attorney http://schema.org/Person Hierarchy for Attorney is: Thing > Organization > LocalBusiness > ProfessionalService > Attorney Hierarchy for Person is: Thing > Person So, should I use: Larry Lawyer Firm: Law Firm Name Address: 742 Evergreen Terrace Or should Attorney be the top itemtype? Does it even really matter?
Intermediate & Advanced SEO | | Gyi0 -
Multiple Google Places Listings?
Hi everyone. While I have read answers regarding this on Mike Blumenthal's blog, I have not been able to get an exact clarification on having multiple Google Places listings. According to Mike Blumethal, Google accepts multiple listings in the Places area for specific industries. e.g. One listing for a Dental office, one listing for EACH dentist. This could include a separate website for each. If this is the case, how far away are we from having one maxed out business owning muiiple positions in the local listing space in the search engines. specifically Google? I would love a good explanation of what is and isn't allowed to have multiple listings.
Intermediate & Advanced SEO | | dignan991