How to make AJAX content crawlable from a specific section of a webpage?
-
Content is located in a specific section of the webpage that are being loaded via AJAX.
-
Thanks Paddy! We'll definitely try these solutions.
-
Hi there,
There are plenty of really good resources online that cover this area, so I'd like to point you towards them rather than copy and paste their guidelines here!
Google has a good guide here with lots of visuals on how they crawl AJAX -
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
They also have a short video here covering some of the basics of Google crawling AJAX and JavaScript:
https://www.youtube.com/watch?v=_6mtiwQ3nvw
You should also become familiar with pushState which is cover in lots of detail, with an example implementation in this blog post:
http://moz.com/blog/create-crawlable-link-friendly-ajax-websites-using-pushstate
The guys at Builtvisible have also put together a few good blog posts on this topic which are worth a read:
http://builtvisible.com/javascript-framework-seo/
http://builtvisible.com/on-infinite-scroll-pushstate/
Essentially, you need to make sure that Googlebot is able to render your content as you intended and that this looks the same to them as it does to users. You can often test how well they can render your content by checking the cache of your page or by using this feature in Google Webmaster Tools.
I hope that helps!
Paddy
-
Hi,
Making the content being loaded by AJAX crawlable by Google involves serving a static HTML snapshot of the content being loaded by AJAX to Google. We should make sure that the HTML snapshot is the exact copy that will be served to the visitors through AJAX.
Here you go for more information:
https://support.google.com/webmasters/answer/174992?hl=en
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IFrames and Thin Content Worries
Hi everyone, I've read a lot about the impact of iFrames on SEO lately -- articles like http://www.visibilitymagazine.com/how-do-iframes-affect-your-seo/ for example. I understand that iFrames don't cause duplicate content or cloaked content issues, but what about thin content concerns? Here's my scenario: Our partner marketing team would like to use an iframe to pull content detailing how Partner A and my company collaborate from a portal the partners have access to. This would allow the partners to help manage their presence on our site directly. The end result would be that Partner A's portal content would be added to Partner A's page on our website via an iFrame. This would happen about across at least 100 URLs. Currently we have traditional partner pages, with unique HTML content. There's a little standalone value for queries involving the bigger partners' names + use case terms, but only in less than 10% of cases. So I'm concerned about those pages, but I'm more worried about the domain overall. My main concern is that in the eyes of Google I'd be stripping a lot of content off the domain all at once, and then replacing it with these shell pages containing nothing (in terms of SEO) but meta, a headline, navigation links, and an iFrame. If that's the case, would Google view those URLs as having thin content? And could that potentially impact the whole domain negatively? Or would Google understand that the page doesn't have content because of the iFrames and give us a pass? Thoughts? Thanks, Andrew
Intermediate & Advanced SEO | | SafeNet_Interactive_Marketing0 -
How does google recognize original content?
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website. However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content. I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic. What do you guys think? You think all our original content effort is going to trash?
Intermediate & Advanced SEO | | serkie1 -
Does this make sense to recover from panda?
Hello guys, our website was pandalized on 9/27/2012 and we haven't been able to recover since then. I've fixed as much as possible when it comes to poor content, and we have been getting high quality links consistently for the past 3-4 months. Our blog had some duplicate content issues due to categories, tags, feeds, etc. I solved those problems before the past 2 refreshes without success. I'm considering moving the blog to a subdomain, more than PR, I'm interested in recovering from panda, and let the blog grow on its own. What do you think about that?
Intermediate & Advanced SEO | | DaveMri0 -
How should I exclude content?
I have category pages on an e-commerce site that are showing up as duplicate pages. On top of each page are register and login, and when selected they come up as category/login and category/register. I have 3 options to attempt to fix this and was wondering what you think is the best. 1. Use robots.txt to exclude. There are hundreds of categories so it could become large. 2. Use canonical tags. 3. Force Login and Register to go to their own page.
Intermediate & Advanced SEO | | EcommerceSite0 -
Okay. So I had the top spot for organic rank ( # 1 position ) above the places section. Now it has been eliminated and I am in the top spot for just the places section. What gives?
So i dont just want to be number in the places section.
Intermediate & Advanced SEO | | NWExterminating0 -
How to promote good content?
Our team just finished a massive piece of content.. very similar to the SEOmoz Begginer's Guide to SEO, but for the salon/aesthetics industry. We have a beautifully designed 10 Chapter, 50-page PDF which will require an email form submission to download. Each chapter is optimized for specific phrases, and will be separate HTML pages that are publicly available... very much like how this is setup: http://www.seomoz.org/beginners-guide-to-seo My question is, what's the best way to promote this thing? Any specific examples would be ideal. I think blogger outreach would likely be the best approach, but is there any specific way that I should be doing this?.. Again a specific start-to-finish example is what I'm looking for here. (I've read almost every outreach post on moz, so no need to reference them) Anyone care to rattle off a list of ideas with accompanying examples? (even if they seem like no-brainers.. I'm all ears)
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Homepage Content
I have a website which perform very well for some keywords and much less for other keywords. I would like to try to optimize the keywords with less performance. Let's say our website offers 2 main services: KEYWORD A and KEYWORD Z. KEYWORD Z is a very important keyword for us in terms of revenue. KEYWORD A gives us position Nr 1 on our local Google and redirect properly the visitors to xxxxxx.com/keyword-a/keyword-a.php KEYWORD Z perform badly and gives us position Nr 7 on local Google search. 90% Google traffic is sent to xxxxxx.com/keyword-z/keyword-z.php and the other 10% is sent to the home page of the website. The Homepage is a "soup" of all the services our company offers, some are important (KEYWORD Z) and other much less important. In order to optimize the keyword KEYWORD Z we were thinking to make a permanent redirect for xxxxxx.com/keyword-z/keyword-z.php to xxxxxx.com and optimize the content of the Homepage to ONLY describe our KEYWORD Z. I am not sure if Google gives more importance in the content of the homepage or not. Of course links on the homepage to other pages like xxxxxx.com/keyword-a/keyword-a.php will still exists. The point for us is maybe to optimize better the homepage and give more importance to the KEYWORD Z. Does it make sense or not?
Intermediate & Advanced SEO | | netbuilder0