Advice needed: Google crawling for single page applicartions with java script
-
Hi Moz community,we have a single page application (enjoywishlist.com) with a lot of content in java script light boxes. There is a lot of valuable content embedded but google can not crawl the content and we can missing out on some opportunities as a result. I was wondering if someone was able to solve a similar issue (besides moving the content from the java script to the HTML body). There appears to be a few services sprouting up to handle single page applications and crawling in google.http://getseojs.com/https://prerender.io/Did anyone use these services? Some feedback would be much appreciated!ThanksAndreas
-
Not an expert on JS by any stretch, but Richard Baxter at SEO Gadget suggested this post:
http://seogadget.com/javascript-framework-seo/
It's focused on Angular JS, but a lot of the core principles should apply more broadly.
-
Your content needs to be in the html.
You can hide it with display:none and then show it via javascript, but it needs to be on the page.
I would not use such plug ins, I cant see them fixing the problem and will further complicate your site and crawling -
Hi Andreas,
I cannot comment on the getseojs, or prerender I did not use those. However, google can execute "some" JavaScript to find content but Google has limitations.
This article should help: http://moz.com/ugc/can-google-really-access-content-in-javascript-really
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
Attachment Pages
i have hundreds/thousands of images on my site, but for some reason the images on this page - http://indigocarhire.co.uk/top-of-the-range-car-hire/ - are being flagged as attachment pages, meaning im getting errors for duplicate titles, missing metas ect why are these images and only these ones being flagged up, they have been added in exactly the same way as every other image on the site appreciate any advice Thanks
Web Design | | RGOnline0 -
Need to hire a tech for find out why google spider can't access my site
Google spider can't access my site and all my pages are gone out of serps. I've called network solutions tech support and they say all is fine with the site which is wrong. does anyone know of a web tech who i can hire to fix this issue? Thanks, Ron
Web Design | | Ron100 -
Sites went from page 1 to page 40 + in results
Hello all We are looking for any insight we can get as to why all (except 1) of our sites were effected very badly in the rankings by Google since the Panda updates. Several of our sites londonescape.com dublinescape.com and prague, paris, florence, delhi, dubai and a few others (all escape.com urls) have had major drop in their rankings. LondonEscape.net (now.com (changed after rank drop) ), was ranked between 4th & 6th but is now down around 400th and DelhiEscape.net and MunichEscape.com were both number 1 for several years for our main key words We also had two Stay sites number 1 , AmsterdamStay and NewYorkstay both .com ranked number 1 for years , NewYork has dropped to 10th place so far the Amsterdam site has not been effected. We are not really sure what we did wrong. MunichEscape and DelhiEcape should never have been page 1 sites ) just 5 pages and a click thru to main site WorldEscape) but we never did anything to make them number 1. London, NewYork and Amsterdam sites have had regular new content added, all is checked to make sure its original. **Since the rankings drop ** LondonEscape.com site We have redirected the.net to the .com url Added a mountain of new articles and content Redesigned the site / script Got a fair few links removed from sites, any with multiple links to us. A few I have not managed yet to get taken down. So far no result in increased rankings. We contacted Google but they informed us we have NOT had a manual ban imposed on us, we received NO mails from Google informing us we had done anything wrong. We were hoping it would be a 6 month ban but we are way past that now. Anyone any ideas ?
Web Design | | WorldEscape0 -
Decreasing Page Load Time with Placeholder Images - Good Idea or Bad Idea?
In an effort to decease our page load time, we are looking at making a change so that all product images on any page past page 1 load with a place holder image. When the user clicks to the next page, it then loads all of the images for that page. Right now, all of the product divs are loaded into a Javascript array and loaded in chunks to the page display div. Product-heavy pages significantly increase load time as the browser loads all of the images from the product HTML before the Javascript can rewrite the display div with page-specific product HTML. In order to get around this, we are looking at loading the product HTML with a small placeholder image and then substituting the appropriate product image URLs when each page is output to the display div. From a user experience, this change will be seamless and they won't be able to tell the difference, plus they will benefit from a potentially a short wait on loading the images for the page in question. However, the source of the page will have all of the product images in a given category page all having the same image. How much of a negative impact will this have on SEO?
Web Design | | airnwater0 -
Adding breadcrumbs in the body of a page
We want to implement breadcrumbs to improve the usability of our website - if we manually input breadcrumbs into the body of every page via our CMS are there any negative effects?
Web Design | | braunna0 -
Home Page Optimization
I only discovered SEOmoz about a week ago and my knowledge in this area has grown 500% in that time, but I'm still a newbie. I'm looking for whether I have the right general idea or not with my home page in regards to SEO. The page is located at Line.com. The top section with the images is 100% for humans. The next section is where the SEO comes into play. I have 5 different services [sports monitor, free sports betting, sports betting forum, sports handicapper websites, gambling affiliate program] that I offer on 5 different inner pages. What I'm trying to do is have my home page rank decently for my desired terms and then pass link juice to the respective pages. My goal is to eventually have my inner pages rank higher than my home page for my desired search terms. Do I have the right general idea or am I way off? Is this too much for the search engines with all of the links and bold text? Design criticisms are also welcome, and anybody who wants to critique the inner pages would be forever thanked. Feel free to be as harsh as you want as long as it's constructive. Thanks!
Web Design | | PatrickGriffith0 -
Why is site not being indexed by Google, and not showing on a crawl test??
On a site we developed of which .com is forwarded to .net domain, we quit getting crawled by google on about the 20th of Feb. Now when we try to run a crawl test on either url, we get There was an error fetching this page. Error description For some reason the page returned did not describe itself as an html page. It could be possible that the url is serving an image, rss feed, pdf, or xml file of some sort. The crawl tool does not currently report metrics on this type of data. Our other sites are fine and this was up to this date. We took out noodp, noydir today as the only thing we could think of. Site is on WP cms.
Web Design | | RobertFisher0