Ajax Content Indexed
-
I used the following guide to implement the endless scroll https://developers.google.com/webmasters/ajax-crawling/docs/getting-started crawlers and correctly reads all URLs the command "site:" show me all indexed Url with #!key=value
I want it to be indexed only the first URL, for the other Urls I would be scanned but not indexed like if there were the robots meta tag "noindex, follow"
how I can do?
-
Hmmm... feels like I'm misunderstanding part of the question here. To get your AJAX content indexed, your server needs to return an HTML snapshot to the crawler, so the simple (or not-so-simple) answer is to inject a meta robots noindex, follow tag into the HTML of the snapshot, just like you would a regular HTML page. How you do this depends on your technology choice and server configuration, (which unfortunately I probably can't help you much with)
If you'd like, feel free to leave a few more details about your particular situation and the solutions your using, and perhaps another community member can chime in.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Best practice for expandable content
We are in the middle of having new pages added to our website. On our website we will have a information section containing various details about a product, this information will be several paragraphs long. we were wanting to show the first paragraph and have a read more button to show the rest of the content that is hidden. Whats googles view on this, is this bad for seo?
Intermediate & Advanced SEO | | Alexogilvie0 -
Content Above The Fold (strategies)
Does anyone know if using a wide responsive layout that brings content well above the fold on big screens (but still pushes it down on small screens or mobile devices) is a good option? We have an adsense site that just got destroyed and I'm assuming its this new Google algo that's looking at sites with too big of ads above the fold.
Intermediate & Advanced SEO | | iAnalyst.com0 -
Wordpress and duplicate content
Hi, I have recently installed wordpress and started a blog but now loads of duplicate pages are cropping up for tags and authors and dates etc. How do I do the canonical thing in wordpress? Thanks Ian
Intermediate & Advanced SEO | | jwdl0 -
Implementation of AJAX Crawling Specifications
My URL is: http://www.redfin.com/TX/Austin/8413-Navidad-Dr-78735/home/31224372 We're using Google's AJAX crawling system, per the documentation here. https://developers.google.com/webmasters/ajax-crawling/The example page above requires JavaScript to display content; it includes in the source. We have a lot of pages like this on our site.We expect Google to query us at this URL:http://www.redfin.com/TX/Austin/8413-Navidad-Dr-78735/home/31224372?escaped_fragment=This page renders correctly with JavaScript disabled.Are we doing this correctly? There are some small differences between the escaped_fragment HTML snapshot and the JavaScript-generated content. Will this cause any problems for us?We ask because there was a period of about two months (from October 4th to Dec 29th) during which Google's crawler radically decreased the hits to our escaped_fragment URLs; it's maybe recovering now, but maybe it isn't, and I wanted to be absolutely sure we're doing this correctly.
Intermediate & Advanced SEO | | RyanOD0 -
Ajax website and SEO
Hi all, A client of mine has a website similar to Pintrest. All in Ajax/. So imagine an ajax-grid based animal lover site called domain.com. The domain has three different Categories Cats, Dogs, Mice. When you click on a category, the site doesn't handle the URL and doesn't change the domain So instead of the domain going from domain.com to domain.com/cats, it uses the Ajax script and just shows all the cat pins. and when you click on each pin/post it opens a page such as domain.com/Pin/123/PostTitle It doesn't reference the category. However a page domain.com/cats does exist and you can go there directly. Is this an SEO issue for not grouping all pins under a category? How does Google handle Ajax these days, it use to be real bad but if Pintrest is going so well i'm assuming times have changed? Any other things to be wary of for a grid based/ajax site? I am happy to pay for an hour or two for a more in depth audit/tips if you can feed back on the above. Fairly urgent. Thanks
Intermediate & Advanced SEO | | Profero1 -
301 redirect for duplicate content
Hey, I have just started working on a site which is a video based city guide, with promotional videos for restaurants, bars, activities,etc. The first thing that I have noticed is that every video on the site has two possible urls:- http://www.domain.com/venue.php?url=rosemarino
Intermediate & Advanced SEO | | AdeLewis
http://www.domain.com/venue/rosemarino I know that I can write a .htaccess line to redirect one to the other:- redirect 301 /venue.php?url=rosemarino http://www.domain.com/venue/rosemarino but this would involve creating a .htaccess line for every video on the site and new videos that get added may get missed. Does anyone know a way of creating a rule to rewrite these urls? Any help would be most gratefully received. Thanks. Ade.0