Ajax #! URL support?
-
Hi Moz,
My site is currently following the convention outlined here:
https://support.google.com/webmasters/answer/174992?hl=en
Basically since pages are generated via Ajax we are setup to direct bots that replace the #! in a url with ?escaped_fragment to cached versions of the ajax generated content.
For example, if the bot sees this url:
it will replace it will instead access the page:
http://www.discoverymap.com/?escaped_fragment=/California/Map-of-Carmel/73
In which case my server serves the cached html instead of the live page. This is all per Googles direction and is indexing fine.
However the MOZ bot does not do this. It seems like a fairly straight-forward feature to support. Rather than ignoring the hash, you look to see if it is a #! and then try to spider the url replaced with ?escaped_fragment. Our server does the rest.
If this is something MOZ plans on supporting in the future I would love to know. If there is other information that would be great.
Also, pushstate is not practical for everyone due to limited browser support, etc.
Thanks,
Dustin
Updates:
I am editing my question because it won't let me respond to my own question. It says I need to sign up for MOZ analytics. I was signed up for Moz Analytics?! Now I am not? I responded to my invitation weeks ago?
Anyway, you are misunderstanding how this process works. There is no site-map involved. The bot reads this URL on the page:
And when it is ready to spider the page for content it, it spider's this URL instead:
http://www.discoverymap.com/?escaped_fragment=/California/Map-of-Carmel/73
The server does the rest, it is simply telling Roger to recognize the #! format and replace it with
?escaped_fragment
Though I obviously do not know how Roger is coded but it is a simple string replacement.
Thanks.
-
Hello Dustin, this is Abe on the Moz Help team.
This question is a bit intricate, I apologize if i am not reading your question correctly.
With AJAX content like this, I know Google's full specifications
https://developers.google.com/webmasters/ajax-crawling/docs/specification
indicate that the #! and ?escaped_fragment= technique works for their crawlers. However, Roger is a bit picky and isn't robust enough yet to use only the sitemap as the reference in this case. Luckily, one of our wonderful users came up with a solution using pushState() method. Click here:
http://www.moz.com/blog/create-crawlable-link-friendly-ajax-websites-using-pushstate
to find out how to create crawl-able content using pushState . This should help our crawler read AJAX content. Let me know if this information works for you!
I hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Explorer Query Field ObscuresParts of Long URL
Hi, When using the Link Explorer to query URLs, when a long URL is entered, the field obscures the full URL eventough the field width should be able to comfortably fit the whole URL. This started to happen a few weeks ago. Hope this can be fixed soon. Thanks!
Moz Bar | | JacobMojiwat
Jake0 -
How Can I Batch Upload URLs to get PA for many pages?
Howdy folks, I'm using advanced search operators to generate lists of long tail queries related to my niche and I'd like to take the batch of URLs I've gathered and upload them in a batch so I can see what the PA is for each URL. This would help me determine which long tail query is receiving the most love and links and help inform my content strategy moving forward. But I can't seem to find a way to do this. I went to check out the Moz API but it's a little confusing. It says there's a free version, but then it looks like it's actually not free, then I try to use it and it says I've gone over my limit even though I haven't used it yet. Anyone that can help me with this, I'd really appreciate it. If you're familiar with SEMRush, they have a batch analysis tool that works well, but I ideally want to upload these URLs to Moz because it's better for this kind of research. Thanks!
Moz Bar | | brettmandoes2 -
Site crawl warning - concatenated urls from Wordpress
I could use some help on how to fix this. I asked at the walkthrough but was told it was a Wordpress issue but so far I can't find anything to point me in the right direction. There are no errors in the files on server side and I have asked my hosting company too. I am hoping someone here may be able to shed some light on it. One of my websites it giving 404 errors on links that are formed as below and there are over 12.7K of them! Example: <mydomainurl>/www.instagram.com/www.instagram.com/<instagram username=""></instagram></mydomainurl> The link that relates to my website is valid and working, but I don't understand the rest. I am totally stumped on how to move forward with this. Any advice, suggestions, tips on how to fix these errors and stop these types of links getting generated. Thanks.
Moz Bar | | emercarr0 -
Sorry, but that URL is inaccessible.
am using moz panel in there is an option on page grader where i use my portal name www.bookmyticket.com and keyword is flight ticket booking but getting error Sorry, but that URL is inaccessible. how i can fix this ??
Moz Bar | | tejaschudasama0 -
URL inaccessible for On Page Grader
I am trying to use the on page grader however it is not working with my website. The URL is as follows: https://capbeast.com. I have been trying to read in older posts to see if https is now supported or not but have not found anything. I know there is no robots.txt issue as I am able to run the crawl test on our website fine. Is the issue on my end in regards to configuration or is due to DDos attacks? Any help would be appreciated. Thanks
Moz Bar | | MisterStitches0 -
Getting 'Sorry, but that URL is inaccessible' error msg when trying to run On-Page Grader
I just signed up for MOZ Pro for the first time today. Tried to run the 'on-page grader' tool on some of my pages but I'm getting a 'Sorry, but that URL is inaccessible' error msg. I have verified against the robot.txt file that the pages are NOT blocking any crawlers. Can anybody help?
Moz Bar | | spinoki0 -
No Keyword in URL, but it is there
Hi,friends. I have grade A on all pages but system shows like no keyword in url, but it is there. for example i have Latvian keyword: bīstamo kravu pārvadājumi (with Latvian characters like Ū, Ā) I have url with exact keyword in it: http://vervo.lv/lv/kravu-parvadajumi/transports/bistamo-kravu-parvadajumi So why moz dont see that keyword, it does not understand special characters?
Moz Bar | | Liva0