Implications of extending browser caching for Google?
-
I have been asked to leverage browser caching on a few scripts in our code.
- http://www.googletagmanager.com/gtm.js?id=GTM-KBQ7B5 (16 minutes 22 seconds)
- http://www.google.com/jsapi (1 hour)
- https://www.google-analytics.com/plugins/ua/linkid.js (1 hour)
- https://www.google-analytics.com/analytics.js (2 hours)
- https://www.youtube.com/iframe_api (expiration not specified)
- https://ssl.google-analytics.com/ga.js (2 hours)
The number beside each link is the expiration for cache applied by the owners. I'm being asked to extend the time to 24 hours. Part of this task is to make sure doing this is a good idea. It would not be in our best interest to do something that would disrupt the collection of data.
Some of what I'm seeing is recommending having a local copy which would mean missing updates from ga/gtm or call for the creation of a cron job to download any updates on a daily basis.
Another concern is would caching these have a delay/disruption in collecting data? That's an unknown to me – may not be to you.
There is also the concern that Google recommends not caching outside of their settings.
Any help on this is much appreciated.
Do you see any issues/risks/benefits/etc. to doing this from your perspective?
-
Thanks, this is super helpful
-
You wouldn't disrupt the collection of data, but you would need to run a cron job to keep updating it. It is not recommended that you store Google analytics locally & honestly it would make little difference to your speed and is more trouble than it's worth. Caching is not recommended by Google for a reason.
All though if your page speed is healthy your really have nothing to worry about. If your concern is just trying to get 100/100 on the page tests i have heard that this does the trick:
https://developers.google.com/speed/pagespeed/module/filter-make-google-analytics-async#description
Danny
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Browser Cacheing - HTTPS redirects to HTTP
Howdy lovely Moz people. A webmaster redirected https protocol links to http a number of years ago in order to try and capture as many links as possible on a site we now manage. We have recently tried to implement https and realised that because of this existing redirect rule, they are now causing infinite loops when trying to test an http redirect. http redirecting to https redirecting back to http, etc. The https version works by itself weirdly enough. We believe that this is due to the permanent browser caching. So unless users clear their cache, they will get this infinite loop. Does anyone have any advice on how we can get round this? a) index both sites and specify in GSC that the https is the canonical version of the site and hope that Google sees that and removes the http version for the https version b) stick with http as infinite loops will kill the site c) ??????????? Thanks all.
Intermediate & Advanced SEO | | HenryFrance0 -
AJAX requests and implication for SEO
Hi, I got a question in regard to webpages being served via AJAX request as I couldn't find a definitive answer in regard to an issue we currently face: When visitors on our site select a facet on a Listing Page, the site doesn't fully reload. As a consequence only certain tags of the content (H1, description,..) are updated, while other tags like canonical URLs, meta noindex,nofollow tag, or the title tag are not updating as long as you don't refresh the page. We have no information about how this will be crawled and indexed yet but I was wondering if anyone of you knows, how this will impact SEO?
Intermediate & Advanced SEO | | FashionLux0 -
Google and PDF indexing
It was recently brought to my attention that one of the PDFs on our site wasn't showing up when looking for a particular phrase within the document. The user was trying to search only within our site. Once I removed the site restriction - I noticed that there was another site using the exact same PDF. It appears Google is indexing that PDF but not ours. The name, title, and content are the same. Is there any way to get around this? I find it interesting as we use GSA and within GSA it shows up for the phrase. I have to imagine Google is saying that it already has the PDF and therefore is ignoring our PDF. Any tricks to get around this? BTW - both sites rightfully should have the PDF. One is a client site and they are allowed to host the PDFs created for them. However, I'd like Mathematica to also be listed. Query: no site restriction (notice: Teach for america comes up #1 and Mathematica is not listed). https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#q=HSAC_final_rpt_9_2013.pdf+"Teach+charlotte"+filetype:pdf&as_qdr=all&filter=0 Query: site restriction (notice that it doesn't find the phrase and redirects to any of the words) https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#as_qdr=all&q="Teach+charlotte"+site:www.mathematica-mpr.com+filetype:pdf
Intermediate & Advanced SEO | | jpfleiderer0 -
Google drop down - keyword gone, why?
Hi guys, i received traffic off a yearly based term, this year for '2013' i noticed it is nowhere near what the yearly term was for the year before. I believe that Google has stopped the yearly term appearing in a drop-down menu from a big volume related term, my question is how do they determine what goes in the drop down menu for related/relevant searches?
Intermediate & Advanced SEO | | pauledwards0 -
Google Cache Redirection Issue? Or awaiting DNS Propagation?
We have just launched an Australian version of our .com site. The domain example.com.au used to redirect automatically to example.com. We updated the DNS to point to our new Australian site on 13 August and have started a two week soft launch to iron out issues and allow us to start the indexing process. The issue is, when I'm checking to see if we have been crawled yet, if I do I search for: cache:http://www.example.com.au The cache automatically redirects to http://www.example.com and shows the cache for that site. The date on the cache is: 14 Aug 2012 20:47:34 GMT. My question is, have I missed something during this process or are we just waiting for the DNS change to fully propagate?
Intermediate & Advanced SEO | | Benj250 -
Is Google mad at me for redirecting...?
Hi, I have an e-commerce website that sells unique items (one of a kind). We have hundreds of items and the items are rapidly sold. Up till now I kept the sold items under our "sold items" section but it started to get back at me as we have more "sold" than non sold and we are having duplication problems (the items are quite similar besides to sizes etc.). What should we do? Should we redirect 100 pages each week? Will Google be upset with that? (for driving it crazy) Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Holiday hijack lowers Google ranking
A client of ours made a mistake that has dropped their Google rankings. They posted a holiday greeting on our homepage for several weeks, and now the search engines are not picking the page up at all. Any thoughts or suggestions on how to repair this?
Intermediate & Advanced SEO | | Event360300 -
Getting rid of a site in Google
Hi, I have two sites, lets call them site A and site B, both are sub domains of the same root domain. Because of a server config error, both got indexed by Google. Google reports millions of inbound links from Site B to Site A I want to get rid of Site B, because its duplicate content. First I tried to remove the site from webmaster tools, and blocking all content in the robots.txt for site B, this removed all content from the search results, but the links from site B to site A still stayed in place, and increased (even after 2 months) I also tried to change all the pages on Site B to 404 pages, but this did not work either I then removed the blocks, cleaned up the robots.txt and changed the server config on Site B so that everything redirects (301) to a landing page for Site B. But still the links in Webmaster Tools to site A from Site B is on the increase. What do you think is the best way to delete a site from google and to delete all the links it had to other sites so that there is NO history of this site? It seems that when you block it with robots.txt, the links and juice does not disappear, but only the blocked by robots.txt report on WMT increases Any suggestions?
Intermediate & Advanced SEO | | JacoRoux0