Leverage browser caching Nginx
-
Hi there,
Does anyone have any experience with leverage browser caching on Nginx?
Every time I run the pagespeed test from Google my site comes up with this as a high priority issue that needs to be addressed.
I can see that it is mostly images that doesn't have an expiry date and tried putting the following into my conf file:
location ~* .(jpg|jpeg|gif|css|png|js|ico)$ {
expires max;
}But this results in breaking my page off and all elements where out of place and missing images.
Also tried excluding css and js but hen all images where still missing.
My site is running on Drupal and I use APC for PHP to increase the load time on the site.
Hope somebody might be able to help me out.
-
Man, you have a difficult case... I not sure what is happening, but in my page I used this code:
<filesmatch ".(ico|jpg|jpeg|png|gif|swf|css|js)$"=""></filesmatch> Header set Expires "Sun, 30 Apr 2090 20:00:00 GMT" Header set Last-Modified "wed, 20 fev 2012 09:00:00 GMT"
You should try this. And try to clean your browser cache.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebots and cache
Our site checks whether visitors are resident in the same country or live abroad. If it recognises that the visitor comes from abroad, the content is made more appropriate for them. Basically, instead of encouraging the visitor to come and visit a showroom, it tells them that we export worldwide. It does this by IP checking. So far so good! But I noticed that if I look at cached pages in Google's results, that the cached pages are all export pages. I've also used Google Webmaster Tools (Search Console) and rendered pages as Google - and they also render export pages. Does anybody have a solution to this?
Technical SEO | | pulcinella2uk
Is it a problem?
Can Google see the properly (local - as in UK) version of the site?0 -
Why does my Google Web Cache Redirects to My Homepage?
Why does my Google Webcache appears in a short period of time and then automatically redirects to my homepage? Is there something wrong with my robots.txt? The only files that I have blocked is below: User-agent: * Disallow: /bin/ Disallow: /common/ Disallow: /css/ Disallow: /download/ Disallow: /images/ Disallow: /medias/ Disallow: /ClientInfo.aspx Disallow: /*affiliateId* Disallow: /*referral*
Technical SEO | | Francis.Magos0 -
Google dropping pages from SERPs even though indexed and cached. (Shift over to https suspected.)
Anybody know why pages that have previously been indexed - and that are still present in Google's cache - are now not appearing in Google SERPs? All the usual suspects - noindex, robots, duplication filter, 301s - have been ruled out. We shifted our site over from http to https last week and it appears to have started then, although we have also been playing around with our navigation structure a bit too. Here are a few examples... Example 1: Live URL: https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place SERP (1): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place SERP (2): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place+site%3Awww.normanrecords.com Example 2: SERP: https://www.google.co.uk/search?q=deaf+center+recount+site%3Awww.normanrecords.com Live URL: https://www.normanrecords.com/records/149001-deaf-center-recount- Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149001-deaf-center-recount- These are pages that have been linked to from our homepage (Moz PA of 68) prominently for days, are present and correct in our sitemap (https://www.normanrecords.com/catalogue_sitemap.xml), have unique content, have decent on-page optimisation, etc. etc. We moved over to https on 11 Aug. There were some initial wobbles (e.g. 301s from normanrecords.com to www.normanrecords.com got caught up in a nasty loop due to the conflicting 301 from http to https) but these were quickly sorted (i.e. spotted and resolved within minutes). There have been some other changes made to the structure of the site (e.g. a reduction in the navigation options) but nothing I know of that would cause pages to drop like this. For the first example (Memory Drawings) we were ranking on the first page right up until this morning and have been receiving Google traffic for it ever since it was added to the site on 4 Aug. Any help very much appreciated! At the very end of my tether / understanding here... Cheers, Nathon
Technical SEO | | nathonraine0 -
Pages appear fine in browser but 404 error when crawled?
I am working on an eCommerce website that has been written in WordPress with the shop pages in E commerce Plus PHP v6.2.7. All the shop product pages appear to work fine in a browser but 404 errors are returned when the pages are crawled. WMT also returns a 404 error when ‘fetch as Google’ is used. Here is a typical page: http://www.flyingjacket.com/proddetail.php?prod=Hepburn-Jacket Why is this page returning a 404 error when crawled? Please help?
Technical SEO | | Web-Incite0 -
Robots.txt Download vs Cache
We made an update to the Robots.txt file this morning after the initial download of the robots.txt file. I then submitted the page through Fetch as Google bot to get the changes in asap. The cache time stamp on the page now shows Sep 27, 2013 15:35:28 GMT. I believe that would put the cache time stamp at about 6 hours ago. However the Blocked URLs tab in Google WMT shows the robots.txt last downloaded at 14 hours ago - and therefore it's showing the old file. This leads me to believe for the Robots.txt the cache date and the download time are independent. Is there anyway to get Google to recognize the new file other than waiting this out??
Technical SEO | | Rich_A0 -
Google showing a Cached option but then giving a 404
2 weeks ago my home page plus some others had a 301 redirect to another domain for about 1 week (due to a hack).The original pages were then de-indexed and the new bad domain was indexed and in effect stole my rankings.Then the 301 was removed/cleaned from my domain and the bad domain was fully de-indexed via a request I made (this was 1 week ago).Then my pages came back into the index but without any ranking power.Now when I perform a search for my domain my home page is listed with an option to view the Cache. Clicking on the Cache brings up a 404 error.So why is Google showing the Cached option but doesn't have the cached file? How do I get Google to properly update it's Cache or show a cached copy?
Technical SEO | | Dantek0 -
Caching Problem !
Hi Webmasters, I have been getting a problem and that is caching problem. I have a SEO blog glanceseo.com and now i am facing caching problem. It takes something 2 months for caching. I want to solve it, please suggest me something... Thanks in advance
Technical SEO | | shubhamtiwari0 -
500 error codes caused by W3 Total Cache plugin?
Hello Everyone, I operate a site (http://www.nationalbankruptcyforum.com) that has been receiving 500 error codes in Webmaster Tools as of late. This morning, webmaster tools showed 129 500 crawling errors. I've included one of the URLs that contained an error message here: http://www.nationalbankruptcyforum.com/marriage-and-bankruptcy/do-my-wife-and-i-both-have-to-file-for-bankruptcy/ I've been getting these errors now for about 3 weeks and they've mostly been on obscure, strange URLs (lots of numbers etc.) however, this morning they started showing up on pages that will actually be trafficked by users. I'm really not sure where they're coming from, although I do believe it's a software issue as I've had my hosting company take a look to no avail. I have had some development work done recently and am running the W3 Total Cache plugin (my site is built on WP). I also run the Yoast SEO plugin and rely on it to publish an XML sitemap among other things. Anyone have any idea where these 500 errors originate from? Thanks, John
Technical SEO | | oconn1460