Google Fetch and Render - Partial result (resources temporarily unavailable)
-
Over the past few weeks, my website pages have been showing as partial in the Google Search Console. There are many resources/ files (js, css, images) that are 'temporarily unreachable'. The website files haven't had any structural changes for about 2 years (it historically has always shows as 'completed' and rendered absolutely fine in the search console).
I have checked and the robots.txt is fine as is the sitemap. My host hasn't been very helpful, but has confirmed there are no server issues. My website rankings have now dropped which I think is due to these resources issues and I need to clear this issue up asap - can any one here offer any assistance? It would be hugely appreciated.
Thanks,
Dan
-
Can anyone suggest any answers or has anyone had similar issues? I continue to monitor the site via fetch and render and the issues remain the same - lots of images, css and js files 'Temporarily Unreachable' (yet they do exist and the link can be clicked on). The website functions fine otherwise.
As I say, I have changed website hosts and it is still the same. This is really affecting my rankings and if anyone has any clues I would be most grateful.
Many thanks,
Dan
-
Hi Martijn,
Thank you for your response!
The results in terms of the fetch and render for a page of the website looks different everytime. Sometimes it is images that are "Temporarily Unavailable' , sometimes it is css/ js files and sometimes both. However, there is never a 'Completed' result and always some form of 'Partial result. All the files/ images are reachable when you click on them, however. Nothing is blocked in terms of robots.
From time to time the entire page itself says 'Temporarily Unreachable', although it comes back to 'partial' after waiting a few hours.
I have contacted my web hosts who haven't offered much help. I actually changed web hosts and paid for a more expensive, faster server (as I assumed the server was taking too long for Google)!
However, the results are the same, so really struggling to understand why this is happening. As before, the robots.txt file is fine without any blockingCould you explain what you mean in terms of crawling with the GoogleBot User Agent?
Having had a quick scan around different forums, it seems there are quite a few websites having a similar problem, but there doesn't seem to be a solution so far.
Thanks again for your time.
-
Hi Dan,
Are there any more insights into what the screenshot actually looks like when the resources aren't being loaded? I would, in addition, try to crawl the site/page with the GoogleBot User Agent and see for yourself what happens. In some cases it could be that your CDN or server is blocking requests that are often done, obviously, this shouldn't happen with Google but it wouldn't be the first time that I see GoogleBot being blocked by a server.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameter for Limiting Results
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions. 1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category? In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes. Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page. 2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do. Any advice or suggestions will be helpful! Thanks.
SERP Trends | | dkeipper0 -
Ways to fetch search analytics - historical search query data from Google Search Console
Is there any way to fetch all historical search query data from Google Search Console ? Google allows us to view only 90 days report at the maximum. Does integrating google search console with google analytics tool solve this problem ?
SERP Trends | | NortonSupportSEO0 -
Cant find link google are saying is inserted
Guys, one of my sites are coming up as "hacked in Google". I have looked for the link they are suggesting but cannot find it in the database. I have tried to resubmit but they are saying It is against the terms of service.
SERP Trends | | Johnny_AppleSeed0 -
Google vs. Bing
We rolled out a new site for a customer on an old (branded) domain (14 years old) about 3 weeks ago, and we are doing very, very well in Google for 40+ keywords, but we can't make a dent in Bing. Absolutely nothing. We are using the Webmaster tools for both. Are we missing something? Are we not using the Bing Webmaster tools correctly?
SERP Trends | | CsmBill0 -
Appearing in Universal Results drops us from Organic Results
Hi all, Has anyone noticed that achieving an appearance in the Universal Results (7 box) forced their previous organic ranking to drop out completely for that keyword? I thought Google would still show us in Universal AND Organic. Is this typical? Here's what happened: Last Week: Ranked no. 6 in standard organic results for specific keyword (but 7 box universal results appear ahead of us between position 3 and 4 and we're not listed) This Week: We added ourselves to Google Places a few weeks ago and this week we suddenly appear in the desirable 7 box Universal result, which is much higher and better ranking - great! But interestingly we notice at the same time our normal organic ranking at no.6 has dropped out completely (-50 in moztool). Is it an either/or for Organic vs. Universal or can you ever keep ranking in both?
SERP Trends | | emerald0 -
Redirection of domain maybe affected by google
I have the domain clinicadentalbarcelona.net that in the past suffered a big traffic drop due to algorithm actualizations, but did not recieved any google letter. Althought the domain still is getting good rakings but has some toxic links I want to change the content to new domain clinicadentalbarcelona.es that as no historic or any linkbuilding my cuestion is if i can redireccion 301 of clinicadentalbarcelona.net (that has some page rank and links) to clinicadentalbarcelona.es/blog to prevent it from afecting the main root domain and to bring the page ranking and some links to a a web that as none. thank you what would you do in my case
SERP Trends | | maestrosonrisas0 -
Google Merchant Center Feed Disapproved - Data Quality Good - No Warnings
I have noticed Google Merchant Center has been making many changes of over the last month. Feeds can now be optimized for certain product attributes. The dilemma currently is that I have a Google Merchant Center Data feed that shows zero warnings and that the data quality is good. Unfortunately, the entire feed has been disapproved. Across many other websites that I noticed the same issues, I have been able to fix all warnings and the feeds are taken perfectly. This one sites issues are eluding me. Anybody have any suggestions or experience dealing with this problem? Possible issues I have looked into but could be affecting feed. Merchant Center Guidelines have been reviewed multiple, multiple times and here is what I have found. 1. Website has limited duplicate content taken from distributors product listings (I have fought a unending battle with site owner to make all product content original) 2. Refurbished Products Issue: The sites feed has listed all products as "new". I found some of the product content in the site had "refurbished" listed. The guidelines state that products must be listed & marked as refurbished in the feed. To overcome this issue I disabled all refurbished products and resubmitted the feed. This did produce a good approved data feed.
SERP Trends | | SEMCLIX0 -
Best keyword research tool for Google image search?
What is the best research tool for finding search data specifically for Google Image search?
SERP Trends | | nicole.healthline1