Should I worry about rendering problems of my pages in google search console fetch as google?
-
Some elements are not properly shown when I preview our pages in search console (fetch as google), e.g.
google maps, css tables etc. and some parts are not showing up since we load them asynchroneously for best page speed.Is this something should pay attention to and try to fix?
-
This is a great question. You will likely get a few different opinions as well. I for one am a fan of using the Google Text Cache as my primary source to determine if something is crawlable. However, I do tend to support my hypothesis with the fetch and render tool as well.
Screaming Frog recently launched an update to their toolset which includes rendering. Consider downloading and seeing how their toolset displays your pages content. It can be another good source for validating if everything is crawlable.
Nick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Kinds of organic search results (Google)
Not sure if this is a new "unit" for Google organic results. Please see the attached image. When searching for "invoice software", the top quarter of the page is a ribbon of products/brands with badly formatted logos. The fact that it's so ugly, and there's nothing marking it as a paid result, leads me to think it's organic. Anyone know what this SERP unit is called; and better still: How do you get included? We rank super high in the normal organic results, but don't appear at all in this product ribbon. y71A9
Intermediate & Advanced SEO | | RobM4161 -
Google WMT/search console showing thousands of links in "Internal Links"
Hi, One of our blog-post has been interlinked with thousands of internal links as per search console; but lists only 2 links it got connected from. How come so many links it got connected internally? I don't see any. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Google Page Speed Score 91, But 5-8 Seconds to Download URL
Greetings MOZ Community: In Google Analytics under "Site Speed" under "Behavior" our home page has a page speed rank of 91 which I assume is pretty fast. However the "Average Page Load Time" is varies between 5 and 8 seconds, which seems very slow. My developers have made major efforts to optimize the home page URL (www.nyc-officespace-leader.com) for speed. The page has a carousel which I assume may be slowing it down. Is the download speed of this page detrimental to SEO? Or is the favorable Page Speed Score good enough. I am particularly concerned because the most competitive phrases are ranked on the home page. As it stands I am having a lot of difficulty ranking in the top ten for these pages. My concern is that the slow download speed of the home page could be holding back ranking of these terms. If necessary I can always redesign the home page and remove the carousel or reduce the number of listings in the carousel to speed it up. Is this worth investing effort in or is the speed good enough? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Google Fetch Issue
I'm having some problems with what google is fetching and what it isn't, and I'd like to know why. For example, google IS fetching a non-existent page but listing it as an error: http://www.gaport.com/carports but the actual url is http://www.gaport.com/carports.htm. Google is NOT able to fetch http://www.gaport.com/aluminum/storage-buildings-10x12.htm. It says the page doesn't exist (even though it does) and when I click on the not found link in Google fetch it adds %E@%80%8E to the url causing the problem. One theory we have is that this may be some sort of server/hosting problem, but that's only really because we can't figure out what we could have done to cause it. Any insights would be greatly appreciated. Thanks and Happy Holidays! Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Does using robots.txt to block pages decrease search traffic?
I know you can use robots.txt to tell search engines not to spend their resources crawling certain pages. So, if you have a section of your website that is good content, but is never updated, and you want the search engines to index new content faster, would it work to block the good, un-changed content with robots.txt? Would this content loose any search traffic if it were blocked by robots.txt? Does anyone have any available case studies?
Intermediate & Advanced SEO | | nicole.healthline0 -
Do sites with a small number of content pages get penalized by Google?
If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?
Intermediate & Advanced SEO | | RightDirection0