Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
-
Hi everyone!
Kind of a weird question here but I'll ask and see if anyone else has seen this:
In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK.
However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page.
Attaching a screenshot.
You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl.
My question: is this an issue? Should I be concerned about this visual look? Or no?
Never have experienced an issue like that.
I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one.
-
Good to hear no performance issue. Obviously that is priority number one. Definitely don't sweat the render. You might want to refetch and see how it looks. also give it shot with mobile fetch to see if you get anything different.
A lot of us are chasing the position zero snippet. I didn't look at your site closely but i would start by making sure that every single item (as appropriate) is marked up with schema.org. That will put you closer to your goal
-
No performance issues, other than not capturing the featured snippet despite my best efforts
Really, I'm mostly concerned about the render, as I hadn't seen that in the 10ish years I've been doing SEO.
Seems like, with your great help (thanks so much again!), that it probably isn't actually an issue of any kind that is hindering performance or the ability to capture the featured snippet.
-
Hi Christian, my apologies, i should have noted that. The CSS does not render in the text cache version. The value though is that you can see if something is crawlable/displaying properly. So for instance, if you looked at that cached version and didnt see any of the content on your page, you know you have something stopping the search engines from properly crawaling and indexing the page.
edit. noticing when looking at the link that the full version doesn't show the CSS either. That's a bit weird. I wouldn't worry about it too much as it seems other pages on your site are rendering properly in the full version.
are you seeing any performance issues with the page or is the concern originally due to just the fact that grey box was displaying in the search/render feature of console?
-
Totally hear you.
Here's a link to the page: https://goo.gl/kZVqE9
Will also say: the cached version of it in Google is also very strange. Almost like CSS not really working.
-
Without knowing the URL its really difficult to audit this situation. My first thought is to ask if you have a pop up that loads when a user comes to your page. Google could be rendering the popup without its content. To your point the content on the page is still shown but only behind the popup.
When you look at the actual text cache of the page are you seeing the actual text of the page? If this is the case I would rely on this more than the rendered version. Honestly, it could be multiple things but without the URL it really is nearly impossible to tell you why.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site on desktop browser: page 2 /mobile browser: page 0
Using my two most pertinent keywords in Chome my site shows up page two. Using the same keywords on my iPhone does not show my site at all (I clicked on to page 15). I have a mobile ranking of 84 on Google PageSpeed Insights. Could be a bit higher but not enough to totally ignore my site. What am I missing?
On-Page Optimization | | artsp0 -
How to handle "app" pages.
Hey guys, We've got an app - a drag & drop email builder - and we are looking to improve our seo efforts. That being said - we're not sure how to treat pages of the app that wouldn't tell google nothing at all basically (loads of duplicate content, lorem ipsum, etc). They're pages that are used by the clients to build their own templates ex: builder pages they are extremely useful for our clients, but GGL wouldn't prolly make too much sense out of them. That being said - rather randomly, before we nofollow noindexed them, some of them started ranking (probably given to the really great analytics data we have on them. Loads of clients, loads of time spent on page, etc). Can we harness them in a better way, or just nofollownoindex them? I don't really see how they can be "canonicalised" since they don't really provide any quality content for Google. Much like MOZ's keyword explorer tool for ex. Mucho quality for us - but not a google fan favorite content-wise. Thanks for your help 🙂
On-Page Optimization | | andy.bigbangthemes0 -
What is the perfect way to handle multiple sitemaps index in Search Console?
Hello friends, I have this doubt for a long and i want to share it with you. In our agency many clients have a PHP template for the home page of their sites, and also have a blog with wordpress as CMS. When i am optimizing sitemaps, I have two separate files, an index of Sitemaps created with Wordpress SEO by Yoast (which inside has separate Sitemaps tags, categories, posts, pages, authors, etc.) and on the other hand the home page sitemap with the subsections. As you know the sitemap generated by "Wordpress SEO by Yoast" is dynamic as it creates the sitemap according to current site content, and is updated every time a new entry is raised or modify any URL. This makes it very practical. I can not have a unique index sitemap sitemaps nesting inside another, as it is not allowed by Google or Sitemap protocol. I read in the Google Support you can upload multiple sitemaps to Search Console but does not say anywhere on upload multiple sitemaps index, or a combination thereof. In my case, I would have to upload two separately files, the dynamically generated with wordpress and the manual created for the PHP template. In my opinion there is no problem and Google will index everything properly performing it this way, but I wanted to share it with you to see how you solve this problem and what experiences had. Thanks and best regards.
On-Page Optimization | | NachoRetta1 -
Google Webmaster Tools Not Showing Correct Data?
Hi, I am experience a weird issue. Google webmaster tools suggested me some HTML improvements a few weeks ago. The suggestions were about duplicate Title Tags and Short Meta Descriptions. I changed the Title Tags and Meta Descriptions. But after 3 Google Updates, webmaster still shows the same suggestion. Please advise Thanks
On-Page Optimization | | Kashif-Amin0 -
Screaming Frog - What are your "go to" tasks you use it for?
So, I have just purchased screaming frog because I have some specific tasks that need completing. However, looking at Screaming Frog generally, there is so much information I was wondering for those who use it, what are the top key tasks you use it for. I mean what are your "go to" things you like to check, that perhaps are not covered by the Moz Crawl reports. Just looking for things I perhaps hadn't thought about, that this might be useful for.
On-Page Optimization | | TheWebMastercom0 -
The "100 links/page recommendation" - Do Duplicate Links Count?
We have way too many links on our homepage. The PageRank Link Juice Calculator (www.ecreativeim.com/pagerank-link-juice-calculator.php) counts them to 300. But all of them are not unique, that is some links point to the same URL. So my question: does the "100 links/page recommendation" refer to all anchors on the page or only to unique link target URLs? I know "100" is just a standard recommendation.
On-Page Optimization | | TalkInThePark0 -
What are the benefits of targeting one keyword phrase per page vs. multiple keywords per page
What are the benefits of optimizing a page for one keyword phrase versus a group of similar keywords, like this one that Rand posted on another blog entry http://bit.ly/7LzTxY: Ted Baker Ted Baker London Ted Baker Clothing Ted Baker Mens Ted Baker Mens Clothing Ted Baker Mens Collection
On-Page Optimization | | EricVallee340