Google Cache Version and Text Only Version are different
-
Across various websites we found Google cache version in the browser loads the full site and all content is visible. However when we try to view TEXT only version of the same page we can't see any content.
Example: we have a client with JS scroller menu on the home page. Each scroller serves a separate content section on the same URL.
When we copy paste some of the page content in Google, we can see that copy indexed in Google search results as well as showing in Cache version . But as soon as we go into Text Only version we cant see the same copy.
We would like to know which version we should trust, Google cache version or the TEXT only version.
-
Thanks for your reply
I thought the same. But when I am trying to check a portion of my site content, its appearing in Google SERPs while trying different set of text its not coming up.
I dont know this is do do with the different JS files we are using and possibility some Google can pass through and be able to crawl content within them and some not.
Any thoughts?
-
Google is able to crawl a lot of javascript these days. If you are seeing the text in their index when you search for it, then its indexed!
As far as I know, the text-only cache leaves out javascript. This was especially useful before Googlebot was able to crawl that stuff, so you could see if parts of your content were hidden from view.
I say, trust the SERPs! Text-only is probably leaving out all your (still crawlable) content wrapped in js.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages are Indexed but not Cached by Google. Why?
Hello, We have magento 2 extensions website mageants.com since 1 years google every 15 days cached my all pages but suddenly last 15 days my websites pages not cached by google showing me 404 error so go search console check error but din't find any error so I have cached manually fetch and render but still most of pages have same 404 error example page : - https://www.mageants.com/free-gift-for-magento-2.html error :- http://webcache.googleusercontent.com/search?q=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&rlz=1C1CHBD_enIN803IN804&oq=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&aqs=chrome..69i57j69i58.1569j0j4&sourceid=chrome&ie=UTF-8 so have any one solutions for this issues
Technical SEO | | vikrantrathore0 -
Is the Google results serp broken?
Hi everyone! We've been trying to decipher how many of our pages are indexed by google at the moment. If we do the usual "site:https://www.hobbydb.com" search term, the serp says that we have more than 740,000 pages indexed. However, when I do a deep dive and click through to the last page of results, I can only get to page 54, and then there are no more results. This would mean that I only have 540 pages indexed, not 740,000. We have also done other queries for other sub-sections of our website, and the results also truncate at 50 pages. Has anyone run into this problem? Any suggestions are appreciated! Best, Alex
Technical SEO | | mpchobbydb0 -
Google Indexing Desktop & Mobile Versions
We have a relatively new site and I have noticed recently that Google seems to be indexing both the mobile and the desktop version of our site. There are some queries where the mobile version will show up and sometimes both mobile and desktop show up. This can't be good. I would imagine that what is supposed to happen is that the desktop version is the one that should be indexed (always) and browser detection will load the mobile version where appropriate once the user is on the site. Do you have any advice on what we should do to solve this problem as we are a bit stuck?
Technical SEO | | simonukss0 -
Dev Site Was Indexed By Google
Two of our dev sites(subdomains) were indexed by Google. They have since been made private once we found the problem. Should we take another step to remove the subdomain through robots.txt or just let it ride out? From what I understand, to remove the subdomain from Google we would verify the subdomain on GWT, then give the subdomain it's own robots.txt and disallow everything. Any advice is welcome, I just wanted to discuss this before making a decision.
Technical SEO | | ntsupply0 -
Google Sitemap - How Long Does it Take Google To Index?
We have changed our sitemap about 1 month ago and Google is yet to index it. We have run a site: search and we still have many pages indexed but we are wondering how long does it take for google to index our sitemap? The last sitemap we put up had thousands of pages indexed within a fortnight, but for some reason this version is taking way longer. We are also confident that there are no errors in this version. Help!
Technical SEO | | JamesDFA0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
Titles in google serps incorrect
If you do the following query in google: site:orlandovisiting.com legend sparrow You'll see in the results the title appears as: legend-captain-jack-sparrow-1149.html Where the title tag in the source is: <title>The Legend of Captain Jack Sparrow Opens at Disney’s Hollywood Studiostitle> Eventually it normally rights itself, but does anyone else get this with their sites? Thanks.
Technical SEO | | walshy990 -
Is this a google dance?
My website keeps moving up and down in ranking but stays within page 2 to 3. Everyday its at a new position.
Technical SEO | | ragivan0