How to find what Googlebot actually sees on a page?
-
1. When I disable java-script in Firefox and load our home page, it is missing entire middle section.
2. Also, the global nav dropdown menu does not display at all. (with java-script disabled) I believe this is not good.
3. But when type in <website name="">in Google search and click on the cached version of home page > and then click on text only version, It displays the Global nav links fine.</website>
4. When I switch the user agent to Googlebot(using Firefox plugin "User Agent Swticher)), the home page and global nav displays fine.
Should I be worried about#1 and #2 then?
How to find what Googlebot actually sees on a page?
(I have tried "Fetch as Googlebot" from GWT. It displays source code.)
Thanks for the help!
Supriya.
-
Yes. Definitely! HTML with CSS is way to go!!
-
NO problem! Was it helpful?
-
Thank you so much Todd!
-
someone wrote a pretty good article discussing the use of javascript in navigations, here is the link:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am really surprised to see this page is ranking like crazy even the content is very thin
https://www.hackerearth.com/blog/artificial-intelligence/artificial-intelligence-101-how-to-get-started/ We are ranking for 121KW for this page. And 22KW are ranking in the 1-3 position. I am not able to understand why will it rank like anything. Considering that it has just 4 inbound links. Will some help me to understand this mystery. When we try to write a good in-depth content then we are not ranking but for such content, we are doing fairly good.
Intermediate & Advanced SEO | | Rajnish_HE1 -
2 pages competing
Hi all, My website currently has 2 pages that address the theme 'property investment in manchester' or 'buy to let manchester'. https://www.knightknox.com/developments/manchester/ https://www.knightknox.com/investments/manchester I am a bit concerned that we are competing against ourselves for these keywords. In my opinion the /investments page provides better content, but the /developments page ranks higher in Google. What do you think would be the best course of action?: Leave as is Merge the contents of both pages Redirect the /developments to /investments Or something else? Any ideas welcome. Thanks
Intermediate & Advanced SEO | | brian-madden0 -
500 and 508 pages?
Hi we just did a massive deepcrawl (using the tool deepcrawl.co.uk/) on the site: http://tinyurl.com/nu6ww4z http://i.imgur.com/vGmCdHK.jpg Which reported a lot of URLs as either 508 and 500 errors. For the URLs as reported as either 508 or 500 after the deep crawl crawl finished we put them directly into screaming frog and they all came back with status code 200. Could it be because Deep Crawl hammered the site and the server couldn't handle the load or something? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Help with 404 pages
Hello everyone, A few days back, we have permanently removed 3 main categories from our E-commerce website and because of that our more than 50k URLs are showing 404 error (according to Google Search Console). What are the good practices to handle such extensively 404 pages? Please help!!
Intermediate & Advanced SEO | | Obbserv0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Cleaning bad pages
We have 10,000 of bad pages, which panda could track and penalize us for that. If we delete them we will get 404 error, and after that we could again get penality from G algo. How can i delete them to follow google rules and avoid penalities? If we make redirect of 10k pages with 301 to index, can 10k old pages be treated as duplicate?
Intermediate & Advanced SEO | | bele0 -
Rel Canonical on Home Page
I have a client who says they can't implement a 301 on their home page. They have tow different urls for their home page that are live and do not redirect. I know that the best solution would be to redirect one to the main URL but they say this isn't possible. So they implemented the rel canonical instead. Is this the second best solution for them if they can't redirect? Will the link juice be passed through the rel canonical? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0 -
Fetch as GoogleBot "Unreachable Page"
Hi, We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors". We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month. We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again. This very same configuration has been working properly for over a year till 21st of this month. Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ? Regards, shaz
Intermediate & Advanced SEO | | shaz_lhr0