If Fetch As Google can render website, it should be appear on SERP ?
-
Hello everyone and thank you in advance for helping me.
I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP).
Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties!
I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host!
Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem.
If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP?
-
Absolutely not a problem. I do think that SSR would be a really positive way forwards for your website! Hopefully that will begin to get the trend-line going up again instead of down
-
Thank you Effectdigital for this response and for your spending time to me. I read twice to get understand and it was fully explained all things in details. I'm gonna searching more about some of your keywords that you mentioned above. I have planed to run SSR in a few months later and finished this problem as well.
-
From the sounds of it, it's not a penalty - it's just a botched migration (with no redirects) to a new platform which is less search-accessible than the previous platform.
Fetch and render has many pitfalls. It (WRONGLY) makes webmaster's think that, every crawl Google does - will be to that level of depth. What you get with fetch and render is a best-case scenario, where Google are deploying all their crawling and rendering technologies for you including rendered browsing (to capture generated content)
You have your base (un-modified) source code, and then you have your modified source code. To get at that (which is far richer, especially for sites which are mostly generated) you have to run a crawler which uses a headless browser (something like Selenium or Windmill, through something like Python) in order to fire the scripts and harvest the modified source data. These days that doesn't take extreme amounts of time, but it does take extreme amounts of time when you compare it to base-source scraping (on average 10x longer). It may still seem like seconds to you, but believe me it takes much more time than near-instant source-code scraping
Google's mission is to index the web. Do you really think they're going to take a random 10x efficiency hit because, modern devs have decided that more modified content is faster and better?
Well... they will and they won't. Google have confirmed that they can and do crawl in this way. But results from moves just like yours, are constantly showing us that they don't deploy this tech for everyone - and even when they do, they don't use it all the time for every crawl (scrape)
If you're in control of a huge site that Google can't afford to lose from their index (like compare the market, Barclays, coca-cola etc) then you have a lot more room to play in this area and reap the benefits of a lightning fast CMS (and front-end deployment, obviously better UX)
If you're not in that position, don't be surprised when these things happen. You have to have some perspective on yourself and what your site is worth to the web. To you it's everything, to Google it's one grain of sand on a vast ocean-floor. And it's one grain of sand which is making Google's life harder, by hitting the efficiency of their core MO (mission objective)
There may be some stuff you can do to fix this, or it may be time to swallow a bitter pill and do a roll-back.
Looking at your source code:
^ the above link will only work in Google Chrome!
It is obvious that it's extremely bare
Let's download the 'base' source code to a PHP file:
It's actually just 3 lines of code, but it probably takes up the space of ... well, a lot more than that (hundred lines maybe)
But here's your modified source code:
It's WAY BIGGER, it's 49 lines of code and even then it's highly condensed
My assertion to you, is that not enough of your coding and content resides within the 'base' source code, most of it is in the modified source code
It's a tough lesson to learn. Yeah, Google 'can' do many things. Yeah their analysis tools put their best foot forwards and show you what they 'can' do. But 'can' and 'will'... they're different cookies man
if you have a powerful enough server (even if you don't maybe it's time to get one!) - maybe you could have all the scripts fire server-side and then just fire users (and search engines) the pre-rendered base-source. Or do something clever like that. This is not game-over, but you'll need to get really smart now. I wouldn't recommend bothering to do that without retrospectively going back (FAST) and doing a full, URL-to-URL 301 redirect migration project (using .htaccess or web.config)
The faster you act, the more likely your recover
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Can too many NoFollow links damage your Google rankings?
I've been trying to recover from a Google algorithm change since Sep 2012, so far without success. I'm now wondering if the nofollow on external links in my blog posts are actually doing me damage. http://www.smartdatinguk.com/blog/ Does anyone have any experience of this?
Intermediate & Advanced SEO | | benners0 -
How a press release can help with Google serp?
Hi, After publishing a press release, if that press release is on top position on Google News for the keyword, how will it effect the SERP for that website?
Intermediate & Advanced SEO | | purplar0 -
My website hasn't been cached for over a month. Can anyone tell me why?
I have been working on an eCommerce site www.fuchia.co.uk. I have asked an earlier question about how to get it working and ranking and I took on board what people said (such as optimising product pages etc...) and I think i'm getting there. The problem I have now is that Google hasn't indexed my site in over a month and the homepage cache is 404'ing when I check it on Google. At the moment there is a problem with the site being live for both WWW and non-WWW versions, i have told google in Webmaster what preferred domain to use and will also be getting developers to do 301 to the preferred domain. Would this be the problem stopping Google properly indexing me? also I'm only having around 30 pages of 137 indexed from the last crawl. Can anyone tell me or suggest why my site hasn't been indexed in such a long time? Thanks
Intermediate & Advanced SEO | | SEOAndy0 -
How to remove "Results 1 - 20 of 47" from Google SERP Snippet
We are trying to optimise our SERP snippet in Google to increase CTR, but we have this horrid "Results 1 - 20 of 47" in the description. We feel this gets in the way of the message and so wish to remove it, but how?? Any ideas apart from removing the paging from the page?
Intermediate & Advanced SEO | | speedyseo0 -
Can you explain why the site is dropping off Google every other week?
Can anyone offer any insight into why since the Google Panda update www.bedandbreakfastsguide.com has been fluctuating on Google so much? One week it's ranked as it used to be, the next it's nowhere to be seen? If you take a look at the screenshot of our traffic, this is the traffic after 75% loss (dropped in two stages) you'll see we get traffic for a week and then nothing. This has been happening for months. Some points that might be involved: Around the same time the SEO guys suggested setting the canonical url to www.bedandbreakfastsguide.com (before there wasn't one so traffic was coming from www. and non-www). A lot of the original urls have been consolidated and rel="canonical" added throughout The "pages" of results all have had a rel="canonical" set to page 1 Could it be that the www is competing with the non-www despite the 301 redirects. We're doing everything we can to help this client (and have reduced their site errors from the millions to low tens-of-thousands) so it's not filling them with confidence when their site just keeps plumetting! What's also irritating/odd is that some of their competitors -who used to be ranked lower and have sites which contradict every rulebook still rank high. Hopefully you can spot something we've missed. Tim I8PNL
Intermediate & Advanced SEO | | TimGaunt0 -
Are there new updates in Google Panda? Please help review my website...
My site have significantly went down in google ranking today. Is there a recent update with regards to google panda? Also, please help me review my website for possible errors so I may apply the necessary changes for my site to recover. Here is my url: http://www.homeescapade.com Thanks and God Bless
Intermediate & Advanced SEO | | Trigun0