Is using REACT SEO friendly?
-
Hi Guys
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
Many thanks for your help in advance.
Cheers
Martin
-
@martin1970 said in Is using REACT SEO friendly?:
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
React itself isn't inherently bad for SEO, but extra care must be taken with regards to optimizing its use for search. Many successful websites use React, yet SEO optimization remains essential.
Consider frameworks such as Next.js, which handles server-side rendering for SEO-friendly development. For ultimate efficiency, however, a static site generator might be better.
If you're interested in SEO, you can join a digital marketing course in Kolkata!
-
@martin1970 said in Is using REACT SEO friendly?:
Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO?
React can be SEO-friendly, but there are considerations to keep in mind due to its default client-side rendering. When search engines crawl websites, they traditionally expect server-rendered HTML for indexing. React applications often render content on the client side, which can pose challenges for search engine optimization (SEO).
To address this issue, there are a few strategies:
-
Server-Side Rendering (SSR):
- SSR involves rendering React components on the server before sending HTML to the client. This ensures that search engines receive fully rendered HTML, making content easily indexable.
- Tools like Next.js, a React framework, support SSR, providing a smoother SEO experience.
-
Static Site Generation (SSG):
- SSG generates static HTML files during the build process. This approach ensures that content is pre-rendered, enhancing SEO performance.
- Next.js also supports SSG, making it a versatile choice for projects requiring strong SEO.
-
Prerendering:
- Prerendering involves generating static HTML for specific pages at build time. This approach combines the benefits of SSR and SSG, allowing developers to target critical pages for SEO optimization.
Several companies and developers have successfully implemented React with SEO in mind. By using SSR or SSG, they've achieved positive results in search engine rankings and overall visibility.
It's essential to note that while React can be SEO-friendly, other frameworks like Angular or Vue.js may also offer SEO solutions. The choice depends on the project's specific requirements and the developer's familiarity with the framework.
In summary, React can be made SEO-friendly through practices like SSR, SSG, or prerendering. Many developers have experienced success in maintaining good SEO performance with React, especially when using tools like Next.js. However, the decision should be based on the project's needs, available resources, and the development team's expertise. Always ensure that your chosen approach aligns with current SEO best practices to achieve optimal results.
-
-
I have doing some research on this issue since there are lots of mixed opinion on this. Per my friends who work on this matter closely, Google, Bing, Yahoo, and DuckDuckGo should all be able to fetch the React based single page applications.
Custom Mat Board (which cuts customized mat boards for any Amazon or IKEA picture frames) is a React based application, and it works well. Please check out Fetch as Google and note if there are any major difference between what Google bot sees and what humans can see. If there are significant differences, you should do something about it. But per my experience, Google bots and humans do see the same thing.
PM me if you have any questions. Cheers!
WJ
-
Thanks for discussing this, Martijn.
Aside from Google, is there any concern that other search engines would have issues rendering a JS website, whether the site uses React, Angular or another framework?
Thanks
-SB
-
Hi Martin,
It can be, that's the actual answer. As React is using JavaScript to load its pages and load the content in most cases. Google and other search engines are able to read the content but it's always required in these cases to check what the actual result is. I've worked with many sites using React and it depends if they're using server or client-side rendering. Start there, to figure out what you can be using for your client/company. Some teams are really drawn to the client side rendering which is a little bit more dangerous as not always can Google see the actual content. In case of server-side rendering, I've seen it go well for most of these.
Let me know if you have any specific questions, happy to answer them!
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My website is not configured in AMP pages, but it is mobile-friendly.
Hi
Algorithm Updates | | rayabahadur
My website is not configured in AMP pages, but it is mobile-friendly.
Last month, my website was ranked to 10 positions for this keyword (Magento Development Company).
Sometimes it's showing on 25 positions but not in the top 5 positions. Here is my URL (for analysis):
https://www.nevinainfotech.com/magento-development-service/
Would you please explain why my keyword rankings are often not showing in the search listings?
Would you mind letting me know is there anything I need to change?
Thank0 -
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Best place to employ "branded" related keywords to gain SEO benefits and rank for "non branded" keywords?
Hi all, I want to put this question straight with an example rather than confusing with a scenario. If there is company called "vertigo", a tiles manufacturer. There are many search queries with thousands of searches like "vertigo tiles life", "vertigo tiles for garden", "vertigo tiles dealers", "vertigo tiles for kitchen", etc....These kind of pages will eventually have tendency to rank for non-branded keywords like "tiles for garden", "tiles for kitchen", etc. So where to employ these kind of help/info pages? Main website or sub-domain? Is it Okay to have these pages on sub-domain and traffic getting diverted to sub domain? What if the same pages are on main website? Will main website have ranking improvement for non branded keywords because of employing the landing pages with related topics? Thanks
Algorithm Updates | | vtmoz0 -
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
Hi guys, I wonder whether you can help me with a couple of SEO queries: So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes. The blog is on a subdomain of the site such as: blog.exampleecommercesite.com (We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time) 1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools? 2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site? If appreciate your opinions on the topic! Thank you and have a good start of the week!
Algorithm Updates | | Firebox0 -
How Additional Characters and Numbers in URL affect SEO
Hi fellow SEOmozers, I noticed that a lot of websites have additional characters and words at the end of the URL in addition keyword optimized URL. Mostly for E-Commerce sites For example: www.yoursite.com/category/keyword?id=12345&Keyword--Category--cm_jdkfls_dklj or wwww.yoursite.com/category/keyword#83939=-37292 My question is how does the additional characters or parameters(not necessarily tracking parameters) affect SEO? Does it matter if i have additional keywords in the additional stuff in the URL (1st url example)? If you can provide more information, that would be helpful. Thank you!
Algorithm Updates | | TommyTan0 -
Does the use of an underscore in filenames adversely affect SEO
We have had a page which until recently was ranked first or second by Google UK and also worldwide for the term "Snowbee". It is now no longer in the top 50. I ran a page optimization report on the url and had a very good score. The only criticism was that I had used an atypical character in the url. The only unusual character was an underscore "_" We use the underscore in most file names without apparent problems with search engines. In fact they are automatically created in html files by our ecommerce software, and other pages do not seem to have been so adversely affected. Should we discontinue this practice? It will be difficult but I'm sure we can overcome this if this is the reason why Google has marked us down. I attach images of the SEO Report pages 8fDPi.jpg AdLIn.jpg
Algorithm Updates | | FFTCOUK0 -
How often do people use Google Product Search
I was was reading Tom Critchlow's excellent blog on how to rank well for Google Product Search. I'm trying to find out if there are stats on how often people use this feature in Google (since it is not listed on Google's main navigation). I'm working with a customer who has b-2-b products and am trying to determine the value of adjusting his ecommerce pages to appear on Google Product Search.
Algorithm Updates | | EricVallee340