Webs using react and gRPC

jcbellido July 28, 2019 [Code and Chlorine] #Programming #Javascript #React #Python

During the last months I've been involved in an infrastructure project. The idea is to offer on-demand resources. Think Jenkins or GitLab or any render queue. In my case, users are working from different countries and time zones. This is one of the cases where building a web-based front end makes sense. The challenge: I've never built anything mid sized on web, only micro solutions that needed close to zero maintenance and were extremely short-lived. To make things more interesting the backend was offering its services through gRPC.

A note for other tool programmers

This is a piece about my second project using react. The first one, even if functional, was a total mess. I'm not suggesting the approach contained here makes sense to everyone but it has worked for me and I think keeping it documented has value.

The main issue with web-stuff for me is the amount of thingies you need to juggle to build a solution. To name just a few, this project contains: javascript, react, Babel, JSX, gRPC, Docker, Python, CSS, redux and nginx. It's surprisingly simple to drown in all that stack.

Starting: react-admin + tooling

I needed an IDE for Javascript and I didn't want to consume any license from the web team. So I started with Visual Studio Code. Coming from an overbloated VS Pro the difference in speed and responsiveness is remarkable. Adding the javascript support was also quite simple using a Code plugin. Below it, I had a common npm + node installation. For heavier environments Jet brain's WebStorm IDE is what the professionals around me are using more frequently.

From that point a simple:

npm install -g create-react-app
npm install react-admin
create-react-app my-lovely-stuff

will get you started. You can see a demo of react-admin from marmelab team here:

{% vimeo 268958716 %}

With all that in place, how to start? After checking with more experienced full-time web devs they recommended me to use react-admin (RA from now) as a starting point. Later I realized how much RA's architecture will impact the rest of the solution. But as a starting point it is great. The documentation is really good, I learnt a lot from it. From the get go you'll have a framework where it's easy to:

  1. List, Show detail, Edit and delete flows
  2. Pagination
  3. Filtering results
  4. Actions in multiple selected resources
  5. Related resources and references, aka: this object references that other thing make navigation between resources, simple.

Half way during the development I found out about react-hooks. I strongly suggest to watch this video, well worth the time I put into it:

I used only only a fraction of the potential Hooks offer and that was more than enough. The resulting code is leaner and more expressive. If I need to write another web using react I'll try to squeeze more from them.

RA is based on a large number of 3rd party libraries. For me the most important 2 are:

  1. React-Redux: I use it mainly in forms and to control side effects. Some of the forms I have in place are quite dense and interdependent.
  2. Material-ui: Controls, layout and styles. According to what I'm seeing around lately it has become an industry standard. Out of the box is going to give you a Google-y look and feel.

Unless you're planning to become a full time web developer I don't believe it's particularly useful to dig too deep into those two monsters of libraries. But having a shallow knowledge of the intent of the libraries can be quite useful.

gRPC in the browser: Envoy + Docker

The backend was serving its data through a gRPC end point and was being built at the same time I was working on the frontend. One of the main concepts of gRPC is the .proto file contract. It defines the API surface and the messages that will travel through it. Google et. al. have released several libraries to consume gRPCs (based around that .proto specification) in many different programming languages including Javascript, .NetCore or Python.

But the trick here is that you can't directly connect to a gRPC backend from the browser. In the documentation, Envoy is used to bridge those. In other scenarios it's possible to use Ambassador if your infrastructure supports it.

Since the backend was under construction I decided to write a little mock based on the .proto file using Python. Starting with the .proto file I'm returning the messages populated with fake but not random data. The messages are built dynamically through reflection from the grpc-python toolset output. The only manual work needed is to write the rpcs entry points than are automatically forwarded and answered by the mock. Once the fake server is written you still need to make it reachable from the web browser. It's here where docker-compose made my life way simpler. I wrote a compose with envoy and my server connected and I had a reliable source of sample data to develop the UI. In this case I was lucky since my office computer is running on a Pro version of Win10 making Hyper-V available and the Docker toolset for Windows machines have improved a lot lately. It's perfectly possible to achieve similar results using non-pro versions of Windows or even simpler by using a Linux or Mac desktop.

This small solution turned to be quite important down the line given the amount of iteration the backend went through. In the web world there're many great API / backend mocking solutions based on REST calls. But when you're working with gRPC the ecosystem is not as rich (or I didn't found anything mature at that moment)

Other lessons

One of the interesting side effects of using RA is the impact of the dataProviders abstraction. The whole architecture orbits around classic HTTP verbs. At the end most of my code beyond some specific layouting and extra forms was pure glue. I have full translation layers in place: from gRPC into Javascript objects and vice versa.

In my domain and due to API restrictions I was getting different categories of resources through the same gRPC points. After thinking a bit about it the simplest solution I found was to implement pre-filtered data providers and give them resource relevant names. In other words I ended with a collection of data providers that were internally pointing at the same gRPCs but with relevant names. This allowed me to offer meaningful routes while keeping the UI code isolated from the backend design.

Containers, Docker in my case, are becoming more and more important as I go forward. If you know nothing about them I strongly suggest you to put some time in them. It can be a game changer. Even if your intent is to keep your dev environment as clean as humanly possible.