I have been working on Golang microservices for some time now , mostly based on GRPC using Protobuf serialization. The main reason for me to use GRPC is consistency and having a “single source of truth”. You define the data structures and services/remote procedure calls in a simple text file and then generate Golang code from that. Serializing and deserializing data in your microservices is really easy , so you can focus on the actual business logic. And it is easy to combine Golang microservices with microservices written in other languages such as C++ /C# or Python. There are also some drawbacks to using GRPC/Protobuf:
- You need install the Protobuf compiler with a couple of plugins to generate the Golang code and this can be a bit of a hassle.
- Generated Protobuf Go structs don’t work out-of-the-box with many Golang ORM libraries
- You need to define the message types and microservice interfaces before you can start coding (this is also an advantage, depending on your viewpoint)
I created a website www.dactory.com where you can design your Golang microservices using a web UI and the web app generates the Golang code using the official Protobuf compiler. Dactory also generates Docker build files to compile your Golang code, so the only things you need install on your PC are Docker and a good IDE/text editor (e.g. VS Code). Try it yourself and let me know what you think.
There are several options to store Protobuf messages in a database.The main problem is that many Golang ORM’s expect specific struct tags , but the generated Go structs from the Protobuf file don’t have these struct tags.The most common solution is to write a “mirror struct” of the generated Protobuf Golang struct, where you set the struct tags manually. This goes against the “single source of truth” paradigm though and leads to more maintenance effort/issues. Another solution is to “post process” the generated Golang code for the structs. You can use for example protoc-go-inject-tag to insert custom tags. This actually works quite well. I used it in combination with go-pg to store generated Protobuf Golang structs in a Postgres database.
In March 2020 there was an announcement that a new version of the Golang Protobuf API was going to be released https://blog.golang.org/protobuf-apiv2 One of the main reasons was to improve reflection capabilities , so it would for example be possible to write Golang code that could handle “dynamic messages” : messages that were not yet defined at compile time.
When V1 of the Golang Protobuf API was still the “standard”, there was an alternative implementation for people who needed more performance: Gogo Protobuf. This was much faster in marshalling and unmarshalling then the official Google implementation and became quite popular. Gogo protobuf is not compatible with API v2 though and they are now actually looking for new ownership of the project. So they probably won’t undertake the big effort to port Gogo protobuf to API v2, although there have been some experiments. But we live in an open source world and another software development team started where Gogo protobuf left off: Vitess.io
Vitess is using Golang Protobuf and after they upgraded to the new API v2 , they saw a performance decrease. So they decided to create a new plugin for the Protobuf compiler that was compatible with the new API that would generate additional helper methods for faster serialization/deserialization and some other features. You can find their code on Github: https://github.com/planetscale/vtprotobuf