Skip to content

rafasf/websockets-trpc-fastify-react

Repository files navigation

websockets-trpc-fastify-react

Built with Devbox, Fastify, tRPC, React, TanStack, nginx

Context 📚

This project demonstrates how to send background updates using WebSockets so that users always receive live changes, even when the service runs on multiple instances.

Problem statement 🖊️

WebSockets create a two-way connection between a client and a server. In a simple setup with a single server, messages are sent directly between them via a client-server model. With multiple servers, however, a message sent to one instance may not reach all connected clients.

For example:

  • Client A connects to Server Instance 1.
  • Client B connects to Server Instance 2.
  • Client C connects to Server Instance 1.

If an update is sent to the wrong instance, some clients might miss it. This project provides a solution to ensure that every user gets the update.

Note

The architecture and technology attempts to emulate a specific environment where this was necessary.

Goals 🎯

  1. Run multiple service instances that support WebSocket connections.
  2. Ensure all users receive background updates quickly and reliably.

Architecture 🧱

flowchart LR
subgraph bff[Server]
    load_balancer["Load Balancer<br/><b>(B)</b>"] -->|request| bff_instance_1["Instance 1<br/><b>(C1)</b>"]
    load_balancer -->|request| bff_instance_2["Instance 2<br/><b>(C2)</b>"]
    load_balancer -->|request| bff_instance_3["Instance 3<br/><b>(C3)</b>"]
end

user[User] -->|uses| browser["Browser<br/><b>(A)</b>"]
browser <-->|http/ws| load_balancer

bff_instance_1 -->|subscribe to| valkey["Valkey<br/><b>(D)</b>"]
bff_instance_2 -->|subscribe to| valkey
bff_instance_3 -->|subscribe to| valkey

bg["Background Updates <br/><b>(E)</b>"] -->|pushed to| valkey
Loading

Deployment 📦

See the definition in compose.yaml.

In the "real world" the distribution looks like this:

  • The client (A) is a React app bundled and delivered via a CDN.
  • The server (C) is a Node.js app running in a Kubernetes cluster.
  • The domain service (E) that pushes updates to the server (C).

This project uses docker compose to emulate the environment mentioned above:

  • A bundled client (A) served as static files by Nginx.
  • A load balancer (B) handling traffic for 3 server instances (C).
  • A script that pushes random updates (E)
  • A Valkey instance providing background updates.

Enabling the subscription ↔️

The tRPC subscription uses Valkey to distribute updates to all connected clients.

The client-server connection still managed by tRPC, for more details go here. Differently from the examples in their website, this project uses Valkey as a backing mechanism instead of an in-memory EventEmitter.

The server setup

Check the onUpdates procedure in router.ts. This function subscribes to a channel in Valkey and sends update messages to WebSocket clients.

The client setup

Review the component in user-lists.tsx. Here, useSubscription from tRPC manages the WebSocket connection and updates the UI as new data arrives.

The load balancer setup

Check the configuration in [nginx.conf]. The upstream defined refers to service in compose.yaml that is deployed with 3 replicas.

For a more realistic setup, the appropriate sticky configuration would be added to the upstream section. Here's the available options.

See it in action 👀

docker compose up

Then open http://localhost:3001 in one or more browser windows. The app will show which server instance sent the update.

Development mode

bun dev

This is a simple playground. Both env.client.ts and env.server.ts provide default environment values (don't do this in production).

Final note

This is a simple demonstration focusing on WebSocket communication between distributed servers. Feel free to adapt the concepts to your specific needs, as each project may require different approaches based on their unique requirements! 😊

About

Using WebSockets with tRPC in a distributed environment.

Topics

Resources

License

Stars

Watchers

Forks