The Story: From SPA to BFF to Socket Hell
5 min read
For years, I stuck to a strict mental model: I hate pages without URLs.
Back in the day, I was building strict SPA structures with Vite, managing paths rigidly with react-router. It worked. It was clean. But the web moves fast, and frankly, maintaining client-side logic for things that should happen on the server started to feel like building with sticks and stones again.
So, I spiraled. (Again).
Moving from a purely client-side existence to a Next.js based BFF (Backend for Frontend) architecture wasn’t just a tech stack switch; it was a philosophy change. I wanted the speed of an SPA but the security and management capabilities of a server-rendered app.
Here is the breakdown of the sleepless nights, the tools that saved me, and the socket architecture that almost broke me.
Phase 1: The Authorization Shuffle
The backend didn't just shift itself; I dragged it kicking and screaming to OAuth 2.1. (I actually documented that whole migration roadmap over here).
I knew this move would trigger a domino effect. As I noted in that post, the BFF pattern became the obvious boundary:
Why: Avoid carrying access tokens in the browser; rely on HttpOnly, Secure, SameSite cookies and server-side session state to shrink XSS/CSRF exposure. Impact: My Vite + React SPA templates no longer fit. I’m moving administrative surfaces to Next.js...
So, the first hurdle was handling tokens. In the old days, we'd just slap a JWT in local storage and pray. Not anymore.
I set up a stateless BFF layer using Next.js.
- The Handshake: The BFF talks to the backend API.
- The Wrap: It takes the user auth and refresh tokens from the API.
- The Handoff: It re-tokens them and plants them firmly in the client via Secure, HttpOnly cookies.
The client code knows nothing. It holds no tokens. It’s blind, which makes it safe. It’s a simple, fast, and lean way to keep secrets actually secret.
Phase 2: The Codegen Addiction
I admit it: I am a codegen junkie.
In the previous iteration, RTK Codegen was my bread and butter. I created a neat api/[...proxy] layer to talk to the backend. But with the upgrade to Next.js 16 and the switch to Biome (because who has time for ESLint config hell anymore?), I needed to push it a step further.
Enter Orval + AsyncAPI.
This wasn't just about fetching data anymore. It was about contracts. By switching to Orval, I gained:
- Zod Schemas: Automatically generated validation.
- Isomorphic Safety: I can use the same types and validation logic on the Server Side (BFF) and the Client Side.
- Query Hooks: Ready-to-use React Query hooks.
Now, my backend and frontend are locked in a dance. If the backend changes, the build fails. It’s annoying, but it’s the good kind of annoying.
Phase 3: The Socket Conundrum (Express + Socket.io)
Here is where things got messy.
I’m building a chat module. The backend is Express + Socket.io. The frontend is Next.js. The instinct is to proxy the socket connection through Next.js to keep the auth context. Don't. It’s a bottleneck.
I needed the client to connect directly to the socket server (Express), but I also needed it to be authenticated using that same JWT hidden in the HttpOnly cookie.
The solution? Trust the browser.
The "Magic" Configuration
Even though the Client code can't read the cookie, the browser sends it automatically during the handshake if you ask politely.
On the Client (Next.js):
import { io } from "socket.io-client";// The client knows nothing about the token.// We just tell it to send credentials (cookies) with the handshake.const socket = io("http://localhost:3000", {withCredentials: true, // <--- The MVPtransports: ["websocket"],autoConnect: true,});
On the Backend (Express/Socket.io):
We rely on the cookie being present. Since we are likely dealing with subdomains in production (e.g., app.domain.com and api.domain.com), the cookie domain setting is critical.
// Setting the cookie (The Login Flow)res.cookie('accessToken', token, {httpOnly: true, // JS can't touch thissecure: true, // HTTPS only (Locally strict, usually)sameSite: 'strict', // Or 'lax' if cross-subdomaindomain: '.mydomain.com' // The wildcard is key for subdomain sharing});// The Socket Serverconst io = new Server(httpServer, {cors: {origin: "http://localhost:4000", // The Next.js Appcredentials: true // Allow cookies to pass through CORS}});// Middleware to validate the handshakeio.use((socket, next) => {const cookieHeader = socket.request.headers.cookie;// Parse cookie, validate JWT, attach user to socketif (isValid(cookieHeader)) {next();} else {next(new Error("Authentication error"));}});
Is it Production Ready?
Currently, it runs seamlessly between localhost:3000 (API) and localhost:4000 (Next.js/BFF). The withCredentials: true flag does the heavy lifting. Even with SameSite: Strict, modern browsers are smart enough to handle the handshake if the domains align correctly.
Is it a "hack"? Maybe. Does it work securely without exposing tokens to the client? Yes.
Final Thoughts & The Next Step
Just to be clear: the NextAuth implementation I mentioned earlier is a separate beast. It doesn't have a socket layer yet. When it does, I won't be using the direct cookie approach there.
Why? Because relying on shared cookies across different architectures can be a headache. For the NextAuth project, the plan is cleaner: Short-lived, socket-only tokens.
Instead of sharing the holy grail (the session cookie) with the socket server, the API will issue a disposable token with a tiny TTL (e.g., 10 seconds). The client grabs it, hands it to the socket, the connection is sealed, and the token dies. Clean, isolated, and paranoid. Just how I like it.
Coming Up Next:
This stack—Next.js 16, Biome, Orval, Zod—is solid. But maintaining it while letting AI write code can be risky.
In the next post, we are going to talk about speed. Not just typing fast, but keeping the architecture from collapsing under the weight of AI generation. We’ll look at how Biome, Dependency Cruiser, and strict Cursor Rules keep the AI in check—so Cursor generates code for the architecture, without tearing through the codebase like a bull in a china shop.