
There’s a specific kind of dread that sets in when you check your production logs and see the infamous error: Fatal error: JavaScript heap out of memory.
For developers using modern Node.js stacks like Next.js (frontend/SSR) and NestJS (backend), this error usually means the Garbage Collector (GC) has failed to reclaim enough memory to keep the application running.
In most cases, it isn’t a problem with the frameworks themselves, but rather how we manage data and state within them.
In this post, we’ll explore the most common programming mistakes that cause Out of Memory (OOM) errors within a Next.js/NestJS architecture, complete with examples and solutions.
What is a Memory Leak?
Before we dive in, a quick primer.
JavaScript is a garbage-collected language.
The runtime automatically allocates memory when objects are created and frees it when they are no longer “reachable” (referenced) from the code’s root.
A memory leak occurs when your application inadvertently retains a reference to an object that is no longer needed.
Because the reference still exists, the Garbage Collector cannot free that memory, causing the heap size to grow steadily until the application crashes.
Mistakes in NestJS (Backend API)
NestJS applications are long-running processes. A tiny leak in a single API endpoint can accumulate over thousands of requests, eventually bringing down the entire server.
1. The Accidental Global Cache
A common requirement is to cache data from a slow API or database call.
However, storing this cache in a simple global object or a Singleton class variable without a removal strategy is a recipe for disaster.
The Mistake:
You decide to cache user permissions in a plain JavaScript object attached to a service.
As more unique users log in, the object grows indefinitely.
// user.service.ts (Bad Implementation)
import { Injectable } from '@nestjs/common';
@Injectable()
export class UserService {
// ❌ LEAK: This object grows with every unique user and is never cleared.
private permissionsCache: Record<string, any[]> = {};
async getUserPermissions(userId: string) {
if (this.permissionsCache[userId]) {
return this.permissionsCache[userId];
}
const permissions = await this.db.fetchPermissions(userId); // Slow call
this.permissionsCache[userId] = permissions;
return permissions;
}
}
The Solution:
Use a dedicated caching library with built-in Time-To-Live (TTL) or Least Recently Used (LRU) eviction policies.
NestJS has a built-in CacheModule that makes this easy.
// user.service.ts (Good Implementation)
import { Injectable, Inject } from '@nestjs/common';
import { CACHE_MANAGER } from '@nestjs/cache-manager';
import { Cache } from 'cache-manager';
@Injectable()
export class UserService {
constructor(@Inject(CACHE_MANAGER) private cacheManager: Cache) {}
async getUserPermissions(userId: string) {
const cacheKey = `perms:${userId}`;
const cached = await this.cacheManager.get<any[]>(cacheKey);
if (cached) return cached;
const permissions = await this.db.fetchPermissions(userId);
// ✅ FIX: Use TTL (e.g., 300 seconds) to ensure data is removed.
await this.cacheManager.set(cacheKey, permissions, 300);
return permissions;
}
}
2. Slicing Large Arrays to “Free” Memory
In Node.js, Buffer.slice() or Array.slice() often creates a view onto the original memory rather than a completely new copy.
If you slice a small piece from a massive 100MB array and retain only the small slice, you are still holding the entire 100MB array in memory.
The Mistake:
You process a huge CSV import, but only need the first few headers.
// import.service.ts (Bad Implementation)
async processHugeCsv(csvBuffer: Buffer) {
// csvBuffer might be 50MB
// ❌ LEAK: This slice retains a reference to the massive underlying buffer.
const headerSlice = csvBuffer.slice(0, 100);
// You save the slice to a service variable to use later
this.currentHeaders = headerSlice;
// ... process the rest ...
}
The Solution:
If you need a smaller part of a large buffer or array, create a fresh copy of that smaller part.
The Garbage Collector can then free the original large object.
// import.service.ts (Good Implementation)
async processHugeCsv(csvBuffer: Buffer) {
const start = 0;
const end = 100;
// ✅ FIX: Create a brand new, detached Buffer instance containing only the data needed.
const headerCopy = Buffer.allocUnsafe(end - start);
csvBuffer.copy(headerCopy, 0, start, end);
this.currentHeaders = headerCopy; // Original csvBuffer can now be GCed.
// ... process the rest ...
}
Mistakes in Next.js (Frontend & SSR)
Next.js operates in two distinct worlds: the client-side (browser) and the server-side (Server-Side Rendering, API Routes).
You have to watch out for leaks in both.
3. Forgotten Listeners and Intervals in useEffect
This is the classic React memory leak, but it is magnified in Next.js when users spend a long time on your site navigating between dynamic routes.
If you set a listener but don’t remove it when the component unmounts, that listener stays alive.
The Mistake:
A dashboard component tracks window resize events to adjust its layout, but forgets to clean up.
// components/Dashboard.tsx (Bad Implementation)
'use client';
import { useEffect, useState } from 'react';
export default function Dashboard() {
const [isMobile, setIsMobile] = useState(false);
useEffect(() => {
const handleResize = () => {
console.log('Resizing...'); // This keeps running even after navigation
setIsMobile(window.innerWidth < 768);
};
// ❌ LEAK: Every time this component mounts, a new listener is added.
window.addEventListener('resize', handleResize);
}, []); // Empty dependency array means it runs on mount only.
return <div>{isMobile ? 'Mobile View' : 'Desktop View'}</div>;
}
The Solution:
Always return a cleanup function from your useEffect hook to remove listeners, clear intervals, or cancel API subscriptions.
// components/Dashboard.tsx (Good Implementation)
'use client';
import { useEffect, useState } from 'react';
export default function Dashboard() {
const [isMobile, setIsMobile] = useState(false);
useEffect(() => {
const handleResize = () => {
setIsMobile(window.innerWidth < 768);
};
window.addEventListener('resize', handleResize);
// ✅ FIX: The returned function is called when the component unmounts.
return () => {
window.removeEventListener('resize', handleResize);
};
}, []);
return <div>{isMobile ? 'Mobile View' : 'Desktop View'}</div>;
}
4. Holding massive state in Route Handlers / API Routes
In Next.js App Router, route.ts handlers are similar to NestJS controllers.
They run on the server. If you receive a massive payload (like a file upload) and try to hold the entire thing in memory as a variable, you will instantly spike the server’s RAM.
The Mistake:
An API route receives a file upload and stores it in an in-memory buffer before sending it to S3. If the file is 1GB, your server needs 1GB of free RAM to handle just that one request.
// app/api/upload/route.ts (Bad Implementation)
import { NextResponse } from 'next/server';
export async function POST(request: Request) {
// ❌ LEAK/SPIKE: This loads the entire body into memory at once.
const entirePayload = await request.arrayBuffer();
const buffer = Buffer.from(entirePayload);
await uploadToS3(buffer); // A slow process
return NextResponse.json({ success: true });
}
The Solution:
Use Node.js Streams. Streaming allows you to process data chunk by chunk without ever loading the entire file into the server’s memory.
// app/api/upload/route.ts (Good Implementation)
import { NextResponse } from 'next/server';
import { Upload } from "@aws-sdk/lib-storage"; // Example S3 streaming library
export async function POST(request: Request) {
if (!request.body) return NextResponse.error();
// ✅ FIX: request.body is already a ReadableStream!
const stream = request.body;
const upload = new Upload({
client: s3Client,
params: { Bucket: 'my-bucket', Key: 'large-file', Body: stream },
});
await upload.done();
return NextResponse.json({ success: true });
}
Conclusion: How to Protect Your Heap
Out Of Memory (OOM) errors can be some of the hardest to debug because the crash rarely happens at the exact point of the leak.
However, by adopting these three principles, you can significantly reduce your risk:
-
Stop using Globals: In NestJS services or global utilities, avoid plain objects or arrays for persistent data storage. Use
CacheModuleor a database. -
Clean up your Hooks: In Next.js client components, every
useEffectthat creates an interval, timeout, or listener must have a cleanup function. -
Stream Large Data: If you are handling files, massive CSV exports, or large database queries, never load the entire dataset into a variable. Stream it from source to destination.
Happy coding, and keep your memory tidy!
Useful links below:
Let me & my team build you a money making website/blog for your business https://bit.ly/tnrwebsite_service
Get Bluehost hosting for as little as $1.99/month (save 75%)…https://bit.ly/3C1fZd2
Best email marketing automation solution on the market! http://www.aweber.com/?373860
Build high converting sales funnels with a few simple clicks of your mouse! https://bit.ly/484YV29
Join my Patreon for one-on-one coaching and help with your coding…https://www.patreon.com/c/TyronneRatcliff
Buy me a coffee https://buymeacoffee.com/tyronneratcliff




