Back to Research
Backend Engineering2026-01-28·7 min read read

Go vs Node.js for APIs: Performance Benchmarks from Real Projects

gonodejsapi performancebenchmarks
Go vs Node.js for APIs: Performance Benchmarks from Real Projects

Every backend framework benchmark article uses trivial "hello world" endpoints that tell you nothing about real-world performance. We took three actual API workloads from production projects, reimplemented them in both Go and Node.js, and ran them on identical hardware under realistic load. The results confirmed some assumptions and challenged others.

The test setup: a dedicated bare-metal server with 8 cores (AMD Ryzen 7 5800X), 32GB RAM, and NVMe storage. Both implementations connected to the same PostgreSQL 16 database running on a separate machine on the same local network. The Go implementation used the standard library's net/http with chi router and sqlx for database access. The Node.js implementation used Fastify (not Express, because Express is not representative of Node.js performance in 2026) with the postgres (porsager/postgres) driver.

Load testing was performed using k6 with a consistent approach: ramp from 0 to target concurrency over 30 seconds, hold for 5 minutes, then ramp down. Each test was run 3 times and we report the median results. All database queries were identical between implementations.

Workload one: simple CRUD API. A typical REST API with endpoints for creating, reading, updating, and deleting records. Each request involves 1 to 2 database queries. The payload sizes are small, under 1KB. This represents the majority of API endpoints in a typical web application.

At 100 concurrent users, Go handled 12,400 requests per second with a p50 latency of 7ms and p99 of 24ms. Node.js handled 8,900 requests per second with a p50 of 10ms and p99 of 35ms. Memory usage: Go held steady at 45MB. Node.js sat at 120MB.

At 500 concurrent users, Go handled 11,800 requests per second with p50 of 38ms and p99 of 95ms. Node.js handled 7,200 requests per second with p50 of 62ms and p99 of 180ms. Memory: Go at 52MB, Node.js at 185MB.

At 1,000 concurrent users, Go handled 11,200 requests per second with p50 of 82ms and p99 of 210ms. Node.js handled 5,800 requests per second with p50 of 155ms and p99 of 450ms. Memory: Go at 58MB, Node.js at 310MB.

For simple CRUD operations, Go is approximately 40% faster in throughput and 30 to 50% better in latency, with dramatically lower memory usage. This gap widens under higher concurrency, which makes sense given Go's goroutine-based concurrency model versus Node.js's single-threaded event loop. Node.js with clustering (using all 8 cores) closes the throughput gap to about 15%, but p99 latency remains significantly higher because inter-process communication adds overhead.

Workload two: data aggregation API. Endpoints that query multiple tables, perform joins, aggregate results, and return larger payloads (5 to 50KB). This involves 3 to 7 database queries per request, some sequential and some parallelizable. This represents dashboard APIs, analytics endpoints, and reporting features.

At 100 concurrent users, Go handled 3,200 requests per second with p50 of 28ms and p99 of 85ms. Node.js handled 2,800 requests per second with p50 of 32ms and p99 of 95ms. Memory: Go at 85MB, Node.js at 200MB.

At 500 concurrent users, Go handled 2,900 requests per second with p50 of 160ms and p99 of 380ms. Node.js handled 2,400 requests per second with p50 of 195ms and p99 of 450ms.

The gap narrowed significantly for this workload. When the bottleneck is database query time rather than request handling overhead, the language runtime matters less. Both Go and Node.js spend most of their time waiting for database responses. Go's advantage here comes primarily from more efficient concurrent query execution via goroutines and lower JSON serialization overhead for large payloads.

The Node.js implementation with properly parallelized database queries using Promise.all performed comparably to Go at lower concurrency levels. The divergence only became meaningful above 300 concurrent users, where Node.js's event loop contention started to bite.

Workload three: CPU-intensive JSON transformation. Endpoints that receive a large JSON payload (100 to 500KB), perform complex transformations including nested object restructuring, array filtering and mapping, date parsing and formatting, and mathematical computations, then return the transformed payload. This represents data import/export endpoints, ETL-style APIs, and complex business logic.

At 50 concurrent users, Go handled 1,800 requests per second with p50 of 25ms and p99 of 65ms. Node.js handled 420 requests per second with p50 of 110ms and p99 of 280ms.

At 200 concurrent users, Go handled 1,750 requests per second with p50 of 105ms and p99 of 240ms. Node.js handled 380 requests per second with p50 of 490ms and p99 of 1,200ms.

This is where Go's advantage is undeniable. For CPU-bound work, Go is 4 to 5x faster than Node.js. The single-threaded nature of Node.js means that CPU-intensive work blocks the event loop, degrading performance for all concurrent requests. Node.js worker threads can help, but they add complexity and communication overhead that narrows the gap to roughly 2.5x rather than closing it.

Now, what do these numbers actually mean for your technology choice?

If your API is mostly CRUD operations with small payloads, which describes 70% of web application backends, the performance difference between Go and Node.js is unlikely to matter at typical startup to mid-size scale. A single Node.js server on modern hardware handles 5,000 to 8,000 CRUD requests per second. You need significant scale before that becomes a bottleneck.

If your API has CPU-intensive endpoints, Go has a decisive advantage. No amount of Node.js optimization will close a 4x performance gap for compute-bound work. If these endpoints are performance-critical, Go is the right choice, or you should offload the computation to a separate service written in Go, Rust, or another compiled language.

If your API is database-bound, which is the most common scenario, invest in database optimization rather than changing your language runtime. Proper indexing, query optimization, and connection pooling will give you 5 to 10x improvements, far more than switching from Node.js to Go.

Beyond raw performance, there are practical factors that influence the decision. Development speed: in our experience, a Node.js/TypeScript team ships features about 20% faster than a Go team for typical web applications. TypeScript's ecosystem has more ready-made packages for common web tasks. The type system is more flexible (for better or worse), and there are more developers available for hire.

Operational simplicity: Go produces a single static binary with no runtime dependencies. Node.js requires a Node runtime, node_modules, and typically a process manager. Go's deployment story is simpler, particularly for containerized environments where image size matters. Our Go Docker images are typically 15 to 30MB. Our Node.js images are 150 to 300MB.

Long-term maintainability: Go's simplicity and lack of "magic" makes codebases easier to understand 2 years after they were written. Node.js's flexibility and the churn in its ecosystem mean that dependencies age faster and refactoring is more frequent.

Our recommendation at Fordel Studios: default to Node.js with TypeScript and Fastify for most web application backends. Switch to Go for services that are compute-intensive, require very low latency at high concurrency, or are infrastructure-level components like API gateways, proxies, or data pipeline services. For services that are purely database CRUD, the language choice matters far less than the quality of your database schema and queries.

The "best" backend language is the one your team is most productive in, right up until you hit a performance wall. And with modern hardware and proper optimization, that wall is further away than most teams think.

About the Author

Fordel Studios

AI-native app development for startups and growing teams. 14+ years of experience shipping production software.

Want to discuss this further?

We love talking shop. If this article resonated, let's connect.

Start a Conversation

Ready to build
something real?

Tell us about your project. We'll give you honest feedback on scope, timeline, and whether we're the right fit.

Start a Conversation