Node.js & Express

Performance Optimization

18 min Lesson 34 of 40

Node.js Performance Optimization

Performance optimization ensures your Node.js application runs efficiently, handles high traffic, and provides fast response times. This involves profiling, caching, database optimization, and proper resource management.

Profiling Node.js Applications

Profiling helps you identify performance bottlenecks. Node.js provides built-in profiling tools:

// 1. Using Node.js built-in profiler
node --prof server.js

// This generates a file: isolate-0x....-v8.log
// Process it to get readable output
node --prof-process isolate-0x....-v8.log > processed.txt

// 2. Using Chrome DevTools for profiling
node --inspect server.js
// Open chrome://inspect in Chrome browser
// Click "inspect" to open DevTools
// Go to "Profiler" tab to record CPU profiles

// 3. Using clinic.js for comprehensive diagnostics
npm install -g clinic

// Doctor - detect performance issues
clinic doctor -- node server.js

// Bubbleprof - async operations analysis
clinic bubbleprof -- node server.js

// Flame - CPU profiling
clinic flame -- node server.js

Memory Leak Detection

Memory leaks occur when your application continuously allocates memory without releasing it, eventually crashing the process.

Common Causes of Memory Leaks:
  • Global variables that keep growing
  • Event listeners that are never removed
  • Closures holding references to large objects
  • Caching without size limits
  • Timers (setTimeout/setInterval) not cleared
// Detecting memory leaks
const heapdump = require('heapdump');

// Take heap snapshot
app.get('/heapdump', (req, res) => {
  heapdump.writeSnapshot(\`./heapdump-${Date.now()}.heapsnapshot\`, (err, filename) => {
    if (err) return res.status(500).send(err);
    res.send(`Heap dump written to ${filename}`);
  });
});

// BAD: Memory leak example
const cache = {};
app.get('/cache/:key', (req, res) => {
  cache[req.params.key] = req.body; // Cache grows infinitely!
  res.send('Cached');
});

// GOOD: Limited cache with LRU (Least Recently Used)
const LRU = require('lru-cache');
const cache = new LRU({
  max: 500,           // Maximum 500 items
  maxAge: 1000 * 60 * 60 // Items expire after 1 hour
});

app.get('/cache/:key', (req, res) => {
  cache.set(req.params.key, req.body);
  res.send('Cached');
});

// BAD: Event listener leak
class BadEmitter extends EventEmitter {
  addListener() {
    this.on('data', () => {
      // Process data
    });
    // Listener is never removed!
  }
}

// GOOD: Remove listeners
class GoodEmitter extends EventEmitter {
  addListener() {
    const handler = () => { /* Process data */ };
    this.on('data', handler);

    // Clean up when done
    this.once('close', () => {
      this.removeListener('data', handler);
    });
  }
}

CPU Profiling and Optimization

Identify CPU-intensive operations and optimize them:

// BAD: Blocking the event loop
app.get('/calculate', (req, res) => {
  let result = 0;
  for (let i = 0; i < 10000000000; i++) {
    result += Math.sqrt(i);
  }
  res.json({ result });
});

// GOOD: Offload CPU-intensive work to worker threads
const { Worker } = require('worker_threads');

app.get('/calculate', (req, res) => {
  const worker = new Worker('./calc-worker.js');

  worker.on('message', (result) => {
    res.json({ result });
  });

  worker.on('error', (err) => {
    res.status(500).json({ error: err.message });
  });

  worker.postMessage({ iterations: 10000000000 });
});

// calc-worker.js
const { parentPort } = require('worker_threads');

parentPort.on('message', ({ iterations }) => {
  let result = 0;
  for (let i = 0; i < iterations; i++) {
    result += Math.sqrt(i);
  }
  parentPort.postMessage(result);
});

Caching Strategies

Caching reduces database queries and improves response times significantly:

const Redis = require('redis');
const redisClient = Redis.createClient({
  host: 'localhost',
  port: 6379
});

// Cache middleware
const cacheMiddleware = (duration) => {
  return async (req, res, next) => {
    const key = `cache:${req.originalUrl}`;

    try {
      const cachedData = await redisClient.get(key);

      if (cachedData) {
        console.log('Cache hit!');
        return res.json(JSON.parse(cachedData));
      }

      // Modify res.json to cache the response
      const originalJson = res.json.bind(res);
      res.json = (data) => {
        redisClient.setEx(key, duration, JSON.stringify(data));
        return originalJson(data);
      };

      next();
    } catch (err) {
      console.error('Cache error:', err);
      next();
    }
  };
};

// Use caching
app.get('/api/products', cacheMiddleware(3600), async (req, res) => {
  const products = await db.query('SELECT * FROM products');
  res.json(products);
});

// Cache invalidation
app.post('/api/products', async (req, res) => {
  const product = await db.insert(req.body);

  // Invalidate relevant caches
  await redisClient.del('cache:/api/products');

  res.json(product);
});

Compression Middleware

Compress responses to reduce bandwidth and improve load times:

const compression = require('compression');

// Enable compression
app.use(compression({
  level: 6,           // Compression level (0-9)
  threshold: 1024,    // Only compress responses > 1KB
  filter: (req, res) => {
    // Don't compress if client doesn't support it
    if (req.headers['x-no-compression']) {
      return false;
    }
    return compression.filter(req, res);
  }
}));

// Compression reduces response size by 60-80% for text content!

Database Query Optimization

Database Optimization Techniques:
  • Indexing: Add indexes on frequently queried columns
  • Query Analysis: Use EXPLAIN to analyze query performance
  • Avoid N+1 Queries: Use JOINs or batch loading
  • Limit Result Sets: Always paginate large datasets
  • Prepared Statements: Faster execution and prevent SQL injection
// BAD: N+1 query problem
const users = await db.query('SELECT * FROM users');
for (const user of users) {
  user.posts = await db.query('SELECT * FROM posts WHERE user_id = ?', [user.id]);
}

// GOOD: Single query with JOIN
const usersWithPosts = await db.query(`
  SELECT
    users.*,
    JSON_ARRAYAGG(
      JSON_OBJECT('id', posts.id, 'title', posts.title)
    ) as posts
  FROM users
  LEFT JOIN posts ON posts.user_id = users.id
  GROUP BY users.id
`);

// Use pagination for large datasets
app.get('/api/posts', async (req, res) => {
  const page = parseInt(req.query.page) || 1;
  const limit = 20;
  const offset = (page - 1) * limit;

  const posts = await db.query(
    'SELECT * FROM posts ORDER BY created_at DESC LIMIT ? OFFSET ?',
    [limit, offset]
  );

  const total = await db.query('SELECT COUNT(*) as count FROM posts');

  res.json({
    posts,
    pagination: {
      page,
      limit,
      total: total[0].count,
      pages: Math.ceil(total[0].count / limit)
    }
  });
});

Connection Pooling

Reuse database connections instead of creating new ones for each request:

const mysql = require('mysql2/promise');

// Create connection pool
const pool = mysql.createPool({
  host: 'localhost',
  user: 'root',
  password: 'password',
  database: 'myapp',
  waitForConnections: true,
  connectionLimit: 10,      // Maximum 10 concurrent connections
  queueLimit: 0,            // Unlimited queue
  enableKeepAlive: true,
  keepAliveInitialDelay: 0
});

// Use pool for queries
app.get('/api/users', async (req, res) => {
  try {
    const [rows] = await pool.execute('SELECT * FROM users');
    res.json(rows);
  } catch (err) {
    console.error(err);
    res.status(500).json({ error: 'Database error' });
  }
});

// Monitor pool status
setInterval(() => {
  console.log('Pool connections:', pool.pool._allConnections.length);
  console.log('Active connections:', pool.pool._acquiringConnections.length);
}, 30000);

Lazy Loading and Code Splitting

// BAD: Load everything upfront
const heavyModule = require('./heavy-module');

app.get('/heavy-operation', (req, res) => {
  const result = heavyModule.process(req.body);
  res.json(result);
});

// GOOD: Lazy load modules when needed
app.get('/heavy-operation', async (req, res) => {
  // Only require the module when this route is hit
  const heavyModule = require('./heavy-module');
  const result = await heavyModule.process(req.body);
  res.json(result);
});

// BETTER: Use dynamic imports for async loading
app.get('/heavy-operation', async (req, res) => {
  const { default: heavyModule } = await import('./heavy-module.mjs');
  const result = await heavyModule.process(req.body);
  res.json(result);
});

Monitoring Performance Metrics

const promClient = require('prom-client');

// Create a Registry
const register = new promClient.Registry();

// Add default metrics
promClient.collectDefaultMetrics({ register });

// Custom metrics
const httpRequestDuration = new promClient.Histogram({
  name: 'http_request_duration_seconds',
  help: 'Duration of HTTP requests in seconds',
  labelNames: ['method', 'route', 'status_code'],
  buckets: [0.1, 0.5, 1, 2, 5]
});
register.registerMetric(httpRequestDuration);

// Measure request duration
app.use((req, res, next) => {
  const start = Date.now();

  res.on('finish', () => {
    const duration = (Date.now() - start) / 1000;
    httpRequestDuration
      .labels(req.method, req.route?.path || req.path, res.statusCode)
      .observe(duration);
  });

  next();
});

// Metrics endpoint
app.get('/metrics', async (req, res) => {
  res.set('Content-Type', register.contentType);
  res.end(await register.metrics());
});

Practice Exercise

Task: Optimize a slow Node.js application:

  1. Profile an existing application to identify bottlenecks
  2. Implement Redis caching for frequently accessed data
  3. Optimize database queries (remove N+1 queries)
  4. Add connection pooling for database
  5. Implement compression middleware
  6. Move CPU-intensive operations to worker threads
  7. Set up performance monitoring with Prometheus metrics
  8. Create memory leak detection endpoint
  9. Compare before/after performance using load testing (Artillery, k6)
Performance Optimization Checklist:
  • ✓ Use connection pooling for databases
  • ✓ Implement caching (Redis, in-memory)
  • ✓ Enable gzip compression
  • ✓ Optimize database queries and add indexes
  • ✓ Use CDN for static assets
  • ✓ Implement lazy loading for modules
  • ✓ Monitor memory usage and detect leaks
  • ✓ Use worker threads for CPU-intensive tasks
  • ✓ Set up proper logging and monitoring