Security & Performance

Database Performance Optimization

18 min Lesson 21 of 35

Database Performance Optimization

Database performance is critical for application scalability. Slow queries can bring down entire systems under load.

Query Optimization Basics

Always write efficient queries that minimize data transfer and processing:

<!-- Bad: SELECT * retrieves unnecessary data -->
SELECT * FROM users WHERE status = 'active';

<!-- Good: Only select needed columns -->
SELECT id, name, email FROM users WHERE status = 'active';

<!-- Bad: N+1 query problem -->
$users = User::all();
foreach ($users as $user) {
echo $user->profile->bio; // Triggers one query per user
}

<!-- Good: Eager loading -->
$users = User::with('profile')->get();
foreach ($users as $user) {
echo $user->profile->bio; // No additional queries
}

Indexing Strategies

Indexes dramatically speed up read operations but slow down writes. Choose indexes carefully:

<!-- Create single-column index -->
CREATE INDEX idx_users_email ON users(email);

<!-- Create composite index for multiple columns -->
CREATE INDEX idx_orders_user_status ON orders(user_id, status);

<!-- Unique index for constraints -->
CREATE UNIQUE INDEX idx_users_username ON users(username);

<!-- Full-text index for search -->
CREATE FULLTEXT INDEX idx_posts_content ON posts(title, content);
Indexing Best Practices: Index foreign keys, frequently searched columns, and WHERE/ORDER BY clauses. Don't over-index - each index adds overhead to INSERT/UPDATE operations.

EXPLAIN Analysis

Use EXPLAIN to understand query execution plans and identify bottlenecks:

<!-- Analyze query performance -->
EXPLAIN SELECT u.name, COUNT(o.id) as order_count
FROM users u
LEFT JOIN orders o ON u.id = o.user_id
WHERE u.created_at > '2024-01-01'
GROUP BY u.id
ORDER BY order_count DESC
LIMIT 10;

<!-- Key columns in EXPLAIN output: -->
<!-- type: ALL (bad), index, range, ref, eq_ref, const (good) -->
<!-- possible_keys: indexes that could be used -->
<!-- key: index actually used -->
<!-- rows: estimated rows scanned -->
<!-- Extra: Using filesort, Using temporary (both indicate performance issues) -->
Warning: If you see "type: ALL" or "Extra: Using filesort" in EXPLAIN output, your query likely needs optimization or additional indexes.

Slow Query Log

Enable slow query logging to identify problematic queries in production:

<!-- MySQL configuration (my.cnf) -->
slow_query_log = 1
slow_query_log_file = /var/log/mysql/slow-query.log
long_query_time = 2

<!-- Analyze slow queries -->
mysqldumpslow -s t -t 10 /var/log/mysql/slow-query.log

<!-- Laravel query logging -->
DB::enableQueryLog();
// ... run queries ...
$queries = DB::getQueryLog();
foreach ($queries as $query) {
if ($query['time'] > 100) { // milliseconds
Log::warning('Slow query', $query);
}
}

Denormalization

Sometimes breaking normalization rules improves performance:

<!-- Normalized: Requires JOIN to get post count -->
SELECT u.*, COUNT(p.id) as post_count
FROM users u
LEFT JOIN posts p ON u.id = p.user_id
GROUP BY u.id;

<!-- Denormalized: Add post_count column to users table -->
ALTER TABLE users ADD COLUMN post_count INT DEFAULT 0;

<!-- Update with trigger or application logic -->
CREATE TRIGGER increment_post_count AFTER INSERT ON posts
FOR EACH ROW
UPDATE users SET post_count = post_count + 1 WHERE id = NEW.user_id;

<!-- Now query is simple and fast -->
SELECT * FROM users ORDER BY post_count DESC;
Trade-off: Denormalization sacrifices data integrity for performance. Use it for computed values that are expensive to calculate but rarely change.

Read Replicas

Scale read operations horizontally with database replication:

<!-- Laravel database.php configuration -->
'mysql' => [
'read' => [
'host' => [
'192.168.1.2', // Replica 1
'192.168.1.3', // Replica 2
],
],
'write' => [
'host' => [
'192.168.1.1', // Primary
],
],
'driver' => 'mysql',
// ... other config
],

<!-- Reads automatically go to replicas -->
$users = User::all(); // Uses read replica

<!-- Writes go to primary -->
User::create([...]); // Uses primary database

Query Caching

Cache expensive query results to reduce database load:

<!-- Laravel query caching -->
use Illuminate\Support\Facades\Cache;

$users = Cache::remember('active_users', 3600, function () {
return User::where('status', 'active')
->with('profile', 'roles')
->get();
});

<!-- Clear cache when data changes -->
public function update(Request $request, User $user)
{
$user->update($request->validated());
Cache::forget('active_users');
return redirect()->back();
}
Exercise: Find a slow query in your application. Use EXPLAIN to analyze it, add appropriate indexes, and measure the performance improvement. Aim for at least 50% reduction in execution time.