**7 Essential Data Access Patterns Every Developer Must Master for Scalable Applications**
Source: Dev.to
Repository Pattern
Imagine you have a box for every major type of data in your app: a User box, an Order box, a Product box. You don’t care what’s inside the box or how it organizes things—you just tell the box, “give me the user with this email,” or “save this new order.” The box handles the rest.
In code, a Repository is a class that provides a collection‑like interface for your data.
// UserRepository.ts
interface UserRepository {
findById(id: string): Promise;
findByEmail(email: string): Promise;
save(user: User): Promise;
delete(id: string): Promise;
}
The interface is a contract: any implementation must provide these methods.
// PostgresUserRepository.ts
import { Pool } from 'pg';
class PostgresUserRepository implements UserRepository {
constructor(private pool: Pool) {}
async findById(id: string): Promise {
const result = await this.pool.query(
'SELECT * FROM users WHERE id = $1',
[id]
);
return result.rows[0] || null;
}
async findByEmail(email: string): Promise {
const result = await this.pool.query(
'SELECT * FROM users WHERE email = $1',
[email]
);
return result.rows[0] || null;
}
async save(user: User): Promise {
await this.pool.query(
`INSERT INTO users (id, email, name)
VALUES ($1, $2, $3)
ON CONFLICT (id) DO UPDATE SET
email = $2, name = $3`,
[user.id, user.email, user.name]
);
}
async delete(id: string): Promise {
const result = await this.pool.query(
'DELETE FROM users WHERE id = $1',
[id]
);
return result.rowCount > 0;
}
}
Your application code depends only on UserRepository. Switching from PostgreSQL to MySQL merely requires a new implementation that satisfies the same interface.
Query Builder
Writing raw SQL strings for complex queries can become hard to read and maintain. A query builder lets you construct queries with a chain of method calls, improving readability and safety.
Example with Knex.js
// Get a paginated list of active users
const activeUsers = await knex('users')
.select('id', 'email', 'created_at')
.where('status', 'active')
.whereBetween('created_at', [startDate, endDate])
.orderBy('created_at', 'desc')
.limit(20)
.offset(40);
Aggregation with Joins
// Find users who have spent more than $1000 total
const bigSpenders = await knex('orders')
.join('users', 'orders.user_id', 'users.id')
.select([
'users.email',
knex.raw('SUM(orders.total_amount) as lifetime_value')
])
.groupBy('users.email')
.having('lifetime_value', '>', 1000);
The builder abstracts dialect differences (PostgreSQL, MySQL, SQLite) and automatically parameterizes values, protecting against SQL injection.
Data Mapper
Often the shape of data in the database differs from the shape of objects in the application. The Data Mapper pattern centralizes the conversion logic.
// UserMapper.ts
class UserMapper {
toEntity(row: UserRecord): User {
// Transform snake_case to camelCase and parse dates
return new User(
row.id,
row.email,
row.full_name,
new Date(row.created_at)
);
}
toPersist(user: User): UserRecord {
// Prepare object for persistence
return {
id: user.id,
email: user.email,
full_name: user.name,
created_at: user.joinedDate.toISOString(),
};
}
}
Usage in a service layer
// UserService.ts
class UserService {
constructor(
private repository: UserRepository,
private mapper: UserMapper
) {}
async getUserProfile(id: string): Promise {
const record = await this.repository.findById(id);
if (!record) throw new Error('User not found');
const userEntity = this.mapper.toEntity(record);
return this.buildProfile(userEntity);
}
private buildProfile(user: User): UserProfile {
// Build and return a profile DTO
// ...
}
}
The domain entity (User) stays pure, free of persistence concerns.
Connection Pooling
Opening and closing a database connection for each query is inefficient. A connection pool maintains a set of reusable connections.
// pool.js
const { Pool } = require('pg');
const pool = new Pool({
host: process.env.DB_HOST,
database: process.env.DB_NAME,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
max: 20, // Maximum number of connections
idleTimeoutMillis: 30000, // Close idle connections after 30 s
});
module.exports = pool;
Using the pool
// orders.js
const pool = require('./pool');
async function getUserOrders(userId) {
const client = await pool.connect(); // Checkout a connection
try {
const result = await client.query(
'SELECT * FROM orders WHERE user_id = $1',
[userId]
);
return result.rows;
} finally {
client.release(); // Return the connection to the pool
}
}
Pooling dramatically reduces latency and resource consumption, especially under load.
Database Migrations
As an application evolves, its schema must change in a repeatable, version‑controlled way (adding columns, indexes, splitting tables, etc.). Database migration tools provide a systematic approach to apply incremental schema changes across environments.
Typical workflow:
- Create a migration file describing the change (e.g.,
add_users_last_login.sql). - Run the migration using a CLI or programmatic API; the tool records the migration as applied.
- Deploy the same migration to staging, testing, and production, ensuring consistency.
Popular migration libraries include Knex migrations, Flyway, Liquibase, and Prisma Migrate. Choose one that fits your stack and integrate it into your CI/CD pipeline.