Performance Optimization
Bob Query Builder is designed with performance in mind. This guide covers the built-in performance features and best practices for optimizing your database operations.
Prepared Statement Caching
Bob automatically caches prepared statements to avoid the overhead of repeatedly preparing the same SQL queries.
How It Works
When you execute a query, Bob:
- Generates an MD5 hash of the SQL statement
- Checks if a prepared statement exists for that hash
- Reuses the existing statement or creates a new one
- Caches the statement for future use
use Bob\Database\Connection;
$connection = new Connection($config);
// Enable statement caching (enabled by default)
$connection->enableStatementCaching();
// Set maximum cached statements (default: 100)
$connection->setMaxCachedStatements(200);
// The second execution reuses the prepared statement
for ($i = 0; $i < 1000; $i++) {
$user = $connection->table('users')
->where('id', $i)
->first(); // Same prepared statement reused
}
// Clear the statement cache if needed
$connection->clearStatementCache();Benefits
- Reduced Parse Time: SQL doesn't need to be re-parsed
- Better Performance: Especially noticeable with repeated queries
- Memory Efficient: LRU eviction prevents unlimited growth
Query Result Caching
Cache query results to avoid hitting the database for frequently accessed data.
Existence Check Caching (New in v2.1.0)
Optimize repeated existence checks with the new exists() caching feature:
use Bob\Database\Connection;
$connection = new Connection($config);
// Enable query cache for the connection
$connection->enableQueryCache(1000, 3600); // Max 1000 items, 1 hour TTL
// Enable exists caching for specific queries
$builder = $connection->table('users')
->enableExistsCache(120) // Cache exists() results for 2 minutes
->where('email', 'user@example.com');
// First call - hits database
if ($builder->exists()) {
echo "User exists!";
}
// Subsequent calls within 2 minutes - uses cache
if ($builder->exists()) { // No database query!
echo "Still exists (cached)!";
}
// Different query conditions = different cache key
$builder->where('status', 'active'); // Cache key changes
if ($builder->exists()) { // New database query
echo "Active user exists!";
}Use Cases:
- Validation checks in loops
- Conditional logic based on existence
- Permission checks
- Duplicate detection
Basic Usage
use Bob\Cache\QueryCache;
use Bob\Database\Connection;
// Create cache instance
$cache = new QueryCache();
// Configure connection with cache
$connection = new Connection($config);
$connection->setQueryCache($cache);
// Enable caching for specific queries
$users = $connection->table('users')
->where('status', 'active')
->cache(300) // Cache for 5 minutes
->get();
// Subsequent calls within 5 minutes return cached resultsCache Key Management
// Manually set cache key
$users = $connection->table('users')
->where('status', 'active')
->cacheKey('active_users')
->cache(600)
->get();
// Clear specific cache
$cache->forget('active_users');
// Clear all query cache
$cache->flush();Cache Tags
Group related cache entries:
$posts = $connection->table('posts')
->where('status', 'published')
->cacheTags(['posts', 'content'])
->cache(3600)
->get();
// Clear all posts cache
$cache->flushTag('posts');Connection Pooling
Manage multiple database connections efficiently with connection pooling.
Setting Up Connection Pool
use Bob\Database\ConnectionPool;
$pool = new ConnectionPool($config, [
'min_connections' => 2,
'max_connections' => 10,
'max_idle_time' => 300, // 5 minutes
'acquisition_timeout' => 5 // 5 seconds
]);
// Acquire connection from pool
$connection = $pool->acquire();
// Use the connection
$users = $connection->table('users')->get();
// Release back to pool
$pool->release($connection);Automatic Connection Management
// Use connection with automatic release
$result = $pool->using(function($connection) {
return $connection->table('users')
->where('status', 'active')
->get();
});Pool Statistics
$stats = $pool->getStatistics();
echo "Active connections: " . $stats['active_connections'];
echo "Idle connections: " . $stats['idle_connections'];
echo "Total connections: " . $stats['total_connections'];
echo "Waiting requests: " . $stats['waiting_count'];Query Profiling
Profile your queries to identify performance bottlenecks.
Basic Profiling
use Bob\Database\QueryProfiler;
$profiler = new QueryProfiler();
$connection->setProfiler($profiler);
// Enable profiling
$profiler->enable();
// Run your queries
$users = $connection->table('users')->get();
$posts = $connection->table('posts')->where('status', 'published')->get();
// Get profiling data
$profile = $profiler->getProfile();
foreach ($profile as $query) {
echo "Query: " . $query['sql'] . "\n";
echo "Time: " . $query['time'] . "ms\n";
echo "Memory: " . $query['memory'] . " bytes\n";
}Identifying Slow Queries
// Set slow query threshold (in milliseconds)
$profiler->setSlowQueryThreshold(100);
// Get only slow queries
$slowQueries = $profiler->getSlowQueries();
foreach ($slowQueries as $query) {
echo "Slow query detected:\n";
echo $query['sql'] . "\n";
echo "Execution time: " . $query['time'] . "ms\n";
}Query Statistics
$stats = $profiler->getStatistics();
echo "Total queries: " . $stats['total_queries'] . "\n";
echo "Total time: " . $stats['total_time'] . "ms\n";
echo "Average time: " . $stats['average_time'] . "ms\n";
echo "Slowest query: " . $stats['slowest_time'] . "ms\n";
echo "Total memory: " . $stats['total_memory'] . " bytes\n";Chunking Large Result Sets
Process large datasets efficiently without loading everything into memory.
Basic Chunking
$connection->table('users')->chunk(100, function($users) {
foreach ($users as $user) {
// Process 100 users at a time
$this->processUser($user);
}
});Chunking with Conditions
$connection->table('orders')
->where('status', 'pending')
->chunk(200, function($orders) {
foreach ($orders as $order) {
// Process order
$this->processOrder($order);
// Update as processed
$this->connection->table('orders')
->where('id', $order->id)
->update(['processed' => true]);
}
});Early Termination
$connection->table('users')->chunk(100, function($users) {
foreach ($users as $user) {
if ($this->shouldStop()) {
return false; // Stop chunking
}
$this->processUser($user);
}
});Cursor-Based Iteration
For even better memory efficiency with large datasets:
// Uses PHP generators for minimal memory usage
foreach ($connection->table('logs')->cursor() as $log) {
// Process one row at a time
$this->processLog($log);
// Memory usage remains constant regardless of table size
}Cursor with Conditions
$cursor = $connection->table('users')
->where('created_at', '>', '2024-01-01')
->orderBy('id')
->cursor();
foreach ($cursor as $user) {
// Process users one by one
if ($this->shouldExportUser($user)) {
$this->exportUser($user);
}
}Index Optimization
Using Index Hints
// Force index usage (MySQL)
$users = $connection->table('users')
->useIndex('idx_email')
->where('email', 'user@example.com')
->get();
// Ignore specific index
$users = $connection->table('users')
->ignoreIndex('idx_status')
->where('status', 'active')
->get();Covering Indexes
Use covering indexes to avoid table lookups:
// Create covering index
$connection->statement('
CREATE INDEX idx_users_covering
ON users(status, created_at, id, name, email)
');
// Query uses only the index
$users = $connection->table('users')
->select('id', 'name', 'email')
->where('status', 'active')
->where('created_at', '>', '2024-01-01')
->get();Query Optimization Tips
1. Select Only Required Columns
// Good - specific columns
$users = $connection->table('users')
->select('id', 'name', 'email')
->get();
// Bad - all columns
$users = $connection->table('users')->get();2. Use Exists Instead of Count
// Good - for existence check
if ($connection->table('users')->where('email', $email)->exists()) {
// User exists
}
// Less efficient
if ($connection->table('users')->where('email', $email)->count() > 0) {
// User exists
}3. Optimize Pagination
// Use cursor-based pagination for large datasets
$lastId = 0;
do {
$users = $connection->table('users')
->where('id', '>', $lastId)
->orderBy('id')
->limit(100)
->get();
foreach ($users as $user) {
$this->processUser($user);
$lastId = $user->id;
}
} while (count($users) === 100);4. Batch Operations
// Good - single INSERT for multiple rows
$connection->table('logs')->insert([
['message' => 'Log 1', 'level' => 'info'],
['message' => 'Log 2', 'level' => 'error'],
['message' => 'Log 3', 'level' => 'debug'],
]);
// Bad - multiple INSERT queries
foreach ($logs as $log) {
$connection->table('logs')->insert($log);
}5. Use Transactions for Bulk Operations
$connection->beginTransaction();
try {
for ($i = 0; $i < 1000; $i++) {
$connection->table('records')->insert([
'data' => $this->generateData($i)
]);
}
$connection->commit();
} catch (\Exception $e) {
$connection->rollBack();
throw $e;
}Monitoring and Debugging
Query Logging
// Enable query logging
$connection->enableQueryLog();
// Run queries
$users = $connection->table('users')->get();
// Get query log
$queries = $connection->getQueryLog();
foreach ($queries as $query) {
echo "SQL: " . $query['query'] . "\n";
echo "Bindings: " . json_encode($query['bindings']) . "\n";
echo "Time: " . $query['time'] . "ms\n\n";
}EXPLAIN Analysis
// Get query explanation
$explanation = $connection->table('users')
->where('status', 'active')
->explain();
foreach ($explanation as $row) {
echo "Type: " . $row->type . "\n";
echo "Possible Keys: " . $row->possible_keys . "\n";
echo "Key Used: " . $row->key . "\n";
echo "Rows Examined: " . $row->rows . "\n";
}Best Practices Summary
- Use Prepared Statements: Let Bob cache them automatically
- Cache Frequently Accessed Data: Use query result caching
- Pool Connections: Use connection pooling for high-traffic applications
- Profile in Development: Identify slow queries early
- Chunk Large Operations: Process large datasets in chunks
- Optimize Indexes: Ensure proper indexes for your queries
- Select Specific Columns: Don't fetch data you don't need
- Use Transactions: Group related operations
- Monitor Production: Enable query logging and profiling in production (carefully)
- Regular Maintenance: Analyze and optimize tables regularly
Performance Benchmarks
Bob Query Builder has been optimized for minimal overhead:
- Query Building: < 0.1ms overhead
- Prepared Statement Caching: 30-50% improvement for repeated queries
- Connection Pooling: 60-80% reduction in connection overhead
- Chunking: Constant memory usage regardless of dataset size
- Result Caching: 100-1000x improvement for cached queries
Remember: The best optimization is often at the database design level. Ensure your tables are properly indexed, normalized (or denormalized where appropriate), and regularly maintained.