|
| 1 | +--- |
| 2 | +title: Pipelining |
| 3 | +--- |
| 4 | + |
| 5 | +import { Alert } from '/components/alert.tsx' |
| 6 | + |
| 7 | +## What is pipelining? |
| 8 | + |
| 9 | +By default node-postgres waits for each query to complete before sending the next one. This means every query pays a full network round-trip of latency. **Query pipelining** sends multiple queries to the server without waiting for responses, and the server processes them in order. Each query still gets its own result (or error), but you avoid the idle time between them. |
| 10 | + |
| 11 | +``` |
| 12 | +sequential (default) pipelined |
| 13 | +───────────────────── ───────────────────── |
| 14 | + client ──Parse──▶ server client ──Parse──▶ server |
| 15 | + client ◀──Ready── server ──Parse──▶ |
| 16 | + client ──Parse──▶ server ──Parse──▶ |
| 17 | + client ◀──Ready── server client ◀──Ready── server |
| 18 | + client ──Parse──▶ server client ◀──Ready── server |
| 19 | + client ◀──Ready── server client ◀──Ready── server |
| 20 | +``` |
| 21 | + |
| 22 | +In benchmarks, pipelining typically delivers **2-3x throughput** for batches of simple queries on a local connection, with larger gains over higher-latency links. |
| 23 | + |
| 24 | +## Enabling pipelining |
| 25 | + |
| 26 | +Pipelining is opt-in. Set `client.pipelining = true` after connecting: |
| 27 | + |
| 28 | +```js |
| 29 | +import { Client } from 'pg' |
| 30 | + |
| 31 | +const client = new Client() |
| 32 | +await client.connect() |
| 33 | +client.pipelining = true |
| 34 | + |
| 35 | +const [r1, r2, r3] = await Promise.all([ |
| 36 | + client.query('SELECT 1 AS num'), |
| 37 | + client.query('SELECT 2 AS num'), |
| 38 | + client.query('SELECT 3 AS num'), |
| 39 | +]) |
| 40 | + |
| 41 | +console.log(r1.rows[0].num, r2.rows[0].num, r3.rows[0].num) // 1 2 3 |
| 42 | + |
| 43 | +await client.end() |
| 44 | +``` |
| 45 | + |
| 46 | +All query types work with pipelining: plain text, parameterized, and named prepared statements. |
| 47 | + |
| 48 | +## Pipelining with a pool |
| 49 | + |
| 50 | +Pass `pipelining: true` in the pool config to enable it on every client the pool creates: |
| 51 | + |
| 52 | +```js |
| 53 | +import { Pool } from 'pg' |
| 54 | + |
| 55 | +const pool = new Pool({ pipelining: true }) |
| 56 | + |
| 57 | +const client = await pool.connect() |
| 58 | +// client.pipelining is already true |
| 59 | + |
| 60 | +const [users, orders] = await Promise.all([ |
| 61 | + client.query('SELECT * FROM users WHERE id = $1', [1]), |
| 62 | + client.query('SELECT * FROM orders WHERE user_id = $1', [1]), |
| 63 | +]) |
| 64 | + |
| 65 | +client.release() |
| 66 | +``` |
| 67 | + |
| 68 | +<Alert> |
| 69 | + <div> |
| 70 | + <code>pool.query()</code> checks out a client for a single query and releases it immediately, so pipelining has no effect there. Use <code>pool.connect()</code> to check out a client and send multiple queries on it. |
| 71 | + </div> |
| 72 | +</Alert> |
| 73 | + |
| 74 | +## Error isolation |
| 75 | + |
| 76 | +Each pipelined query gets its own error boundary. A failing query in the middle of a batch does not break the other queries: |
| 77 | + |
| 78 | +```js |
| 79 | +const results = await Promise.allSettled([ |
| 80 | + client.query('SELECT 1 AS num'), |
| 81 | + client.query('SELECT INVALID SYNTAX'), |
| 82 | + client.query('SELECT 3 AS num'), |
| 83 | +]) |
| 84 | + |
| 85 | +console.log(results[0].status) // 'fulfilled' |
| 86 | +console.log(results[1].status) // 'rejected' |
| 87 | +console.log(results[2].status) // 'fulfilled' |
| 88 | +``` |
| 89 | + |
| 90 | +This works because node-postgres sends a `Sync` message after each query, which is how PostgreSQL delimits error boundaries in the extended query protocol. |
| 91 | + |
| 92 | +## Named prepared statements |
| 93 | + |
| 94 | +Named prepared statements work with pipelining. When two pipelined queries share the same statement name, node-postgres sends `Parse` only once and reuses the prepared statement for subsequent queries: |
| 95 | + |
| 96 | +```js |
| 97 | +const queries = Array.from({ length: 100 }, (_, i) => ({ |
| 98 | + name: 'get-user', |
| 99 | + text: 'SELECT * FROM users WHERE id = $1', |
| 100 | + values: [i], |
| 101 | +})) |
| 102 | + |
| 103 | +const results = await Promise.all(queries.map(q => client.query(q))) |
| 104 | +``` |
| 105 | + |
| 106 | +## Graceful shutdown |
| 107 | + |
| 108 | +Calling `client.end()` while pipelined queries are in flight will wait for all of them to complete before closing the connection: |
| 109 | + |
| 110 | +```js |
| 111 | +client.pipelining = true |
| 112 | + |
| 113 | +const p1 = client.query('SELECT 1') |
| 114 | +const p2 = client.query('SELECT 2') |
| 115 | +const endPromise = client.end() |
| 116 | + |
| 117 | +// Both queries will resolve normally |
| 118 | +const [r1, r2] = await Promise.all([p1, p2]) |
| 119 | +await endPromise |
| 120 | +``` |
| 121 | + |
| 122 | +## When to use pipelining |
| 123 | + |
| 124 | +Pipelining is most useful when you have multiple **independent** queries that don't depend on each other's results. Common use cases: |
| 125 | + |
| 126 | +- Fetching data from multiple tables in parallel for a page load |
| 127 | +- Inserting or updating multiple rows simultaneously |
| 128 | +- Running a batch of analytics queries |
| 129 | + |
| 130 | +<div className="alert alert-warning"> |
| 131 | + Do not use pipelining inside a transaction if you need to read the result of one query before issuing the next. Pipelined queries are all sent before any responses arrive, so you cannot branch on intermediate results. For dependent queries within a transaction, use sequential <code>await</code> calls instead. |
| 132 | +</div> |
0 commit comments