Skip to content

News

pgflow 0.7.2: Fix Missing Realtime Broadcasts for Step Events

pgflow 0.7.2 fixes missing realtime broadcasts that prevented clients from receiving step:started and step:completed events. PostgreSQL’s query optimizer was eliminating CTEs containing realtime.send() calls because they weren’t referenced by subsequent operations.

  • step:started events now broadcast when steps begin executing
  • step:completed events now broadcast for empty map steps
  • step:completed events now broadcast for cascade completions
  • Client applySnapshot() methods added for proper initial state hydration without event emission
  • Enhanced test coverage for realtime event lifecycles
  • Documentation updates for abort signal support and empty map step behavior

Questions or issues? Join our Discord community or open an issue on GitHub.

pgflow 0.7.1: Fix pgmq 1.5.1 Version Conflict on New Supabase Projects

pgflow 0.7.1 fixes installation failures on fresh Supabase projects caused by pgmq version mismatch.

Installing pgflow on new Supabase projects failed with:

ERROR: extension "pgmq" has no installation script nor update path for version "1.4.4" (SQLSTATE 22023)
At statement: 2
CREATE EXTENSION IF NOT EXISTS "pgmq" WITH SCHEMA "pgmq" VERSION "1.4.4"

Supabase upgraded to pgmq 1.5.1 in Postgres 17.6.1.016+ (supabase/postgres#1668). pgflow’s initial migration was pinned to version 1.4.4, which no longer exists in new Supabase instances.

Removed the version constraint from pgmq extension creation. pgflow now uses whatever pgmq version Supabase provides (1.4.4, 1.5.1, or newer).

CREATE EXTENSION IF NOT EXISTS "pgmq" WITH SCHEMA "pgmq" VERSION "1.4.4";
CREATE EXTENSION IF NOT EXISTS "pgmq" WITH SCHEMA "pgmq";

Thanks to @kallebysantos for reporting this issue!

pgflow 0.7.0: Public Beta with Map Steps and Documentation Redesign

Cyberpunk workflow engine with glowing teal circuits processing parallel data streams

pgflow 0.7.0 is here - a major milestone that brings parallel array processing, production-ready stability, and a complete documentation redesign.

pgflow has transitioned from alpha to public beta. Core functionality is stable and reliable, with early adopters already running pgflow in production environments.

This milestone reflects months of testing, bug fixes, and real-world usage feedback. The SQL Core, DSL, and Edge Worker components have proven robust across different workloads and deployment scenarios.

See the project status page for production recommendations and known limitations.

Map steps enable parallel array processing by automatically creating multiple tasks - one for each array element.

import { Flow } from '@pgflow/dsl/supabase';
const BatchProcessor = new Flow<string[]>({
slug: 'batch_processor',
maxAttempts: 3,
})
.map(
{ slug: 'processUrls' },
async (url) => {
// Each URL gets its own task with independent retry
return await scrapeWebpage(url);
}
);

Why this matters:

When processing 100 URLs, if URL #47 fails, only that specific task retries - the other 99 continue processing. With a regular step, one failure would retry all 100 URLs.

This independent retry isolation makes flows more efficient and resilient. Each task has its own retry counter, timeout, and execution context.

Map steps handle edge cases automatically:

  • Empty arrays complete immediately without creating tasks
  • Type violations fail gracefully with stored output for debugging
  • Results maintain array order regardless of completion sequence

Learn more: Map Steps and Process Arrays in Parallel

The new .array() method provides compile-time type safety for array-returning handlers:

// Enforces array return type at compile time
flow.array({ slug: 'items' }, () => [1, 2, 3]); // Valid
flow.array({ slug: 'invalid' }, () => 42); // Compile error

Array steps are a semantic wrapper that makes intent clear and moves type errors from .map() to .array(). When a map step depends on a regular step that doesn’t return an array, the compiler catches it too - .array() just makes the error location more precise and the code intention explicit.

The @pgflow/client package, initially released in v0.4.0 but never widely announced, now has complete documentation. This type-safe client powers the pgflow demo and provides both promise-based and event-based APIs for starting workflows and monitoring real-time progress from TypeScript environments (browsers, Node.js, Deno, React Native).

Features include type-safe flow management with automatic inference from flow definitions, real-time progress monitoring via Supabase broadcasts, and extensive test coverage.

Learn more: TypeScript Client Guide | @pgflow/client API Reference

Documentation Restructure and New Landing Page

Section titled “Documentation Restructure and New Landing Page”

The entire documentation has been reorganized from a feature-based structure to a user-journey-based structure, making it easier to find what you need at each stage of using pgflow.

BeforeAfter
Documentation structure before reorganizationDocumentation structure after reorganization

New documentation includes:

The homepage has been completely rebuilt with animated DAG visualization, interactive before/after code comparisons, and streamlined messaging. Visit pgflow.dev to explore the new experience.

Copy to Markdown on All Docs Pages - Every documentation page now includes contextual menu buttons to copy the page as markdown or open it directly in Claude Code or ChatGPT for context-aware assistance.

Contextual menu showing copy to markdown and open in AI options

Additional improvements:

  • Full deno.json Support - pgflow compile now uses --config flag for complete deno.json support
  • Fixed config.toml Corruption - CLI no longer corrupts minimal config.toml files (thanks to @DecimalTurn)
  • Better Type Inference - Improved DSL type inference for .array() and .map() methods
  • Compile-time Duplicate Slug Detection - Prevents duplicate step slugs before deployment
  • Enhanced Failure Handling - Automatic archival of queued messages when runs fail, with stored output for debugging
  • Fixed Data Pruning Bug - Resolved foreign key constraint issue preventing cleanup operations
  • Comprehensive integration tests for map steps and enhanced type testing infrastructure

See the update guide for complete instructions.


Questions or issues with the upgrade? Join our Discord community or open an issue on GitHub.

pgflow 0.6.1: Worker Configuration in Handler Context

pgflow 0.6.1 worker config in handler context cover image

pgflow 0.6.1 adds workerConfig to the handler execution context, enabling intelligent decision-making based on worker configuration.

Handlers now have access to the complete worker configuration through context.workerConfig (#200). This enables smarter handlers that can adapt their behavior based on retry limits, concurrency settings, timeouts, and other worker parameters.

async function sendEmail(input, context) {
const isLastAttempt = context.rawMessage.read_ct >= context.workerConfig.retry.limit;
if (isLastAttempt) {
// Use fallback email service on final attempt
return await sendWithFallbackProvider(input.to, input.subject, input.body);
}
// Use primary email service for regular attempts
return await sendWithPrimaryProvider(input.to, input.subject, input.body);
}

See the context documentation for complete details on available configuration properties and additional examples.

This release also fixes retry strategy validation to only enforce the 50-limit cap for exponential retry strategy, allowing higher limits for fixed strategy when needed (#199).

Follow our update guide for step-by-step upgrade instructions.

pgflow 0.6.0: Worker Deprecation and Context Changes

pgflow 0.6.0 worker deprecation and context changes cover image

pgflow 0.6.0 introduces worker deprecation for deployments without version overlap and simplifies the context object with breaking changes.

Deploy new workers without version overlap. Deprecated workers stop accepting new tasks while finishing current work.

-- Deprecate old workers before starting new ones
UPDATE pgflow.workers
SET deprecated_at = NOW()
WHERE function_name = 'your-worker-name'
AND deprecated_at IS NULL;

Workers detect deprecation within 5 seconds via heartbeat and gracefully stop polling. The deployment guide has been simplified with a single safe deployment sequence.

The context object is now cleaner and more consistent:

Before (0.5.x):

async function handler(input, context) {
const { data } = await context.serviceSupabase
.from('users')
.select('*');
}

After (0.6.0):

async function handler(input, context) {
const { data } = await context.supabase
.from('users')
.select('*');
}
  1. Update context usage: context.serviceSupabasecontext.supabase
  2. Remove any context.anonSupabase usage
  3. Use worker deprecation for smooth deployments

The service role client is now the primary Supabase interface, simplifying database operations in handler functions.

pgflow 0.5.4: Context - Simplify Your Handler Functions

Context object simplifying handler functions with platform resources

Workers now pass a context object as a second parameter to all handlers, providing ready-to-use database connections, environment variables, and Supabase clients.

Previously, handlers relied on global singletons or manual resource initialization:

// Before: Global resources that complicated testing and lifecycle management
import { sql } from '../db.js';
import { supabase } from '../supabase-client.js';
// After: Clean dependency injection via context
async function processPayment(input, ctx) {
const [payment] = await ctx.sql`
SELECT * FROM payments WHERE id = ${input.paymentId}
`;
await ctx.serviceSupabase.from('audit_logs').insert({
action: 'payment_processed',
payment_id: input.paymentId
});
}

Core resources (always available):

  • env - Environment variables
  • shutdownSignal - AbortSignal for graceful shutdown
  • rawMessage - pgmq message metadata (msg_id, read_ct, etc.)
  • stepTask - Step execution details (flows only)

Supabase resources:

  • sql - PostgreSQL client (postgres.js)
  • anonSupabase - Client with anon key (respects RLS)
  • serviceSupabase - Client with service role (bypasses RLS)
  1. Zero Configuration - No connection boilerplate
  2. Managed Resources - pgflow handles pooling and lifecycle
  3. Type Safety - Full TypeScript support
  4. Testable - Mock only what you use:
// Handler uses only env? Test with only env:
await handler(input, { env: { API_KEY: 'test' } });

Existing handlers continue to work. Add the context parameter when you need platform resources:

// Old handlers work fine
async function handler(input) { return { ok: true }; }
// New handlers get context
async function handler(input, ctx) {
const data = await ctx.sql`SELECT * FROM table`;
return { ok: true, count: data.length };
}

Updated packages: @pgflow/edge-worker and @pgflow/dsl