Skip to content

dsl

2 posts with the tag “dsl”

pgflow 0.7.0: Public Beta with Map Steps and Documentation Redesign

Cyberpunk workflow engine with glowing teal circuits processing parallel data streams

pgflow 0.7.0 is here - a major milestone that brings parallel array processing, production-ready stability, and a complete documentation redesign.

pgflow has transitioned from alpha to public beta. Core functionality is stable and reliable, with early adopters already running pgflow in production environments.

This milestone reflects months of testing, bug fixes, and real-world usage feedback. The SQL Core, DSL, and Edge Worker components have proven robust across different workloads and deployment scenarios.

See the project status page for production recommendations and known limitations.

Map steps enable parallel array processing by automatically creating multiple tasks - one for each array element.

import { Flow } from '@pgflow/dsl/supabase';
const BatchProcessor = new Flow<string[]>({
slug: 'batch_processor',
maxAttempts: 3,
})
.map(
{ slug: 'processUrls' },
async (url) => {
// Each URL gets its own task with independent retry
return await scrapeWebpage(url);
}
);

Why this matters:

When processing 100 URLs, if URL #47 fails, only that specific task retries - the other 99 continue processing. With a regular step, one failure would retry all 100 URLs.

This independent retry isolation makes flows more efficient and resilient. Each task has its own retry counter, timeout, and execution context.

Map steps handle edge cases automatically:

  • Empty arrays complete immediately without creating tasks
  • Type violations fail gracefully with stored output for debugging
  • Results maintain array order regardless of completion sequence

Learn more: Map Steps and Process Arrays in Parallel

The new .array() method provides compile-time type safety for array-returning handlers:

// Enforces array return type at compile time
flow.array({ slug: 'items' }, () => [1, 2, 3]); // Valid
flow.array({ slug: 'invalid' }, () => 42); // Compile error

Array steps are a semantic wrapper that makes intent clear and moves type errors from .map() to .array(). When a map step depends on a regular step that doesn’t return an array, the compiler catches it too - .array() just makes the error location more precise and the code intention explicit.

The @pgflow/client package, initially released in v0.4.0 but never widely announced, now has complete documentation. This type-safe client powers the pgflow demo and provides both promise-based and event-based APIs for starting workflows and monitoring real-time progress from TypeScript environments (browsers, Node.js, Deno, React Native).

Features include type-safe flow management with automatic inference from flow definitions, real-time progress monitoring via Supabase broadcasts, and extensive test coverage.

Learn more: TypeScript Client Guide | @pgflow/client API Reference

Documentation Restructure and New Landing Page

Section titled “Documentation Restructure and New Landing Page”

The entire documentation has been reorganized from a feature-based structure to a user-journey-based structure, making it easier to find what you need at each stage of using pgflow.

BeforeAfter
Documentation structure before reorganizationDocumentation structure after reorganization

New documentation includes:

The homepage has been completely rebuilt with animated DAG visualization, interactive before/after code comparisons, and streamlined messaging. Visit pgflow.dev to explore the new experience.

Copy to Markdown on All Docs Pages - Every documentation page now includes contextual menu buttons to copy the page as markdown or open it directly in Claude Code or ChatGPT for context-aware assistance.

Contextual menu showing copy to markdown and open in AI options

Additional improvements:

  • Full deno.json Support - pgflow compile now uses --config flag for complete deno.json support
  • Fixed config.toml Corruption - CLI no longer corrupts minimal config.toml files (thanks to @DecimalTurn)
  • Better Type Inference - Improved DSL type inference for .array() and .map() methods
  • Compile-time Duplicate Slug Detection - Prevents duplicate step slugs before deployment
  • Enhanced Failure Handling - Automatic archival of queued messages when runs fail, with stored output for debugging
  • Fixed Data Pruning Bug - Resolved foreign key constraint issue preventing cleanup operations
  • Comprehensive integration tests for map steps and enhanced type testing infrastructure

See the update guide for complete instructions.


Questions or issues with the upgrade? Join our Discord community or open an issue on GitHub.

pgflow 0.5.4: Context - Simplify Your Handler Functions

Context object simplifying handler functions with platform resources

Workers now pass a context object as a second parameter to all handlers, providing ready-to-use database connections, environment variables, and Supabase clients.

Previously, handlers relied on global singletons or manual resource initialization:

// Before: Global resources that complicated testing and lifecycle management
import { sql } from '../db.js';
import { supabase } from '../supabase-client.js';
// After: Clean dependency injection via context
async function processPayment(input, ctx) {
const [payment] = await ctx.sql`
SELECT * FROM payments WHERE id = ${input.paymentId}
`;
await ctx.serviceSupabase.from('audit_logs').insert({
action: 'payment_processed',
payment_id: input.paymentId
});
}

Core resources (always available):

  • env - Environment variables
  • shutdownSignal - AbortSignal for graceful shutdown
  • rawMessage - pgmq message metadata (msg_id, read_ct, etc.)
  • stepTask - Step execution details (flows only)

Supabase resources:

  • sql - PostgreSQL client (postgres.js)
  • anonSupabase - Client with anon key (respects RLS)
  • serviceSupabase - Client with service role (bypasses RLS)
  1. Zero Configuration - No connection boilerplate
  2. Managed Resources - pgflow handles pooling and lifecycle
  3. Type Safety - Full TypeScript support
  4. Testable - Mock only what you use:
// Handler uses only env? Test with only env:
await handler(input, { env: { API_KEY: 'test' } });

Existing handlers continue to work. Add the context parameter when you need platform resources:

// Old handlers work fine
async function handler(input) { return { ok: true }; }
// New handlers get context
async function handler(input, ctx) {
const data = await ctx.sql`SELECT * FROM table`;
return { ok: true, count: data.length };
}

Updated packages: @pgflow/edge-worker and @pgflow/dsl