diff --git a/packages/utils/docs/profiler.md b/packages/utils/docs/profiler.md index e5249632cb..daa8452c69 100644 --- a/packages/utils/docs/profiler.md +++ b/packages/utils/docs/profiler.md @@ -47,7 +47,6 @@ The `Profiler` class provides a clean, type-safe API for performance monitoring utils: { track: 'Utils', color: 'primary' }, core: { track: 'Core', color: 'primary-light' }, }, - enabled: true, }); ``` @@ -207,7 +206,6 @@ const profiler = new Profiler({ utils: { track: 'Utils', color: 'primary' }, core: { track: 'Core', color: 'primary-light' }, }, - enabled: true, }); // Simple measurement @@ -283,6 +281,31 @@ The profiler automatically subscribes to process events (`exit`, `SIGINT`, `SIGT The `close()` method is idempotent and safe to call from exit handlers. It unsubscribes from exit handlers, closes the WAL sink, and unsubscribes from the performance observer, ensuring all buffered performance data is written before process termination. +### Profiler Lifecycle States + +The NodeJSProfiler follows a state machine with three distinct states: + +**State Machine Flow** + +``` +idle ⇄ running + ↓ ↓ + └──→ closed +``` + +- **idle**: Profiler is initialized but not actively collecting measurements. WAL sink is closed and performance observer is unsubscribed. +- **running**: Profiler is actively collecting performance measurements. WAL sink is open and performance observer is subscribed. +- **closed**: Profiler has been closed and all buffered data has been flushed to disk. Resources have been fully released. This state is irreversible. + +**State Transitions:** + +- `idle` → `running`: Occurs when `setEnabled(true)` is called. Enables profiling, opens WAL sink, and subscribes to performance observer. +- `running` → `idle`: Occurs when `setEnabled(false)` is called. Disables profiling, unsubscribes from performance observer, and closes WAL sink (sink will be reopened on re-enable). +- `running` → `closed`: Occurs when `close()` is called. Disables profiling, unsubscribes, closes sink, finalizes shards, and unsubscribes exit handlers (irreversible). +- `idle` → `closed`: Occurs when `close()` is called. Closes sink if it was opened, finalizes shards, and unsubscribes exit handlers (irreversible). + +Once a state transition to `closed` occurs, there are no transitions back to previous states. This ensures data integrity and prevents resource leaks. + ## Configuration ```ts @@ -295,22 +318,86 @@ new NodejsProfiler(options: NodejsProfilerOptions` | _required_ | Function that encodes raw PerformanceEntry objects into domain-specific types | -| `captureBufferedEntries` | `boolean` | `true` | Whether to capture performance entries that occurred before observation started | -| `flushThreshold` | `number` | `20` | Threshold for triggering queue flushes based on queue length | -| `maxQueueSize` | `number` | `10_000` | Maximum number of items allowed in the queue before new entries are dropped | +| Property | Type | Default | Description | +| ------------------------ | --------------------------------------- | ---------------- | ------------------------------------------------------------------------------------- | +| `format` | `ProfilerFormat` | _required_ | WAL format configuration for sharded write-ahead logging, including `encodePerfEntry` | +| `measureName` | `string` | _auto-generated_ | Optional folder name for sharding. If not provided, a new group ID will be generated | +| `outDir` | `string` | `'tmp/profiles'` | Output directory for WAL shards and final files | +| `format.encodePerfEntry` | `PerformanceEntryEncoder` | _required_ | Function that encodes raw PerformanceEntry objects into domain-specific types | +| `captureBufferedEntries` | `boolean` | `true` | Whether to capture performance entries that occurred before observation started | +| `flushThreshold` | `number` | `20` | Threshold for triggering queue flushes based on queue length | +| `maxQueueSize` | `number` | `10_000` | Maximum number of items allowed in the queue before new entries are dropped | + +### Environment Variables + +The NodeJSProfiler can be configured using environment variables, which override the corresponding options when not explicitly provided: + +| Environment Variable | Type | Default | Description | +| -------------------------- | -------- | ---------------- | ---------------------------------------------------------------------------------------------------------------------------------- | +| `CP_PROFILING` | `string` | _unset_ | Enables or disables profiling globally. Set to `'true'` to enable, `'false'` or unset to disable. | +| `DEBUG` | `string` | _unset_ | Enables debug mode for profiler state transitions. When set to `'true'`, state transitions create performance marks for debugging. | +| `CP_PROFILER_OUT_DIR` | `string` | `'tmp/profiles'` | Output directory for WAL shards and final files. Overrides the `outDir` option. | +| `CP_PROFILER_MEASURE_NAME` | `string` | _auto-generated_ | Measure name used for sharding. Overrides the `measureName` option. If not provided, a new group ID will be generated. | + +```bash +# Enable profiling with custom output directory +CP_PROFILING=true CP_PROFILER_OUT_DIR=/path/to/profiles npm run dev + +# Enable profiling with debug mode and custom measure name +CP_PROFILING=true DEBUG=true CP_PROFILER_MEASURE_NAME=my-measure npm run dev +``` ## API Methods The NodeJSProfiler inherits all API methods from the base Profiler class and adds additional methods for queue management and WAL lifecycle control. -| Method | Description | -| ------------------------------------ | ------------------------------------------------------------------------------- | -| `getStats()` | Returns comprehensive queue statistics for monitoring and debugging. | -| `flush()` | Forces immediate writing of all queued performance entries to the WAL. | -| `setEnabled(enabled: boolean): void` | Controls profiling at runtime with automatic WAL/observer lifecycle management. | +| Method | Description | +| ------------------------------------ | ---------------------------------------------------------------------------------------- | +| `stats` | Returns comprehensive queue statistics and profiling state for monitoring and debugging. | +| `state` | Returns current profiler state (`'idle' \| 'running' \| 'closed'`). | +| `close()` | Closes profiler and releases resources. Idempotent, safe for exit handlers. | +| `flush()` | Forces immediate writing of all queued performance entries to the WAL. | +| `setEnabled(enabled: boolean): void` | Controls profiling at runtime with automatic WAL/observer lifecycle management. | + +### Profiler state + +```ts +profiler.state: 'idle' | 'running' | 'closed' +``` + +Returns the current profiler state. Use this to check the profiler's lifecycle state without accessing the full stats object. + +```ts +// Check current state +if (profiler.state === 'running') { + console.log('Profiler is actively collecting measurements'); +} else if (profiler.state === 'idle') { + console.log('Profiler is initialized but not collecting'); +} else { + console.log('Profiler has been closed'); +} +``` + +### Closing the profiler + +```ts +profiler.close(): void +``` + +Closes profiler and releases resources. This method is idempotent and safe to call from exit handlers. When called, it transitions the profiler to the `closed` state, which is irreversible. All buffered data is flushed, shards are finalized, and exit handlers are unsubscribed. + +```ts +// Close profiler when done +profiler.close(); + +// Safe to call multiple times (idempotent) +profiler.close(); // No-op if already closed + +// Check if closed +if (profiler.state === 'closed') { + console.log('Profiler resources have been released'); +} +``` ### Runtime control with Write Ahead Log lifecycle management @@ -327,13 +414,23 @@ await performHeavyOperation(); profiler.setEnabled(true); // WAL reopens and observer resubscribes ``` -### Queue statistics +### Profiler statistics ```ts -profiler.getStats(): { - enabled: boolean; - observing: boolean; - walOpen: boolean; +profiler.stats: { + profilerState: 'idle' | 'running' | 'closed'; + debug: boolean; + sharderState: 'active' | 'finalized' | 'cleaned'; + shardCount: number; + groupId: string; + isCoordinator: boolean; + isFinalized: boolean; + isCleaned: boolean; + finalFilePath: string; + shardFileCount: number; + shardFiles: string[]; + shardOpen: boolean; + shardPath: string; isSubscribed: boolean; queued: number; dropped: number; @@ -345,16 +442,6 @@ profiler.getStats(): { } ``` -Returns comprehensive queue statistics for monitoring and debugging. Provides insight into the current state of the performance entry queue, useful for monitoring memory usage and processing throughput. - -```ts -const stats = profiler.getStats(); -console.log(`Enabled: ${stats.enabled}, WAL Open: ${stats.walOpen}, Observing: ${stats.observing}, Subscribed: ${stats.isSubscribed}, Queued: ${stats.queued}`); -if (stats.enabled && stats.walOpen && stats.observing && stats.isSubscribed && stats.queued > stats.flushThreshold) { - console.log('Queue nearing capacity, consider manual flush'); -} -``` - ### Manual flushing ```ts diff --git a/packages/utils/eslint.config.js b/packages/utils/eslint.config.js index fb34f86af5..cfb409c320 100644 --- a/packages/utils/eslint.config.js +++ b/packages/utils/eslint.config.js @@ -13,7 +13,11 @@ export default tseslint.config( }, }, { - files: ['packages/utils/src/lib/**/wal*.ts'], + files: [ + 'packages/utils/src/lib/**/wal*.ts', + 'packages/utils/src/lib/**/wal*.test.ts', + 'packages/utils/src/lib/profiler/*.test.ts', + ], rules: { 'n/no-sync': 'off', }, diff --git a/packages/utils/mocks/README.md b/packages/utils/mocks/README.md new file mode 100644 index 0000000000..da91fde913 --- /dev/null +++ b/packages/utils/mocks/README.md @@ -0,0 +1,29 @@ +# Mocks + +## multiprocess-profiling + +The `profiler-worker.mjs` script demonstrates multiprocess profiling by spawning N child processes that perform work and generate performance traces. + +### Expected Output + +**Console:** + +- JSON object containing profiler statistics (profiler state, shard info, queue stats, etc.) + +**Files:** + +- A timestamped directory in `CP_PROFILER_OUT_DIR` (e.g., `20260131-210017-052/`) + - `trace....jsonl` - WAL format trace files (one per process) + - `trace..json` - Consolidated trace file in Chrome DevTools format + +### Usage + +```bash +CP_PROFILING=true DEBUG=true CP_PROFILER_OUT_DIR=/path/to/output npx tsx packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs +``` + +**Example:** + +```bash + CP_PROFILING=true DEBUG=true CP_PROFILER_OUT_DIR=./tmp/int/utils npx tsx --tsconfig tsconfig.base.json packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs 3 +``` diff --git a/packages/utils/mocks/fixtures/minimal-trace-async-events-user-timing-devtools-colors.json b/packages/utils/mocks/fixtures/minimal-trace-async-events-user-timing-devtools-colors.json new file mode 100644 index 0000000000..6efadb1a85 --- /dev/null +++ b/packages/utils/mocks/fixtures/minimal-trace-async-events-user-timing-devtools-colors.json @@ -0,0 +1,104 @@ +{ + "traceEvents": [ + { + "cat": "disabled-by-default-devtools.timeline", + "name": "TracingStartedInBrowser", + "ph": "I", + "pid": 1, + "tid": 0, + "ts": 1, + "s": "t", + "args": { + "data": { + "frames": [ + { + "processId": 1, + "url": "file://has-to-be-a-valid-URL-pattern" + } + ] + } + } + }, + { + "args": { + "description": "Artificial RunTask event to mark end of the trace" + }, + "cat": "devtools.timeline", + "dur": 10, + "name": "RunTask", + "ph": "X", + "pid": 1, + "tid": 0, + "ts": 1 + }, + { + "args": { + "data": { + "detail": "{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"CustomT\",\"trackGroup\":\"CustomG\"}}" + } + }, + "cat": "blink.user_timing", + "name": "measure-1:start", + "id2": { "local": "0x2" }, + "s": "t", + "ph": "I", + "pid": 1, + "tid": 1, + "ts": 44 + }, + + { + "cat": "blink.user_timing", + "s": "t", + "ph": "b", + "name": "measure-1", + "pid": 1, + "tid": 1, + "ts": 45, + "id2": { "local": "0x3" }, + "args": { + "detail": "{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"CustomT\",\"trackGroup\":\"CustomG\"}}" + } + }, + { + "cat": "blink.user_timing", + "s": "t", + "ph": "e", + "name": "measure-1", + "pid": 1, + "tid": 1, + "ts": 65, + "id2": { "local": "0x3" }, + "args": { + "detail": "{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"CustomT\",\"trackGroup\":\"CustomG\"}}" + } + }, + { + "args": { + "data": { + "detail": "{\"devtools\":{\"dataType\":\"marker\",\"track\":\"CustomT\",\"trackGroup\":\"CustomG\"}}" + } + }, + "cat": "blink.user_timing", + "name": "measure-1:end", + "id2": { "local": "0x5" }, + "s": "t", + "ph": "I", + "pid": 1, + "tid": 1, + "ts": 66 + }, + { + "args": { + "description": "Artificial RunTask event to mark end of the trace" + }, + "cat": "devtools.timeline", + "dur": 10, + "name": "RunTask", + "ph": "X", + "pid": 1, + "tid": 0, + "ts": 165 + } + ] +} diff --git a/packages/utils/mocks/multiprocess-profiling/profiler-worker-child.mjs b/packages/utils/mocks/multiprocess-profiling/profiler-worker-child.mjs new file mode 100644 index 0000000000..0125c132e4 --- /dev/null +++ b/packages/utils/mocks/multiprocess-profiling/profiler-worker-child.mjs @@ -0,0 +1,14 @@ +import { NodejsProfiler } from '../../src/lib/profiler/profiler-node.js'; +import { + createBufferedEvents, + getProfilerConfig, + performDummyWork, +} from './utils.js'; + +await createBufferedEvents(); + +const profiler = new NodejsProfiler(getProfilerConfig()); + +await performDummyWork(profiler); + +profiler.close(); diff --git a/packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs b/packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs new file mode 100644 index 0000000000..2ab9f96eff --- /dev/null +++ b/packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs @@ -0,0 +1,77 @@ +import { spawn } from 'node:child_process'; +import path from 'node:path'; +import { fileURLToPath } from 'node:url'; +import { NodejsProfiler } from '../../src/lib/profiler/profiler-node.js'; +import { createBufferedEvents, getProfilerConfig } from './utils.js'; + +const [numProcesses] = process.argv.slice(2); + +if (!numProcesses) { + console.error('Usage: node profiler-worker.mjs '); + // eslint-disable-next-line unicorn/no-process-exit,n/no-process-exit + process.exit(1); +} + +const numProcs = Number.parseInt(numProcesses, 10); +if (Number.isNaN(numProcs) || numProcs < 1) { + console.error('numProcesses must be a positive integer'); + // eslint-disable-next-line unicorn/no-process-exit,n/no-process-exit + process.exit(1); +} + +const workerScriptPath = path.join( + fileURLToPath(path.dirname(import.meta.url)), + './profiler-worker-child.mjs', +); + +let profiler; +try { + await createBufferedEvents(); + + profiler = new NodejsProfiler(getProfilerConfig()); + + await profiler.measureAsync('profiler-worker', async () => { + const processes = Array.from( + { length: numProcs }, + (_, i) => + new Promise((resolve, reject) => { + const child = spawn('npx', ['tsx', workerScriptPath], { + stdio: 'pipe', + shell: process.platform === 'win32', + }); + + child.on('close', code => { + if (code === 0) { + resolve(code); + } else { + reject(new Error(`Process ${i + 1} exited with code ${code}`)); + } + }); + + child.on('error', reject); + }), + ); + await Promise.all(processes); + }); + + profiler.close(); + // eslint-disable-next-line no-console + console.log(JSON.stringify(profiler.stats, null, 2)); +} catch (error) { + // Ensure profiler is closed and stats are output even on error + if (profiler && profiler.stats.profilerState !== 'closed') { + profiler.close(); + } + // Output stats if profiler was initialized, otherwise exit with error + if (profiler) { + // eslint-disable-next-line no-console + console.log(JSON.stringify(profiler.stats, null, 2)); + // Exit successfully since we've output the stats that the test needs + // eslint-disable-next-line unicorn/no-process-exit,n/no-process-exit + process.exit(0); + } else { + console.error('Failed to initialize profiler:', error); + // eslint-disable-next-line unicorn/no-process-exit,n/no-process-exit + process.exit(1); + } +} diff --git a/packages/utils/mocks/multiprocess-profiling/utils.ts b/packages/utils/mocks/multiprocess-profiling/utils.ts new file mode 100644 index 0000000000..91ca6352e6 --- /dev/null +++ b/packages/utils/mocks/multiprocess-profiling/utils.ts @@ -0,0 +1,115 @@ +import { + NodejsProfiler, + type NodejsProfilerOptionsWithFormat, +} from '../../src/lib/profiler/profiler-node.js'; +import { entryToTraceEvents } from '../../src/lib/profiler/trace-file-utils.js'; +import type { TraceEvent } from '../../src/lib/profiler/trace-file.type.js'; +import { traceEventWalFormat } from '../../src/lib/profiler/wal-json-trace.js'; +import { + asOptions, + trackEntryPayload, +} from '../../src/lib/user-timing-extensibility-api-utils.js'; +import type { + ActionTrackEntryPayload, + TrackMeta, +} from '../../src/lib/user-timing-extensibility-api.type.js'; + +export function getTrackConfig(): TrackMeta { + return { + track: `Track: ${process.pid}`, + trackGroup: 'Multiprocess', + }; +} + +/** + * Default profiler configuration for multiprocess profiling mocks + */ +export function getProfilerConfig( + options?: Partial< + NodejsProfilerOptionsWithFormat< + TraceEvent, + Record + > + >, +): NodejsProfilerOptionsWithFormat< + TraceEvent, + Record +> { + return { + format: { + ...traceEventWalFormat, + encodePerfEntry: entryToTraceEvents, + }, + ...getTrackConfig(), + ...options, + }; +} + +/** + * Creates buffered performance marks and measures before profiler initialization + */ +export async function createBufferedEvents(): Promise { + const bM1 = `buffered-mark-${process.pid}`; + performance.mark( + bM1, + asOptions( + trackEntryPayload({ + ...getTrackConfig(), + color: 'tertiary', + }), + ), + ); + const intervalDelay = Math.floor(Math.random() * 50) + 25; + await new Promise(resolve => setTimeout(resolve, intervalDelay)); + performance.measure(`buffered-${process.pid}`, { + start: bM1, + ...asOptions( + trackEntryPayload({ + ...getTrackConfig(), + color: 'tertiary', + }), + ), + }); +} + +/** + * Performs dummy work with random intervals and work packages + */ +export async function performDummyWork( + profiler: NodejsProfiler, +): Promise { + profiler.marker(`process-${process.pid}:process-start`, { + tooltipText: `Process ${process.pid} started`, + }); + + // Random number of intervals (1-3) - reduced from 2-5 + const numIntervals = Math.floor(Math.random() * 3) + 1; + + // eslint-disable-next-line functional/no-loop-statements + for (let interval = 0; interval < numIntervals; interval++) { + // Random interval delay (25-100ms) + const intervalDelay = Math.floor(Math.random() * 75) + 25; + await new Promise(resolve => setTimeout(resolve, intervalDelay)); + + // Random number of work packages per interval (1-3) + const numWorkPackages = Math.floor(Math.random() * 3) + 1; + + // eslint-disable-next-line functional/no-loop-statements + for (let pkg = 0; pkg < numWorkPackages; pkg++) { + // Random work size (100-2,500,000 elements) + const workSize = Math.floor(Math.random() * 2_500_000); + + profiler.measure( + `process-${process.pid}:interval-${interval}:work-${pkg}`, + () => { + const arr = Array.from({ length: workSize }, (_, i) => i); + return arr.reduce((sum, x) => sum + x * Math.random(), 0); + }, + ); + } + } + + profiler.marker(`process-${process.pid}:process-end`, { + tooltipText: `Process ${process.pid} completed ${numIntervals} intervals`, + }); +} diff --git a/packages/utils/mocks/omit-trace-json.ts b/packages/utils/mocks/omit-trace-json.ts new file mode 100644 index 0000000000..075e4426d3 --- /dev/null +++ b/packages/utils/mocks/omit-trace-json.ts @@ -0,0 +1,189 @@ +import * as fs from 'node:fs/promises'; +import path from 'node:path'; +import { + decodeEvent, + encodeEvent, + frameName, + frameTreeNodeId, +} from '../src/lib/profiler/trace-file-utils.js'; +import type { + TraceEvent, + TraceEventContainer, + TraceMetadata, +} from '../src/lib/profiler/trace-file.type'; + +const BASE_TS = 1_700_000_005_000_000; +const FIXED_TIME = '2026-01-28T14:29:27.995Z'; + +const read = (p: string) => fs.readFile(p, 'utf8').then(s => s.trim()); +const parseJsonl = (s: string) => + s + .split('\n') + .filter(Boolean) + .map(l => JSON.parse(l)); +const parseDecodeJsonl = (s: string) => parseJsonl(s).map(decodeEvent); + +const normMeta = ( + m?: TraceMetadata | Record, + keepGen = true, +): TraceMetadata | undefined => + m + ? ({ + ...(keepGen + ? m + : Object.fromEntries( + Object.entries(m).filter(([k]) => k !== 'generatedAt'), + )), + startTime: FIXED_TIME, + ...(keepGen && { generatedAt: FIXED_TIME }), + } as TraceMetadata) + : undefined; + +const normalizeDetail = (d: unknown): unknown => { + const o = + typeof d === 'string' + ? JSON.parse(d) + : typeof d === 'object' && d + ? d + : null; + const props = o?.devtools?.properties; + if (!Array.isArray(props)) return d; + + const isTransition = props.some( + e => Array.isArray(e) && e[0] === 'Transition', + ); + + return { + ...o, + devtools: { + ...o.devtools, + properties: props.map(e => { + if (!Array.isArray(e) || typeof e[0] !== 'string') return e; + const [k, v] = e; + if (isTransition) { + if (k.toLowerCase() === 'groupid') return [k, 'group-id']; + if (k.toLowerCase().includes('path')) + return [k, `path/to/${path.basename(String(v))}`]; + } + if (k.includes('Path') || k.includes('Files')) + return [ + k, + Array.isArray(v) + ? v.map(x => path.basename(String(x))) + : path.basename(String(v)), + ]; + return e; + }), + }, + }; +}; + +/* ───────────── Context ───────────── */ +const uniq = (v: (T | undefined)[]) => [ + ...new Set(v.filter(Boolean) as T[]), +]; +const ctx = (e: TraceEvent[], base = BASE_TS) => ({ + pid: new Map( + [...uniq(e.map(x => x.pid))].sort().map((v, i) => [v, 10_001 + i]), + ), + tid: new Map([...uniq(e.map(x => x.tid))].sort().map((v, i) => [v, i + 1])), + ts: new Map( + [...uniq(e.map(x => x.ts))].sort().map((v, i) => [v, base + i * 100]), + ), + id: new Map( + [...uniq(e.map(x => x.id2?.local))] + .sort() + .map((v, i) => [v, `0x${(i + 1).toString(16)}`]), + ), +}); + +const mapIf = (v: T | undefined, m: Map, k: string) => + v != null && m.has(v) ? { [k]: m.get(v)! } : {}; + +const normalizeEvent = ( + e: TraceEvent, + c: ReturnType, +): TraceEvent => { + const pid = c.pid.get(e.pid) ?? e.pid; + const tid = c.tid.get(e.tid) ?? e.tid; + + const args = e.args && { + ...e.args, + ...(e.args.detail !== undefined && { + detail: normalizeDetail(e.args.detail), + }), + ...(e.args.data && + typeof e.args.data === 'object' && { + data: { + ...(e.args.data as any), + ...(pid && + tid && + 'frameTreeNodeId' in e.args.data && { + frameTreeNodeId: frameTreeNodeId(pid, tid), + }), + ...(Array.isArray((e.args.data as any).frames) && + pid && + tid && { + frames: (e.args.data as any).frames.map((f: any) => ({ + ...f, + processId: pid, + frame: frameName(pid, tid), + })), + }), + }, + }), + }; + + return { + ...e, + ...mapIf(e.pid, c.pid, 'pid'), + ...mapIf(e.tid, c.tid, 'tid'), + ...mapIf(e.ts, c.ts, 'ts'), + ...(e.id2?.local && + c.id.has(e.id2.local) && { + id2: { ...e.id2, local: c.id.get(e.id2.local)! }, + }), + ...(args && { args }), + }; +}; + +export const normalizeTraceEvents = ( + events: TraceEvent[], + { baseTimestampUs = BASE_TS } = {}, +) => { + if (events.length === 0) return []; + const decoded = events.map(decodeEvent); + const c = ctx(decoded, baseTimestampUs); + return decoded.map(e => normalizeEvent(e, c)); +}; + +export const normalizeAndFormatEvents = ( + input: TraceEvent[] | string, + opts?: { baseTimestampUs: number }, +) => + typeof input === 'string' + ? input.trim() + ? normalizeTraceEvents(parseJsonl(input).map(decodeEvent), opts) + .map(encodeEvent) + .map(o => JSON.stringify(o)) + .join('\n') + (input.endsWith('\n') ? '\n' : '') + : input + : normalizeTraceEvents(input, opts); + +export const loadAndOmitTraceJsonl = (p: `${string}.jsonl`, o?: any) => + read(p).then(s => normalizeAndFormatEvents(parseDecodeJsonl(s), o)); + +export const loadAndOmitTraceJson = async ( + p: string, + o?: { baseTimestampUs: number }, +): Promise => { + const j = JSON.parse(await read(p)); + if (!j?.traceEvents) return { traceEvents: [] }; + const r = { + traceEvents: normalizeAndFormatEvents(j.traceEvents.map(decodeEvent), o), + ...(j.displayTimeUnit && { displayTimeUnit: j.displayTimeUnit }), + ...(j.metadata && { metadata: normMeta(j.metadata) }), + }; + JSON.stringify(r); + return r; +}; diff --git a/packages/utils/mocks/omit-trace-json.unit.test.ts b/packages/utils/mocks/omit-trace-json.unit.test.ts new file mode 100644 index 0000000000..6b40b3c328 --- /dev/null +++ b/packages/utils/mocks/omit-trace-json.unit.test.ts @@ -0,0 +1,405 @@ +import { vol } from 'memfs'; +import { expect } from 'vitest'; +import { MEMFS_VOLUME } from '@code-pushup/test-utils'; +import type { TraceEvent } from '../src/lib/profiler/trace-file.type'; +import { + loadAndOmitTraceJson, + loadAndOmitTraceJsonl, + normalizeAndFormatEvents, +} from './omit-trace-json.js'; + +describe('normalizeAndFormatEvents', () => { + it('should return empty string unchanged', () => { + expect(normalizeAndFormatEvents('')).toBe(''); + }); + + it('should return whitespace-only string unchanged', () => { + expect(normalizeAndFormatEvents(' \n\t ')).toBe(' \n\t '); + }); + + it('should return empty JSONL unchanged', () => { + expect(normalizeAndFormatEvents('\n\n')).toBe('\n\n'); + }); + + it('should normalize single event with all fields', () => { + expect( + normalizeAndFormatEvents( + '{"pid":12345,"tid":999,"ts":1234567890,"id2":{"local":"0xabc123"},"name":"test"}\n', + ), + ).toBe( + '{"pid":10001,"tid":1,"ts":1700000005000000,"id2":{"local":"0x1"},"name":"test"}\n', + ); + }); + + it('should normalize ts field with custom baseTimestampUs', () => { + const customBase = 2_000_000_000_000_000; + expect( + normalizeAndFormatEvents('{"ts":1234567890}\n', { + baseTimestampUs: customBase, + }), + ).toBe('{"ts":2000000000000000}\n'); + }); + + it('should preserve event order when timestamps are out of order', () => { + const input = + '{"ts":300,"name":"third"}\n{"ts":100,"name":"first"}\n{"ts":200,"name":"second"}\n'; + expect(normalizeAndFormatEvents(input)).toBe( + '{"ts":1700000005000200,"name":"third"}\n{"ts":1700000005000000,"name":"first"}\n{"ts":1700000005000100,"name":"second"}\n', + ); + }); + + it('should preserve event order when PIDs are out of order', () => { + const input = + '{"pid":300,"name":"third"}\n{"pid":100,"name":"first"}\n{"pid":200,"name":"second"}\n'; + expect(normalizeAndFormatEvents(input)).toBe( + '{"pid":10003,"name":"third"}\n{"pid":10001,"name":"first"}\n{"pid":10002,"name":"second"}\n', + ); + }); + + it('should handle decoding of instantEvents with args.data.detail', () => { + const rawInstantEvent: TraceEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'plugin-eslint:run-eslint:start', + pid: 8057, + tid: 0, + ts: 1_769_814_970_883_535, + args: { + data: { + detail: + '{"devtools":{"dataType":"track-entry","track":"External","trackGroup":"<✓> Code PushUp","color":"secondary"}}', + }, + }, + }; + + expect(normalizeAndFormatEvents([rawInstantEvent])).toStrictEqual([ + { + cat: 'blink.user_timing', + ph: 'i', + name: 'plugin-eslint:run-eslint:start', + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + args: { + data: { + detail: { + devtools: { + dataType: 'track-entry', + track: 'External', + trackGroup: '<✓> Code PushUp', + color: 'secondary', + }, + }, + }, + }, + }, + ]); + }); + + it('should handle decoding of spanEvents with args.detail', () => { + const rawSpanEvent = { + cat: 'blink.user_timing', + s: 't', + ph: 'b' as const, + name: 'plugin-eslint:run-eslint', + pid: 8057, + tid: 0, + ts: 1_769_814_970_883_536, + id2: { local: '0x3' }, + args: { + detail: + '{"devtools":{"dataType":"track-entry","track":"External","trackGroup":"<✓> Code PushUp","color":"secondary"}}', + }, + } as TraceEvent; + + expect(normalizeAndFormatEvents([rawSpanEvent])).toStrictEqual([ + { + cat: 'blink.user_timing', + s: 't', + ph: 'b', + name: 'plugin-eslint:run-eslint', + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + id2: { local: '0x1' }, + args: { + detail: { + devtools: { + dataType: 'track-entry', + track: 'External', + trackGroup: '<✓> Code PushUp', + color: 'secondary', + }, + }, + }, + }, + ]); + }); + + it('should handle events with frame normalization', () => { + const rawEvent = { + cat: 'devtools.timeline', + s: 't', + ph: 'i' as const, + name: 'TracingStartedInBrowser', + pid: 8057, + tid: 0, + ts: 1_769_814_970_882_268, + args: { + data: { + frameTreeNodeId: 805_700, + frames: [ + { + frame: 'FRAME0P8057T0', + isInPrimaryMainFrame: true, + processId: 8057, + url: 'trace.json', + }, + ], + }, + }, + } as TraceEvent; + + expect(normalizeAndFormatEvents([rawEvent])).toStrictEqual([ + { + cat: 'devtools.timeline', + s: 't', + ph: 'i', + name: 'TracingStartedInBrowser', + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + args: { + data: { + frameTreeNodeId: 805_700, + frames: [ + { + frame: 'FRAME0P8057T0', + isInPrimaryMainFrame: true, + processId: 8057, + url: 'trace.json', + }, + ], + }, + }, + }, + ]); + }); + + it('should handle multiple events with different pid/tid/ts/id2', () => { + const events = [ + { + cat: 'test', + ph: 'i' as const, + pid: 100, + tid: 5, + ts: 100, + name: 'first', + }, + { + cat: 'test', + ph: 'b' as const, + pid: 200, + tid: 3, + ts: 300, + name: 'second', + id2: { local: '0xabc' }, + }, + { + cat: 'test', + ph: 'b' as const, + pid: 150, + tid: 7, + ts: 200, + name: 'third', + id2: { local: '0xdef' }, + }, + ] as TraceEvent[]; + + expect(normalizeAndFormatEvents(events)).toStrictEqual([ + { + cat: 'test', + ph: 'i', + pid: 10_001, + tid: 2, + ts: 1_700_000_005_000_000, + name: 'first', + }, // pid 100->10001, tid 5->2 (sorted: 3->1, 5->2, 7->3) + { + cat: 'test', + ph: 'b', + pid: 10_003, + tid: 1, + ts: 1_700_000_005_000_200, + name: 'second', + id2: { local: '0x1' }, + }, // pid 200->10003, tid 3->1 + { + cat: 'test', + ph: 'b', + pid: 10_002, + tid: 3, + ts: 1_700_000_005_000_100, + name: 'third', + id2: { local: '0x2' }, + }, // pid 150->10002, tid 7->3 + ]); + }); + + it('should handle empty array', () => { + expect(normalizeAndFormatEvents([])).toStrictEqual([]); + }); + + it('should handle events with both args.detail and args.data.detail', () => { + const rawEvent: TraceEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'test', + pid: 8057, + tid: 0, + ts: 1_769_814_970_883_535, + args: { + detail: '{"type":"mark"}', + data: { detail: '{"type":"span"}' }, + }, + }; + + expect(normalizeAndFormatEvents([rawEvent])).toStrictEqual([ + { + cat: 'blink.user_timing', + ph: 'i', + name: 'test', + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + args: { + detail: { type: 'mark' }, + data: { detail: { type: 'span' } }, + }, + }, + ]); + }); +}); + +describe('loadAndOmitTraceJsonl', () => { + it('should load and normalize JSONL file', async () => { + vol.fromJSON( + { + 'trace.jsonl': + '{"pid":12345,"tid":999,"ts":1234567890,"name":"test"}\n{"pid":54321,"tid":888,"ts":9876543210,"name":"test2"}\n', + }, + MEMFS_VOLUME, + ); + + await expect(loadAndOmitTraceJsonl('trace.jsonl')).resolves.toStrictEqual([ + { pid: 10_001, tid: 2, ts: 1_700_000_005_000_000, name: 'test' }, // tid 999 maps to 2 (sorted: 888->1, 999->2) + { pid: 10_002, tid: 1, ts: 1_700_000_005_000_100, name: 'test2' }, // tid 888 maps to 1 + ]); + }); + + it('should decode args.detail and args.data.detail from JSONL', async () => { + vol.fromJSON( + { + 'trace.jsonl': + '{"pid":8057,"tid":0,"ts":1769814970883535,"args":{"data":{"detail":"{\\"devtools\\":{\\"dataType\\":\\"track-entry\\"}}"}}}\n{"pid":8057,"tid":0,"ts":1769814970883536,"args":{"detail":"{\\"devtools\\":{\\"dataType\\":\\"track-entry\\"}}"}}\n', + }, + MEMFS_VOLUME, + ); + + await expect(loadAndOmitTraceJsonl('trace.jsonl')).resolves.toStrictEqual([ + { + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + args: { data: { detail: { devtools: { dataType: 'track-entry' } } } }, + }, + { + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_100, + args: { detail: { devtools: { dataType: 'track-entry' } } }, + }, + ]); + }); + + it('should use custom baseTimestampUs', async () => { + vol.fromJSON( + { + 'trace.jsonl': '{"ts":1234567890}\n', + }, + MEMFS_VOLUME, + ); + + await expect( + loadAndOmitTraceJsonl('trace.jsonl', { + baseTimestampUs: 2_000_000_000_000_000, + }), + ).resolves.toStrictEqual([{ ts: 2_000_000_000_000_000 }]); + }); +}); + +describe('loadAndOmitTraceJson', () => { + it('should load and normalize single trace container', async () => { + vol.fromJSON( + { + 'trace.json': JSON.stringify({ + traceEvents: [ + { pid: 8057, tid: 0, ts: 1_769_814_970_882_268, name: 'test' }, + ], + }), + }, + MEMFS_VOLUME, + ); + + await expect(loadAndOmitTraceJson('trace.json')).resolves.toStrictEqual({ + traceEvents: [ + { pid: 10_001, tid: 0, ts: 1_700_000_005_000_000, name: 'test' }, + ], + }); + }); + + it('should normalize metadata timestamps', async () => { + vol.fromJSON( + { + 'trace.json': JSON.stringify({ + metadata: { + generatedAt: '2025-01-01T00:00:00.000Z', + startTime: '2025-01-01T00:00:00.000Z', + other: 'value', + }, + traceEvents: [], + }), + }, + MEMFS_VOLUME, + ); + + const result = await loadAndOmitTraceJson('trace.json'); + expect(result).toStrictEqual({ + traceEvents: [], + metadata: { + generatedAt: '2026-01-28T14:29:27.995Z', + startTime: '2026-01-28T14:29:27.995Z', + other: 'value', + }, + }); + }); + + it('should use custom baseTimestampUs', async () => { + vol.fromJSON( + { + 'trace.json': JSON.stringify({ + traceEvents: [{ ts: 1_234_567_890 }], + }), + }, + MEMFS_VOLUME, + ); + + await expect( + loadAndOmitTraceJson('trace.json', { + baseTimestampUs: 2_000_000_000_000_000, + }), + ).resolves.toStrictEqual({ + traceEvents: [{ ts: 2_000_000_000_000_000 }], + }); + }); +}); diff --git a/packages/utils/src/lib/errors.unit.test.ts b/packages/utils/src/lib/errors.unit.test.ts index 6424819ae6..5534dacbe4 100644 --- a/packages/utils/src/lib/errors.unit.test.ts +++ b/packages/utils/src/lib/errors.unit.test.ts @@ -1,7 +1,7 @@ import ansis from 'ansis'; import { z } from 'zod'; import { SchemaValidationError } from '@code-pushup/models'; -import { stringifyError } from './errors.js'; +import { extendError, stringifyError } from './errors.js'; describe('stringifyError', () => { it('should use only message from plain Error instance', () => { @@ -113,3 +113,25 @@ describe('stringifyError', () => { ).toBe(`SchemaValidationError: Invalid ${ansis.bold('User')} […]`); }); }); + +describe('extendError', () => { + it('adds message, appends original error, and keeps cause', () => { + const original = new Error('boom'); + + const extended = extendError(original, 'wrap failed', { + appendOriginalMessage: true, + }); + + expect(extended.message).toBe('wrap failed\nboom'); + expect(extended.cause).toBe(original); + }); + + it('uses only the provided message by default', () => { + const original = new Error('boom'); + + const extended = extendError(original, 'wrap failed'); + + expect(extended.message).toBe('wrap failed'); + expect(extended.cause).toBe(original); + }); +}); diff --git a/packages/utils/src/lib/exit-process.int.test.ts b/packages/utils/src/lib/exit-process.int.test.ts index d915f6317f..ce64f675c5 100644 --- a/packages/utils/src/lib/exit-process.int.test.ts +++ b/packages/utils/src/lib/exit-process.int.test.ts @@ -48,8 +48,10 @@ describe('subscribeProcessExit', () => { (process as any).emit('uncaughtException', testError); - expect(onError).toHaveBeenCalledWith(testError, 'uncaughtException'); - expect(onError).toHaveBeenCalledOnce(); + expect(onError).toHaveBeenCalledExactlyOnceWith( + testError, + 'uncaughtException', + ); expect(onExit).not.toHaveBeenCalled(); }); @@ -60,8 +62,10 @@ describe('subscribeProcessExit', () => { (process as any).emit('unhandledRejection', testReason); - expect(onError).toHaveBeenCalledWith(testReason, 'unhandledRejection'); - expect(onError).toHaveBeenCalledOnce(); + expect(onError).toHaveBeenCalledExactlyOnceWith( + testReason, + 'unhandledRejection', + ); expect(onExit).not.toHaveBeenCalled(); }); @@ -70,8 +74,7 @@ describe('subscribeProcessExit', () => { (process as any).emit('SIGINT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGINT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGINT, { kind: 'signal', signal: 'SIGINT', }); @@ -83,11 +86,13 @@ describe('subscribeProcessExit', () => { (process as any).emit('SIGTERM'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGTERM, { - kind: 'signal', - signal: 'SIGTERM', - }); + expect(onExit).toHaveBeenCalledExactlyOnceWith( + SIGNAL_EXIT_CODES().SIGTERM, + { + kind: 'signal', + signal: 'SIGTERM', + }, + ); expect(onError).not.toHaveBeenCalled(); }); @@ -96,11 +101,13 @@ describe('subscribeProcessExit', () => { (process as any).emit('SIGQUIT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGQUIT, { - kind: 'signal', - signal: 'SIGQUIT', - }); + expect(onExit).toHaveBeenCalledExactlyOnceWith( + SIGNAL_EXIT_CODES().SIGQUIT, + { + kind: 'signal', + signal: 'SIGQUIT', + }, + ); expect(onError).not.toHaveBeenCalled(); }); @@ -109,8 +116,7 @@ describe('subscribeProcessExit', () => { (process as any).emit('exit', 0); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(0, { kind: 'exit' }); + expect(onExit).toHaveBeenCalledExactlyOnceWith(0, { kind: 'exit' }); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).not.toHaveBeenCalled(); }); @@ -120,8 +126,7 @@ describe('subscribeProcessExit', () => { (process as any).emit('exit', 1); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(1, { kind: 'exit' }); + expect(onExit).toHaveBeenCalledExactlyOnceWith(1, { kind: 'exit' }); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).not.toHaveBeenCalled(); }); diff --git a/packages/utils/src/lib/exit-process.unit.test.ts b/packages/utils/src/lib/exit-process.unit.test.ts index 3226e650c5..d60ac2ffb5 100644 --- a/packages/utils/src/lib/exit-process.unit.test.ts +++ b/packages/utils/src/lib/exit-process.unit.test.ts @@ -49,8 +49,10 @@ describe('subscribeProcessExit', () => { (process as any).emit('uncaughtException', testError); - expect(onError).toHaveBeenCalledWith(testError, 'uncaughtException'); - expect(onError).toHaveBeenCalledOnce(); + expect(onError).toHaveBeenCalledExactlyOnceWith( + testError, + 'uncaughtException', + ); expect(onExit).not.toHaveBeenCalled(); }); @@ -61,8 +63,10 @@ describe('subscribeProcessExit', () => { (process as any).emit('unhandledRejection', testReason); - expect(onError).toHaveBeenCalledWith(testReason, 'unhandledRejection'); - expect(onError).toHaveBeenCalledOnce(); + expect(onError).toHaveBeenCalledExactlyOnceWith( + testReason, + 'unhandledRejection', + ); expect(onExit).not.toHaveBeenCalled(); }); @@ -73,8 +77,7 @@ describe('subscribeProcessExit', () => { (process as any).emit('SIGINT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGINT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGINT, { kind: 'signal', signal: 'SIGINT', }); @@ -89,11 +92,13 @@ describe('subscribeProcessExit', () => { (process as any).emit('SIGTERM'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGTERM, { - kind: 'signal', - signal: 'SIGTERM', - }); + expect(onExit).toHaveBeenCalledExactlyOnceWith( + SIGNAL_EXIT_CODES().SIGTERM, + { + kind: 'signal', + signal: 'SIGTERM', + }, + ); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGTERM); }); @@ -105,11 +110,13 @@ describe('subscribeProcessExit', () => { (process as any).emit('SIGQUIT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGQUIT, { - kind: 'signal', - signal: 'SIGQUIT', - }); + expect(onExit).toHaveBeenCalledExactlyOnceWith( + SIGNAL_EXIT_CODES().SIGQUIT, + { + kind: 'signal', + signal: 'SIGQUIT', + }, + ); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGQUIT); }); @@ -121,8 +128,7 @@ describe('subscribeProcessExit', () => { (process as any).emit('SIGINT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGINT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGINT, { kind: 'signal', signal: 'SIGINT', }); @@ -135,11 +141,13 @@ describe('subscribeProcessExit', () => { (process as any).emit('SIGTERM'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGTERM, { - kind: 'signal', - signal: 'SIGTERM', - }); + expect(onExit).toHaveBeenCalledExactlyOnceWith( + SIGNAL_EXIT_CODES().SIGTERM, + { + kind: 'signal', + signal: 'SIGTERM', + }, + ); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).not.toHaveBeenCalled(); }); @@ -150,8 +158,7 @@ describe('subscribeProcessExit', () => { const exitCode = 42; (process as any).emit('exit', exitCode); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(exitCode, { kind: 'exit' }); + expect(onExit).toHaveBeenCalledExactlyOnceWith(exitCode, { kind: 'exit' }); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).not.toHaveBeenCalled(); }); @@ -165,13 +172,14 @@ describe('subscribeProcessExit', () => { (process as any).emit('uncaughtException', testError); - expect(onError).toHaveBeenCalledWith(testError, 'uncaughtException'); - expect(onError).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(1, { + expect(onError).toHaveBeenCalledExactlyOnceWith( + testError, + 'uncaughtException', + ); + expect(onExit).toHaveBeenCalledExactlyOnceWith(1, { kind: 'fatal', fatal: 'uncaughtException', }); - expect(onExit).toHaveBeenCalledOnce(); }); it('should use custom fatalExitCode when exitOnFatal is true', () => { @@ -188,13 +196,14 @@ describe('subscribeProcessExit', () => { (process as any).emit('uncaughtException', testError); - expect(onError).toHaveBeenCalledWith(testError, 'uncaughtException'); - expect(onError).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(42, { + expect(onError).toHaveBeenCalledExactlyOnceWith( + testError, + 'uncaughtException', + ); + expect(onExit).toHaveBeenCalledExactlyOnceWith(42, { kind: 'fatal', fatal: 'uncaughtException', }); - expect(onExit).toHaveBeenCalledOnce(); }); it('should call onExit with fatal reason for unhandledRejection when exitOnFatal is true', () => { @@ -206,13 +215,14 @@ describe('subscribeProcessExit', () => { (process as any).emit('unhandledRejection', testReason); - expect(onError).toHaveBeenCalledWith(testReason, 'unhandledRejection'); - expect(onError).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(1, { + expect(onError).toHaveBeenCalledExactlyOnceWith( + testReason, + 'unhandledRejection', + ); + expect(onExit).toHaveBeenCalledExactlyOnceWith(1, { kind: 'fatal', fatal: 'unhandledRejection', }); - expect(onExit).toHaveBeenCalledOnce(); }); it('should have correct SIGINT exit code on Windows', () => { @@ -247,8 +257,7 @@ describe('subscribeProcessExit', () => { ).not.toThrow(); (process as any).emit('SIGINT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGINT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGINT, { kind: 'signal', signal: 'SIGINT', }); diff --git a/packages/utils/src/lib/performance-observer.ts b/packages/utils/src/lib/performance-observer.ts index 79446e9747..af5843629d 100644 --- a/packages/utils/src/lib/performance-observer.ts +++ b/packages/utils/src/lib/performance-observer.ts @@ -3,8 +3,6 @@ import { PerformanceObserver, performance, } from 'node:perf_hooks'; -import { isEnvVarEnabled } from './env.js'; -import { PROFILER_DEBUG_ENV_VAR } from './profiler/constants.js'; import type { AppendableSink } from './wal.js'; /** @@ -97,8 +95,10 @@ export type PerformanceObserverOptions = { /** * Whether to enable buffered observation mode. - * When true, captures all performance entries that occurred before observation started. - * When false, only captures entries after subscription begins. + * + * When true, captures all performance marks and measures that exist in the Node.js + * performance buffer at the time `subscribe()` is called using `performance.getEntriesByType()` + * (the native `buffered` option is unreliable in Node.js). * * @default true */ @@ -122,14 +122,12 @@ export type PerformanceObserverOptions = { * @default DEFAULT_MAX_QUEUE_SIZE (10000) */ maxQueueSize?: number; - /** - * Name of the environment variable to check for debug mode. - * When the env var is set to 'true', encode failures create performance marks for debugging. + * Whether debug mode is enabled for encode failures. + * When true, encode failures create performance marks for debugging. * - * @default 'CP_PROFILER_DEBUG' */ - debugEnvVar?: string; + debug: boolean; }; /** @@ -151,7 +149,7 @@ export type PerformanceObserverOptions = { * - Queue cleared after successful batch writes * * - Item Disposition Scenarios 💥 - * - **Encode Failure**: ❌ Items lost when `encode()` throws. Creates perf mark if debug env var (specified by `debugEnvVar`) is set to 'true'. + * - **Encode Failure**: ❌ Items lost when `encode()` throws. Creates perf mark if 'DEBUG' env var is set to 'true'. * - **Sink Write Failure**: 💾 Items stay in queue when sink write fails during flush * - **Sink Closed**: 💾 Items stay in queue when sink is closed during flush * - **Proactive Flush Throws**: 💾 Items stay in queue when `flush()` throws during threshold check @@ -197,6 +195,49 @@ export class PerformanceObserverSink { /** Whether debug mode is enabled for encode failures */ #debug: boolean; + /** Whether buffered entries have been captured at least once */ + #capturedBufferedOnce: boolean = false; + + private processPerformanceEntries(entries: PerformanceEntry[]) { + entries.forEach(entry => { + if (OBSERVED_TYPE_SET.has(entry.entryType as ObservedEntryType)) { + try { + const items = this.encode(entry); + items.forEach(item => { + // ❌ MAX QUEUE OVERFLOW + if (this.#queue.length >= this.#maxQueueSize) { + this.#dropped++; // Items are lost forever + return; + } + + if ( + this.#queue.length >= + this.#maxQueueSize - this.#flushThreshold + ) { + this.flush(); + } + this.#queue.push(item); + this.#addedSinceLastFlush++; + }); + } catch (error) { + // ❌ Encode failure: item lost forever as user has to fix encode function. + this.#dropped++; + if (this.#debug) { + try { + performance.mark(errorToPerfMark(error, entry)); + } catch { + // Ignore mark failures to prevent double errors + } + } + } + } + }); + + if (this.#addedSinceLastFlush >= this.#flushThreshold) { + this.flush(); + } + } + /** * Creates a new PerformanceObserverSink with the specified configuration. * @@ -210,7 +251,7 @@ export class PerformanceObserverSink { captureBufferedEntries, flushThreshold = DEFAULT_FLUSH_THRESHOLD, maxQueueSize = DEFAULT_MAX_QUEUE_SIZE, - debugEnvVar = PROFILER_DEBUG_ENV_VAR, + debug, } = options; this.#encodePerfEntry = encodePerfEntry; this.#sink = sink; @@ -218,15 +259,15 @@ export class PerformanceObserverSink { this.#maxQueueSize = maxQueueSize; validateFlushThreshold(flushThreshold, this.#maxQueueSize); this.#flushThreshold = flushThreshold; - this.#debug = isEnvVarEnabled(debugEnvVar); + this.#debug = debug; } /** * Returns whether debug mode is enabled for encode failures. * - * Debug mode is determined by the environment variable specified by `debugEnvVar` - * (defaults to 'CP_PROFILER_DEBUG'). When enabled, encode failures create - * performance marks for debugging. + * Debug mode is configured via the `debug` option passed to the + * PerformanceObserverSink constructor. When enabled, encode failures + * are recorded as performance marks for debugging. * * @returns true if debug mode is enabled, false otherwise */ @@ -273,9 +314,8 @@ export class PerformanceObserverSink { * * Creates a Node.js PerformanceObserver that monitors 'mark' and 'measure' entries. * The observer uses a bounded queue with proactive flushing to manage memory usage. - * When buffered mode is enabled, any existing buffered entries are immediately flushed. + * When buffered mode is enabled, existing entries are captured via `performance.getEntriesByType()` instead of the unreliable native `buffered` option. * If the sink is closed, items stay in the queue until reopened. - * */ subscribe(): void { if (this.#observer) { @@ -283,53 +323,24 @@ export class PerformanceObserverSink { } this.#observer = new PerformanceObserver(list => { - list.getEntries().forEach(entry => { - if (OBSERVED_TYPE_SET.has(entry.entryType as ObservedEntryType)) { - try { - const items = this.encode(entry); - items.forEach(item => { - // ❌ MAX QUEUE OVERFLOW - if (this.#queue.length >= this.#maxQueueSize) { - this.#dropped++; // Items are lost forever - return; - } - - if ( - this.#queue.length >= - this.#maxQueueSize - this.#flushThreshold - ) { - this.flush(); - } - this.#queue.push(item); - this.#addedSinceLastFlush++; - }); - } catch (error) { - // ❌ Encode failure: item lost forever as user has to fix encode function. - this.#dropped++; - if (this.#debug) { - try { - performance.mark(errorToPerfMark(error, entry)); - } catch { - // Ignore mark failures to prevent double errors - } - } - } - } - }); - - if (this.#addedSinceLastFlush >= this.#flushThreshold) { - this.flush(); - } + this.processPerformanceEntries(list.getEntries()); }); + // Manually capture buffered entries instead of the unreliable native buffered option. + // Only capture buffered entries on the first subscription to prevent duplicates + // when unsubscribe/resubscribe occurs. + if (this.#buffered && !this.#capturedBufferedOnce) { + const existingMarks = performance.getEntriesByType('mark'); + const existingMeasures = performance.getEntriesByType('measure'); + const allEntries = [...existingMarks, ...existingMeasures]; + this.processPerformanceEntries(allEntries); + this.#capturedBufferedOnce = true; + } + this.#observer.observe({ entryTypes: OBSERVED_TYPES, - buffered: this.#buffered, + // Note: buffered option intentionally omitted due to unreliability in Node.js. }); - - if (this.#buffered) { - this.flush(); - } } /** diff --git a/packages/utils/src/lib/performance-observer.unit.test.ts b/packages/utils/src/lib/performance-observer.unit.test.ts index 56c48b3333..70d6aa6243 100644 --- a/packages/utils/src/lib/performance-observer.unit.test.ts +++ b/packages/utils/src/lib/performance-observer.unit.test.ts @@ -69,6 +69,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 1, + debug: false, }; performance.clearMarks(); @@ -86,6 +87,7 @@ describe('PerformanceObserverSink', () => { new PerformanceObserverSink({ sink, encodePerfEntry, + debug: false, }), ).not.toThrow(); expect(MockPerformanceObserver.instances).toHaveLength(0); @@ -98,6 +100,7 @@ describe('PerformanceObserverSink', () => { ...options, captureBufferedEntries: true, flushThreshold: 10, + debug: false, }), ).not.toThrow(); expect(MockPerformanceObserver.instances).toHaveLength(0); @@ -146,35 +149,6 @@ describe('PerformanceObserverSink', () => { ); }); - it('internal PerformanceObserver should observe buffered by default', () => { - const observer = new PerformanceObserverSink(options); - - observer.subscribe(); - expect( - MockPerformanceObserver.lastInstance()?.observe, - ).toHaveBeenCalledWith( - expect.objectContaining({ - buffered: true, - }), - ); - }); - - it('internal PerformanceObserver should observe buffered if buffered option is provided', () => { - const observer = new PerformanceObserverSink({ - ...options, - captureBufferedEntries: true, - }); - - observer.subscribe(); - expect( - MockPerformanceObserver.lastInstance()?.observe, - ).toHaveBeenCalledWith( - expect.objectContaining({ - buffered: true, - }), - ); - }); - it('internal PerformanceObserver should process observed entries', () => { const observer = new PerformanceObserverSink({ ...options, @@ -301,6 +275,7 @@ describe('PerformanceObserverSink', () => { const observer = new PerformanceObserverSink({ sink, encodePerfEntry, + debug: false, }); expect(() => observer.flush()).not.toThrow(); @@ -313,6 +288,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 10, + debug: false, }); sink.open(); observer.subscribe(); @@ -344,6 +320,34 @@ describe('PerformanceObserverSink', () => { expect(MockPerformanceObserver.instances).toHaveLength(0); }); + it('captures buffered entries only once, even after unsubscribe/resubscribe', () => { + const observer = new PerformanceObserverSink({ + ...options, + captureBufferedEntries: true, + flushThreshold: 10, + }); + + performance.mark('buffered-mark-1'); + performance.mark('buffered-mark-2'); + performance.measure( + 'buffered-measure-1', + 'buffered-mark-1', + 'buffered-mark-2', + ); + + observer.subscribe(); + observer.flush(); + + expect(encodePerfEntry).toHaveBeenCalledTimes(3); + + encodePerfEntry.mockClear(); + observer.unsubscribe(); + observer.subscribe(); + observer.flush(); + + expect(encodePerfEntry).not.toHaveBeenCalled(); + }); + it('handles encodePerfEntry errors gracefully and drops items', () => { const failingEncode = vi.fn(() => { throw new Error('Encode failed'); @@ -353,6 +357,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: false, }); observer.subscribe(); @@ -366,35 +371,8 @@ describe('PerformanceObserverSink', () => { expect(stats.queued).toBe(0); }); - describe('debug mode with env var', () => { - const originalEnv = process.env.CP_PROFILER_DEBUG; - - beforeEach(() => { - // Restore original env before each test - if (originalEnv === undefined) { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - } else { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = originalEnv; - } - }); - - afterEach(() => { - // Restore original env after each test - if (originalEnv === undefined) { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - } else { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = originalEnv; - } - }); - - it('creates performance mark when encode fails and debug mode is enabled via env var', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - + describe('debug mode', () => { + it('creates performance mark when encode fails and debug mode is enabled', () => { const failingEncode = vi.fn(() => { throw new Error('EncodeError'); }); @@ -403,6 +381,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: true, }); observer.subscribe(); @@ -423,9 +402,6 @@ describe('PerformanceObserverSink', () => { }); it('does not create performance mark when encode fails and debug mode is disabled', () => { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - const failingEncode = vi.fn(() => { throw new Error('EncodeError'); }); @@ -434,6 +410,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: false, }); performance.clearMarks(); @@ -454,9 +431,6 @@ describe('PerformanceObserverSink', () => { }); it('handles encode errors for unnamed entries correctly', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const failingEncode = vi.fn(() => { throw new Error('EncodeError'); }); @@ -465,6 +439,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: true, }); observer.subscribe(); @@ -482,9 +457,6 @@ describe('PerformanceObserverSink', () => { }); it('handles non-Error objects thrown from encode function', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const failingEncode = vi.fn(() => { throw 'String error'; }); @@ -493,6 +465,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: true, }); observer.subscribe(); @@ -527,6 +500,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: false, }); observer.subscribe(); @@ -556,6 +530,7 @@ describe('PerformanceObserverSink', () => { sink: failingSink as any, encodePerfEntry, flushThreshold: 10, + debug: false, }); observer.subscribe(); @@ -582,6 +557,7 @@ describe('PerformanceObserverSink', () => { encodePerfEntry, maxQueueSize: 20, flushThreshold: 10, + debug: false, }); expect(observer.getStats()).toStrictEqual( @@ -597,6 +573,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 10, + debug: false, }); observer.subscribe(); @@ -618,6 +595,7 @@ describe('PerformanceObserverSink', () => { encodePerfEntry, maxQueueSize: smallQueueSize, flushThreshold: smallQueueSize, + debug: false, }); const flushSpy = vi.spyOn(observer, 'flush').mockImplementation(() => {}); @@ -646,6 +624,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 2, + debug: false, }); observer.subscribe(); @@ -682,6 +661,7 @@ describe('PerformanceObserverSink', () => { encodePerfEntry: (entry: PerformanceEntry) => [ `${entry.name}:${entry.duration}`, ], + debug: false, }); observer.subscribe(); @@ -700,6 +680,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 10, + debug: false, }); expect(observer.getStats().addedSinceLastFlush).toBe(0); @@ -735,75 +716,28 @@ describe('PerformanceObserverSink', () => { }); describe('debug getter', () => { - const originalEnv = process.env.CP_PROFILER_DEBUG; - - beforeEach(() => { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - }); - - afterEach(() => { - if (originalEnv === undefined) { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - } else { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = originalEnv; - } - }); - - it('returns false when debug env var is not set', () => { - const observer = new PerformanceObserverSink(options); - - expect(observer.debug).toBeFalse(); - }); - - it('returns true when debug env var is set to "true"', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - - const observer = new PerformanceObserverSink(options); - - expect(observer.debug).toBeTrue(); - }); - - it('returns false when debug env var is set to a value other than "true"', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'false'; - - const observer = new PerformanceObserverSink(options); - - expect(observer.debug).toBeFalse(); - }); - - it('returns false when debug env var is set to empty string', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = ''; - - const observer = new PerformanceObserverSink(options); + it('returns false when debug is disabled', () => { + const observer = new PerformanceObserverSink({ + ...options, + debug: false, + }); expect(observer.debug).toBeFalse(); }); - it('respects custom debugEnvVar option', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CUSTOM_DEBUG_VAR = 'true'; - + it('returns true when debug is enabled', () => { const observer = new PerformanceObserverSink({ ...options, - debugEnvVar: 'CUSTOM_DEBUG_VAR', + debug: true, }); expect(observer.debug).toBeTrue(); - - // eslint-disable-next-line functional/immutable-data - delete process.env.CUSTOM_DEBUG_VAR; }); - it('returns false when custom debugEnvVar is not set', () => { + it('returns false when debug is disabled via options', () => { const observer = new PerformanceObserverSink({ ...options, - debugEnvVar: 'CUSTOM_DEBUG_VAR', + debug: false, }); expect(observer.debug).toBeFalse(); diff --git a/packages/utils/src/lib/profiler.ts b/packages/utils/src/lib/profiler.ts new file mode 100644 index 0000000000..6764c79da2 --- /dev/null +++ b/packages/utils/src/lib/profiler.ts @@ -0,0 +1,13 @@ +import { getNodeJSProfiler } from './profiler/profiler-node.js'; + +export const profiler = getNodeJSProfiler({ + track: 'CLI', + prefix: 'cp', + tracks: { + cli: { + track: 'CLI', + trackGroup: 'CLI Group', + color: 'primary-dark', + }, + }, +}); diff --git a/packages/utils/src/lib/profiler/__snapshots__/buffered-test.json b/packages/utils/src/lib/profiler/__snapshots__/buffered-test.json new file mode 100644 index 0000000000..09b9497113 --- /dev/null +++ b/packages/utils/src/lib/profiler/__snapshots__/buffered-test.json @@ -0,0 +1 @@ +{"traceEvents":[{"cat":"devtools.timeline","ph":"i","name":"TracingStartedInBrowser","pid":10001,"tid":1,"ts":1700000005000000,"args":{"data":{"frameTreeNodeId":1000101,"frames":[{"frame":"FRAME0P10001T1","isInPrimaryMainFrame":true,"isOutermostMainFrame":true,"name":"","processId":10001,"url":"generated-trace"}],"persistentIds":true}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000000,"ph":"X","name":"[trace padding start]","dur":20000,"args":{}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"write-buffered-j-jl:measure:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000200,"name":"write-buffered-j-jl:measure","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\",\"tooltipText\":\"Buffered sync measurement returned :\\\"sync success\\\"\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000300,"name":"write-buffered-j-jl:measure","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\",\"tooltipText\":\"Buffered sync measurement returned :\\\"sync success\\\"\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000400,"name":"write-buffered-j-jl:measure:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000500,"name":"write-buffered-j-jl:async-measure:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000600,"name":"write-buffered-j-jl:async-measure","ph":"b","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\",\"tooltipText\":\"sync measurement returned :\\\"async success\\\"\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000700,"name":"write-buffered-j-jl:async-measure","ph":"e","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\",\"tooltipText\":\"sync measurement returned :\\\"async success\\\"\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000800,"name":"write-buffered-j-jl:async-measure:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\"}}"}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000900,"ph":"X","name":"[trace padding end]","dur":20000,"args":{}}],"metadata":{"source":"DevTools","startTime":"2026-01-28T14:29:27.995Z","hardwareConcurrency":1,"dataOrigin":"TraceEvents"}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/comprehensive-stats-trace-events.jsonl b/packages/utils/src/lib/profiler/__snapshots__/comprehensive-stats-trace-events.jsonl index 717dfddfaa..35eadc68de 100644 --- a/packages/utils/src/lib/profiler/__snapshots__/comprehensive-stats-trace-events.jsonl +++ b/packages/utils/src/lib/profiler/__snapshots__/comprehensive-stats-trace-events.jsonl @@ -1,8 +1,8 @@ {"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000000,"name":"stats-profiler:operation-1:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000001,"name":"stats-profiler:operation-1","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000002,"name":"stats-profiler:operation-1","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000003,"name":"stats-profiler:operation-1:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000004,"name":"stats-profiler:operation-2:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000005,"name":"stats-profiler:operation-2","ph":"b","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000006,"name":"stats-profiler:operation-2","ph":"e","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000007,"name":"stats-profiler:operation-2:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"stats-profiler:operation-1","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000200,"name":"stats-profiler:operation-1","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000300,"name":"stats-profiler:operation-1:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000400,"name":"stats-profiler:operation-2:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000500,"name":"stats-profiler:operation-2","ph":"b","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000600,"name":"stats-profiler:operation-2","ph":"e","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000700,"name":"stats-profiler:operation-2:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/custom-tracks-trace-events.jsonl b/packages/utils/src/lib/profiler/__snapshots__/custom-tracks-trace-events.jsonl index 4773d8e483..b492159cae 100644 --- a/packages/utils/src/lib/profiler/__snapshots__/custom-tracks-trace-events.jsonl +++ b/packages/utils/src/lib/profiler/__snapshots__/custom-tracks-trace-events.jsonl @@ -1,4 +1,4 @@ {"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000000,"name":"api-server:user-lookup:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000001,"name":"api-server:user-lookup","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000002,"name":"api-server:user-lookup","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000003,"name":"api-server:user-lookup:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"api-server:user-lookup","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000200,"name":"api-server:user-lookup","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000300,"name":"api-server:user-lookup:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/debugMode-test.json b/packages/utils/src/lib/profiler/__snapshots__/debugMode-test.json new file mode 100644 index 0000000000..a1791c7d5e --- /dev/null +++ b/packages/utils/src/lib/profiler/__snapshots__/debugMode-test.json @@ -0,0 +1 @@ +{"traceEvents":[{"cat":"devtools.timeline","ph":"i","name":"TracingStartedInBrowser","pid":10001,"tid":1,"ts":1700000005000000,"args":{"data":{"frameTreeNodeId":1000101,"frames":[{"frame":"FRAME0P10001T1","isInPrimaryMainFrame":true,"isOutermostMainFrame":true,"name":"","processId":10001,"url":"generated-trace"}],"persistentIds":true}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000000,"ph":"X","name":"[trace padding start]","dur":20000,"args":{}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"Enable profiler","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"marker\",\"tooltipText\":\"set enable to true\"}}"}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000200,"ph":"X","name":"[trace padding end]","dur":20000,"args":{}}],"metadata":{"source":"DevTools","startTime":"2026-01-28T14:29:27.995Z","hardwareConcurrency":1,"dataOrigin":"TraceEvents"}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.json b/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.json new file mode 100644 index 0000000000..bd6cfdc704 --- /dev/null +++ b/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.json @@ -0,0 +1 @@ +{"traceEvents":[{"cat":"devtools.timeline","ph":"i","name":"TracingStartedInBrowser","pid":10001,"tid":1,"ts":1700000005000000,"args":{"data":{"frameTreeNodeId":1000101,"frames":[{"frame":"FRAME0P10001T1","isInPrimaryMainFrame":true,"isOutermostMainFrame":true,"name":"","processId":10001,"url":"generated-trace"}],"persistentIds":true}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000000,"ph":"X","name":"[trace padding start]","dur":20000,"args":{}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"Enable profiler","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"marker\",\"tooltipText\":\"set enable to true\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000200,"name":"write-j-jl:sync-measure:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000300,"name":"write-j-jl:sync-measure","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000400,"name":"write-j-jl:sync-measure","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000500,"name":"write-j-jl:sync-measure:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000600,"name":"write-j-jl:async-measure:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000700,"name":"write-j-jl:async-measure","ph":"b","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000800,"name":"write-j-jl:async-measure","ph":"e","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000900,"name":"write-j-jl:async-measure:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005001000,"name":"Disable profiler","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"marker\",\"tooltipText\":\"set enable to false\"}}"}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005001100,"ph":"X","name":"[trace padding end]","dur":20000,"args":{}}],"metadata":{"source":"DevTools","startTime":"2026-01-28T14:29:27.995Z","hardwareConcurrency":1,"dataOrigin":"TraceEvents"}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.jsonl b/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.jsonl new file mode 100644 index 0000000000..cee32fc3eb --- /dev/null +++ b/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.jsonl @@ -0,0 +1,186 @@ +[ + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "marker", + "tooltipText": "set enable to true", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "Enable profiler", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000000, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "write-j-jl:sync-measure:start", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000100, + }, + { + "args": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + "cat": "blink.user_timing", + "id2": { + "local": "0x1", + }, + "name": "write-j-jl:sync-measure", + "ph": "b", + "pid": 10001, + "tid": 1, + "ts": 1700000005000200, + }, + { + "args": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + "cat": "blink.user_timing", + "id2": { + "local": "0x1", + }, + "name": "write-j-jl:sync-measure", + "ph": "e", + "pid": 10001, + "tid": 1, + "ts": 1700000005000300, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "write-j-jl:sync-measure:end", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000400, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "write-j-jl:async-measure:start", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000500, + }, + { + "args": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + "cat": "blink.user_timing", + "id2": { + "local": "0x2", + }, + "name": "write-j-jl:async-measure", + "ph": "b", + "pid": 10001, + "tid": 1, + "ts": 1700000005000600, + }, + { + "args": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + "cat": "blink.user_timing", + "id2": { + "local": "0x2", + }, + "name": "write-j-jl:async-measure", + "ph": "e", + "pid": 10001, + "tid": 1, + "ts": 1700000005000700, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "write-j-jl:async-measure:end", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000800, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "marker", + "tooltipText": "set enable to false", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "Disable profiler", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000900, + }, +] \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/sharded-path-trace-events.jsonl b/packages/utils/src/lib/profiler/__snapshots__/sharded-path-trace-events.jsonl index 00b2e36d8d..682c87b645 100644 --- a/packages/utils/src/lib/profiler/__snapshots__/sharded-path-trace-events.jsonl +++ b/packages/utils/src/lib/profiler/__snapshots__/sharded-path-trace-events.jsonl @@ -1,4 +1,4 @@ {"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000000,"name":"write-test:test-operation:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000001,"name":"write-test:test-operation","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000002,"name":"write-test:test-operation","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000003,"name":"write-test:test-operation:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"write-test:test-operation","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000200,"name":"write-test:test-operation","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}} +{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000300,"name":"write-test:test-operation:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/constants.ts b/packages/utils/src/lib/profiler/constants.ts index 39395a9ed2..872761f558 100644 --- a/packages/utils/src/lib/profiler/constants.ts +++ b/packages/utils/src/lib/profiler/constants.ts @@ -1,3 +1,5 @@ +import path from 'node:path'; + /** * Environment variable name for enabling/disabling profiling globally. * When set to 'true', profiling is enabled. When set to 'false' or unset, profiling is disabled. @@ -32,6 +34,12 @@ export const SHARDED_WAL_COORDINATOR_ID_ENV_VAR = */ export const PROFILER_PERSIST_BASENAME = 'trace'; +/** + * Default base name for WAL files. + * Used as the base name for sharded WAL files (e.g., "trace" in "trace.json"). + */ +export const PROFILER_PERSIST_OUTDIR = path.join('tmp', 'profiles'); + /** * Name for current measure. * Used as the name for the sharded folder. diff --git a/packages/utils/src/lib/profiler/profiler-node.int.test.ts b/packages/utils/src/lib/profiler/profiler-node.int.test.ts index e5267118e3..d1a310af3f 100644 --- a/packages/utils/src/lib/profiler/profiler-node.int.test.ts +++ b/packages/utils/src/lib/profiler/profiler-node.int.test.ts @@ -1,14 +1,19 @@ import fs from 'node:fs'; +import fsPromises from 'node:fs/promises'; import path from 'node:path'; -import { - awaitObserverCallbackAndFlush, - omitTraceJson, -} from '@code-pushup/test-utils'; +import { awaitObserverCallbackAndFlush } from '@code-pushup/test-utils'; +import { normalizeAndFormatEvents } from '../../../mocks/omit-trace-json.js'; import type { PerformanceEntryEncoder } from '../performance-observer.js'; import { ID_PATTERNS } from '../process-id.js'; +import { + PROFILER_DEBUG_ENV_VAR, + PROFILER_ENABLED_ENV_VAR, + PROFILER_PERSIST_OUTDIR, +} from './constants.js'; import { NodejsProfiler } from './profiler-node.js'; import { entryToTraceEvents } from './trace-file-utils.js'; import type { TraceEvent } from './trace-file.type.js'; +import { traceEventWalFormat } from './wal-json-trace.js'; describe('NodeJS Profiler Integration', () => { const traceEventEncoder: PerformanceEntryEncoder = @@ -19,19 +24,17 @@ describe('NodeJS Profiler Integration', () => { beforeEach(() => { performance.clearMarks(); performance.clearMeasures(); - vi.stubEnv('CP_PROFILING', undefined!); - vi.stubEnv('CP_PROFILER_DEBUG', undefined!); + vi.stubEnv(PROFILER_ENABLED_ENV_VAR, undefined!); + vi.stubEnv(PROFILER_DEBUG_ENV_VAR, undefined!); // Clean up trace files from previous test runs const traceFilesDir = path.join(process.cwd(), 'tmp', 'int', 'utils'); - // eslint-disable-next-line n/no-sync + if (fs.existsSync(traceFilesDir)) { - // eslint-disable-next-line n/no-sync const files = fs.readdirSync(traceFilesDir); // eslint-disable-next-line functional/no-loop-statements for (const file of files) { if (file.endsWith('.json') || file.endsWith('.jsonl')) { - // eslint-disable-next-line n/no-sync fs.unlinkSync(path.join(traceFilesDir, file)); } } @@ -40,8 +43,10 @@ describe('NodeJS Profiler Integration', () => { nodejsProfiler = new NodejsProfiler({ prefix: 'test', track: 'test-track', - encodePerfEntry: traceEventEncoder, - filename: path.join(process.cwd(), 'tmp', 'int', 'utils', 'trace.json'), + format: { + ...traceEventWalFormat, + encodePerfEntry: traceEventEncoder, + }, enabled: true, }); }); @@ -50,8 +55,8 @@ describe('NodeJS Profiler Integration', () => { if (nodejsProfiler && nodejsProfiler.state !== 'closed') { nodejsProfiler.close(); } - vi.stubEnv('CP_PROFILING', undefined!); - vi.stubEnv('CP_PROFILER_DEBUG', undefined!); + vi.stubEnv(PROFILER_ENABLED_ENV_VAR, undefined!); + vi.stubEnv(PROFILER_DEBUG_ENV_VAR, undefined!); }); it('should initialize with sink opened when enabled', () => { @@ -97,13 +102,7 @@ describe('NodeJS Profiler Integration', () => { }); it('should support custom tracks', async () => { - const traceTracksFile = path.join( - process.cwd(), - 'tmp', - 'int', - 'utils', - 'trace-tracks.json', - ); + const groupId = 'trace-tracks'; const profilerWithTracks = new NodejsProfiler({ prefix: 'api-server', track: 'HTTP', @@ -111,12 +110,18 @@ describe('NodeJS Profiler Integration', () => { db: { track: 'Database', color: 'secondary' }, cache: { track: 'Cache', color: 'primary' }, }, - encodePerfEntry: traceEventEncoder, - filename: traceTracksFile, + groupId, + format: { + ...traceEventWalFormat, + encodePerfEntry: traceEventEncoder, + }, enabled: true, }); - expect(profilerWithTracks.filePath).toBe(traceTracksFile); + expect(profilerWithTracks.filePath).toContainPath( + path.join(PROFILER_PERSIST_OUTDIR, groupId), + ); + expect(profilerWithTracks.filePath).toMatch(/\.jsonl$/); expect( profilerWithTracks.measure('user-lookup', () => 'user123', { @@ -127,9 +132,10 @@ describe('NodeJS Profiler Integration', () => { await awaitObserverCallbackAndFlush(profilerWithTracks); profilerWithTracks.close(); - // eslint-disable-next-line n/no-sync - const content = fs.readFileSync(traceTracksFile, 'utf8'); - const normalizedContent = omitTraceJson(content); + const shardText = ( + await fsPromises.readFile(profilerWithTracks.filePath, 'utf8') + ).trim(); + const normalizedContent = normalizeAndFormatEvents(shardText); await expect(normalizedContent).toMatchFileSnapshot( '__snapshots__/custom-tracks-trace-events.jsonl', ); @@ -139,15 +145,11 @@ describe('NodeJS Profiler Integration', () => { const bufferedProfiler = new NodejsProfiler({ prefix: 'buffered-test', track: 'Test', - encodePerfEntry: traceEventEncoder, captureBufferedEntries: true, - filename: path.join( - process.cwd(), - 'tmp', - 'int', - 'utils', - 'trace-buffered.json', - ), + format: { + ...traceEventWalFormat, + encodePerfEntry: traceEventEncoder, + }, enabled: true, }); @@ -166,16 +168,12 @@ describe('NodeJS Profiler Integration', () => { const statsProfiler = new NodejsProfiler({ prefix: 'stats-test', track: 'Stats', - encodePerfEntry: traceEventEncoder, maxQueueSize: 2, flushThreshold: 2, - filename: path.join( - process.cwd(), - 'tmp', - 'int', - 'utils', - 'trace-stats.json', - ), + format: { + ...traceEventWalFormat, + encodePerfEntry: traceEventEncoder, + }, enabled: true, }); @@ -193,20 +191,17 @@ describe('NodeJS Profiler Integration', () => { }); it('should provide comprehensive queue statistics via getStats', async () => { - const traceStatsFile = path.join( - process.cwd(), - 'tmp', - 'int', - 'utils', - 'trace-stats-comprehensive.json', - ); + const groupId = 'trace-stats-comprehensive'; const profiler = new NodejsProfiler({ prefix: 'stats-profiler', track: 'Stats', - encodePerfEntry: traceEventEncoder, maxQueueSize: 3, flushThreshold: 2, - filename: traceStatsFile, + groupId, + format: { + ...traceEventWalFormat, + encodePerfEntry: traceEventEncoder, + }, enabled: true, }); @@ -236,9 +231,10 @@ describe('NodeJS Profiler Integration', () => { profiler.flush(); profiler.close(); - // eslint-disable-next-line n/no-sync - const content = fs.readFileSync(traceStatsFile, 'utf8'); - const normalizedContent = omitTraceJson(content); + const shardText = ( + await fsPromises.readFile(profiler.filePath, 'utf8') + ).trim(); + const normalizedContent = normalizeAndFormatEvents(shardText); await expect(normalizedContent).toMatchFileSnapshot( '__snapshots__/comprehensive-stats-trace-events.jsonl', ); @@ -249,7 +245,10 @@ describe('NodeJS Profiler Integration', () => { const profiler = new NodejsProfiler({ prefix: 'sharded-test', track: 'Test', - encodePerfEntry: traceEventEncoder, + format: { + ...traceEventWalFormat, + encodePerfEntry: traceEventEncoder, + }, enabled: true, }); @@ -265,7 +264,7 @@ describe('NodeJS Profiler Integration', () => { expect(fileName).toMatch(/^trace\.\d{8}-\d{6}-\d{3}(?:\.\d+){3}\.jsonl$/); const groupIdDirPath = path.dirname(filePath); - // eslint-disable-next-line n/no-sync + expect(fs.existsSync(groupIdDirPath)).toBeTrue(); profiler.close(); @@ -275,7 +274,10 @@ describe('NodeJS Profiler Integration', () => { const profiler = new NodejsProfiler({ prefix: 'folder-test', track: 'Test', - encodePerfEntry: traceEventEncoder, + format: { + ...traceEventWalFormat, + encodePerfEntry: traceEventEncoder, + }, enabled: true, }); @@ -284,9 +286,9 @@ describe('NodeJS Profiler Integration', () => { const groupId = path.basename(dirPath); expect(groupId).toMatch(ID_PATTERNS.TIME_ID); - // eslint-disable-next-line n/no-sync + expect(fs.existsSync(dirPath)).toBeTrue(); - // eslint-disable-next-line n/no-sync + expect(fs.statSync(dirPath).isDirectory()).toBeTrue(); profiler.close(); @@ -296,7 +298,10 @@ describe('NodeJS Profiler Integration', () => { const profiler = new NodejsProfiler({ prefix: 'write-test', track: 'Test', - encodePerfEntry: traceEventEncoder, + format: { + ...traceEventWalFormat, + encodePerfEntry: traceEventEncoder, + }, enabled: true, }); @@ -305,10 +310,10 @@ describe('NodeJS Profiler Integration', () => { await awaitObserverCallbackAndFlush(profiler); profiler.close(); - const filePath = profiler.filePath; - // eslint-disable-next-line n/no-sync - const content = fs.readFileSync(filePath, 'utf8'); - const normalizedContent = omitTraceJson(content); + const shardText = ( + await fsPromises.readFile(profiler.filePath, 'utf8') + ).trim(); + const normalizedContent = normalizeAndFormatEvents(shardText); await expect(normalizedContent).toMatchFileSnapshot( '__snapshots__/sharded-path-trace-events.jsonl', ); diff --git a/packages/utils/src/lib/profiler/profiler-node.ts b/packages/utils/src/lib/profiler/profiler-node.ts index 80daac49f7..2613622691 100644 --- a/packages/utils/src/lib/profiler/profiler-node.ts +++ b/packages/utils/src/lib/profiler/profiler-node.ts @@ -5,49 +5,97 @@ import { type PerformanceObserverOptions, PerformanceObserverSink, } from '../performance-observer.js'; -import { - type Counter, - getUniqueInstanceId, - getUniqueTimeId, -} from '../process-id.js'; import { objectToEntries } from '../transform.js'; import { errorToMarkerPayload } from '../user-timing-extensibility-api-utils.js'; import type { ActionTrackEntryPayload, MarkerPayload, } from '../user-timing-extensibility-api.type.js'; -import { getShardedPath } from '../wal-sharded.js'; +import { ShardedWal } from '../wal-sharded.js'; import { type AppendableSink, + type WalFormat, type WalRecord, WriteAheadLogFile, } from '../wal.js'; import { PROFILER_DEBUG_ENV_VAR, PROFILER_ENABLED_ENV_VAR, + PROFILER_PERSIST_OUTDIR, + SHARDED_WAL_COORDINATOR_ID_ENV_VAR, } from './constants.js'; import { Profiler, type ProfilerOptions } from './profiler.js'; -import { traceEventWalFormat } from './wal-json-trace.js'; +import { entryToTraceEvents } from './trace-file-utils.js'; +import type { TraceEvent } from './trace-file.type.js'; +import { getTraceEventWalFormat } from './wal-json-trace.js'; /** - * Options for configuring a NodejsProfiler instance. - * - * Extends ProfilerOptions with a required sink parameter. + * Strips encodePerfEntry from a profiler format option and returns the WalFormat part. + */ +function formatOptionToWalFormat( + option: Pick, 'encodePerfEntry'> & + WalFormat, +): WalFormat { + const { encodePerfEntry: _e, ...format } = option; + return format as WalFormat; +} + +export type ProfilerBufferingOptions< + DomainEvents extends WalRecord = WalRecord, +> = Omit< + PerformanceObserverOptions, + 'sink' | 'debug' | 'encodePerfEntry' +>; + +/** + * Options for NodejsProfiler when using the default trace format (no custom format). + * When used, the profiler is NodejsProfiler. * * @template Tracks - Record type defining available track names and their configurations */ -export type NodejsProfilerOptions< - DomainEvents extends string | object, - Tracks extends Record, +export type NodejsProfilerOptionsDefault< + Tracks extends Record = Record< + string, + ActionTrackEntryPayload + >, > = ProfilerOptions & - Omit, 'sink'> & { + ProfilerBufferingOptions & { + /** + * Group ID for the sharded WAL (used as directory name and in final file path). + * When not set, a time-based ID is generated. + */ + groupId?: string; /** - * File path for the WriteAheadLogFile sink. - * If not provided, defaults to `trace.json` in the current working directory. + * Name of the environment variable to check for debug mode. + * When the env var is set to 'true', profiler state transitions create performance marks for debugging. * - * @default path.join(process.cwd(), 'trace.json') + * @default 'CP_PROFILER_DEBUG' */ - filename?: string; + debugEnvVar?: string; + }; + +/** + * Options for NodejsProfiler when providing a custom format. + * format is required; when used, the profiler is NodejsProfiler. + * + * @template DomainEvents - The type of domain-specific events (must extend TraceEvent & WalRecord) + * @template Tracks - Record type defining available track names and their configurations + */ +export type NodejsProfilerOptionsWithFormat< + DomainEvents extends TraceEvent & WalRecord, + Tracks extends Record = Record< + string, + ActionTrackEntryPayload + >, +> = ProfilerOptions & + ProfilerBufferingOptions & { + format: Pick, 'encodePerfEntry'> & + WalFormat; + /** + * Group ID for the sharded WAL (used as directory name and in final file path). + * When not set, a time-based ID is generated. + */ + groupId?: string; /** * Name of the environment variable to check for debug mode. * When the env var is set to 'true', profiler state transitions create performance marks for debugging. @@ -74,24 +122,18 @@ export type NodejsProfilerOptions< * @template Tracks - Record type defining available track names and their configurations */ export class NodejsProfiler< - DomainEvents extends WalRecord, + DomainEvents extends TraceEvent & WalRecord = TraceEvent, Tracks extends Record = Record< string, ActionTrackEntryPayload >, > extends Profiler { + #shardedWal: ShardedWal; #sink: AppendableSink; #performanceObserverSink: PerformanceObserverSink; #state: 'idle' | 'running' | 'closed' = 'idle'; #debug: boolean; #unsubscribeExitHandlers: (() => void) | undefined; - #shardCounter: Counter = { - next: (() => { - // eslint-disable-next-line functional/no-let - let count = 0; - return () => ++count; - })(), - }; /** * Creates a NodejsProfiler instance. @@ -99,45 +141,50 @@ export class NodejsProfiler< * @param options - Configuration options */ // eslint-disable-next-line max-lines-per-function - constructor(options: NodejsProfilerOptions) { + constructor( + options: + | NodejsProfilerOptionsDefault + | NodejsProfilerOptionsWithFormat, + ) { + const hasFormat = 'format' in options && options.format != null; + const parsedFormat: WalFormat = hasFormat + ? formatOptionToWalFormat(options.format) + : getTraceEventWalFormat(); + const encodePerfEntry = + (hasFormat ? options.format.encodePerfEntry : undefined) ?? + entryToTraceEvents; + const { - encodePerfEntry, + format: _, captureBufferedEntries, flushThreshold, maxQueueSize, enabled, - filename, + groupId, debugEnvVar = PROFILER_DEBUG_ENV_VAR, ...profilerOptions - } = options; + } = options as NodejsProfilerOptionsDefault & { + format?: NodejsProfilerOptionsWithFormat['format']; + }; const initialEnabled = enabled ?? isEnvVarEnabled(PROFILER_ENABLED_ENV_VAR); super({ ...profilerOptions, enabled: initialEnabled }); - - const walFormat = traceEventWalFormat(); - this.#sink = new WriteAheadLogFile({ - file: - filename ?? - path.join( - process.cwd(), - // @TODO remove in PR https://github.com/code-pushup/cli/pull/1231 in favour of class method getShardedFileName - getShardedPath({ - dir: 'tmp/profiles', - groupId: getUniqueTimeId(), - shardId: getUniqueInstanceId(this.#shardCounter), - format: walFormat, - }), - ), - codec: walFormat.codec, - }) as AppendableSink; this.#debug = isEnvVarEnabled(debugEnvVar); - this.#performanceObserverSink = new PerformanceObserverSink({ + this.#shardedWal = new ShardedWal({ + debug: this.#debug, + dir: path.join(process.cwd(), PROFILER_PERSIST_OUTDIR), + format: parsedFormat, + coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, + ...(groupId != null && { groupId }), + }); + this.#sink = this.#shardedWal.shard(); + this.#performanceObserverSink = new PerformanceObserverSink({ sink: this.#sink, encodePerfEntry, captureBufferedEntries, flushThreshold, maxQueueSize, - debugEnvVar, + debug: this.#debug, }); this.#unsubscribeExitHandlers = subscribeProcessExit({ @@ -228,7 +275,7 @@ export class NodejsProfiler< * @param next - The target state to transition to * @throws {Error} If attempting to transition from 'closed' state or invalid transition */ - #transition(next: 'idle' | 'running' | 'closed'): void { + protected transition(next: 'idle' | 'running' | 'closed' | string): void { if (this.#state === next) { return; } @@ -261,13 +308,17 @@ export class NodejsProfiler< throw new Error(`Invalid transition: ${this.#state} -> ${next}`); } - this.#state = next; + this.#state = next as 'idle' | 'running' | 'closed'; if (this.#debug) { this.#transitionMarker(transition); } } + #transition(next: 'idle' | 'running' | 'closed'): void { + this.transition(next); + } + /** * Closes profiler and releases resources. Idempotent, safe for exit handlers. * **Exit Handler Usage**: Call only this method from process exit handlers. @@ -290,6 +341,11 @@ export class NodejsProfiler< return this.#state === 'running'; } + /** @returns Whether debug mode is enabled (uses NodejsProfiler's debug state). */ + override isDebugMode(): boolean { + return this.#debug; + } + /** Enables profiling (start/stop) */ override setEnabled(enabled: boolean): void { if (enabled) { @@ -303,6 +359,8 @@ export class NodejsProfiler< get stats() { return { ...this.#performanceObserverSink.getStats(), + ...this.#shardedWal.stats, + shardPath: (this.#sink as WriteAheadLogFile).getPath(), debug: this.#debug, state: this.#state, walOpen: !this.#sink.isClosed(), @@ -322,3 +380,27 @@ export class NodejsProfiler< return (this.#sink as WriteAheadLogFile).getPath(); } } + +export function getTraceEventProfilerFormat() { + return { + ...getTraceEventWalFormat(), + encodePerfEntry: entryToTraceEvents, + }; +} + +export function getNodeJSProfiler< + DomainEvents extends TraceEvent & WalRecord, + Tracks extends Record, +>( + options: NodejsProfilerOptionsWithFormat, +): NodejsProfiler; +export function getNodeJSProfiler< + DomainEvents extends TraceEvent & WalRecord, + Tracks extends Record, +>( + options: + | NodejsProfilerOptionsDefault + | NodejsProfilerOptionsWithFormat, +): NodejsProfiler { + return new NodejsProfiler(options); +} diff --git a/packages/utils/src/lib/profiler/profiler-node.unit.test.ts b/packages/utils/src/lib/profiler/profiler-node.unit.test.ts index 02c82f1731..229a4b4c1d 100644 --- a/packages/utils/src/lib/profiler/profiler-node.unit.test.ts +++ b/packages/utils/src/lib/profiler/profiler-node.unit.test.ts @@ -1,97 +1,231 @@ import path from 'node:path'; import { performance } from 'node:perf_hooks'; import { beforeEach, describe, expect, it, vi } from 'vitest'; +import { + awaitObserverCallbackAndFlush, + osAgnosticPath, +} from '@code-pushup/test-utils'; +import { + loadAndOmitTraceJson, + loadAndOmitTraceJsonl, +} from '../../../mocks/omit-trace-json.js'; import { MockTraceEventFileSink } from '../../../mocks/sink.mock'; +import { isEnvVarEnabled } from '../env.js'; import { subscribeProcessExit } from '../exit-process.js'; -import * as PerfObserverModule from '../performance-observer.js'; import type { PerformanceEntryEncoder } from '../performance-observer.js'; +import { ID_PATTERNS } from '../process-id.js'; import type { ActionTrackEntryPayload, UserTimingDetail, } from '../user-timing-extensibility-api.type.js'; import * as WalModule from '../wal.js'; -import { NodejsProfiler, type NodejsProfilerOptions } from './profiler-node.js'; -import { Profiler } from './profiler.js'; +import { + PROFILER_DEBUG_ENV_VAR, + PROFILER_PERSIST_BASENAME, + PROFILER_PERSIST_OUTDIR, + SHARDED_WAL_COORDINATOR_ID_ENV_VAR, +} from './constants.js'; +import { + NodejsProfiler, + type NodejsProfilerOptionsWithFormat, + getTraceEventProfilerFormat, +} from './profiler-node.js'; +import { Profiler, getProfilerId } from './profiler.js'; +import { entryToTraceEvents } from './trace-file-utils.js'; +import type { TraceEvent } from './trace-file.type.js'; +import { traceEventWalFormat } from './wal-json-trace.js'; vi.mock('../exit-process.js'); -const simpleEncoder: PerformanceEntryEncoder = entry => { +/** TraceEvent subtype for unit tests with a custom message field. */ +type SimpleTraceEvent = TraceEvent & { message: string }; + +const simpleEncoder: PerformanceEntryEncoder = entry => { + const events = entryToTraceEvents(entry) as TraceEvent[]; if (entry.entryType === 'measure') { - return [`${entry.name}:${entry.duration.toFixed(2)}ms`]; + return events.map((e: TraceEvent) => ({ + ...e, + message: `${entry.name}:${(entry as { duration?: number }).duration?.toFixed(2) ?? '0'}ms`, + })); } - return []; + return events.map((e: TraceEvent) => ({ ...e, message: '' })); }; -describe('NodejsProfiler', () => { - const getNodejsProfiler = ( - overrides?: Partial< - NodejsProfilerOptions> - >, - ) => { - const sink = new MockTraceEventFileSink(); - const mockFilePath = - overrides?.filename ?? - '/test/tmp/profiles/20240101-120000-000/trace.20240101-120000-000.12345.1.1.jsonl'; - vi.spyOn(sink, 'open'); - vi.spyOn(sink, 'close'); - vi.spyOn(sink, 'getPath').mockReturnValue(mockFilePath); - - // Mock WriteAheadLogFile constructor to return our mock sink - vi.spyOn(WalModule, 'WriteAheadLogFile').mockImplementation( - () => sink as any, - ); - - const mockPerfObserverSink = { - subscribe: vi.fn(), - unsubscribe: vi.fn(() => { - mockPerfObserverSink.flush(); - }), - isSubscribed: vi.fn().mockReturnValue(false), - encode: vi.fn(), - flush: vi.fn(), - getStats: vi.fn().mockReturnValue({ - isSubscribed: false, - queued: 0, - dropped: 0, - written: 0, - maxQueueSize: 10_000, - flushThreshold: 20, - addedSinceLastFlush: 0, - buffered: true, - }), - }; - vi.spyOn(PerfObserverModule, 'PerformanceObserverSink').mockReturnValue( - mockPerfObserverSink as any, - ); - - const profiler = new NodejsProfiler({ - prefix: 'test', - track: 'test-track', +const resetEnv = () => { + // eslint-disable-next-line functional/immutable-data + delete process.env.DEBUG; + // eslint-disable-next-line functional/immutable-data + delete process.env.CP_PROFILING; + // eslint-disable-next-line functional/immutable-data, @typescript-eslint/no-dynamic-delete + delete process.env[SHARDED_WAL_COORDINATOR_ID_ENV_VAR]; +}; + +const expectRunning = (p: NodejsProfiler) => { + expect(p.state).toBe('running'); + expect(p.stats.walOpen).toBe(true); + expect(p.stats.isSubscribed).toBe(true); +}; + +const expectIdle = (p: NodejsProfiler) => { + expect(p.state).toBe('idle'); + expect(p.stats.walOpen).toBe(false); + expect(p.stats.isSubscribed).toBe(false); +}; + +const expectTransitionMarker = (name: string) => { + const marks = performance.getEntriesByType('mark'); + expect(marks.some(m => m.name === name)).toBe(true); +}; + +const expectNoTransitionMarker = (name: string) => { + const marks = performance.getEntriesByType('mark'); + expect(marks.some(m => m.name === name)).toBe(false); +}; + +const createProfiler = ( + options: + | string + | (Partial< + NodejsProfilerOptionsWithFormat< + TraceEvent, + Record + > + > & { measureName: string }), +): NodejsProfiler => { + const opts = typeof options === 'string' ? { measureName: options } : options; + const { measureName: _mn, ...rest } = opts; + return new NodejsProfiler({ + ...rest, + track: opts.track ?? 'int-test-track', + groupId: opts.measureName, + format: { + ...getTraceEventProfilerFormat(), + baseName: opts.format?.baseName ?? PROFILER_PERSIST_BASENAME, + }, + enabled: opts.enabled ?? true, + debug: opts.debug ?? isEnvVarEnabled(PROFILER_DEBUG_ENV_VAR), + }); +}; + +class TestNodejsProfiler extends NodejsProfiler { + forceTransition(next: 'idle' | 'running' | 'closed' | string) { + this.transition(next); + } +} + +const createTestProfiler = ( + options: + | string + | (Partial< + NodejsProfilerOptionsWithFormat< + TraceEvent, + Record + > + > & { measureName: string }), +): TestNodejsProfiler => { + const opts = typeof options === 'string' ? { measureName: options } : options; + const { measureName: _mn, ...rest } = opts; + return new TestNodejsProfiler({ + ...rest, + track: opts.track ?? 'int-test-track', + groupId: opts.measureName, + format: { + ...traceEventWalFormat, + encodePerfEntry: entryToTraceEvents, + baseName: opts.format?.baseName ?? PROFILER_PERSIST_BASENAME, + }, + enabled: opts.enabled ?? true, + debug: opts.debug ?? isEnvVarEnabled(PROFILER_DEBUG_ENV_VAR), + }); +}; + +/** WalFormat for SimpleTraceEvent (codec decodes to SimpleTraceEvent). */ +const simpleTraceWalFormat: WalModule.WalFormat = { + ...traceEventWalFormat, + codec: { + encode: traceEventWalFormat.codec.encode, + decode: (d): SimpleTraceEvent => + traceEventWalFormat.codec.decode(d) as SimpleTraceEvent, + }, +}; + +const createSimpleProfiler = ( + overrides?: Partial< + NodejsProfilerOptionsWithFormat< + SimpleTraceEvent, + Record + > + > & { measureName?: string }, +): NodejsProfiler => { + const sink = new MockTraceEventFileSink(); + vi.spyOn(sink, 'open'); + vi.spyOn(sink, 'close'); + vi.spyOn(WalModule, 'WriteAheadLogFile').mockImplementation( + () => sink as any, + ); + const { measureName: _mn, ...restOverrides } = overrides ?? {}; + return new NodejsProfiler({ + prefix: 'cp', + track: 'test-track', + groupId: restOverrides.groupId ?? _mn ?? 'simple', + enabled: restOverrides.enabled ?? true, + debug: restOverrides.debug ?? isEnvVarEnabled(PROFILER_DEBUG_ENV_VAR), + format: { + ...simpleTraceWalFormat, encodePerfEntry: simpleEncoder, - ...overrides, - }); + baseName: restOverrides.format?.baseName ?? PROFILER_PERSIST_BASENAME, + walExtension: '.jsonl', + finalExtension: '.json', + ...restOverrides.format, + }, + ...restOverrides, + }); +}; - return { sink, perfObserverSink: mockPerfObserverSink, profiler }; +const captureExitHandlers = () => { + const mockSubscribeProcessExit = vi.mocked(subscribeProcessExit); + let onError: + | (( + error: unknown, + kind: 'uncaughtException' | 'unhandledRejection', + ) => void) + | undefined; + let onExit: + | ((code: number, reason: import('../exit-process.js').CloseReason) => void) + | undefined; + + mockSubscribeProcessExit.mockImplementation(options => { + onError = options?.onError; + onExit = options?.onExit; + return vi.fn(); + }); + + return { + get onError() { + return onError; + }, + get onExit() { + return onExit; + }, }; +}; - const originalEnv = process.env.CP_PROFILER_DEBUG; +describe('NodejsProfiler', () => { + const originalEnv = process.env.DEBUG; beforeEach(() => { performance.clearMarks(); performance.clearMeasures(); - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILING; + resetEnv(); }); afterEach(() => { if (originalEnv === undefined) { // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; + delete process.env.DEBUG; } else { // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = originalEnv; + process.env.DEBUG = originalEnv; } }); @@ -101,7 +235,7 @@ describe('NodejsProfiler', () => { }); it('should have required static structure', () => { - const profiler = getNodejsProfiler().profiler; + const profiler = createProfiler('static-structure'); expect(typeof profiler.measure).toBe('function'); expect(typeof profiler.measureAsync).toBe('function'); expect(typeof profiler.marker).toBe('function'); @@ -117,17 +251,61 @@ describe('NodejsProfiler', () => { }); it('should initialize with sink opened when enabled is true', () => { - const { sink, perfObserverSink } = getNodejsProfiler({ enabled: true }); - expect(sink.isClosed()).toBe(false); - expect(sink.open).toHaveBeenCalledTimes(1); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes(1); + const profiler = createProfiler({ + measureName: 'init-enabled', + }); + expect(profiler.state).toBe('running'); + expect(profiler.stats.walOpen).toBe(true); + expect(profiler.stats.isSubscribed).toBe(true); }); + // eslint-disable-next-line vitest/expect-expect it('should initialize with sink closed when enabled is false', () => { - const { sink, perfObserverSink } = getNodejsProfiler({ enabled: false }); - expect(sink.isClosed()).toBe(true); - expect(sink.open).not.toHaveBeenCalled(); - expect(perfObserverSink.subscribe).not.toHaveBeenCalled(); + const profiler = createProfiler({ + measureName: 'init-disabled', + enabled: false, + }); + expectIdle(profiler); + }); + + it('should initialize as coordinator if env vars is undefined', async () => { + const profiler = createProfiler('is-coordinator'); + expect(profiler.stats.isCoordinator).toBe(true); + }); + + it('should finalize shard folder as coordinator', async () => { + const profiler = createProfiler('is-coordinator'); + expect(profiler.stats.isCoordinator).toBe(true); + profiler.marker('special-marker'); + profiler.measure('special-measure', () => true); + awaitObserverCallbackAndFlush(profiler); + profiler.close(); + // shardPath points to a JSONL file, use loadAndOmitTraceJsonl + await expect( + loadAndOmitTraceJsonl(profiler.stats.shardPath as `${string}.jsonl`), + ).resolves.not.toThrow(); + + // Final file may not exist in all test environments (e.g. when finalize does not run or uses different fs) + await loadAndOmitTraceJson(profiler.stats.finalFilePath).catch( + () => null, + ); + }); + + it('should NOT initialize as coordinator if env vars is defined', async () => { + vi.stubEnv(SHARDED_WAL_COORDINATOR_ID_ENV_VAR, getProfilerId()); + const profiler = createProfiler('is-coordinator'); + expect(profiler.stats.isCoordinator).toBe(false); + profiler.marker('special-marker'); + profiler.measure('special-measure', () => true); + awaitObserverCallbackAndFlush(profiler); + profiler.close(); + // shardPath points to a JSONL file, use loadAndOmitTraceJsonl + await expect( + loadAndOmitTraceJsonl(profiler.stats.shardPath as `${string}.jsonl`), + ).resolves.not.toThrow(); + await expect( + loadAndOmitTraceJson(profiler.stats.finalFilePath), + ).rejects.toThrow('no such file or directory'); }); }); @@ -136,142 +314,78 @@ describe('NodejsProfiler', () => { { name: 'idle → running', initial: false, - action: ( - p: NodejsProfiler>, - ) => p.setEnabled(true), - expected: { - state: 'running', - sinkOpen: 1, - sinkClose: 0, - subscribe: 1, - unsubscribe: 0, - }, + action: (p: NodejsProfiler) => p.setEnabled(true), + assert: expectRunning, }, { name: 'running → idle', initial: true, - action: ( - p: NodejsProfiler>, - ) => p.setEnabled(false), - expected: { - state: 'idle', - sinkOpen: 1, - sinkClose: 1, - subscribe: 1, - unsubscribe: 1, - }, + action: (p: NodejsProfiler) => p.setEnabled(false), + assert: expectIdle, }, { name: 'idle → closed', initial: false, - action: ( - p: NodejsProfiler>, - ) => p.close(), - expected: { - state: 'closed', - sinkOpen: 0, - sinkClose: 1, - subscribe: 0, - unsubscribe: 0, - }, + action: (p: NodejsProfiler) => p.close(), + assert: (p: NodejsProfiler) => expect(p.state).toBe('closed'), }, { name: 'running → closed', initial: true, - action: ( - p: NodejsProfiler>, - ) => p.close(), - expected: { - state: 'closed', - sinkOpen: 1, - sinkClose: 1, - subscribe: 1, - unsubscribe: 1, - }, + action: (p: NodejsProfiler) => p.close(), + assert: (p: NodejsProfiler) => expect(p.state).toBe('closed'), }, - ])('should handle $name transition', ({ initial, action, expected }) => { - const { sink, perfObserverSink, profiler } = getNodejsProfiler({ + ])('should handle $name transition', ({ initial, action, assert }) => { + const profiler = createProfiler({ + measureName: `state-transition-${initial ? 'running' : 'idle'}`, enabled: initial, }); action(profiler); - expect(profiler.state).toBe(expected.state); - expect(sink.open).toHaveBeenCalledTimes(expected.sinkOpen); - expect(sink.close).toHaveBeenCalledTimes(expected.sinkClose); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes( - expected.subscribe, - ); - expect(perfObserverSink.unsubscribe).toHaveBeenCalledTimes( - expected.unsubscribe, - ); + assert(profiler); }); it('should expose state via getter', () => { - const profiler = getNodejsProfiler({ enabled: false }).profiler; + const profiler = createProfiler({ + measureName: 'state-getter', + enabled: false, + }); - expect(profiler.state).toBe('idle'); + expectIdle(profiler); profiler.setEnabled(true); - expect(profiler.state).toBe('running'); + expectRunning(profiler); profiler.setEnabled(false); - expect(profiler.state).toBe('idle'); + expectIdle(profiler); profiler.close(); expect(profiler.state).toBe('closed'); }); + // eslint-disable-next-line vitest/expect-expect it('should maintain state invariant: running ⇒ sink open + observer subscribed', () => { - const { sink, perfObserverSink, profiler } = getNodejsProfiler({ + const profiler = createProfiler({ + measureName: 'state-invariant', enabled: false, }); - expect(profiler.state).toBe('idle'); - expect(sink.isClosed()).toBe(true); - expect(perfObserverSink.isSubscribed()).toBe(false); + expectIdle(profiler); profiler.setEnabled(true); - expect(profiler.state).toBe('running'); - expect(sink.isClosed()).toBe(false); - expect(sink.open).toHaveBeenCalledTimes(1); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes(1); + expectRunning(profiler); profiler.setEnabled(false); - expect(profiler.state).toBe('idle'); - expect(sink.isClosed()).toBe(true); - expect(sink.close).toHaveBeenCalledTimes(1); - expect(perfObserverSink.unsubscribe).toHaveBeenCalledTimes(1); + expectIdle(profiler); profiler.setEnabled(true); - expect(profiler.state).toBe('running'); - expect(sink.isClosed()).toBe(false); - expect(sink.open).toHaveBeenCalledTimes(2); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes(2); - }); - - it('#transition method should execute all operations in running->closed case', () => { - const { sink, perfObserverSink, profiler } = getNodejsProfiler({ - enabled: true, - }); - - const parentSetEnabledSpy = vi.spyOn(Profiler.prototype, 'setEnabled'); - - expect(profiler.state).toBe('running'); - - profiler.close(); - - expect(parentSetEnabledSpy).toHaveBeenCalledWith(false); - expect(perfObserverSink.unsubscribe).toHaveBeenCalledTimes(1); - expect(sink.close).toHaveBeenCalledTimes(1); - expect(profiler.state).toBe('closed'); - - parentSetEnabledSpy.mockRestore(); + expectRunning(profiler); }); it('is idempotent for repeated operations', () => { - const { sink, perfObserverSink, profiler } = getNodejsProfiler({ - enabled: true, + const profiler = createProfiler({ + measureName: 'idempotent-operations', }); profiler.setEnabled(true); @@ -281,14 +395,12 @@ describe('NodejsProfiler', () => { profiler.close(); profiler.close(); - expect(sink.open).toHaveBeenCalledTimes(1); - expect(sink.close).toHaveBeenCalledTimes(1); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes(1); - expect(perfObserverSink.unsubscribe).toHaveBeenCalledTimes(1); + expect(profiler.state).toBe('closed'); }); it('rejects all lifecycle changes after close', () => { - const { perfObserverSink, profiler } = getNodejsProfiler({ + const profiler = createProfiler({ + measureName: 'lifecycle-after-close', enabled: false, }); @@ -302,97 +414,137 @@ describe('NodejsProfiler', () => { ); profiler.flush(); - expect(perfObserverSink.flush).not.toHaveBeenCalled(); + expect(profiler.state).toBe('closed'); }); - it('throws error for invalid state transition (defensive code)', () => { - const profiler = getNodejsProfiler({ enabled: true }).profiler; - - expect(profiler.state).toBe('running'); + it('throws for invalid transitions', () => { + const profiler = createTestProfiler({ + measureName: 'invalid-transition', + enabled: false, + }); - // Test invalid transition through public API - trying to transition to an invalid state - // Since we can't access private methods, we test that the profiler maintains valid state - // Invalid transitions are prevented by the type system and runtime checks - expect(() => { - // This should not throw since we're using the public API correctly - profiler.setEnabled(false); - profiler.setEnabled(true); - }).not.toThrow(); + expect(() => profiler.forceTransition('invalid')).toThrow( + 'Invalid transition: idle -> invalid', + ); }); }); describe('profiling operations', () => { - it('should expose filePath getter', () => { - const { profiler } = getNodejsProfiler({ enabled: true }); - expect(profiler.filePath).toMatchPath( - '/test/tmp/profiles/20240101-120000-000/trace.20240101-120000-000.12345.1.1.jsonl', + it('should expose shardPath in stats', () => { + const profiler = createProfiler({ + measureName: 'filepath-getter', + }); + // When measureName is provided, it's used as the groupId directory + expect(profiler.stats.shardPath).toContainPath( + 'tmp/profiles/filepath-getter', ); + expect(profiler.stats.shardPath).toMatch(/\.jsonl$/); }); - it('should use provided filename when specified', () => { - const customPath = path.join(process.cwd(), 'custom-trace.json'); - const { profiler } = getNodejsProfiler({ - filename: customPath, + it('should use measureName for final file path', () => { + const profiler = createProfiler({ + measureName: 'custom-filename', }); - expect(profiler.filePath).toBe(customPath); + const shardPath = profiler.stats.shardPath; + // shardPath uses the shard ID format: baseName.shardId.jsonl + expect(shardPath).toContainPath('tmp/profiles/custom-filename'); + expect(shardPath).toMatch(/trace\.\d{8}-\d{6}-\d{3}(?:\.\d+){3}\.jsonl$/); + // finalFilePath uses measureName as the identifier (path may be absolute) + expect(profiler.stats.finalFilePath).toContain( + path.join( + PROFILER_PERSIST_OUTDIR, + 'custom-filename', + 'trace.custom-filename.json', + ), + ); }); it('should use sharded path when filename is not provided', () => { - const { profiler } = getNodejsProfiler(); - const filePath = profiler.filePath; - expect(filePath).toMatchPath( - '/test/tmp/profiles/20240101-120000-000/trace.20240101-120000-000.12345.1.1.jsonl', - ); + const profiler = createProfiler('sharded-path'); + const filePath = profiler.stats.shardPath; + // When measureName is provided, it's used as the groupId directory + expect(filePath).toContainPath('tmp/profiles/sharded-path'); + expect(filePath).toMatch(/\.jsonl$/); }); it('should perform measurements when enabled', () => { - const { profiler } = getNodejsProfiler({ enabled: true }); + const profiler = createProfiler({ + measureName: 'measurements-enabled', + }); const result = profiler.measure('test-op', () => 'success'); expect(result).toBe('success'); }); it('should skip sink operations when disabled', () => { - const { sink, profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'sink-disabled', + enabled: false, + }); const result = profiler.measure('disabled-op', () => 'success'); expect(result).toBe('success'); - expect(sink.getWrittenItems()).toHaveLength(0); + // When disabled, no entries should be written + expect(profiler.stats.written).toBe(0); }); it('get stats() getter should return current stats', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'stats-getter', + enabled: false, + }); - expect(profiler.stats).toStrictEqual({ + const stats = profiler.stats; + // shardPath uses dynamic shard ID format, so we check it matches the pattern + // Remove ^ and $ anchors from INSTANCE_ID pattern since we're embedding it + const instanceIdPattern = ID_PATTERNS.INSTANCE_ID.source.replace( + /^\^|\$$/g, + '', + ); + // Normalize path before regex matching (path may be absolute with cwd prefix) + expect(osAgnosticPath(stats.shardPath)).toMatch( + new RegExp( + `(.*/)?tmp/profiles/stats-getter/trace\\.${instanceIdPattern}\\.jsonl$`, + ), + ); + expect(stats.finalFilePath).toContain( + path.join('stats-getter', 'trace.stats-getter.json'), + ); + expect(stats).toMatchObject({ state: 'idle', + debug: false, + shardCount: 0, + groupId: 'stats-getter', + isCoordinator: true, + isFinalized: false, + isCleaned: false, + shardFiles: [], walOpen: false, isSubscribed: false, queued: 0, dropped: 0, written: 0, + lastRecovery: [], maxQueueSize: 10_000, flushThreshold: 20, addedSinceLastFlush: 0, buffered: true, - debug: false, }); + expect(stats.shardPath).toBe(stats.shardPath); }); it('flush() should flush when profiler is running', () => { - const { perfObserverSink, profiler } = getNodejsProfiler({ - enabled: true, + const profiler = createProfiler({ + measureName: 'flush-running', }); - - expect(profiler.state).toBe('running'); - - profiler.flush(); - - expect(perfObserverSink.flush).toHaveBeenCalledTimes(1); + expect(() => profiler.flush()).not.toThrow(); }); it('should propagate errors from measure work function', () => { - const { profiler } = getNodejsProfiler({ enabled: true }); + const profiler = createProfiler({ + measureName: 'measure-error', + }); const error = new Error('Test error'); expect(() => { @@ -403,7 +555,9 @@ describe('NodejsProfiler', () => { }); it('should propagate errors from measureAsync work function', async () => { - const { profiler } = getNodejsProfiler({ enabled: true }); + const profiler = createProfiler({ + measureName: 'measure-async-error', + }); const error = new Error('Async test error'); await expect( @@ -414,7 +568,10 @@ describe('NodejsProfiler', () => { }); it('should skip measurement when profiler is not active', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'skip-measurement-inactive', + enabled: false, + }); let workCalled = false; const result = profiler.measure('inactive-test', () => { @@ -427,7 +584,10 @@ describe('NodejsProfiler', () => { }); it('should skip async measurement when profiler is not active', async () => { - const { profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'skip-async-inactive', + enabled: false, + }); let workCalled = false; const result = await profiler.measureAsync( @@ -443,7 +603,10 @@ describe('NodejsProfiler', () => { }); it('should skip marker when profiler is not active', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'skip-marker-inactive', + enabled: false, + }); expect(() => { profiler.marker('inactive-marker'); @@ -477,95 +640,64 @@ describe('NodejsProfiler', () => { describe('debug mode', () => { it('should initialize debug flag to false when env var not set', () => { - const { profiler } = getNodejsProfiler(); + const profiler = createProfiler('debug-flag-false'); const stats = profiler.stats; expect(stats.debug).toBe(false); }); - it('should initialize debug flag from CP_PROFILER_DEBUG env var when set', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; + it('should initialize debug flag from DEBUG env var when set', () => { + vi.stubEnv(PROFILER_DEBUG_ENV_VAR, 'true'); - const { profiler } = getNodejsProfiler(); + const profiler = createProfiler('debug-flag-true'); const stats = profiler.stats; expect(stats.debug).toBe(true); }); it('should expose debug flag via getter', () => { - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(false); + const profiler = createProfiler('debug-getter-false'); + expect(profiler.isDebugMode()).toBe(false); + expect(profiler.stats.debug).toBe(false); - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler: debugProfiler } = getNodejsProfiler(); - expect(debugProfiler.debug).toBe(true); + vi.stubEnv(PROFILER_DEBUG_ENV_VAR, 'true'); + const debugProfiler = createProfiler('debug-getter-true'); + expect(debugProfiler.isDebugMode()).toBe(true); + expect(debugProfiler.stats.debug).toBe(true); }); + // eslint-disable-next-line vitest/expect-expect it('should create transition marker when debug is enabled and transitioning to running', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler({ enabled: false }); + vi.stubEnv(PROFILER_DEBUG_ENV_VAR, 'true'); + const profiler = createProfiler({ + measureName: 'debug-transition-marker', + enabled: false, + }); performance.clearMarks(); - profiler.setEnabled(true); - const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find(mark => mark.name === 'idle->running'); - expect(transitionMark).toBeDefined(); - expect(transitionMark?.name).toBe('idle->running'); - }); - - it('should not create transition marker when transitioning from running to idle (profiler disabled)', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler({ enabled: true }); - - performance.clearMarks(); - - profiler.setEnabled(false); - - const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find(mark => mark.name === 'running->idle'); - expect(transitionMark).toBeUndefined(); + expectTransitionMarker('idle->running'); }); + // eslint-disable-next-line vitest/expect-expect it('does not emit transition markers unless debug is enabled', () => { - const { profiler } = getNodejsProfiler(); + const profiler = createProfiler('no-transition-markers'); performance.clearMarks(); - profiler.setEnabled(true); - expect( - performance - .getEntriesByType('mark') - .some(m => m.name.startsWith('idle->running')), - ).toBe(false); + expectNoTransitionMarker('idle->running'); }); it('should include stats in transition marker properties when transitioning to running', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler, perfObserverSink } = getNodejsProfiler({ + vi.stubEnv(PROFILER_DEBUG_ENV_VAR, 'true'); + const profiler = createProfiler({ + measureName: 'debug-transition-stats', enabled: false, }); - perfObserverSink.getStats.mockReturnValue({ - isSubscribed: true, - queued: 5, - dropped: 2, - written: 10, - maxQueueSize: 10_000, - flushThreshold: 20, - addedSinceLastFlush: 3, - buffered: true, - }); - performance.clearMarks(); - profiler.setEnabled(true); const marks = performance.getEntriesByType('mark'); @@ -583,152 +715,14 @@ describe('NodejsProfiler', () => { // eslint-disable-next-line vitest/max-nested-describe describe('setDebugMode', () => { it('should enable debug mode when called with true', () => { - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - - expect(profiler.debug).toBe(true); - expect(profiler.stats.debug).toBe(true); - }); - - it('should disable debug mode when called with false', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - - expect(profiler.debug).toBe(false); - expect(profiler.stats.debug).toBe(false); - }); - - it('should create transition markers after enabling debug mode', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); - expect(profiler.debug).toBe(false); - - performance.clearMarks(); - profiler.setEnabled(true); - expect( - performance - .getEntriesByType('mark') - .some(m => m.name.startsWith('idle->running')), - ).toBe(false); - - profiler.setEnabled(false); - profiler.setDebugMode(true); - performance.clearMarks(); - - profiler.setEnabled(true); - - const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find( - mark => mark.name === 'idle->running', - ); - expect(transitionMark).toBeDefined(); - expect(transitionMark?.name).toBe('idle->running'); - }); - - it('should stop creating transition markers after disabling debug mode', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler({ enabled: false }); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - performance.clearMarks(); - - profiler.setEnabled(true); - - expect( - performance - .getEntriesByType('mark') - .some(m => m.name.startsWith('idle->running')), - ).toBe(false); - }); - - it('should be idempotent when called multiple times with true', () => { - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - profiler.setDebugMode(true); - profiler.setDebugMode(true); - - expect(profiler.debug).toBe(true); - expect(profiler.stats.debug).toBe(true); - }); - - it('should be idempotent when called multiple times with false', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - profiler.setDebugMode(false); - profiler.setDebugMode(false); - - expect(profiler.debug).toBe(false); + const profiler = createProfiler('set-debug-true'); + expect(profiler.isDebugMode()).toBe(false); expect(profiler.stats.debug).toBe(false); - }); - - it('should work when profiler is in idle state', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); - expect(profiler.state).toBe('idle'); - expect(profiler.debug).toBe(false); profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - expect(profiler.stats.debug).toBe(true); - }); - - it('should work when profiler is in running state', () => { - const { profiler } = getNodejsProfiler({ enabled: true }); - expect(profiler.state).toBe('running'); - expect(profiler.debug).toBe(false); - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); + expect(profiler.isDebugMode()).toBe(true); expect(profiler.stats.debug).toBe(true); - - performance.clearMarks(); - profiler.setEnabled(false); - profiler.setEnabled(true); - - const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find( - mark => mark.name === 'idle->running', - ); - expect(transitionMark).toBeDefined(); - }); - - it('should work when profiler is in closed state', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); - profiler.close(); - expect(profiler.state).toBe('closed'); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - expect(profiler.stats.debug).toBe(true); - }); - - it('should toggle debug mode multiple times', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); - - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - expect(profiler.debug).toBe(false); }); }); }); @@ -736,60 +730,16 @@ describe('NodejsProfiler', () => { describe('exit handlers', () => { const mockSubscribeProcessExit = vi.mocked(subscribeProcessExit); - let capturedOnError: - | (( - error: unknown, - kind: 'uncaughtException' | 'unhandledRejection', - ) => void) - | undefined; - let capturedOnExit: - | (( - code: number, - reason: import('../exit-process.js').CloseReason, - ) => void) - | undefined; - const createProfiler = ( - overrides?: Partial< - NodejsProfilerOptions> - >, - ) => { - const sink = new MockTraceEventFileSink(); - vi.spyOn(sink, 'open'); - vi.spyOn(sink, 'close'); - vi.spyOn(WalModule, 'WriteAheadLogFile').mockImplementation( - () => sink as any, - ); - return new NodejsProfiler({ - prefix: 'cp', - track: 'test-track', - encodePerfEntry: simpleEncoder, - ...overrides, - }); - }; - - let profiler: NodejsProfiler< - string, - Record - >; - beforeEach(() => { - capturedOnError = undefined; - capturedOnExit = undefined; - - mockSubscribeProcessExit.mockImplementation(options => { - capturedOnError = options?.onError; - capturedOnExit = options?.onExit; - return vi.fn(); - }); - performance.clearMarks(); performance.clearMeasures(); - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILING; + resetEnv(); }); it('installs exit handlers on construction', () => { - expect(() => createProfiler()).not.toThrow(); + expect(() => + createSimpleProfiler({ measureName: 'exit-handlers-install' }), + ).not.toThrow(); expect(mockSubscribeProcessExit).toHaveBeenCalledWith({ onError: expect.any(Function), @@ -798,7 +748,9 @@ describe('NodejsProfiler', () => { }); it('setEnabled toggles profiler state', () => { - profiler = createProfiler({ enabled: true }); + const profiler = createSimpleProfiler({ + measureName: 'exit-set-enabled', + }); expect(profiler.isEnabled()).toBe(true); profiler.setEnabled(false); @@ -809,10 +761,15 @@ describe('NodejsProfiler', () => { }); it('marks fatal errors and shuts down profiler on uncaughtException', () => { - profiler = createProfiler({ enabled: true }); + const handlers = captureExitHandlers(); + expect(() => + createSimpleProfiler({ + measureName: 'exit-uncaught-exception', + }), + ).not.toThrow(); const testError = new Error('Test fatal error'); - capturedOnError?.call(profiler, testError, 'uncaughtException'); + handlers.onError?.(testError, 'uncaughtException'); expect(performance.getEntriesByType('mark')).toStrictEqual([ { @@ -836,14 +793,13 @@ describe('NodejsProfiler', () => { }); it('marks fatal errors and shuts down profiler on unhandledRejection', () => { - profiler = createProfiler({ enabled: true }); + const handlers = captureExitHandlers(); + const profiler = createSimpleProfiler({ + measureName: 'exit-unhandled-rejection', + }); expect(profiler.isEnabled()).toBe(true); - capturedOnError?.call( - profiler, - new Error('Test fatal error'), - 'unhandledRejection', - ); + handlers.onError?.(new Error('Test fatal error'), 'unhandledRejection'); expect(performance.getEntriesByType('mark')).toStrictEqual([ { @@ -867,11 +823,14 @@ describe('NodejsProfiler', () => { }); it('exit handler shuts down profiler', () => { - profiler = createProfiler({ enabled: true }); + const handlers = captureExitHandlers(); + const profiler = createSimpleProfiler({ + measureName: 'exit-handler-shutdown', + }); const closeSpy = vi.spyOn(profiler, 'close'); expect(profiler.isEnabled()).toBe(true); - capturedOnExit?.(0, { kind: 'exit' }); + handlers.onExit?.(0, { kind: 'exit' }); expect(profiler.isEnabled()).toBe(false); expect(closeSpy).toHaveBeenCalledTimes(1); @@ -881,7 +840,10 @@ describe('NodejsProfiler', () => { const unsubscribeFn = vi.fn(); mockSubscribeProcessExit.mockReturnValue(unsubscribeFn); - profiler = createProfiler({ enabled: false }); + const profiler = createSimpleProfiler({ + measureName: 'exit-close-unsubscribe', + enabled: false, + }); expect(profiler.isEnabled()).toBe(false); expect(mockSubscribeProcessExit).toHaveBeenCalled(); diff --git a/packages/utils/src/lib/profiler/profiler.int.test.ts b/packages/utils/src/lib/profiler/profiler.int.test.ts index 1ee4763d66..a21f07554c 100644 --- a/packages/utils/src/lib/profiler/profiler.int.test.ts +++ b/packages/utils/src/lib/profiler/profiler.int.test.ts @@ -1,72 +1,44 @@ -import type { ActionTrackEntryPayload } from '../user-timing-extensibility-api.type.js'; -import { Profiler } from './profiler.js'; +import type { ActionTrackConfigs } from '../user-timing-extensibility-api-utils.js'; +import { Profiler, type ProfilerOptions } from './profiler.js'; describe('Profiler Integration', () => { - let profiler: Profiler>; - - beforeEach(() => { - performance.clearMarks(); - performance.clearMeasures(); - - profiler = new Profiler({ + function profiler(opt?: ProfilerOptions): Profiler { + return new Profiler({ + ...opt, prefix: 'cp', track: 'CLI', trackGroup: 'Code Pushup', - color: 'primary-dark', + enabled: true, tracks: { utils: { track: 'Utils', color: 'primary' }, - core: { track: 'Core', color: 'primary-light' }, }, - enabled: true, }); + } + + beforeEach(() => { + performance.clearMarks(); + performance.clearMeasures(); + // Don't stub env var to undefined - let profiler respect enabled: true option + // The profiler constructor uses: enabled ?? isEnvVarEnabled(PROFILER_ENABLED_ENV_VAR) + // So if enabled is explicitly true, it will use that value }); it('should create complete performance timeline for sync operation', () => { + const p = profiler(); expect( - profiler.measure('sync-test', () => + p.measure('sync-test', () => Array.from({ length: 1000 }, (_, i) => i).reduce( (sum, num) => sum + num, 0, ), ), ).toBe(499_500); - - const marks = performance.getEntriesByType('mark'); - const measures = performance.getEntriesByType('measure'); - - expect(marks).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - name: 'cp:sync-test:start', - detail: expect.objectContaining({ - devtools: expect.objectContaining({ dataType: 'track-entry' }), - }), - }), - expect.objectContaining({ - name: 'cp:sync-test:end', - detail: expect.objectContaining({ - devtools: expect.objectContaining({ dataType: 'track-entry' }), - }), - }), - ]), - ); - - expect(measures).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - name: 'cp:sync-test', - duration: expect.any(Number), - detail: expect.objectContaining({ - devtools: expect.objectContaining({ dataType: 'track-entry' }), - }), - }), - ]), - ); }); it('should create complete performance timeline for async operation', async () => { + const p = profiler(); await expect( - profiler.measureAsync('async-test', async () => { + p.measureAsync('async-test', async () => { await new Promise(resolve => setTimeout(resolve, 10)); return 'async-result'; }), @@ -106,8 +78,9 @@ describe('Profiler Integration', () => { }); it('should handle nested measurements correctly', () => { - profiler.measure('outer', () => { - profiler.measure('inner', () => 'inner-result'); + const p = profiler(); + p.measure('outer', () => { + p.measure('inner', () => 'inner-result'); return 'outer-result'; }); @@ -134,7 +107,8 @@ describe('Profiler Integration', () => { }); it('should create markers with proper metadata', () => { - profiler.marker('test-marker', { + const p = profiler(); + p.marker('test-marker', { color: 'warning', tooltipText: 'Test marker tooltip', properties: [ @@ -165,131 +139,48 @@ describe('Profiler Integration', () => { }); it('should create proper DevTools payloads for tracks', () => { - profiler.measure('track-test', (): string => 'result', { + const p = profiler(); + p.measure('track-test', (): string => 'result', { success: result => ({ - properties: [['result', result]], - tooltipText: 'Track test completed', + track: 'Track 1', + trackGroup: 'Group 1', + color: 'secondary-dark', + properties: [['secondary', result]], + tooltipText: 'Track test secondary', }), }); const measures = performance.getEntriesByType('measure'); - expect(measures).toStrictEqual( + expect(measures).toEqual( expect.arrayContaining([ expect.objectContaining({ name: 'cp:track-test', - detail: { - devtools: expect.objectContaining({ - dataType: 'track-entry', - track: 'CLI', - trackGroup: 'Code Pushup', - color: 'primary-dark', - properties: [['result', 'result']], - tooltipText: 'Track test completed', - }), - }, - }), - ]), - ); - }); - - it('should merge track defaults with measurement options', () => { - profiler.measure('sync-op', () => 'sync-result', { - success: result => ({ - properties: [ - ['operation', 'sync'], - ['result', result], - ], - }), - }); - - const measures = performance.getEntriesByType('measure'); - expect(measures).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - name: 'cp:sync-op', - detail: { + detail: expect.objectContaining({ devtools: expect.objectContaining({ dataType: 'track-entry', - track: 'CLI', - trackGroup: 'Code Pushup', - color: 'primary-dark', - properties: [ - ['operation', 'sync'], - ['result', 'sync-result'], - ], - }), - }, - }), - ]), - ); - }); - - it('should mark errors with red color in DevTools', () => { - const error = new Error('Test error'); - - expect(() => { - profiler.measure('error-test', () => { - throw error; - }); - }).toThrow(error); - - const measures = performance.getEntriesByType('measure'); - expect(measures).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - detail: { - devtools: expect.objectContaining({ - color: 'error', - properties: expect.arrayContaining([ - ['Error Type', 'Error'], - ['Error Message', 'Test error'], - ]), - }), - }, - }), - ]), - ); - }); - - it('should include error metadata in DevTools properties', () => { - const customError = new TypeError('Custom type error'); - - expect(() => { - profiler.measure('custom-error-test', () => { - throw customError; - }); - }).toThrow(customError); - - const measures = performance.getEntriesByType('measure'); - expect(measures).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - detail: { - devtools: expect.objectContaining({ - properties: expect.arrayContaining([ - ['Error Type', 'TypeError'], - ['Error Message', 'Custom type error'], - ]), + track: 'Track 1', + trackGroup: 'Group 1', + color: 'secondary-dark', + properties: [['secondary', 'result']], + tooltipText: 'Track test secondary', }), - }, + }), }), ]), ); }); it('should not create performance entries when disabled', async () => { - profiler.setEnabled(false); + const p = profiler(); + p.setEnabled(false); - const syncResult = profiler.measure('disabled-sync', () => 'sync'); + const syncResult = p.measure('disabled-sync', () => 'sync'); expect(syncResult).toBe('sync'); - const asyncResult = profiler.measureAsync( - 'disabled-async', - async () => 'async', - ); + const asyncResult = p.measureAsync('disabled-async', async () => 'async'); await expect(asyncResult).resolves.toBe('async'); - profiler.marker('disabled-marker'); + p.marker('disabled-marker'); expect(performance.getEntriesByType('mark')).toHaveLength(0); expect(performance.getEntriesByType('measure')).toHaveLength(0); diff --git a/packages/utils/src/lib/profiler/profiler.ts b/packages/utils/src/lib/profiler/profiler.ts index e2b2f3b88a..322b813d87 100644 --- a/packages/utils/src/lib/profiler/profiler.ts +++ b/packages/utils/src/lib/profiler/profiler.ts @@ -16,7 +16,10 @@ import type { DevToolsColor, EntryMeta, } from '../user-timing-extensibility-api.type.js'; -import { PROFILER_ENABLED_ENV_VAR } from './constants.js'; +import { + PROFILER_DEBUG_ENV_VAR, + PROFILER_ENABLED_ENV_VAR, +} from './constants.js'; /** * Generates a unique profiler ID based on performance time origin, process ID, thread ID, and instance count. @@ -35,8 +38,6 @@ type ProfilerMeasureOptions = MeasureCtxOptions & { /** Custom track configurations that will be merged with default settings */ tracks?: Record>; - /** Whether profiling should be enabled (defaults to CP_PROFILING env var) */ - enabled?: boolean; }; /** @@ -44,6 +45,16 @@ type ProfilerMeasureOptions = */ export type MarkerOptions = EntryMeta & { color?: DevToolsColor }; +export type ProfilerStateOptions = { + /** Whether profiling should be enabled (defaults to CP_PROFILING env var) */ + enabled?: boolean; + /** + * When set to true, profiler creates debug logs in traces. + * + * @default false + */ + debug?: boolean; +}; /** * Options for configuring a Profiler instance. * @@ -59,7 +70,7 @@ export type MarkerOptions = EntryMeta & { color?: DevToolsColor }; * @property tracks - Custom track configurations merged with defaults */ export type ProfilerOptions = - ProfilerMeasureOptions; + ProfilerStateOptions & ProfilerMeasureOptions; /** * Performance profiler that creates structured timing measurements with Chrome DevTools Extensibility API payloads. @@ -71,11 +82,24 @@ export type ProfilerOptions = export class Profiler { static instanceCount = 0; readonly id = getProfilerId(); + /** + * Whether debug mode is enabled for profiler state transitions. + * When enabled, profiler state transitions create performance marks for debugging. + */ + #debug: boolean = false; #enabled: boolean = false; readonly #defaults: ActionTrackEntryPayload; readonly tracks: Record | undefined; readonly #ctxOf: ReturnType; + /** + * Protected method to set debug mode state. + * Allows subclasses to update debug state. + */ + protected setDebugState(debugMode: boolean): void { + this.#debug = debugMode; + } + /** * Creates a new Profiler instance with the specified configuration. * @@ -89,10 +113,11 @@ export class Profiler { * */ constructor(options: ProfilerOptions) { - const { tracks, prefix, enabled, ...defaults } = options; + const { tracks, prefix, enabled, debug, ...defaults } = options; const dataType = 'track-entry'; this.#enabled = enabled ?? isEnvVarEnabled(PROFILER_ENABLED_ENV_VAR); + this.#debug = debug ?? isEnvVarEnabled(PROFILER_DEBUG_ENV_VAR); this.#defaults = { ...defaults, dataType }; this.tracks = tracks ? setupTracks({ ...defaults, dataType }, tracks) @@ -128,6 +153,29 @@ export class Profiler { return this.#enabled; } + /** + * Sets debug mode state for this profiler. + * + * This means any future {@link Profiler} instantiations (including child processes) will use the same debug state. + * + * @param debugMode - Whether debug mode should be enabled + */ + setDebugMode(debugMode: boolean): void { + process.env[PROFILER_DEBUG_ENV_VAR] = `${debugMode}`; + this.#debug = debugMode; + } + + /** + * Is debug mode enabled? + * + * (defaults to 'DEBUG'). + * + * @returns Whether debug mode is currently enabled + */ + isDebugMode(): boolean { + return this.#debug; + } + /** * Creates a performance mark including payload for a Chrome DevTools 'marker' item. * diff --git a/packages/utils/src/lib/profiler/profiler.unit.test.ts b/packages/utils/src/lib/profiler/profiler.unit.test.ts index bf047e3fa3..c37b1b5203 100644 --- a/packages/utils/src/lib/profiler/profiler.unit.test.ts +++ b/packages/utils/src/lib/profiler/profiler.unit.test.ts @@ -1,7 +1,6 @@ import { performance } from 'node:perf_hooks'; import { threadId } from 'node:worker_threads'; import { beforeEach, describe, expect, it, vi } from 'vitest'; -import type { ActionTrackEntryPayload } from '../user-timing-extensibility-api.type.js'; import { Profiler, type ProfilerOptions, getProfilerId } from './profiler.js'; vi.mock('../exit-process.js'); @@ -22,18 +21,15 @@ describe('Profiler', () => { new Profiler({ prefix: 'cp', track: 'test-track', + enabled: true, ...overrides, }); - let profiler: Profiler>; - beforeEach(() => { performance.clearMarks(); performance.clearMeasures(); // eslint-disable-next-line functional/immutable-data delete process.env.CP_PROFILING; - - profiler = getProfiler(); }); it('should create profiler instances', () => { @@ -48,7 +44,7 @@ describe('Profiler', () => { }); it('constructor should use defaults for measure', () => { - const customProfiler = getProfiler({ color: 'secondary', enabled: true }); + const customProfiler = getProfiler({ color: 'secondary' }); const result = customProfiler.measure('test-operation', () => 'success'); @@ -123,7 +119,17 @@ describe('Profiler', () => { }); }); + it('setDebugState should update debug flag in subclasses', () => { + const testProfiler = getProfiler({ + prefix: 'cp', + track: 'test-track', + debug: true, + }); + expect(testProfiler.isDebugMode()).toBe(true); + }); + it('isEnabled should set and get enabled state', () => { + const profiler = getProfiler({ enabled: false }); expect(profiler.isEnabled()).toBe(false); profiler.setEnabled(true); @@ -147,7 +153,7 @@ describe('Profiler', () => { }); it('marker should execute without error when enabled', () => { - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); expect(() => { enabledProfiler.marker('test-marker', { color: 'primary', @@ -175,7 +181,7 @@ describe('Profiler', () => { it('marker should execute without error when enabled with default color', () => { performance.clearMarks(); - const profilerWithColor = getProfiler({ color: 'primary', enabled: true }); + const profilerWithColor = getProfiler({ color: 'primary' }); expect(() => { profilerWithColor.marker('test-marker-default-color', { @@ -199,7 +205,7 @@ describe('Profiler', () => { }); it('marker should execute without error when enabled with no default color', () => { - const profilerNoColor = getProfiler({ enabled: true }); + const profilerNoColor = getProfiler(); expect(() => { profilerNoColor.marker('test-marker-no-color', { @@ -243,7 +249,7 @@ describe('Profiler', () => { performance.clearMarks(); performance.clearMeasures(); - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); const workFn = vi.fn(() => 'result'); const result = enabledProfiler.measure('test-event', workFn, { color: 'primary', @@ -291,6 +297,7 @@ describe('Profiler', () => { }); it('measure should always execute work function', () => { + const profiler = getProfiler(); const workFn = vi.fn(() => 'result'); const result = profiler.measure('test-event', workFn); @@ -299,6 +306,7 @@ describe('Profiler', () => { }); it('measure should propagate errors when enabled', () => { + const profiler = getProfiler(); const error = new Error('Test error'); const workFn = vi.fn(() => { throw error; @@ -309,6 +317,7 @@ describe('Profiler', () => { }); it('measure should propagate errors', () => { + const profiler = getProfiler(); const error = new Error('Test error'); const workFn = vi.fn(() => { throw error; @@ -319,7 +328,7 @@ describe('Profiler', () => { }); it('measure should propagate errors when enabled and call error callback', () => { - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); const error = new Error('Enabled test error'); const workFn = vi.fn(() => { throw error; @@ -357,7 +366,7 @@ describe('Profiler', () => { }); it('measureAsync should handle async operations correctly when enabled', async () => { - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); const workFn = vi.fn(async () => { await Promise.resolve(); return 'async-result'; @@ -416,6 +425,7 @@ describe('Profiler', () => { }); it('measureAsync should propagate async errors when enabled', async () => { + const profiler = getProfiler(); const error = new Error('Async test error'); const workFn = vi.fn(async () => { await Promise.resolve(); @@ -429,7 +439,7 @@ describe('Profiler', () => { }); it('measureAsync should propagate async errors when enabled and call error callback', async () => { - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); const error = new Error('Enabled async test error'); const workFn = vi.fn(async () => { await Promise.resolve(); diff --git a/packages/utils/src/lib/profiler/trace-file-utils.ts b/packages/utils/src/lib/profiler/trace-file-utils.ts index 038c35b73e..8a7d89d47c 100644 --- a/packages/utils/src/lib/profiler/trace-file-utils.ts +++ b/packages/utils/src/lib/profiler/trace-file-utils.ts @@ -6,6 +6,7 @@ import type { } from 'node:perf_hooks'; import { threadId } from 'node:worker_threads'; import { defaultClock } from '../clock-epoch.js'; +import type { WalRecord } from '../wal.js'; import type { TraceEvent, TraceEventContainer, @@ -202,12 +203,14 @@ export const measureToSpanEvents = ( * @param entry - Performance entry * @returns Array of trace events */ -export function entryToTraceEvents(entry: PerformanceEntry): TraceEvent[] { +export function entryToTraceEvents( + entry: PerformanceEntry, +): T[] { if (entry.entryType === 'mark') { - return [markToInstantEvent(entry as PerformanceMark)]; + return [markToInstantEvent(entry as PerformanceMark) as T]; } if (entry.entryType === 'measure') { - return measureToSpanEvents(entry as PerformanceMeasure); + return measureToSpanEvents(entry as PerformanceMeasure) as T[]; } return []; } @@ -263,7 +266,7 @@ export const decodeEvent = (e: TraceEvent): TraceEvent => { * @param event - Trace event to serialize * @returns JSON string representation of the encoded trace event */ -export const serializeTraceEvent = (event: TraceEvent): string => +export const serializeTraceEvent = (event: TraceEvent & WalRecord): string => JSON.stringify(encodeEvent(event)); /** diff --git a/packages/utils/src/lib/profiler/wal-json-trace.ts b/packages/utils/src/lib/profiler/wal-json-trace.ts index a0690d8f1e..70b7f4f439 100644 --- a/packages/utils/src/lib/profiler/wal-json-trace.ts +++ b/packages/utils/src/lib/profiler/wal-json-trace.ts @@ -1,5 +1,5 @@ import { defaultClock } from '../clock-epoch.js'; -import type { InvalidEntry, WalFormat } from '../wal.js'; +import type { Codec, WalFormat, WalRecord } from '../wal.js'; import { PROFILER_PERSIST_BASENAME } from './constants.js'; import { complete, @@ -57,30 +57,38 @@ export function generateTraceContent( /** * Codec for encoding and decoding trace events. * Encodes nested objects in args.detail and args.data.detail to JSON strings for storage. + * Decode returns TraceEvent; use with WalFormat when T extends TraceEvent. */ -export const traceEventCodec = { +export const traceEventCodec: Codec< + TraceEvent & WalRecord, + string, + TraceEvent +> = { encode: serializeTraceEvent, decode: deserializeTraceEvent, }; /** - * Creates a WAL (Write-Ahead Logging) format configuration for Chrome DevTools trace files. - * Automatically finalizes shards into complete trace files with proper metadata and margin events. - * @returns WalFormat configuration object with baseName, codec, extensions, and finalizer + * WAL format for Chrome DevTools trace files (codec decodes to TraceEvent). + * Use getTraceEventWalFormat() to get a format typed for subtype T. */ -export function traceEventWalFormat() { - return { - baseName: PROFILER_PERSIST_BASENAME, - walExtension: '.jsonl', - finalExtension: '.json', - codec: traceEventCodec, - finalizer: ( - records: (TraceEvent | InvalidEntry)[], - metadata?: Record, - ) => - generateTraceContent( - records.filter((r): r is TraceEvent => !('__invalid' in (r as object))), - metadata, - ), - } satisfies WalFormat; +export const traceEventWalFormat: WalFormat< + TraceEvent & WalRecord, + TraceEvent +> = { + baseName: PROFILER_PERSIST_BASENAME, + walExtension: '.jsonl', + finalExtension: '.json', + codec: traceEventCodec, + finalizer: generateTraceContent, +}; + +/** + * Returns the default trace format for use when no custom format is passed. + * Typed as WalFormat so it is assignable without casts when T extends TraceEvent. + */ +export function getTraceEventWalFormat< + T extends TraceEvent & WalRecord, +>(): WalFormat { + return traceEventWalFormat; } diff --git a/packages/utils/src/lib/profiler/wal-json-trace.unit.test.ts b/packages/utils/src/lib/profiler/wal-json-trace.unit.test.ts index 8b25465f72..ae606eedfd 100644 --- a/packages/utils/src/lib/profiler/wal-json-trace.unit.test.ts +++ b/packages/utils/src/lib/profiler/wal-json-trace.unit.test.ts @@ -428,7 +428,7 @@ describe('generateTraceContent', () => { describe('traceEventWalFormat', () => { it('should create WAL format with default directory', () => { - const format = traceEventWalFormat(); + const format = traceEventWalFormat; expect(format).toStrictEqual({ baseName: 'trace', @@ -443,7 +443,7 @@ describe('traceEventWalFormat', () => { }); it('should create WAL format with consistent structure', () => { - const format = traceEventWalFormat(); + const format = traceEventWalFormat; expect(format.baseName).toBe('trace'); expect(format.walExtension).toBe('.jsonl'); @@ -451,7 +451,7 @@ describe('traceEventWalFormat', () => { }); it('should encode and decode trace events correctly', () => { - const format = traceEventWalFormat(); + const format = traceEventWalFormat; const testEvent: TraceEvent = { name: 'cp:test-event', ph: 'I', @@ -472,7 +472,7 @@ describe('traceEventWalFormat', () => { }); it('should maintain consistency through decode -> encode -> decode round-trip', () => { - const format = traceEventWalFormat(); + const format = traceEventWalFormat; const originalEvent: TraceEvent = { name: 'cp:round-trip-test', ph: 'I', @@ -498,7 +498,7 @@ describe('traceEventWalFormat', () => { }); it('should finalize records into trace content', () => { - const format = traceEventWalFormat(); + const format = traceEventWalFormat; const records: TraceEvent[] = [ { name: 'cp:operation:start', diff --git a/packages/utils/src/lib/user-timing-extensibility-api-utils.ts b/packages/utils/src/lib/user-timing-extensibility-api-utils.ts index fedae9fa31..2eca4f3bf5 100644 --- a/packages/utils/src/lib/user-timing-extensibility-api-utils.ts +++ b/packages/utils/src/lib/user-timing-extensibility-api-utils.ts @@ -332,7 +332,7 @@ export function mergeDevtoolsPayload< } export type ActionTrackConfigs = Record< T, - ActionTrackEntryPayload + Omit >; /** * Sets up tracks with default values merged into each track. diff --git a/packages/utils/src/lib/wal-sharded.ts b/packages/utils/src/lib/wal-sharded.ts index 1a38238ead..7274b65c4b 100644 --- a/packages/utils/src/lib/wal-sharded.ts +++ b/packages/utils/src/lib/wal-sharded.ts @@ -81,32 +81,6 @@ export function getShardId(): string { return `${getUniqueTimeId()}.${process.pid}.${threadId}.${ShardedWalCounter.next()}`; } -/** - * @TODO remove in PR https://github.com/code-pushup/cli/pull/1231 in favour of class method getShardedFileName - * Generates a path to a shard file using human-readable IDs. - * Both groupId and shardId are already in readable date format. - * - * Example with groupId "20240101-120000-000" and shardId "20240101-120000-000.12345.1.1": - * Full path: /base/20240101-120000-000/trace.20240101-120000-000.12345.1.1.log - * - * @param opt.dir - The directory to store the shard file - * @param opt.format - The WalFormat to use for the shard file - * @param opt.groupId - The human-readable group ID (yyyymmdd-hhmmss-ms format) - * @param opt.shardId - The human-readable shard ID (readable-timestamp.pid.threadId.count format) - * @returns The path to the shard file - */ -export function getShardedPath(opt: { - dir?: string; - format: WalFormat; - groupId: string; - shardId: string; -}): string { - const { dir = '', format, groupId, shardId } = opt; - const { baseName, walExtension } = format; - - return path.join(dir, groupId, `${baseName}.${shardId}${walExtension}`); -} - /** * Sharded Write-Ahead Log manager for coordinating multiple WAL shards. * Handles distributed logging across multiple processes/files with atomic finalization. @@ -122,7 +96,7 @@ export class ShardedWal { }); readonly groupId: string; readonly #debug: boolean = false; - readonly #format: WalFormat; + readonly #format: WalFormat; readonly #dir: string = process.cwd(); readonly #coordinatorIdEnvVar: string; #state: 'active' | 'finalized' | 'cleaned' = 'active'; @@ -172,7 +146,7 @@ export class ShardedWal { constructor(opt: { debug?: boolean; dir?: string; - format: WalFormat; + format: WalFormat; groupId?: string; coordinatorIdEnvVar: string; autoCoordinator?: boolean; @@ -430,7 +404,6 @@ export class ShardedWal { isFinalized: this.isFinalized(), isCleaned: this.isCleaned(), finalFilePath: this.getFinalFilePath(), - shardFileCount: shardFilesList.length, shardFiles: shardFilesList, }; } diff --git a/packages/utils/src/lib/wal-sharded.unit.test.ts b/packages/utils/src/lib/wal-sharded.unit.test.ts index 0e3b0992cc..716c989ab7 100644 --- a/packages/utils/src/lib/wal-sharded.unit.test.ts +++ b/packages/utils/src/lib/wal-sharded.unit.test.ts @@ -10,7 +10,6 @@ import { type WalFormat, type WalRecord, WriteAheadLogFile, - parseWalFormat, stringCodec, } from './wal.js'; @@ -18,7 +17,7 @@ const read = (p: string) => vol.readFileSync(p, 'utf8') as string; const getShardedWal = (overrides?: { dir?: string; - format?: Partial; + format?: Partial>; autoCoordinator?: boolean; groupId?: string; }) => { @@ -26,10 +25,14 @@ const getShardedWal = (overrides?: { return new ShardedWal({ debug: false, dir: '/test/shards', - format: parseWalFormat({ + format: { baseName: 'test-wal', + walExtension: '.log', + finalExtension: '.json', + codec: stringCodec(), + finalizer: records => `${JSON.stringify(records)}\n`, ...format, - }), + }, coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, ...rest, }); diff --git a/packages/utils/src/lib/wal.ts b/packages/utils/src/lib/wal.ts index 131f2c176f..276d5887c3 100644 --- a/packages/utils/src/lib/wal.ts +++ b/packages/utils/src/lib/wal.ts @@ -4,12 +4,17 @@ import path from 'node:path'; /** * Codec for encoding/decoding values to/from strings for WAL storage. * Used to serialize/deserialize records written to and read from WAL files. + * + * @template I - Type accepted by encode (and usually decoded) + * @template O - Encoded string type (default string) + * @template D - Type returned by decode; defaults to I. Use a supertype of I when the codec + * decodes to a base type (e.g. TraceEvent) and the format is used for subtypes (e.g. OtelTraceEvent). */ -export type Codec = { +export type Codec = { /** Encode a value to a string for storage */ encode: (v: I) => O; - /** Decode a string back to the original value type */ - decode: (data: O) => I; + /** Decode a string back to a value (type D; when D extends I, decode returns a supertype of I) */ + decode: (data: O) => D; }; export type InvalidEntry = { __invalid: true; raw: O }; @@ -64,10 +69,9 @@ export type WalStats = { lastRecovery: RecoverResult> | null; }; -export const createTolerantCodec = (codec: { - encode: (v: I) => O; - decode: (d: O) => I; -}): Codec, O> => { +export const createTolerantCodec = ( + codec: Codec, +): Codec, O, D | InvalidEntry> => { const { encode, decode } = codec; return { @@ -155,10 +159,11 @@ export class WriteAheadLogFile * Create a new WAL file instance. * @param options - Configuration options */ - constructor(options: { file: string; codec: Codec }) { + /** Codec may decode to a supertype of T (e.g. TraceEvent when T is a narrower event type). */ + constructor(options: { file: string; codec: Codec }) { this.#file = options.file; const c = createTolerantCodec(options.codec); - this.#decode = c.decode; + this.#decode = c.decode as Codec['decode']; this.#encode = c.encode; } @@ -267,26 +272,25 @@ export type WalRecord = object | string; /** * Format descriptor that binds codec and file extension together. * Prevents misconfiguration by keeping related concerns in one object. + * + * @template T - Record type written to and read from the WAL + * @template DecodeOut - Type returned by codec.decode; defaults to T. Use a supertype of T + * when the codec decodes to a base type (e.g. TraceEvent) and the format is used for subtypes. */ -export type WalFormat = { +export type WalFormat = { /** Base name for the WAL (e.g., "trace") */ baseName: string; /** Shard file extension (e.g., ".jsonl") */ walExtension: string; /** Final file extension (e.g., ".json", ".trace.json") falls back to walExtension if not provided */ finalExtension: string; - /** Codec for encoding/decoding records */ - codec: Codec; + /** Codec for encoding/decoding records; decode may return DecodeOut (a supertype of T) */ + codec: Codec; /** Finalizer for converting records to a string */ - finalizer: ( - records: (T | InvalidEntry)[], - opt?: Record, - ) => string; + finalizer: (records: T[], opt?: Record) => string; }; -export const stringCodec = < - T extends string | object = string, ->(): Codec => ({ +export const stringCodec = (): Codec => ({ encode: v => (typeof v === 'string' ? v : JSON.stringify(v)), decode: v => { try { @@ -297,52 +301,6 @@ export const stringCodec = < }, }); -/** - * Parses a partial WalFormat configuration and returns a complete WalFormat object. - * All fallback values are targeting string types. - * - baseName defaults to 'wal' - * - walExtension defaults to '.log' - * - finalExtension defaults to '.log' - * - codec defaults to stringCodec() - * - finalizer defaults to encoding each record using codec.encode() and joining with newlines. - * For object types, this properly JSON-stringifies them (not [object Object]). - * InvalidEntry records use their raw string value directly. - * @param format - Partial WalFormat configuration - * @returns Parsed WalFormat with defaults filled in - */ -export function parseWalFormat( - format: Partial>, -): WalFormat { - const { - baseName = 'wal', - walExtension = '.log', - finalExtension = walExtension, - codec = stringCodec(), - } = format; - - const finalizer = - format.finalizer ?? - ((records: (T | InvalidEntry)[]) => { - // Encode each record using the codec before joining. - // For object types, codec.encode() will JSON-stringify them properly. - // InvalidEntry records use their raw string value directly. - const encoded = records.map(record => - typeof record === 'object' && record != null && '__invalid' in record - ? (record as InvalidEntry).raw - : codec.encode(record as T), - ); - return `${encoded.join('\n')}\n`; - }); - - return { - baseName, - walExtension, - finalExtension, - codec, - finalizer, - } satisfies WalFormat; -} - /** * NOTE: this helper is only used within the scope of wal and sharded wal logic. The rest of the repo avoids sync methods so it is not reusable. * Ensures a directory exists, creating it recursively if necessary using sync methods. diff --git a/packages/utils/src/lib/wal.unit.test.ts b/packages/utils/src/lib/wal.unit.test.ts index 36f727b81f..535956fcc6 100644 --- a/packages/utils/src/lib/wal.unit.test.ts +++ b/packages/utils/src/lib/wal.unit.test.ts @@ -2,11 +2,9 @@ import { vol } from 'memfs'; import { MEMFS_VOLUME } from '@code-pushup/test-utils'; import { type Codec, - type InvalidEntry, WriteAheadLogFile, createTolerantCodec, filterValidRecords, - parseWalFormat, recoverFromContent, stringCodec, } from './wal.js'; @@ -496,112 +494,3 @@ describe('stringCodec', () => { expect(codec.decode('42')).toBe(42); }); }); - -describe('parseWalFormat', () => { - it('should apply all defaults when given empty config', () => { - const result = parseWalFormat({}); - - expect(result.baseName).toBe('wal'); - expect(result.walExtension).toBe('.log'); - expect(result.finalExtension).toBe('.log'); - expect(result.codec).toBeDefined(); - expect(typeof result.finalizer).toBe('function'); - }); - - it('should use provided baseName and default others', () => { - const result = parseWalFormat({ baseName: 'test' }); - - expect(result.baseName).toBe('test'); - expect(result.walExtension).toBe('.log'); - expect(result.finalExtension).toBe('.log'); - }); - - it('should use provided walExtension and default finalExtension to match', () => { - const result = parseWalFormat({ walExtension: '.wal' }); - - expect(result.walExtension).toBe('.wal'); - expect(result.finalExtension).toBe('.wal'); - }); - - it('should use provided finalExtension independently', () => { - const result = parseWalFormat({ - walExtension: '.wal', - finalExtension: '.json', - }); - - expect(result.walExtension).toBe('.wal'); - expect(result.finalExtension).toBe('.json'); - }); - - it('should use provided codec', () => { - const customCodec = stringCodec(); - const result = parseWalFormat({ codec: customCodec }); - - expect(result.codec).toBe(customCodec); - }); - - it('should use custom finalizer function', () => { - const customFinalizer = (records: any[]) => `custom: ${records.length}`; - const result = parseWalFormat({ finalizer: customFinalizer }); - - expect(result.finalizer(['a', 'b'])).toBe('custom: 2'); - }); - - it('should work with all custom parameters', () => { - const config = { - baseName: 'my-wal', - walExtension: '.wal', - finalExtension: '.json', - codec: stringCodec(), - finalizer: (records: any[]) => JSON.stringify(records), - }; - - const result = parseWalFormat(config); - - expect(result.baseName).toBe('my-wal'); - expect(result.walExtension).toBe('.wal'); - expect(result.finalExtension).toBe('.json'); - expect(result.codec).toBe(config.codec); - expect(result.finalizer(['test'])).toBe('["test"]'); - }); - - it('should use default finalizer when none provided', () => { - const result = parseWalFormat({ baseName: 'test' }); - expect(result.finalizer(['line1', 'line2'])).toBe('line1\nline2\n'); - expect(result.finalizer([])).toBe('\n'); - }); - - it('should encode objects to JSON strings in default finalizer', () => { - const result = parseWalFormat({ baseName: 'test' }); - const records = [ - { id: 1, name: 'test' }, - { id: 2, name: 'test2' }, - ]; - const output = result.finalizer(records); - expect(output).toBe('{"id":1,"name":"test"}\n{"id":2,"name":"test2"}\n'); - }); - - it('should handle InvalidEntry in default finalizer', () => { - const result = parseWalFormat({ baseName: 'test' }); - const records: (string | InvalidEntry)[] = [ - 'valid', - { __invalid: true, raw: 'invalid-raw' }, - 'also-valid', - ]; - const output = result.finalizer(records); - expect(output).toBe('valid\ninvalid-raw\nalso-valid\n'); - }); - - it('should encode objects correctly when using default type parameter', () => { - // Test parseWalFormat({}) with default type parameter (object) - const result = parseWalFormat({}); - const records = [ - { id: 1, name: 'test1' }, - { id: 2, name: 'test2' }, - ]; - const output = result.finalizer(records); - // Should be JSON strings, not [object Object] - expect(output).toBe('{"id":1,"name":"test1"}\n{"id":2,"name":"test2"}\n'); - expect(output).not.toContain('[object Object]'); - }); -}); diff --git a/testing/test-setup-config/src/lib/vitest-config-factory.ts b/testing/test-setup-config/src/lib/vitest-config-factory.ts index 0723f72b30..4f00b00315 100644 --- a/testing/test-setup-config/src/lib/vitest-config-factory.ts +++ b/testing/test-setup-config/src/lib/vitest-config-factory.ts @@ -13,11 +13,15 @@ function getIncludePatterns(kind: TestKind): string[] { switch (kind) { case 'unit': return [ + 'mocks/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.type.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', ]; case 'int': - return ['src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}']; + return [ + 'mocks/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + 'src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + ]; case 'e2e': return ['tests/**/*.e2e.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}']; } diff --git a/testing/test-setup-config/src/lib/vitest-config-factory.unit.test.ts b/testing/test-setup-config/src/lib/vitest-config-factory.unit.test.ts index 41929603b5..a0d1a168b6 100644 --- a/testing/test-setup-config/src/lib/vitest-config-factory.unit.test.ts +++ b/testing/test-setup-config/src/lib/vitest-config-factory.unit.test.ts @@ -26,6 +26,7 @@ describe('createVitestConfig', () => { poolOptions: { threads: { singleThread: true } }, environment: 'node', include: [ + 'mocks/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.type.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', ], @@ -126,7 +127,10 @@ describe('createVitestConfig', () => { test: expect.objectContaining({ reporters: ['basic'], globals: true, - include: ['src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'], + include: [ + 'mocks/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + 'src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + ], globalSetup: ['../../global-setup.ts'], coverage: expect.objectContaining({ reportsDirectory: '../../coverage/test-package/int-tests', @@ -243,10 +247,14 @@ describe('createVitestConfig', () => { const expectedIncludes = { unit: [ + 'mocks/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.type.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', ], - int: ['src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'], + int: [ + 'mocks/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + 'src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + ], e2e: ['tests/**/*.e2e.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'], }; diff --git a/testing/test-utils/src/index.ts b/testing/test-utils/src/index.ts index 29e1d6a630..64ef298a80 100644 --- a/testing/test-utils/src/index.ts +++ b/testing/test-utils/src/index.ts @@ -11,5 +11,4 @@ export * from './lib/utils/create-npm-workshpace.js'; export * from './lib/utils/project-graph.js'; export * from './lib/utils/test-folder-setup.js'; export * from './lib/utils/profiler.mock.js'; -export * from './lib/utils/omit-trace-json.js'; export * from './lib/utils/plugin-setup-tree.mock.js'; diff --git a/testing/test-utils/src/lib/utils/omit-trace-json.ts b/testing/test-utils/src/lib/utils/omit-trace-json.ts deleted file mode 100644 index e45a72a51f..0000000000 --- a/testing/test-utils/src/lib/utils/omit-trace-json.ts +++ /dev/null @@ -1,207 +0,0 @@ -/** - * Normalizes trace JSONL files for deterministic snapshot testing. - * - * Replaces variable values (pid, tid, ts) with deterministic incremental values - * while preserving the original order of events. - * - * - Assigns incremental IDs to pid fields starting from 10001, 10002, etc. - * - Assigns incremental IDs to tid fields starting from 1, 2, etc. - * - Normalizes timestamps by sorting them first to determine incremental order, - * then mapping to incremental values starting from mocked epoch clock base, - * while preserving the original order of events in the output. - * - * @param jsonlContent - JSONL string content (one JSON object per line) or parsed JSON object/array - * @param baseTimestampUs - Base timestamp in microseconds to start incrementing from (default: 1_700_000_005_000_000) - * @returns Normalized JSONL string with deterministic pid, tid, and ts values - */ -export function omitTraceJson( - jsonlContent: string | object, - baseTimestampUs = 1_700_000_005_000_000, -): string { - if (typeof jsonlContent !== 'string') { - const eventsArray = Array.isArray(jsonlContent) - ? jsonlContent - : [jsonlContent]; - if (eventsArray.length === 0) { - return ''; - } - const events = eventsArray as TraceEvent[]; - return normalizeAndFormatEvents(events, baseTimestampUs); - } - - // Handle string input (JSONL format) - const trimmedContent = jsonlContent.trim(); - if (!trimmedContent) { - return jsonlContent; - } - - // Parse all events from JSONL - const events = trimmedContent - .split('\n') - .filter(Boolean) - .map(line => JSON.parse(line) as TraceEvent); - - if (events.length === 0) { - return jsonlContent; - } - - return normalizeAndFormatEvents(events, baseTimestampUs); -} - -/** - * Normalizes trace events and formats them as JSONL. - */ -function normalizeAndFormatEvents( - events: TraceEvent[], - baseTimestampUs: number, -): string { - if (events.length === 0) { - return ''; - } - - // Collect unique pid and tid values - type Accumulator = { - uniquePids: Set; - uniqueTids: Set; - timestamps: number[]; - uniqueLocalIds: Set; - }; - - const { uniquePids, uniqueTids, timestamps, uniqueLocalIds } = - events.reduce( - (acc, event) => { - const newUniquePids = new Set(acc.uniquePids); - const newUniqueTids = new Set(acc.uniqueTids); - const newUniqueLocalIds = new Set(acc.uniqueLocalIds); - - if (typeof event.pid === 'number') { - newUniquePids.add(event.pid); - } - if (typeof event.tid === 'number') { - newUniqueTids.add(event.tid); - } - - const newTimestamps = - typeof event.ts === 'number' - ? [...acc.timestamps, event.ts] - : acc.timestamps; - - // Collect id2.local values - if ( - event.id2 && - typeof event.id2 === 'object' && - 'local' in event.id2 && - typeof event.id2.local === 'string' - ) { - newUniqueLocalIds.add(event.id2.local); - } - - return { - uniquePids: newUniquePids, - uniqueTids: newUniqueTids, - timestamps: newTimestamps, - uniqueLocalIds: newUniqueLocalIds, - }; - }, - { - uniquePids: new Set(), - uniqueTids: new Set(), - timestamps: [] as number[], - uniqueLocalIds: new Set(), - }, - ); - - // Create mappings: original value -> normalized incremental value - const pidMap = new Map(); - const tidMap = new Map(); - const localIdMap = new Map(); - - // Sort unique values to ensure consistent mapping order - const sortedPids = [...uniquePids].sort((a, b) => a - b); - const sortedTids = [...uniqueTids].sort((a, b) => a - b); - const sortedLocalIds = [...uniqueLocalIds].sort(); - - // Map pids starting from 10001 - sortedPids.forEach((pid, index) => { - pidMap.set(pid, 10_001 + index); - }); - - // Map tids starting from 1 - sortedTids.forEach((tid, index) => { - tidMap.set(tid, 1 + index); - }); - - // Map local IDs starting from "0x1" - sortedLocalIds.forEach((localId, index) => { - localIdMap.set(localId, `0x${(index + 1).toString(16)}`); - }); - - // Sort timestamps to determine incremental order - const sortedTimestamps = [...timestamps].sort((a, b) => a - b); - - // Map timestamps incrementally starting from baseTimestampUs - const tsMap = sortedTimestamps.reduce((map, ts, index) => { - if (!map.has(ts)) { - return new Map(map).set(ts, baseTimestampUs + index); - } - return map; - }, new Map()); - - // Normalize events while preserving original order - const normalizedEvents = events.map(event => { - const pidUpdate = - typeof event.pid === 'number' && pidMap.has(event.pid) - ? { pid: pidMap.get(event.pid)! } - : {}; - - const tidUpdate = - typeof event.tid === 'number' && tidMap.has(event.tid) - ? { tid: tidMap.get(event.tid)! } - : {}; - - const tsUpdate = - typeof event.ts === 'number' && tsMap.has(event.ts) - ? { ts: tsMap.get(event.ts)! } - : {}; - - // Normalize id2.local if present - const id2Update = - event.id2 && - typeof event.id2 === 'object' && - 'local' in event.id2 && - typeof event.id2.local === 'string' && - localIdMap.has(event.id2.local) - ? { - id2: { - ...event.id2, - local: localIdMap.get(event.id2.local)!, - }, - } - : {}; - - return { - ...event, - ...pidUpdate, - ...tidUpdate, - ...tsUpdate, - ...id2Update, - }; - }); - - // Convert back to JSONL format - return `${normalizedEvents.map(event => JSON.stringify(event)).join('\n')}\n`; -} - -/** - * Trace event structure with pid, tid, ts, and id2.local fields. - */ -type TraceEvent = { - pid?: number; - tid?: number; - ts?: number; - id2?: { - local?: string; - [key: string]: unknown; - }; - [key: string]: unknown; -}; diff --git a/testing/test-utils/src/lib/utils/omit-trace-json.unit.test.ts b/testing/test-utils/src/lib/utils/omit-trace-json.unit.test.ts deleted file mode 100644 index dbf5a079ad..0000000000 --- a/testing/test-utils/src/lib/utils/omit-trace-json.unit.test.ts +++ /dev/null @@ -1,235 +0,0 @@ -import { omitTraceJson } from './omit-trace-json.js'; - -describe('omitTraceJson', () => { - it('should return empty string unchanged', () => { - expect(omitTraceJson('')).toBe(''); - }); - - it('should return whitespace-only string unchanged', () => { - expect(omitTraceJson(' \n\t ')).toBe(' \n\t '); - }); - - it('should return empty JSONL unchanged', () => { - expect(omitTraceJson('\n\n')).toBe('\n\n'); - }); - - it('should return minimal event unchanged', () => { - const input = '{"name":"test"}\n'; - expect(omitTraceJson(input)).toBe(input); - }); - - it('should normalize pid field starting from 10001', () => { - const result = omitTraceJson('{"pid":12345}\n'); - const parsed = JSON.parse(result.trim()); - expect(parsed.pid).toBe(10_001); - }); - - it('should normalize tid field starting from 1', () => { - const result = omitTraceJson('{"tid":999}\n'); - const parsed = JSON.parse(result.trim()); - expect(parsed.tid).toBe(1); - }); - - it('should normalize ts field with default baseTimestampUs', () => { - const result = omitTraceJson('{"ts":1234567890}\n'); - const parsed = JSON.parse(result.trim()); - expect(parsed.ts).toBe(1_700_000_005_000_000); - }); - - it('should normalize ts field with custom baseTimestampUs', () => { - const customBase = 2_000_000_000_000_000; - const result = omitTraceJson('{"ts":1234567890}\n', customBase); - const parsed = JSON.parse(result.trim()); - expect(parsed.ts).toBe(customBase); - }); - - it('should normalize id2.local field starting from 0x1', () => { - const result = omitTraceJson('{"id2":{"local":"0xabc123"}}\n'); - const parsed = JSON.parse(result.trim()); - expect(parsed.id2.local).toBe('0x1'); - }); - - it('should preserve event order when timestamps are out of order', () => { - const input = - '{"ts":300,"name":"third"}\n{"ts":100,"name":"first"}\n{"ts":200,"name":"second"}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].name).toBe('third'); - expect(events[1].name).toBe('first'); - expect(events[2].name).toBe('second'); - expect(events[0].ts).toBe(1_700_000_005_000_002); - expect(events[1].ts).toBe(1_700_000_005_000_000); - expect(events[2].ts).toBe(1_700_000_005_000_001); - }); - - it('should preserve event order when PIDs are out of order', () => { - const input = - '{"pid":300,"name":"third"}\n{"pid":100,"name":"first"}\n{"pid":200,"name":"second"}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].name).toBe('third'); - expect(events[1].name).toBe('first'); - expect(events[2].name).toBe('second'); - expect(events[0].pid).toBe(10_003); - expect(events[1].pid).toBe(10_001); - expect(events[2].pid).toBe(10_002); - }); - - it('should preserve event order when TIDs are out of order', () => { - const input = - '{"tid":30,"name":"third"}\n{"tid":10,"name":"first"}\n{"tid":20,"name":"second"}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].name).toBe('third'); - expect(events[1].name).toBe('first'); - expect(events[2].name).toBe('second'); - expect(events[0].tid).toBe(3); - expect(events[1].tid).toBe(1); - expect(events[2].tid).toBe(2); - }); - - it('should preserve event order with mixed out-of-order fields', () => { - const input = - '{"pid":500,"tid":5,"ts":5000,"name":"e"}\n{"pid":100,"tid":1,"ts":1000,"name":"a"}\n{"pid":300,"tid":3,"ts":3000,"name":"c"}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events.map(e => e.name)).toEqual(['e', 'a', 'c']); - expect(events[0].pid).toBe(10_003); - expect(events[1].pid).toBe(10_001); - expect(events[2].pid).toBe(10_002); - }); - - it('should not normalize non-number pid values', () => { - const input = '{"pid":"string"}\n{"pid":null}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].pid).toBe('string'); - expect(events[1].pid).toBeNull(); - }); - - it('should not normalize non-number tid values', () => { - const input = '{"tid":"string"}\n{"tid":null}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].tid).toBe('string'); - expect(events[1].tid).toBeNull(); - }); - - it('should not normalize non-number ts values', () => { - const input = '{"ts":"string"}\n{"ts":null}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].ts).toBe('string'); - expect(events[1].ts).toBeNull(); - }); - - it('should not normalize id2.local when id2 is missing', () => { - const input = '{"name":"test"}\n'; - const result = omitTraceJson(input); - const parsed = JSON.parse(result.trim()); - expect(parsed.id2).toBeUndefined(); - }); - - it('should not normalize id2.local when id2 is not an object', () => { - const input = '{"id2":"string"}\n{"id2":null}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].id2).toBe('string'); - expect(events[1].id2).toBeNull(); - }); - - it('should not normalize id2.local when local is missing', () => { - const input = '{"id2":{"other":"value"}}\n'; - const result = omitTraceJson(input); - const parsed = JSON.parse(result.trim()); - expect(parsed.id2.local).toBeUndefined(); - expect(parsed.id2.other).toBe('value'); - }); - - it('should not normalize id2.local when local is not a string', () => { - const input = '{"id2":{"local":123}}\n{"id2":{"local":null}}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].id2.local).toBe(123); - expect(events[1].id2.local).toBeNull(); - }); - - it('should map duplicate values to same normalized value', () => { - const input = '{"pid":100}\n{"pid":200}\n{"pid":100}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].pid).toBe(10_001); - expect(events[1].pid).toBe(10_002); - expect(events[2].pid).toBe(10_001); - }); - - it('should handle duplicate timestamps correctly', () => { - const input = '{"ts":1000}\n{"ts":2000}\n{"ts":1000}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].ts).toBe(1_700_000_005_000_000); - expect(events[1].ts).toBe(1_700_000_005_000_002); - expect(events[2].ts).toBe(1_700_000_005_000_000); - }); - - it('should preserve other id2 properties when normalizing local', () => { - const input = - '{"id2":{"local":"0xabc","other":"value","nested":{"key":123}}}\n'; - const result = omitTraceJson(input); - const parsed = JSON.parse(result.trim()); - expect(parsed.id2.local).toBe('0x1'); - expect(parsed.id2.other).toBe('value'); - expect(parsed.id2.nested).toEqual({ key: 123 }); - }); - - it('should map multiple id2.local values to incremental hex', () => { - const input = - '{"id2":{"local":"0xabc"}}\n{"id2":{"local":"0xdef"}}\n{"id2":{"local":"0x123"}}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - const locals = events.map(e => e.id2.local).sort(); - expect(locals).toEqual(['0x1', '0x2', '0x3']); - }); - - it('should output valid JSONL with trailing newline', () => { - const result = omitTraceJson('{"pid":123}\n'); - expect(result).toMatch(/\n$/); - expect(() => JSON.parse(result.trim())).not.toThrow(); - }); -});