Open Source · MPL-2.0

Healthcare Pipelines
Defined as Code.

Stop fighting legacy GUI-based engines. Build, version, and deploy healthcare integration pipelines using YAML and TypeScript.
AI-friendly by design.

$ npm install -g intu-dev
npm version npm downloads GitHub stars GitHub forks License Test Coverage Build Status

What is Integration as Code?

Every other part of your stack is defined as code—infrastructure, CI/CD, containers. Why are healthcare pipelines still configured in GUIs and stored in databases?

Legacy Approach

  • Pipelines configured in a GUI, stored in a database
  • No version control, no diffs, no code review
  • Legacy JavaScript (ES5 / Rhino) with no type safety
  • Heavy JVM runtime, manual heap tuning
  • AI cannot generate or modify pipeline definitions

Integration as Code

  • YAML config + TypeScript logic, in a Git repo
  • Branch, PR, diff, review, merge—standard dev workflow
  • Full TypeScript with npm, type safety, IDE autocomplete
  • Lightweight Go binary + Node.js. No JVM.
  • AI-friendly: structured YAML + typed TS are ideal for LLM generation

Three files. That's a pipeline.

A channel is a folder with a YAML config, a TypeScript transformer, and an optional validator. Everything lives in your Git repo.

src/channels/fhir-to-adt/channel.yaml config
id: fhir-to-adt
enabled: true

listener:
  type: fhir
  fhir:
    port: 8082
    base_path: /fhir/r4
    version: R4
    resources: [Patient]

transformer:
  runtime: node
  entrypoint: transformer.ts

destinations:
  - hl7-file-output

Declarative YAML: what connects to what, which protocol, which port.

src/channels/fhir-to-adt/transformer.ts logic
import type { Patient } from "fhir/r4";
import { Message } from "node-hl7-client";

export function transform(
  msg: IntuMessage,
  ctx: IntuContext
): IntuMessage {
  const p = msg.body as Patient;
  const hl7 = new Message({
    messageHeader: {
      msh_9_1: "ADT",
      msh_9_2: "A08",
      msh_11_1: "P",
    },
  });

  hl7.addSegment("PID");
  hl7.set("PID.5.1", p.name?.[0]?.family);
  hl7.set("PID.5.2", p.name?.[0]?.given?.join(" "));

  return { body: hl7.toString() };
}

Pure TypeScript: full npm ecosystem, type safety, IDE support.

1
intu init my-project

Scaffold a project with sample channels, npm deps, and Docker config.

2
intu c my-channel

Add channels. Each one is a folder with YAML + TypeScript.

3
npm run dev

Run the engine. Auto-compiles TypeScript. Hot-reloads on changes.

Built for production healthcare.

A production-grade engine with enterprise features—all open source.

Git-Native Workflow

Every pipeline is plain text files. Branch, diff, review, and merge your integration logic the same way you ship application code.

AI-Friendly by Design

Structured YAML + typed TypeScript are the formats LLMs generate best. Use AI assistants to create, modify, and debug your pipelines.

Sub-Millisecond Transforms

Go binary runtime with a pre-loaded Node.js worker pool. Modules are cached in V8 at startup. Transform execution is measured in microseconds.

Persistent Storage

Memory, Postgres, or S3 backends. Store messages at each pipeline stage for audit and replay. Configurable per-channel.

Retry & Dead-Letter Queue

Configurable retry with fixed, linear, or exponential backoff per destination. Failed messages route to a DLQ for inspection.

Full Pipeline Stages

Preprocessor, Validator, Source Filter, Transformer, per-destination Filter & Transformer, Response Transformer, Postprocessor.

Observability

OpenTelemetry traces and metrics, Prometheus endpoint, and log transports for CloudWatch, Datadog, Sumo Logic, Elasticsearch.

Clustering & HA

Redis-based coordination with channel partitioning, message deduplication, and persistent destination queues.

Secrets & Auth

HashiCorp Vault, AWS Secrets Manager, GCP Secret Manager. LDAP and OIDC for dashboard authentication. RBAC and audit logging.

25 connectors. 8 data types.

Sources define where messages come from. Destinations define where they go. Mix and match across protocols.

Healthcare Data Types

HL7v2 FHIR R4 X12 EDI CDA / CCDA DICOM JSON XML CSV

Full processing pipeline.

Every stage is optional except the transformer. Per-destination filters and transformers let you customize payloads for each output independently.

Source
Pre
Validate
Filter
Transform
D.Filter
D.Xform
Send
Response
Post

Profile layering for every environment.

Base config in intu.yaml. Override per environment with intu.dev.yaml or intu.prod.yaml. Environment variables expand with ${VAR} syntax.

Storage mode, log transports, secrets provider, cluster config, and connector settings all vary by profile. One codebase, every environment.

# intu.prod.yaml
runtime:
  mode: cluster

message_storage:
  driver: postgres
  mode: full

logging:
  transports:
    - type: datadog
      datadog:
        api_key: ${DD_API_KEY}

Honest comparisons.

We believe transparency builds trust. Read how intu stacks up—including where others still win.

Coming from Mirth Connect?

Move from legacy GUI-based engines to a modern, Git-native workflow. Go binary + Node.js worker pool, sub-millisecond transforms, and hot-reload out of the box.

$ intu import mirth channel.xml

Import your existing Mirth channel XML exports directly. Generates channel.yaml + transformer.ts automatically.

Feature parity where it matters

Same pipeline stages, same connector types, same healthcare protocols. Modern TypeScript instead of Rhino JS. Git instead of a database.

25
Connectors
8
Data types
0
License fees

Why we built it this way.

Architecture Decision Records explain the reasoning behind every major technical choice.