ETL flow intelligence platform

You have hundreds of flows.
Read them as code.

Convert visual ETL flows to clean SQL, AWS Glue jobs, PySpark, or dbt models. Drop your flow file and get production-ready code.

Source
Target
Upload
quarterly_performance_demo.tflxDEMO
FLOW MAPDEMOquarterly_performance_demo.tflx
2 sources4 transforms1 outputs
input
Orders_2017
Sales Workbook · 7 fields
input
Returns_2017
Sales Workbook · 2 fields
left join
Join returns
1 key matched
calculate
Calculate ship speed
Create ship_days
filter
Keep fast shipments
Keep rows by ship_days
aggregate
Summarize by region
2 groups · 2 metrics
output
Quarterly performance
Hyper extract
OUTPUTDEMOquarterly_performance_demo.tflx
PostgreSQL
File parsed in browser Only metadata sent — no raw data Tableau Prep → PostgreSQL
The problem

Your flow library
is a black box

Every company accumulates ETL flows over the years. People build them, people leave, and the flows stay — but the knowledge doesn't.

When a new request comes in, nobody digs through 50 old flows to check if it's already been done. They build it again. Five versions of the same join, redundant maintenance, a flow library nobody trusts.

Deflows makes every flow readable, searchable, and optimizable.

Use cases
Before you build

Check if it already exists

New request for a regional report? Search your flow library first. Deflows finds the 3 flows that already do something similar.

New team member

Onboard in hours

New hire inherits 80 flows. Upload the batch, read a plain-language summary of the entire pipeline in 20 minutes.

Platform team

Audit & eliminate redundancy

Find the 12 flows doing the same join. Consolidate to 2 parameterized queries. Cut maintenance by 60%.

Migration

Unblock cloud migration

150 Prep flows standing between you and AWS? Convert them in an afternoon, then export dbt models plus schema.yml and sources.yml as your starting scaffold.

Privacy
File stays in your browser

Your .tfl is parsed entirely in the browser. Only structural metadata (table names, joins, formulas) is sent for code generation — never the original file.

Connection hashing

Database addresses are hashed for grouping. We never see or store the originals.

No row data in .tfl

Prep files contain metadata only — table names, formulas, joins. Not your business data.

Pricing

Start free, move to self-serve when you need more volume, and bring us in when the migration needs dedicated support.

Free
$0
Try before you commit.
1 flow without signup
3 flows/month with a free account
Convert + Document
PostgreSQL output only
Single-flow upload
Try now →
Pro
$39/mo
For individual data engineers and consultants.
50 flows / month
Convert + Document
Current targets: PostgreSQL, MySQL, BigQuery, PySpark, dbt, AWS Glue
Batch upload up to 20 flows
Export generated code or dbt scaffold
Email support
Sign in to upgrade →
Enterprise
Custom
For security, scale, and procurement.
Higher volume or unlimited usage
SSO / SAML
Security review
Private deployment / VPC discussion
Custom connectors and targets
Dedicated support
Contact us →
Migration Services
Custom

Need full delivery? We handle larger migration programs end-to-end. Roadmap targets include Snowflake, Databricks, Apache Airflow.

Contact us →