📊 Dashboard

Loading
MCP Tools
--
Projects
--
Workflow Steps
--
Version

Recent Activity

TimestampToolStatus
Loading...

Sample Data Files

Available in data/ — click a file to load it in the Tool Workbench.

Loading...

Quick Start

Get started with DataBridge AI:

Cost / Credit Tracker

Track LLM token usage and Snowflake credit consumption per workflow run.

--
LLM Calls
--
Total Tokens
--
LLM Cost (USD)
--
SF Credits
Run IDLLM CallsTokens (in/out)LLM $SF CreditsSF $Total $
No cost data yet — run a workflow with CostTracker enabled.

📡 Agent Communication Console

Communication Stream
[--:--:--.---] SYSTEM
Agent Communication Console initialized. Ready to process requests. Type a query below or launch an Autonomous Demo.

Session Stats

0
Messages
0
Agents

Active Agents

🎯 Orchestrator
📊 Data Agent
🔍 Cortex Analyst
🏗️ Hierarchy Builder
Quality Agent
dbt Agent
📚 Catalog Agent
✈️ Wright Agent

Autonomous Demos

Select a demo to see a hands-off walkthrough of DataBridge AI capabilities.
Step 0/0

🔧 Tool Workbench

Available Tools

  • Loading tools...

Select a Tool

Choose a tool from the list to configure and run it.

Output

// Tool output will appear here

📁 Hierarchy Viewer

Sample Hierarchy: Investment Property Financial Analysis DEMO

Commercial real estate investment property model with income statement, balance sheet, and financial analysis hierarchies.

Click a node in the hierarchy tree to view details.

graph TD ROOT[Investment Property
Financial Analysis] --> IS[Income Statement] ROOT --> BS[Balance Sheet] ROOT --> FA[Financial Analysis
Report] IS --> REV[Revenue] IS --> OPEX[Operating Expenses] IS --> NOI_C[Net Operating Income] REV --> RENT[Rental Income] REV --> CAM[CAM Reimbursements] REV --> OTH_R[Other Income] RENT --> BASE[Base Rent] RENT --> PCT[Percentage Rent] RENT --> PARK[Parking Revenue] BS --> ASSETS[Assets] BS --> LIAB[Liabilities] BS --> EQ[Owner Equity] FA --> NOI[NOI Analysis] FA --> CAP[Cap Rate Analysis] FA --> DCF[DCF Valuation] FA --> DSCR_N[Debt Service Coverage]

Saved Projects

  • Loading projects...

Select a Project

Choose a project from the list to view its details.

⚡ Workflow Editor

Tool Palette

    Workflow Steps

    Click tools to add steps to your workflow.

    ✈️ Wright Pipeline Builder

    Build hierarchy-driven data marts with the 4-object pipeline pattern. Configure each step and preview generated SQL.

    Pipeline Configuration

    VW_1: Translation View

    Translates ID_SOURCE column values to physical database columns using CASE statements.

    -- Click "Generate" to create VW_1 Translation View SQL

    DT_2: Granularity Table

    UNPIVOT operation to normalize data and apply exclusion filters.

    -- Click "Generate" to create DT_2 Granularity Table SQL

    DT_3A: Pre-Aggregation Fact

    UNION ALL branches for different join patterns. Each branch handles different dimension combinations.

    -- Click "Generate" to create DT_3A Pre-Aggregation SQL

    DT_3: Final Data Mart

    Final data mart with formula precedence cascade and surrogate key generation.

    -- Click "Generate" to create DT_3 Data Mart SQL

    🔬 Researcher

    Run live demos against sample data using real MCP tools. Explore data quality, reconciliation, and schema analysis.

    How the Researcher Works

    The Researcher validates source data, compares datasets, and profiles quality — all from sample CSV files included with DataBridge AI.

    CE Tools (Free):
    load_csv, profile_data, compare_hashes, fuzzy_match_columns, detect_schema_drift
    Pro Tools (Licensed):
    analyze_book_with_researcher, compare_book_to_database, profile_book_sources
    Data Flow:
    CSV Files --> load_csv
    load_csv --> profile_data (stats)
    load_csv --> compare_hashes (diffs)
    load_csv --> fuzzy_match (matches)
    Two CSVs --> detect_schema_drift (changes)

    Live Demos

    Loading demos...

    Pro Researcher Tools

    Requires Pro License

    analyze_book_with_researcher

    Analyze a Book's data sources against a database connection

    compare_book_to_database

    Compare Book hierarchy against live database schema

    profile_book_sources

    Profile all data sources referenced by a Book

    ⚙️ Administration

    Configuration

    📚 Documentation & Help

    DataBridge AI v0.41.1

    A headless, MCP-native data and implementation engine with 433 tools across 28 modules. Tool availability is license-dependent (Community/Pro/Enterprise).

    Core Capabilities

    🔄 Data ReconciliationCompare and validate data from CSV, SQL, PDF, JSON sources (38 tools)
    🏗️ Hierarchy BuilderCreate and manage multi-level hierarchy projects with formulas (49 tools)
    🧬 BLCE EngineBusiness logic extraction, Kimball modeling, DDL generation, deployment (72 tools, 21 phases)
    🧠 Cortex AISnowflake Cortex integration with natural language to SQL (26 tools)
    📊 Wright ModuleHierarchy-driven data mart generation with 4-object pipeline (31 tools)
    📚 Data CatalogCentralized metadata registry with business glossary (19 tools)
    🔗 GraphRAGKnowledge graph + vector search for explainable AI grounding (10 tools)
    📈 ObservabilityMetric recording, anomaly detection, asset health monitoring (15 tools)
    📦 Data VersioningDataset snapshots, diffs, and rollback (12 tools)
    🔍 Lineage TrackingColumn-level lineage and impact analysis (11 tools)
    ✅ Data QualityExpectation suites and data contracts (7 tools)
    🛡️ DataShieldOffline data masking before AI processing
    🔧 dbt IntegrationGenerate dbt projects from hierarchies (8 tools)

    Quick Start

    # Install from PyPI (Community Edition) pip install databridge-ai # Or install Pro (requires license key) pip install databridge-ai-pro export DATABRIDGE_LICENSE_KEY="DB-PRO-..." # Run as MCP Server python -m src.server

    Architecture

    graph TD A[Claude / MCP Client] --> B[MCP Protocol] B --> C[DataBridge MCP Server
    433 Tools] C --> D[Hierarchy Builder
    49 tools] C --> E[Data Reconciliation
    38 tools] C --> F[BLCE Engine
    72 tools] C --> G[Wright Module
    31 tools] C --> H[Cortex AI
    26 tools] C --> I[Data Catalog
    19 tools] C --> J[Observability
    15 tools] C --> K[Other Modules] F --> L[(Snowflake)] G --> L H --> L D --> M[GraphRAG Store] F --> M I --> M

    All 28 Tool Categories (433 Total)

    Tool availability depends on your license tier: CE (Community), Pro, or Enterprise.

    ModuleToolsTierKey Tools
    File Discovery3CEfind_files, stage_file
    Data Reconciliation38CEload_csv, profile_data, fuzzy_match_columns
    Hierarchy Builder49CEcreate_hierarchy, import_flexible_hierarchy, export_hierarchy_csv
    Hierarchy-Graph Bridge5CEhierarchy_graph_status, hierarchy_rag_search
    Templates / Skills / KB16CElist_financial_templates, get_skill_prompt
    Git Automation4CEcommit_dbt_project, create_deployment_pr
    SQL Discovery2CEsql_to_hierarchy, smart_analyze_sql
    Mapping Enrichment5CEconfigure_mapping_enrichment, enrich_mapping_file
    BLCE Engine72CEblce_parse_sql, blce_generate_ddl, blce_execute_ddl, model_ask
    AI Orchestrator16Prosubmit_orchestrated_task, register_agent
    Planner Agent11Proplan_workflow, suggest_agents
    Smart Recommendations5Proget_smart_recommendations, smart_import_csv
    Diff Utilities6CEdiff_text, diff_dicts, explain_diff
    Unified AI Agent10Procheckout_librarian_to_book, sync_book_and_librarian
    Cortex Agent12Procortex_complete, cortex_reason
    Cortex Analyst14Proanalyst_ask, create_semantic_model
    Console Dashboard5CEstart_console_server, broadcast_console_message
    dbt Integration8CEcreate_dbt_project, generate_dbt_model
    Data Quality7CEgenerate_expectation_suite, run_validation
    Wright Module31Procreate_mart_config, generate_mart_pipeline, wright_from_hierarchy
    Lineage & Impact11Protrack_column_lineage, analyze_change_impact
    Git / CI-CD12Progit_commit, github_create_pr
    Data Catalog19Procatalog_scan_connection, catalog_search
    Data Versioning12Proversion_create, version_diff, version_rollback
    GraphRAG Engine10Prorag_search, rag_validate_output, rag_entity_extract
    Data Observability15Proobs_record_metric, obs_create_alert_rule
    Cortex Table Understanding5Progenerate_table_understanding, batch_table_understanding
    AI Relationship Discovery8Proai_analyze_schema, ai_detect_relationships
    Total433

    Available Templates

    Accounting Domain

    Template IDNameIndustry
    standard_plStandard P&LGeneral
    standard_bsStandard Balance SheetGeneral
    oil_gas_losOil & Gas LOSOil & Gas
    upstream_oil_gas_plUpstream Oil & Gas P&LOil & Gas - E&P
    manufacturing_plIndustrial Manufacturing P&LManufacturing
    saas_plSaaS Company P&LSaaS

    Operations Domain

    Template IDNameIndustry
    geographic_hierarchyGeographic HierarchyGeneral
    department_hierarchyDepartment HierarchyGeneral
    upstream_field_hierarchyUpstream Field HierarchyOil & Gas
    fleet_hierarchyFleet & Route HierarchyTransportation

    ERP Data Model Templates (BLCE)

    Pre-built Kimball data model specs for common ERP systems. Used by the BLCE engine to generate dimension and fact tables automatically.

    ERP SystemConfig FilePre-Built DimsPre-Built Facts
    Enertiaerp_configs/enertia.json148
    WolfePakerp_configs/wolfepak.json127
    SAP (O&G)erp_configs/sap_og.json1812
    NetSuiteerp_configs/netsuite.json106
    QuickBookserp_configs/quickbooks.json63
    ProCounterp_configs/procount.json85

    Built-in Skills

    Skill IDNameIndustriesCapabilities
    financial-analyst Financial Analyst General GL reconciliation, trial balance, bank rec, COA design
    fpa-oil-gas-analyst FP&A Oil & Gas Analyst Oil & Gas LOS analysis, JIB, reserves, hedge accounting
    manufacturing-analyst Manufacturing Analyst Manufacturing Standard costing, COGS, variances, inventory
    saas-metrics-analyst SaaS Metrics Analyst SaaS ARR/MRR, cohorts, CAC/LTV, unit economics
    transportation-analyst Transportation Analyst Transportation Operating ratio, fleet, lanes, driver metrics

    BLCE Auto-Generated Skills

    The BLCE engine automatically generates domain-specific skill prompts from each analysis run. Skills are reusable and shareable across projects.

    Skill TypeGenerated FromExample
    Domain Expert Normalized measures + governance metadata "Revenue analysis for Enertia upstream O&G"
    Query Assistant Bus matrix + model metadata "Query the well production fact table"
    Report Builder Report suggestions + templates "Build a lease operating statement"

    API Reference

    MCP Configuration (Claude Desktop)

    { "mcpServers": { "DataBridge_AI": { "command": "python", "args": ["-m", "src.server"] } } }

    Programmatic Usage

    from src.server import mcp # Run as MCP server mcp.run() # Or get tools list tools = await mcp.get_tools() print(f"Loaded {len(tools)} tools")

    License Key System

    DataBridge uses a tiered license system. Community Edition is free; Pro and Enterprise require a license key.

    # License key format: DB-{TIER}-{CUSTOMER_ID}-{EXPIRY}-{SIGNATURE} # Example: export DATABRIDGE_LICENSE_KEY="DB-PRO-ACME001-20270209-a1b2c3d4e5f6" # Generate a license key (admin) python scripts/generate_license.py PRO CUSTOMER01 365 # Check license status (MCP tool) get_license_status()

    Environment Variables

    VariableDescriptionDefault
    DATABRIDGE_LICENSE_KEYLicense key for Pro/Enterprise features- (CE mode)
    DATABRIDGE_LICENSE_SECRETLicense signing secret (admin only)-
    DATA_DIRData directory for projects./data
    NESTJS_BACKEND_URLNestJS backend URLhttp://localhost:8001
    NESTJS_API_KEYAPI key for backend-
    SNOWFLAKE_ACCOUNTSnowflake account identifier-
    SNOWFLAKE_USERSnowflake authentication user-
    DATABRIDGE_FUZZY_THRESHOLDFuzzy match score threshold (0-100)80

    Platform Architecture Diagrams

    BLCE 21-Phase Pipeline

    The Business Logic Comprehension Engine processes ERP data through 21 sequential phases, from intake to deployment.

    Intake & Discovery
    1. E2E Chain
    2. Intake
    3. Consultant
    4. Catalog
    5. Parse
    6. Reports
    7. Normalize
    Analysis & Modeling
    8. Hierarchy
    9. CrossRef
    10. Evidence
    11. Governance
    12. Model Gen
    13. Persist
    14. Bus Matrix
    Enrichment & Deploy
    15. Quality
    16. Skills
    17. AI Enrich
    18. Swarm
    19. Auto Build
    20. Artifacts
    21. Deploy

    Wright Pipeline Flow

    The Wright module generates a 4-object Snowflake Dynamic Table pipeline from hierarchy projects.

    graph LR H[Hierarchy Project] --> VW1[VW_1
    Translation View] VW1 --> DT2[DT_2
    Granularity Table] DT2 --> DT3A[DT_3A
    Pre-Aggregation] DT3A --> DT3[DT_3
    Final Data Mart] DT3 --> SF[(Snowflake)]

    Cortex AI Pipeline

    Snowflake Cortex integration for AI-powered analytics with natural language queries.

    graph TD Q[Natural Language Query] --> CA[Cortex Agent] CA --> SM[Semantic Model] SM --> SQL[Generated SQL] SQL --> SF[(Snowflake)] SF --> R[Results] CA --> CR[Cortex Reason] CR --> I[Insights]

    Data Catalog & Observability

    Centralized metadata, lineage tracking, and real-time health monitoring.

    graph TD SC[Catalog Scanner] --> CAT[Data Catalog
    19 tools] CAT --> LIN[Lineage Graph
    11 tools] CAT --> GL[Business Glossary] OBS[Observability
    15 tools] --> MET[Metrics Store] OBS --> ALR[Alert Rules] OBS --> AH[Asset Health] LIN --> GR[GraphRAG
    10 tools] CAT --> GR

    Commercialization Tiers

    Three-tier licensing model with increasing tool counts and capabilities.

    graph TD CE[Community Edition
    ~128 tools
    Free - PyPI] --> PRO[Pro Edition
    ~369 tools
    Licensed - GitHub Packages] PRO --> ENT[Enterprise
    433+ tools
    Custom Deploy] CE --> EX[Pro Examples
    47 tests + 29 use cases]

    Changelog

    v0.41.1 - February 17, 2026

    • BLCE P5: DDL executor + deployment phase (phase 21)
    • 22 new tools added (tools 51-72), 5 new phases (17-21)
    • Auto-build pipeline: schema creation, DDL execution, validation
    • Swarm orchestration for parallel AI enrichment
    • Artifact bundle generation with rich HTML reports
    • Dashboard UI refresh with Architecture/Changelog tabs, BLCE Engine page
    • Total tool count: 433

    v0.41.0 - February 16, 2026

    • BLCE Engine launch: Business Logic Comprehension Engine
    • 50 initial tools across 16 phases
    • SQL parsing, measure normalization, cross-referencing
    • Evidence collection, governance metadata, model generation
    • Bus matrix generation, quality validation
    • 601 tests passing

    v0.40.0 - January 15, 2026

    • E2E Assessment Pipeline: 15-phase orchestrated workflow
    • DataShield UI: offline data masking before AI processing
    • Snowflake Connection Pool: singleton SSO auth for pipelines
    • Bulk VARIANT loader for Snowflake persistence
    • ERP config registry with auto-detect + Enertia preset
    • Report generator with KPI tiles, bus matrix, timeline

    v0.39.0 - December 2025

    • Data Observability: metric recording, anomaly detection, asset health
    • GraphRAG Engine: knowledge graph + vector search
    • Data Versioning: snapshots, diffs, and rollback
    • AI Relationship Discovery: schema analysis, naming patterns, FK detection
    • Cortex Table Understanding: AI-generated table summaries

    🧬 BLCE Engine

    The Business Logic Comprehension Engine (BLCE) is DataBridge AI's core analytical engine. It ingests raw ERP SQL views and tables, extracts business logic, normalizes measures, discovers hierarchies, and generates a complete Kimball-style data warehouse — all through a 21-phase automated pipeline.

    72
    MCP Tools
    21
    Pipeline Phases
    6
    ERP Templates
    17
    Pydantic Contracts

    21-Phase Pipeline

    Intake & Discovery
    1. E2E Chain
    2. Intake
    3. Consultant
    4. Catalog
    5. Parse
    6. Reports
    7. Normalize
    Analysis & Modeling
    8. Hierarchy
    9. CrossRef
    10. Evidence
    11. Governance
    12. Model Gen
    13. Persist
    14. Bus Matrix
    Enrichment & Deploy
    15. Quality
    16. Skills
    17. AI Enrich
    18. Swarm
    19. Auto Build
    20. Artifacts
    21. Deploy

    How It Works

    Phase GroupPhasesPurpose
    Intake & Discovery1-6Connect to ERP, catalog tables, parse SQL, identify reports
    Analysis & Normalization7-9Normalize measures, detect hierarchies, cross-reference
    Governance & Modeling10-14Collect evidence, apply governance, generate Kimball model, bus matrix
    Quality & Skills15-16Validate data quality, generate domain-specific AI skills
    Enrichment & Build17-21AI enrichment, swarm orchestration, auto-build DDL, deploy

    72 BLCE Tools by Function

    Parsing (8 tools)

    blce_parse_sql blce_parse_batch blce_parse_status blce_detect_joins blce_detect_measures blce_detect_dimensions blce_resolve_cte blce_extract_filters

    Normalization (8 tools)

    blce_normalize_measures blce_normalize_batch blce_canonical_names blce_detect_duplicates blce_merge_measures blce_classify_measures blce_suggest_aggregations blce_validate_normalization

    Evidence & Governance (8 tools)

    blce_collect_evidence blce_score_confidence blce_governance_check blce_tag_pii blce_lineage_trace blce_audit_log blce_provenance_report blce_compliance_scan

    Workflow & Orchestration (8 tools)

    blce_run_phase blce_run_pipeline blce_phase_status blce_resume_pipeline blce_rollback_phase blce_checkpoint blce_validate_pipeline blce_pipeline_report

    Agent & Swarm (12 tools)

    blce_agent_analyze blce_agent_recommend blce_agent_explain blce_agent_validate blce_agent_enrich blce_agent_summarize blce_swarm_dispatch blce_swarm_status blce_swarm_collect blce_swarm_merge blce_swarm_cancel blce_swarm_report

    Client Interaction & Intake (12 tools)

    blce_consultant_intake blce_consultant_questions blce_consultant_summary blce_client_objectives blce_client_priorities blce_client_review model_ask model_query_builder model_explain model_suggest model_compare model_validate

    Model Generation (10 tools)

    blce_generate_dim blce_generate_fact blce_generate_bridge blce_generate_ddl blce_generate_bus_matrix blce_generate_star_schema blce_validate_model blce_model_diff blce_model_export blce_model_import

    BI Export & Deployment (6 tools)

    blce_export_sigma blce_export_powerbi blce_export_tableau blce_cross_reference blce_execute_ddl blce_deploy_validate

    17 Pydantic Contracts

    BLCE uses strongly-typed Pydantic models at every phase boundary. Each contract validates data flowing between phases.

    ContractPrefixPurpose
    ParsedSQLPSQL_Validated SQL parse tree with CTEs, joins, measures
    NormalizedMeasureNM_Canonical measure with aggregation type, grain, units
    DetectedHierarchyDH_Discovered hierarchy levels with parent-child links
    CrossReferenceXR_Cross-table relationships with confidence scores
    EvidenceRecordER_Source evidence for each analytical decision
    GovernanceTagGT_PII/sensitivity classification, retention policy
    DimensionSpecDS_Kimball dimension definition with SCD type
    FactSpecFS_Kimball fact table with grain, measures, FK links
    BusMatrixEntryBM_Fact-dimension intersection for bus matrix
    QualityRuleQR_Data quality expectation with threshold
    SkillPromptSP_Generated AI skill with domain context
    EnrichmentResultENR_AI-enriched metadata and descriptions
    SwarmTaskST_Parallel task definition for swarm orchestration
    DDLStatementDDL_Generated CREATE TABLE/VIEW statement
    DeploymentPlanDP_Ordered DDL execution plan with rollback
    ArtifactBundleAB_HTML report, JSON metadata, diagram outputs
    PipelineStatePS_Checkpoint state for pipeline resume/rollback

    ⬡ Hierarchy Builder

    Hierarchy Tree

    Select a project to view its hierarchy tree.

    Select a node from the tree to edit its details.