Development Commands
The development commands support local workflows: creating sample projects, importing existing dbt projects, running the semantic graph HTTP server, IDE integration via LSP, and scaffolding new warehouse adapter crates.
rocky playground
Section titled “rocky playground”Create a self-contained sample project using DuckDB as the local execution engine. No warehouse credentials or external services are required. Useful for learning Rocky, testing model logic, and rapid prototyping.
rocky playground [path]Arguments
Section titled “Arguments”| Argument | Type | Default | Description |
|---|---|---|---|
path | string | rocky-playground | Directory name for the playground project. |
Examples
Section titled “Examples”Create a playground with the default name:
rocky playgroundCreated rocky-playground/rocky.toml (DuckDB config)Created rocky-playground/models/stg_orders.sqlCreated rocky-playground/models/stg_orders.tomlCreated rocky-playground/models/stg_customers.sqlCreated rocky-playground/models/stg_customers.tomlCreated rocky-playground/models/fct_revenue.sqlCreated rocky-playground/models/fct_revenue.tomlCreated rocky-playground/seeds/orders.csvCreated rocky-playground/seeds/customers.csv
Playground ready! Run: cd rocky-playground rocky compile rocky testCreate a playground with a custom name:
rocky playground my-experimentCreated my-experiment/rocky.toml (DuckDB config)Created my-experiment/models/...Playground ready! Run: cd my-experiment rocky compile rocky testRelated Commands
Section titled “Related Commands”rocky init— create a production projectrocky test— run tests in the playgroundrocky compile— compile playground models
rocky import-dbt
Section titled “rocky import-dbt”Import an existing dbt project and convert it to Rocky models. Translates dbt SQL (Jinja + ref/source macros) and YAML config into Rocky’s .sql + .toml sidecar format.
rocky import-dbt [flags]| Flag | Type | Default | Description |
|---|---|---|---|
--dbt-project <PATH> | PathBuf | (required) | Path to the dbt project directory (containing dbt_project.yml). |
--output <PATH> | PathBuf | models | Output directory for the generated Rocky model files. |
Examples
Section titled “Examples”Import a dbt project:
rocky import-dbt --dbt-project ~/projects/acme-dbt{ "version": "0.1.0", "command": "import-dbt", "models_imported": 24, "sources_imported": 6, "warnings": [ { "model": "stg_payments", "message": "custom Jinja macro 'cents_to_dollars' not translated, left as comment" }, { "model": "fct_orders", "message": "incremental_strategy 'merge' converted to Rocky incremental with note" } ], "output_dir": "models/"}Import to a custom output directory:
rocky import-dbt --dbt-project ~/projects/acme-dbt --output src/models{ "version": "0.1.0", "command": "import-dbt", "models_imported": 24, "sources_imported": 6, "warnings": [], "output_dir": "src/models/"}Import and then compile to verify:
rocky import-dbt --dbt-project ~/projects/acme-dbt && rocky compileRelated Commands
Section titled “Related Commands”rocky compile— compile the imported modelsrocky init— create a new project first, then import
rocky serve
Section titled “rocky serve”Start an HTTP API server that exposes the compiler’s semantic graph. Provides REST endpoints for model metadata, lineage, and compilation results. Useful for editor integrations, dashboards, and custom tooling.
rocky serve [flags]| Flag | Type | Default | Description |
|---|---|---|---|
--models <PATH> | PathBuf | models | Directory containing model files. |
--contracts <PATH> | PathBuf | Directory containing data contract definitions. | |
--port <PORT> | u16 | 8080 | Port to listen on. |
--watch | bool | false | Watch for file changes and auto-recompile. |
Examples
Section titled “Examples”Start the server with defaults:
rocky serveCompiled 14 models in 42msListening on http://127.0.0.1:8080Endpoints: GET /api/models - List all compiled models GET /api/models/:name - Get model details GET /api/lineage/:name - Column-level lineage GET /api/dag - Full dependency graphStart with file watching on a custom port:
rocky serve --port 3000 --watchCompiled 14 models in 42msWatching models/ for changes...Listening on http://127.0.0.1:3000Start with contracts:
rocky serve --models src/models --contracts src/contracts --port 9090 --watchRelated Commands
Section titled “Related Commands”rocky lsp— IDE integration via Language Server Protocolrocky compile— one-shot compilation without the serverrocky lineage— CLI lineage (the server exposes the same data via HTTP)
rocky lsp
Section titled “rocky lsp”Start a Language Server Protocol server for IDE integration. Provides diagnostics, completions, hover information, and go-to-definition for Rocky SQL models.
rocky lspNo command-specific flags. The LSP server communicates over stdin/stdout per the LSP specification.
Examples
Section titled “Examples”Start the LSP server (typically called by an editor, not directly):
rocky lspConfigure in VS Code (settings.json):
{ "rocky.lsp.path": "rocky", "rocky.lsp.args": ["lsp"]}Configure in Neovim (with lspconfig):
require('lspconfig').rocky.setup({ cmd = { "rocky", "lsp" }, filetypes = { "sql" }, root_dir = function(fname) return require('lspconfig.util').root_pattern('rocky.toml')(fname) end,})Related Commands
Section titled “Related Commands”rocky serve— HTTP API server (alternative integration method)rocky compile— the LSP uses the same compilation engine
rocky init-adapter
Section titled “rocky init-adapter”Scaffold a new warehouse adapter crate. Creates the directory structure, Cargo.toml, and trait implementation stubs for building a custom adapter (e.g., BigQuery, Redshift, Snowflake).
rocky init-adapter <name>Arguments
Section titled “Arguments”| Argument | Type | Default | Description |
|---|---|---|---|
name | string | (required) | Adapter name (e.g., bigquery, redshift, snowflake). |
Examples
Section titled “Examples”Scaffold a BigQuery adapter:
rocky init-adapter bigqueryCreated crates/rocky-bigquery/Cargo.tomlCreated crates/rocky-bigquery/src/lib.rsCreated crates/rocky-bigquery/src/connector.rsCreated crates/rocky-bigquery/src/auth.rs
Adapter scaffold ready at crates/rocky-bigquery/Implement the WarehouseAdapter trait in src/connector.rs to get started.Scaffold a Snowflake adapter:
rocky init-adapter snowflakeCreated crates/rocky-snowflake/Cargo.tomlCreated crates/rocky-snowflake/src/lib.rsCreated crates/rocky-snowflake/src/connector.rsCreated crates/rocky-snowflake/src/auth.rs
Adapter scaffold ready at crates/rocky-snowflake/Implement the WarehouseAdapter trait in src/connector.rs to get started.Related Commands
Section titled “Related Commands”rocky compile— compile models using the new adapterrocky validate— validate config after registering the adapter
rocky hooks
Section titled “rocky hooks”Manage lifecycle hooks configured in rocky.toml.
rocky hooks list
Section titled “rocky hooks list”List all configured hooks.
rocky hooks listrocky hooks test
Section titled “rocky hooks test”Fire a synthetic test event to validate hook scripts.
rocky hooks test <EVENT>| Argument | Type | Description |
|---|---|---|
EVENT | string | Event name (e.g., pipeline_start, materialize_error) |
Examples
Section titled “Examples”$ rocky hooks test pipeline_startFiring test event: pipeline_startHook 'bash scripts/notify.sh': OK (exit 0, 120ms)rocky validate-migration
Section titled “rocky validate-migration”Compare a dbt project against its Rocky import to verify correctness.
rocky validate-migration [flags]| Flag | Type | Default | Description |
|---|---|---|---|
--dbt-project | path | required | Path to the dbt project |
--rocky-project | path | — | Path to the Rocky project (defaults to current directory) |
--sample-size | number | — | Number of sample rows for data comparison |
Examples
Section titled “Examples”$ rocky validate-migration --dbt-project ~/dbt-projectValidating 12 models... stg_customers: PASS (schema match, row count match) fct_orders: PASS (schema match, row count match) dim_products: WARN (column order differs)rocky test-adapter
Section titled “rocky test-adapter”Run conformance tests against a warehouse adapter.
rocky test-adapter [flags]| Flag | Type | Default | Description |
|---|---|---|---|
--adapter | string | — | Built-in adapter name (databricks, snowflake, duckdb) |
--command | string | — | Path to a process adapter binary |
--adapter-config | string | — | JSON config to pass to the adapter |
Examples
Section titled “Examples”$ rocky test-adapter --adapter duckdbRunning conformance tests for duckdb... 19/19 core tests passed 3/7 optional tests passed (4 skipped: not supported)Related Commands
Section titled “Related Commands”rocky init-adapter— scaffold a new adapter crate
rocky doctor
Section titled “rocky doctor”Health checks for your Rocky project: config validation, state store integrity, adapter connectivity, pipeline consistency, state sync, and auth verification.
rocky doctor # Run all checksrocky doctor --check config # Run only the config checkrocky doctor --check auth # Verify credentials + connectivity for all adaptersThe auth check pings each registered warehouse adapter (via SELECT 1 or an adapter-specific cheaper query) and each discovery adapter. Reports per-adapter pass/fail with latency.
See the CLI Reference for the full check list and JSON output format.
rocky list
Section titled “rocky list”Inspect project contents without running a pipeline.
rocky list pipelines # Pipeline definitions (type, adapters, depends_on)rocky list adapters # Adapter configurations (type, host)rocky list models # Transformation models (target, strategy, contract, deps)rocky list sources # Replication source configurationsrocky list deps <model> # What this model depends onrocky list consumers <model> # What depends on this modelAll subcommands support --output json via -o json. Models are discovered from the models/ directory (and immediate subdirectories for the common models/{layer}/ layout).
See the CLI Reference for full examples and JSON output schemas.
rocky validate— validate config after registering the adapter