Skip to content

Development Commands

The development commands support local workflows: creating sample projects, importing existing dbt projects, running the semantic graph HTTP server, IDE integration via LSP, and scaffolding new warehouse adapter crates.


Create a self-contained sample project using DuckDB as the local execution engine. No warehouse credentials or external services are required. Useful for learning Rocky, testing model logic, and rapid prototyping.

Terminal window
rocky playground [path]
ArgumentTypeDefaultDescription
pathstringrocky-playgroundDirectory name for the playground project.

Create a playground with the default name:

Terminal window
rocky playground
Created rocky-playground/rocky.toml (DuckDB config)
Created rocky-playground/models/stg_orders.sql
Created rocky-playground/models/stg_orders.toml
Created rocky-playground/models/stg_customers.sql
Created rocky-playground/models/stg_customers.toml
Created rocky-playground/models/fct_revenue.sql
Created rocky-playground/models/fct_revenue.toml
Created rocky-playground/seeds/orders.csv
Created rocky-playground/seeds/customers.csv
Playground ready! Run:
cd rocky-playground
rocky compile
rocky test

Create a playground with a custom name:

Terminal window
rocky playground my-experiment
Created my-experiment/rocky.toml (DuckDB config)
Created my-experiment/models/...
Playground ready! Run:
cd my-experiment
rocky compile
rocky test

Import an existing dbt project and convert it to Rocky models. Translates dbt SQL (Jinja + ref/source macros) and YAML config into Rocky’s .sql + .toml sidecar format.

Terminal window
rocky import-dbt [flags]
FlagTypeDefaultDescription
--dbt-project <PATH>PathBuf(required)Path to the dbt project directory (containing dbt_project.yml).
--output <PATH>PathBufmodelsOutput directory for the generated Rocky model files.

Import a dbt project:

Terminal window
rocky import-dbt --dbt-project ~/projects/acme-dbt
{
"version": "0.1.0",
"command": "import-dbt",
"models_imported": 24,
"sources_imported": 6,
"warnings": [
{ "model": "stg_payments", "message": "custom Jinja macro 'cents_to_dollars' not translated, left as comment" },
{ "model": "fct_orders", "message": "incremental_strategy 'merge' converted to Rocky incremental with note" }
],
"output_dir": "models/"
}

Import to a custom output directory:

Terminal window
rocky import-dbt --dbt-project ~/projects/acme-dbt --output src/models
{
"version": "0.1.0",
"command": "import-dbt",
"models_imported": 24,
"sources_imported": 6,
"warnings": [],
"output_dir": "src/models/"
}

Import and then compile to verify:

Terminal window
rocky import-dbt --dbt-project ~/projects/acme-dbt && rocky compile

Start an HTTP API server that exposes the compiler’s semantic graph. Provides REST endpoints for model metadata, lineage, and compilation results. Useful for editor integrations, dashboards, and custom tooling.

Terminal window
rocky serve [flags]
FlagTypeDefaultDescription
--models <PATH>PathBufmodelsDirectory containing model files.
--contracts <PATH>PathBufDirectory containing data contract definitions.
--port <PORT>u168080Port to listen on.
--watchboolfalseWatch for file changes and auto-recompile.

Start the server with defaults:

Terminal window
rocky serve
Compiled 14 models in 42ms
Listening on http://127.0.0.1:8080
Endpoints:
GET /api/models - List all compiled models
GET /api/models/:name - Get model details
GET /api/lineage/:name - Column-level lineage
GET /api/dag - Full dependency graph

Start with file watching on a custom port:

Terminal window
rocky serve --port 3000 --watch
Compiled 14 models in 42ms
Watching models/ for changes...
Listening on http://127.0.0.1:3000

Start with contracts:

Terminal window
rocky serve --models src/models --contracts src/contracts --port 9090 --watch
  • rocky lsp — IDE integration via Language Server Protocol
  • rocky compile — one-shot compilation without the server
  • rocky lineage — CLI lineage (the server exposes the same data via HTTP)

Start a Language Server Protocol server for IDE integration. Provides diagnostics, completions, hover information, and go-to-definition for Rocky SQL models.

Terminal window
rocky lsp

No command-specific flags. The LSP server communicates over stdin/stdout per the LSP specification.

Start the LSP server (typically called by an editor, not directly):

Terminal window
rocky lsp

Configure in VS Code (settings.json):

{
"rocky.lsp.path": "rocky",
"rocky.lsp.args": ["lsp"]
}

Configure in Neovim (with lspconfig):

require('lspconfig').rocky.setup({
cmd = { "rocky", "lsp" },
filetypes = { "sql" },
root_dir = function(fname)
return require('lspconfig.util').root_pattern('rocky.toml')(fname)
end,
})
  • rocky serve — HTTP API server (alternative integration method)
  • rocky compile — the LSP uses the same compilation engine

Scaffold a new warehouse adapter crate. Creates the directory structure, Cargo.toml, and trait implementation stubs for building a custom adapter (e.g., BigQuery, Redshift, Snowflake).

Terminal window
rocky init-adapter <name>
ArgumentTypeDefaultDescription
namestring(required)Adapter name (e.g., bigquery, redshift, snowflake).

Scaffold a BigQuery adapter:

Terminal window
rocky init-adapter bigquery
Created crates/rocky-bigquery/Cargo.toml
Created crates/rocky-bigquery/src/lib.rs
Created crates/rocky-bigquery/src/connector.rs
Created crates/rocky-bigquery/src/auth.rs
Adapter scaffold ready at crates/rocky-bigquery/
Implement the WarehouseAdapter trait in src/connector.rs to get started.

Scaffold a Snowflake adapter:

Terminal window
rocky init-adapter snowflake
Created crates/rocky-snowflake/Cargo.toml
Created crates/rocky-snowflake/src/lib.rs
Created crates/rocky-snowflake/src/connector.rs
Created crates/rocky-snowflake/src/auth.rs
Adapter scaffold ready at crates/rocky-snowflake/
Implement the WarehouseAdapter trait in src/connector.rs to get started.

Manage lifecycle hooks configured in rocky.toml.

List all configured hooks.

Terminal window
rocky hooks list

Fire a synthetic test event to validate hook scripts.

Terminal window
rocky hooks test <EVENT>
ArgumentTypeDescription
EVENTstringEvent name (e.g., pipeline_start, materialize_error)
Terminal window
$ rocky hooks test pipeline_start
Firing test event: pipeline_start
Hook 'bash scripts/notify.sh': OK (exit 0, 120ms)

Compare a dbt project against its Rocky import to verify correctness.

Terminal window
rocky validate-migration [flags]
FlagTypeDefaultDescription
--dbt-projectpathrequiredPath to the dbt project
--rocky-projectpathPath to the Rocky project (defaults to current directory)
--sample-sizenumberNumber of sample rows for data comparison
Terminal window
$ rocky validate-migration --dbt-project ~/dbt-project
Validating 12 models...
stg_customers: PASS (schema match, row count match)
fct_orders: PASS (schema match, row count match)
dim_products: WARN (column order differs)

Run conformance tests against a warehouse adapter.

Terminal window
rocky test-adapter [flags]
FlagTypeDefaultDescription
--adapterstringBuilt-in adapter name (databricks, snowflake, duckdb)
--commandstringPath to a process adapter binary
--adapter-configstringJSON config to pass to the adapter
Terminal window
$ rocky test-adapter --adapter duckdb
Running conformance tests for duckdb...
19/19 core tests passed
3/7 optional tests passed (4 skipped: not supported)

Health checks for your Rocky project: config validation, state store integrity, adapter connectivity, pipeline consistency, state sync, and auth verification.

Terminal window
rocky doctor # Run all checks
rocky doctor --check config # Run only the config check
rocky doctor --check auth # Verify credentials + connectivity for all adapters

The auth check pings each registered warehouse adapter (via SELECT 1 or an adapter-specific cheaper query) and each discovery adapter. Reports per-adapter pass/fail with latency.

See the CLI Reference for the full check list and JSON output format.


Inspect project contents without running a pipeline.

Terminal window
rocky list pipelines # Pipeline definitions (type, adapters, depends_on)
rocky list adapters # Adapter configurations (type, host)
rocky list models # Transformation models (target, strategy, contract, deps)
rocky list sources # Replication source configurations
rocky list deps <model> # What this model depends on
rocky list consumers <model> # What depends on this model

All subcommands support --output json via -o json. Models are discovered from the models/ directory (and immediate subdirectories for the common models/{layer}/ layout).

See the CLI Reference for full examples and JSON output schemas.