CLI Commands
The Snowpack CLI provides three commands for interactive use. Run all commands
with uv run snowpack from the project root.
Every command accepts --spark-host, --spark-port, and --catalog flags.
These can also be set via the SNOWPACK_SPARK_HOST, SNOWPACK_SPARK_PORT, and
SNOWPACK_CATALOG environment variables. Explicit flags take precedence over
environment variables. The global --verbose flag must be passed before the
command name, for example uv run snowpack --verbose health my_db my_table.
snowpack tables
Lists all Iceberg tables visible in the catalog, optionally filtered to a single database.
Flags
| Flag | Default | Description |
|---|---|---|
--spark-host | config default | Spark Thrift Server / Kyuubi hostname. |
--spark-port | config default | Spark Thrift Server / Kyuubi port. |
--catalog | config default | Iceberg catalog name. |
--database, -d | (all) | Filter to a single database. |
Examples
List all tables across every database:
uv run snowpack tablesList tables in a specific database:
uv run snowpack tables --database my_databaseUse a non-default Spark host:
uv run snowpack tables --spark-host spark.internal.example.comsnowpack health
Fetches live health metrics for a single table directly from the PyIceberg catalog. Returns small file count, snapshot count, manifest count, and position delete file count.
Flags
| Flag | Default | Description |
|---|---|---|
--spark-host | config default | Spark Thrift Server / Kyuubi hostname. |
--spark-port | config default | Spark Thrift Server / Kyuubi port. |
--catalog | config default | Iceberg catalog name. |
Positional arguments
| Argument | Description |
|---|---|
database | Database name. |
table | Table name. |
Examples
Check health for a single table:
uv run snowpack health my_database my_tableVerbose output with full metric breakdown:
uv run snowpack --verbose health my_database my_tablesnowpack maintain
Runs one or more maintenance actions directly for a single table. This CLI path
bypasses the REST API, Postgres queue, KEDA workers, and job history. At least
one --action flag is required.
Flags
| Flag | Default | Description |
|---|---|---|
--spark-host | config default | Spark Thrift Server / Kyuubi hostname. |
--spark-port | config default | Spark Thrift Server / Kyuubi port. |
--catalog | config default | Iceberg catalog name. |
--action | required | Maintenance action to run. Repeatable — pass multiple --action flags to run several actions in sequence. Valid values: rewrite_data_files, rewrite_position_delete_files, expire_snapshots, rewrite_manifests, remove_orphan_files. Actions are executed in Snowpack’s fixed enum order. |
--target-file-size-mb | 512 | Target file size in MB for compaction. |
--min-file-size-mb | 384 | Minimum file size in MB. Files smaller than this are candidates for compaction. |
--dry-run | false | Log the maintenance plan without executing it. |
Positional arguments
| Argument | Description |
|---|---|
database | Database name. |
table | Table name. |
Examples
Run only compaction and snapshot expiration:
uv run snowpack maintain \ --action rewrite_data_files \ --action expire_snapshots \ my_database my_tableDry run with custom file size targets:
uv run snowpack maintain \ --dry-run \ --action rewrite_data_files \ --target-file-size-mb 256 \ --min-file-size-mb 192 \ my_database my_tableVerbose output against a remote Spark host:
uv run snowpack maintain \ --spark-host spark.internal.example.com \ --action rewrite_data_files \ my_database my_table