CLI command reference¶
The following list shows all available commands in the Tinybird command-line interface, their options, and their arguments.
For examples on how to use them, see the Quick start guide, Data projects, and Common use cases.
tb auth¶
Configure your Tinybird authentication.
auth commands
Command | Description |
---|---|
info OPTIONS | Gets information about the authentication that is currently being used. |
ls OPTIONS | Lists available regions to authenticate. |
use OPTIONS REGION_NAME_OR_HOST_OR_ID | Switches to a different region. You can pass the region name, the region host url, or the region index after listing available regions with tb auth ls . |
The previous commands accept the following options:
--token INTEGER
: Use auth Token, defaults to TB_TOKEN envvar, then to the .tinyb file.--host TEXT
: Set custom host if it's different than https://api.tinybird.co. Check this page for the available list of regions.--region TEXT
: Set region. Run 'tb auth ls' to show available regions.--connector [bigquery|snowflake]
: Set credentials for one of the supported connectors.--interactive,-i
: Show available regions and select where to authenticate to.
tb branch¶
Manage your Workspace branches.
Branch commands
Command | Description | Options | |
---|---|---|---|
create BRANCH_NAME | Creates a new Branch in the current 'main' Workspace. |
| |
current | Shows the Branch you're currently authenticated to. | ||
data | Performs a data branch operation to bring data into the current Branch. |
| |
datasource copy DATA_SOURCE_NAME | Copies data source from Main. |
| |
ls | Lists all the Branches available. | --sort / --no-sort : Sorts the list of Branches by name. Disabled by default. | |
regression-tests | Regression test commands. |
| |
regression-tests coverage PIPE_NAME | Runs regression tests using coverage requests for Branch vs Main Workspace. It creates a regression-tests job. The argument supports regular expressions. Using '.*' if no Pipe name is provided. |
| |
regression-tests last PIPE_NAME | Runs regression tests using coverage requests for Branch vs Main Workspace. It creates a regression-tests job. The argument supports regular expressions. Using '.*' if no Pipe name is provided. |
| |
regression-tests manual PIPE_NAME | Runs regression tests using coverage requests for Branch vs Main Workspace. It creates a regression-tests job. The argument supports regular expressions. Using '.*' if no Pipe name is provided. |
| |
rm [BRANCH_NAME_OR_ID] | Removes a Branch from the Workspace (not Main). It can't be recovered. | --yes : Don't ask for confirmation. | |
use [BRANCH_NAME_OR_ID] | Switches to another Branch. |
tb check¶
Checks file syntax.
It only allows one option, --debug
, which prints the internal representation.
tb connection¶
Connection commands.
Command | Description | Options |
---|---|---|
create COMMAND [ARGS] | Creates a connection. Available subcommands or types are bigquery , kafka , s3 , s3_iamrole , and snowflake . | See the next table. |
ls [OPTIONS] | Lists connections. | --connector TYPE : Filters by connector. Available types are bigquery , kafka , s3 , s3_iamrole , and snowflake . |
rm [OPTIONS] CONNECTION_ID_OR_NAME | Removes a connection. | --force BOOLEAN : Forces connection removal even if there are Data Sources using it. |
tb connection create¶
The following subcommands and settings are available for each tb connection create
subcommand:
Command | Description | Options |
---|---|---|
create bigquery [OPTIONS] | Creates a BigQuery connection. | --no-validate : Doesn't validate GCP permissions. |
create kafka [OPTIONS] | Creates a Kafka connection. |
|
create s3 [OPTIONS] | Creates an S3 connection. |
|
create s3_iamrole [OPTIONS] | Creates an S3 connection (IAM role). |
|
create snowflake [OPTIONS] | Creates a Snowflake connection. |
|
tb datasource¶
Data Sources commands.
Command | Description | Options |
---|---|---|
analyze OPTIONS URL_OR_FILE | Analyzes a URL or a file before creating a new data source. | |
append OPTIONS DATASOURCE_NAME URL | Appends data to an existing Data Source from URL, local file or a connector. | |
connect OPTIONS CONNECTION DATASOURCE_NAME | Deprecated. Use |
|
copy OPTIONS DATASOURCE_NAME | Copies data source from Main. |
|
delete OPTIONS DATASOURCE_NAME | Deletes rows from a Data Source. |
|
generate OPTIONS FILENAMES | Generates a Data Source file based on a sample CSV file from local disk or URL. | --force : Overrides existing files. |
ls OPTIONS | Lists Data Sources. |
|
replace OPTIONS DATASOURCE_NAME URL | Replaces the data in a Data Source from a URL, local file or a connector. |
|
rm OPTIONS DATASOURCE_NAME | Deletes a Data Source. | --yes : Doesn't ask for confirmation. |
share OPTIONS DATASOURCE_NAME WORKSPACE_NAME_OR_ID | Shares a Data Source. |
|
sync OPTIONS DATASOURCE_NAME | Syncs from connector defined in .datasource file. | --yes : Doesn't ask for confirmation. |
truncate OPTIONS DATASOURCE_NAME | Truncates a Data Source. |
|
unshare OPTIONS DATASOURCE_NAME WORKSPACE_NAME_OR_ID | Unshares a Data Source. |
|
scheduling resume DATASOURCE_NAME | Resumes the scheduling of a Data Source. | |
scheduling pause DATASOURCE_NAME | Pauses the scheduling of a Data Source. | |
scheduling status DATASOURCE_NAME | Gets the scheduling status of a Data Source (paused or running). |
tb dependencies¶
Prints all Data Sources dependencies.
Its options:
--no-deps
: Prints only Data Sources with no Pipes using them.--match TEXT
: Retrieves any resource matching the pattern.--pipe TEXT
: Retrieves any resource used by Pipe.--datasource TEXT
: Retrieves resources depending on this Data Source.--check-for-partial-replace
: Retrieves dependant Data Sources that have their data replaced if a partial replace is executed in the Data Source selected.--recursive
: Calculates recursive dependencies.
tb deploy¶
Deploys in Tinybird pushing resources changed from previous release using Git.
These are the options available for the deploy
command:
--dry-run
: Runs the command with static checks, without creating resources on the Tinybird account or any side effect. Doesn't check for runtime errors.-f, --force
: Overrides Pipes when they already exist.--override-datasource
: When pushing a Pipe with a materialized Node if the target Data Source exists it tries to override it.--populate
: Populate materialized Nodes when pushing them.--subset FLOAT
: Populates with a subset percent of the data (limited to a maximum of 2M rows), this is useful to quickly test a materialized Node with some data. The subset must be greater than 0 and lower than 0.1. A subset of 0.1 means a 10% of the data in the source Data Source is used to populate the Materialized View. Use it together with--populate
, it has precedence over--sql-condition
.--sql-condition TEXT
: Populates with a SQL condition to be applied to the trigger Data Source of the Materialized View. For instance,--sql-condition='date == toYYYYMM(now())'
it populates taking all the rows from the trigger Data Source whichdate
is the current month. Use it together with--populate
.--sql-condition
is not taken into account if the--subset
param is present. Including in thesql_condition
any column present in the Data Sourceengine_sorting_key
makes the populate job process less data.--unlink-on-populate-error
: If the populate job fails the Materialized View is unlinked and new data isn't ingested there. First time a populate job fails, the Materialized View is always unlinked.--wait
: To be used along with--populate
command. Waits for populate jobs to finish, showing a progress bar. Disabled by default.--yes
: Doesn't ask for confirmation.--workspace_map TEXT..., --workspace TEXT...
: Adds a Workspace path to the list of external Workspaces, usage:--workspace name path/to/folder
.--timeout FLOAT
: Timeout you want to use for the job populate.--user_token TOKEN
: The user Token is required for sharing a Data Source that contains the SHARED_WITH entry.
tb diff¶
Diffs local datafiles to the corresponding remote files in the Workspace.
It works as a regular diff
command, useful to know if the remote resources have been changed. Some caveats:
- Resources in the Workspace might mismatch due to having slightly different SQL syntax, for instance: A parenthesis mismatch,
INTERVAL
expressions or changes in the schema definitions. - If you didn't specify an
ENGINE_PARTITION_KEY
andENGINE_SORTING_KEY
, resources in the Workspace might have default ones.
The recommendation in these cases is use tb pull
to keep your local files in sync.
Remote files are downloaded and stored locally in a .diff_tmp
directory, if working with git you can add it to .gitignore
.
The options for this command:
--fmt / --no-fmt
: Format files before doing the diff, default is True so both files match the format.--no-color
: Don't colorize diff.--no-verbose
: List the resources changed not the content of the diff.
tb fmt¶
Formats a .datasource, .pipe or .incl file.
These are the options available for the fmt
command:
--line-length INTEGER
: A number indicating the maximum characters per line in the Node SQL, lines split based on the SQL syntax and the number of characters passed as a parameter.--dry-run
: Don't ask to override the local file.--yes
: Don't ask for confirmation to overwrite the local file.--diff
: Formats local file, prints the diff and exits 1 if different, 0 if equal.
This command removes comments starting with # from the file, so use DESCRIPTION or a comment block instead:
Example comment block
% {% comment this is a comment and fmt keeps it %} SELECT {% comment this is another comment and fmt keeps it %} count() c FROM stock_prices_1m
You can add tb fmt
to your git pre-commit
hook to have your files properly formatted. If the SQL formatting results are not the ones expected to you, you can disable it just for the blocks needed. Read how to disable fmt.
tb init¶
Initializes folder layout.
It comes with these options:
--generate-datasources
: Generates Data Sources based on CSV, NDJSON and Parquet files in this folder.--folder DIRECTORY
: Folder where datafiles are placed.-f, --force
: Overrides existing files.-ir, --ignore-remote
: Ignores remote files not present in the local data project ontb init --git
.--git
: Initializes Workspace with Git commits.--override-commit TEXT
: Use this option to manually override the reference commit of your Workspace. This is useful if a commit is not recognized in your Git log, such as after a force push (git push -f
).
tb job¶
Jobs commands.
Command | Description | Options |
---|---|---|
cancel JOB_ID | Tries to cancel a job. | None |
details JOB_ID | Gets details for any job created in the last 48h. | None |
ls [OPTIONS] | Lists jobs. | --status [waiting|working|done|error] or -s : Shows results with the desired status. |
tb materialize¶
Analyzes the node_name
SQL query to generate the .datasource and .pipe files needed to push a new materialize view.
This command guides you to generate the Materialized View with name TARGET_DATASOURCE, the only requirement is having a valid Pipe datafile locally. Use tb pull
to download resources from your Workspace when needed.
It allows to use these options:
---push-deps
: Push dependencies, disabled by default.
--workspace TEXT...
: Add a Workspace path to the list of external Workspaces, usage:--workspace name path/to/folder
.--no-versions
: When set, resource dependency versions are not used, it pushes the dependencies as-is.--verbose
: Prints more log.--unlink-on-populate-error
: If the populate job fails the Materialized View is unlinked and new data isn't ingested in the Materialized View. First time a populate job fails, the Materialized View is always unlinked.
tb pipe¶
Use the following commands to manage Pipes.
Command | Description | Options |
---|---|---|
append OPTIONS PIPE_NAME_OR_UID SQL | Appends a Node to a Pipe. | |
copy pause OPTIONS PIPE_NAME_OR_UID | Pauses a running Copy Pipe. | |
copy resume OPTIONS PIPE_NAME_OR_UID | Resumes a paused Copy Pipe. | |
copy run OPTIONS PIPE_NAME_OR_UID | Runs an on-demand copy job. |
|
data OPTIONS PIPE_NAME_OR_UID PARAMETERS | Prints data returned by a Pipe. You can pass query parameters to the command, for example |
|
generate OPTIONS NAME QUERY | Generates a Pipe file based on a sql query. Example: tb pipe generate my_pipe 'select * from existing_datasource' . | --force : Overrides existing files. |
ls OPTIONS | Lists Pipes. |
|
populate OPTIONS PIPE_NAME | Populates the result of a Materialized Node into the target Materialized View. |
|
publish OPTIONS PIPE_NAME_OR_ID NODE_UID | Changes the published Node of a Pipe. | |
regression-test OPTIONS FILENAMES | Runs regression tests using last requests. |
|
rm OPTIONS PIPE_NAME_OR_ID | Deletes a Pipe. PIPE_NAME_OR_ID can be either a Pipe name or id in the Workspace or a local path to a .pipe file. | --yes : Doesn't ask for confirmation. |
set_endpoint OPTIONS PIPE_NAME_OR_ID NODE_UID | Same as 'publish', changes the published Node of a Pipe. | |
sink run OPTIONS PIPE_NAME_OR_UID | Runs an on-demand sink job. |
|
stats OPTIONS PIPES | Prints Pipe stats for the last 7 days. | --format [json] : Forces a type of the output. To parse the output, keep in mind to use tb --no-version-warning pipe stats option. |
token_read OPTIONS PIPE_NAME | Retrieves a Token to read a Pipe. | |
unlink OPTIONS PIPE_NAME NODE_UID | Unlinks the output of a Pipe, whatever its type: Materialized Views, Copy Pipes, or Sinks. | |
unpublish OPTIONS PIPE_NAME NODE_UID | Unpublishes the endpoint of a Pipe. |
tb prompt¶
Provides instructions to configure the shell prompt for Tinybird CLI. See Configure your shell prompt.
tb pull¶
Retrieves the latest version for project files from your Workspace.
With these options:
--folder DIRECTORY
: Folder where files are placed.--auto / --no-auto
: Saves datafiles automatically into their default directories (/datasources or /pipes). Default is True.--match TEXT
: Retrieve any resourcing matching the pattern. eg--match _test
.-f, --force
: Override existing files.--fmt
: Format files, following the same format astb fmt
.
tb push¶
Push files to your Workspace.
You can use this command with these options:
--dry-run
: Runs the command with static checks, without creating resources on the Tinybird account or any side effect. Doesn't check for runtime errors.--check / --no-check
: Enables/disables output checking, enabled by default.--push-deps
: Pushes dependencies, disabled by default.--only-changes
: Pushes only the resources that have changed compared to the destination Workspace.--debug
: Prints internal representation, can be combined with any command to get more information.-f, --force
: Overrides Pipes when they already exist.--override-datasource
: When pushing a Pipe with a materialized Node if the target Data Source exists it tries to override it.--populate
: Populates materialized Nodes when pushing them.--subset FLOAT
: Populates with a subset percent of the data (limited to a maximum of 2M rows), this is useful to quickly test a materialized Node with some data. The subset must be greater than 0 and lower than 0.1. A subset of 0.1 means a 10 percent of the data in the source Data Source is used to populate the Materialized View. Use it together with--populate
, it has precedence over--sql-condition
.--sql-condition TEXT
: Populates with a SQL condition to be applied to the trigger Data Source of the Materialized View. For instance,--sql-condition='date == toYYYYMM(now())'
it populates taking all the rows from the trigger Data Source whichdate
is the current month. Use it together with--populate
.--sql-condition
is not taken into account if the--subset
param is present. Including in thesql_condition
any column present in the Data Sourceengine_sorting_key
makes the populate job process less data.--unlink-on-populate-error
: If the populate job fails the Materialized View is unlinked and new data isn't ingested in the Materialized View. First time a populate job fails, the Materialized View is always unlinked.--fixtures
: Appends fixtures to Data Sources.--wait
: To be used along with--populate
command. Waits for populate jobs to finish, showing a progress bar. Disabled by default.--yes
: Doesn't ask for confirmation.--only-response-times
: Checks only response times, when --force push a Pipe.--workspace TEXT..., --workspace_map TEXT...
: Add a Workspace path to the list of external Workspaces, usage:--workspace name path/to/folder
.--no-versions
: When set, resource dependency versions are not used, it pushes the dependencies as-is.--timeout FLOAT
: Timeout you want to use for the populate job.-l, --limit INTEGER RANGE
: Number of requests to validate [0<=x<=100].--sample-by-params INTEGER RANGE
: When set, aggregates thepipe_stats_rt
requests byextractURLParameterNames(assumeNotNull(url))
and for each combination takes a sample of N requests [1<=x<=100].-ff, --failfast
: When set, the checker exits as soon one test fails.--ignore-order
: When set, the checker ignores the order of list properties.--validate-processed-bytes
: When set, the checker validates that the new version doesn't process more than 25% than the current version.--user_token TEXT
: The User Token is required for sharing a Data Source that contains the SHARED_WITH entry.
tb sql¶
Runs SQL queries over Data Sources and Pipes.
--rows_limit INTEGER
: Max number of rows retrieved.--pipeline TEXT
: The name of the Pipe to run the SQL Query.--pipe TEXT
: The path to the .pipe file to run the SQL Query of a specific NODE.--node TEXT
: The NODE name.--format [json|csv|human]
: Output format.--stats / --no-stats
: Shows query stats.
tb test¶
Test commands.
Command | Description | Options |
---|---|---|
init | Initializes a file list with a simple test suite. | --force : Overrides existing files. |
parse [OPTIONS] [FILES] | Reads the contents of a test file list. | |
run [OPTIONS] [FILES] | Runs the test suite, a file, or a test. |
|
tb token¶
Manage your Workspace Tokens.
Command | Description | Options |
---|---|---|
copy OPTIONS TOKEN_ID | Copies a Token. | |
ls OPTIONS | Lists Tokens. | --match TEXT : Retrieves any Token matching the pattern. eg --match _test . |
refresh OPTIONS TOKEN_ID | Refreshes a Token. | --yes : Doesn't ask for confirmation. |
rm OPTIONS TOKEN_ID | Removes a Token. | --yes : Doesn't ask for confirmation. |
scopes OPTIONS TOKEN_ID | Lists Token scopes. | |
create static OPTIONS TOKEN_NAME | Creates a static Token that lasts forever. |
|
create jwt OPTIONS TOKEN_NAME | Creates a JWT Token with a fixed expiration time. |
|
tb workspace¶
Manage your Workspaces.
Command | Description | Options |
---|---|---|
clear OPTIONS | Drop all the resources inside a project. This command is dangerous because it removes everything, use with care. |
|
create OPTIONS WORKSPACE_NAME | Creates a new Workspace for your Tinybird user. |
|
current OPTIONS | Shows the Workspace you're currently authenticated to. | |
delete OPTIONS WORKSPACE_NAME_OR_ID | Deletes a Workspace where you are an admin. |
|
ls OPTIONS | Lists all the Workspaces you have access to in the account you're currently authenticated to. | |
members add OPTIONS MEMBERS_EMAILS | Adds members to the current Workspace. | --user_token TEXT : When passed, we won't prompt asking for it. |
members ls OPTIONS | Lists members in the current Workspace. | |
members rm OPTIONS | Removes members from the current Workspace. | --user_token TEXT : When passed, we won't prompt asking for it. |
members set-role OPTIONS [guest|viewer|admin] MEMBERS_EMAILS | Sets the role for existing Workspace members. | --user_token TEXT : When passed, we won't prompt asking for it. |
use OPTIONS WORKSPACE_NAME_OR_ID | Switches to another workspace. Use tb workspace ls to list the workspaces you have access to. |
tb tag¶
Manage your Workspace tags.
Command | Description | Options |
---|---|---|
create TAG_NAME | Creates a tag in the current Workspace. | |
ls | List all the tags of the current Workspace. | |
ls TAG_NAME | List all the resources tagged with the given tag. | |
rm TAG_NAME | Removes a tag from the current Workspace. All resources are not tagged by the given tag anymore. | --yes : Don't ask for confirmation. |