New Docs structure: split Forward and Classic version¶
In March, we launched a new user experience called Forward. Since then, we've noticed some confusion when users try to differentiate between documentation for this new version and our previous experience, which we call Classic.
To address this, we've completely restructured our documentation so that Forward and Classic are now self-contained versions, each with everything you need to understand Tinybird. The navigation is clearer, content is better organized, and you'll always know which version you're viewing.
To switch between versions, we've added a new selector in the left sidebar that makes it easy to jump between Forward and Classic documentation.

Create Kafka data sources with Forward CLI¶
We've added two new commands to the Tinybird Forward CLI to help you create a Kafka data source.
tb connection create kafka
This command will guide you through the process of creating a Kafka connection. You'll be asked to provide a name for the connection, and then the Kafka credentials to access your topics.
tb datasource create --kafka
This command will guide you through the process of creating a Kafka data source. You'll be asked to provide a name for the data source, and then if your connection is already created, you will be able to select your preferred topic. If not, you will be asked to create a new connection first.
Environment variables¶
When working in local, you can store secrets in .env.local or .env files. They will be loaded automatically when you run tb dev
or tb build
.
KAFKA_USERNAME=12345 KAFKA_PASSWORD=67890
Improvements and bug fixes¶
- [All]: Ingestion Burst Limit Handling for vCPU Usage enabled in Developer plans
- [All]: Fixed counting logic of copy pipe active jobs when applying limit
- [Forward]: Support stdin in
tb sql
command. E.g:echo "select 1" | tb sql
- [Forward]: Improved validation for the [IMPORT_FROM_TIMESTAMP] parameter in S3 and GCS Data Sources to ensure correct
YYYY-MM-DDTHH:MM:SSZ
format. [Forward]: Fixed dependency tracking for pipes with UNION ALL statements. No data was affected as the issue only prevented deployments.