tinybird
/
guides

Implementing test strategies

Intermediate

In the versioning your pipes guide we learned how to use prefixes and versions as part of the usual development workflow of your API endpoints.

In this guide we'll talk about how to run automatic tests on your API endpoints as part of your continuous integration workflow.

Guide preparation

Let's start by pushing the ecommerce_data_project (we'll use the {% code-line %}--prefix pro{% code-line-end %} to indicate it's used in production):

Regression tests

When one of your API endpoints is integrated in a production environment (a web or mobile application, a dashboard, etc.), you want to make sure that any change in the pipe doesn't change the output of the endpoint.

In other words, you want the same version of an API endpoint to return the same data for the same requests.

The CLI provides you with automatic regression tests any time you try to push the same version of a pipe. Let's see it with an example:

Imagine we have this version of our {% code-line %}top_products{% code-line-end %} pipe:

And we want to parameterize the date filter to this:

We've include a new param {% code-line %}day{% code-line-end %} with a default value of {% code-line %}7{% code-line-end %}. That means by default, the behaviour of the endpoint should be the same.

To illustrate the example, let's do a couple of requests to the API endpoint:

Now let's try to override the endpoint:

The CLI will take the top 10 most requested API calls to the endpoint and will compare the response (both in content and in response times) for the current version of the endpoint and the version we are pushing.

In this case, all requests return the same data, so the endpoint is overriden.

As a test, let's change the default date range to the last day:

And try to override it:

Since the default period changed, the response changed for the default request, so the endpoint is not overriden. The CLI saved us from a possible regression.

Let's say we are sure the new response is correct, so we don't consider this as a regression and want to force override the endpoint, we can do it like this:

In this case, the regression tests won't be executed. Of course, you do this at your own risk!! (Especially when the endpoints are in production.)

Continuous integration

Regression tests are great to double check your endpoints are correct when overriding them in your Tinybird account.

When you are developing you are most likely looking to validate your endpoints as well. In the same way that you write integration and acceptance tests to your source code, you can write tests for your endpoints to be run on each commit.

We will use GitHub Actions and the Tinybird CLI to illustrate how you can test your endpoints on any new commit to a pull request.

Configure the GitHub Action

Take a look at the GitHub Action from the ecommerce_data project. On each push to a branch it'll:

  1. Install the Tinybird CLI
  2. Print the CLI version
  3. Drop all the resources inside the project with the prefix (given by the name of the branch)
  4. Push the project again
  5. Run this test script, that compares the current result with the expected result.

This action uses a {% code-line %}secrets.tb_token{% code-line-end %} environment variable you have to configure with your Tinybird master token.

Configure the continuous integration tests

The GitHub action will run a set of tests configured with two files:

Let's see an example for the {% code-line %}top_products{% code-line-end %} API endpoint with the {% code-line %}day{% code-line-end %} parameter.

The {% code-line %}top_products.test{% code-line-end %} file is as follows:

It does a call to the {% code-line %}top_products{% code-line-end %} pipe, prefixed with the name of this branch and returns the data in {% code-line %}CSV{% code-line-end %} format.

Now for the {% code-line %}top_products.test.result{% code-line-end %}, it contains the expected result for the previous API endpoint:

With this approach, you can have your tests for your data project integrated into your development process. All you have to do anytime you create a new branch, besides doing the proper changes in your {% code-line %}.datasource{% code-line-end %} and {% code-line %}.pipe{% code-line-end %} files, is update your test files accordingly.

ON THIS GUIDE