Overview
After a task finishes extracting data, Figranium can automatically forward the result to a configured output destination. This creates a fully automated pipeline:- Continuously populating a database with scraped data on a schedule.
- Building live dashboards that refresh from automated tasks.
- Feeding extracted data directly into other tools without custom ETL scripts.
Supported Providers
| Provider | Description |
|---|---|
| Baserow | Open-source no-code database. Appends rows to a table via the Baserow REST API. |
Baserow Integration
Baserow is a self-hosted, open-source alternative to Airtable. Figranium integrates with it to push extracted JSON data directly into a Baserow table.Prerequisites
- A running Baserow instance (self-hosted or Baserow.io cloud).
- A Baserow API token with write access to the target table.
- The Table ID of the destination table in Baserow.
Step 1: Configure credentials
You can add credentials in two places: From Settings:- Go to Settings > API Keys.
- Under Database / Output, click Add API Key and select Baserow.
- Enter a Name, Base URL (e.g.,
https://api.baserow.io), and your API Token. - Save the credential.
- Open a task and go to Settings > Output.
- Click + New next to the Credential dropdown.
- Fill in the same fields and save.
Figranium validates the base URL to prevent requests to private or internal network addresses. If
ALLOW_PRIVATE_NETWORKS is set to false, URLs pointing to private IP ranges (such as 10.x.x.x, 172.16.x.x, or 192.168.x.x) are rejected. Redirect chains are also validated — if a Baserow instance returns a redirect, each hop is checked against the same blocklist. Additionally, sensitive headers like Authorization are automatically stripped if a redirect crosses to a different origin. See Security best practices for details.Step 2: Configure the Task Output
- Open the task in the Task Editor.
- Go to Settings > Output.
- Select:
- Provider:
Baserow - Credential: The credential you just created.
- Table ID: The numeric ID of the Baserow table to push data into.
- On Error:
ignore(default) orfail(marks the execution as failed if the push fails).
- Provider:
- Save the task.
Step 3: Match Your Extraction Data to Table Columns
Figranium pushes data using Baserow’suser_field_names=true API parameter, which means you reference columns by their display name (not internal ID).
Your extraction script must return either:
A single object (inserts one row):
How Data is Sent
| Extracted data shape | Baserow API call |
|---|---|
JSON object {} | Single row via POST /api/database/rows/table/{tableId}/ |
JSON array [{}, {}] | Batch insert via POST /api/database/rows/table/{tableId}/batch/ |
Authorization: Token <your-token>.
Error Handling
By default, push failures are ignored — the execution is marked as successful even if the Baserow push fails. This prevents network blips from causing false task failures. To make the task fail when the push fails, set On Error tofail in the output settings.
Check the execution logs for output push status:
API Configuration
You can also configure output providers via the REST API: Save a credential:output field in your task definition: