BigQuery Adapter
Import datasets from BigQuery and export generated events back to BigQuery.
Install
bash
npm install @synode/adapter-bigquery @google-cloud/bigqueryBoth @synode/core and @google-cloud/bigquery are peer dependencies.
Exporting Events
Write generated events to a BigQuery table using BigQueryAdapter:
typescript
import { BigQueryAdapter } from '@synode/adapter-bigquery';
import { generate, defineJourney, defineAdventure, defineAction } from '@synode/core';
const adapter = new BigQueryAdapter({
projectId: 'my-gcp-project',
datasetId: 'analytics',
tableId: 'events',
batchSize: 200,
flushInterval: 3000,
});
await generate(journey, { users: 1000, adapter });BigQueryAdapterOptions
| Option | Type | Default | Description |
|---|---|---|---|
projectId | string | required | GCP project ID |
datasetId | string | required | BigQuery dataset ID |
tableId | string | required | BigQuery table ID |
batchSize | number | 100 | Events to buffer before inserting |
flushInterval | number | 5000 | Max ms before flushing partial batch |
autoCreateTable | boolean | false | Create table if missing |
transform | (row) => row | none | Transform each row before insert |
Row Format
Events are serialized as flat rows:
| Column | Type | Source |
|---|---|---|
id | STRING | event.id |
user_id | STRING | event.userId |
session_id | STRING | event.sessionId |
name | STRING | event.name |
timestamp | STRING | event.timestamp (ISO 8601) |
payload | STRING | JSON.stringify(event.payload) |
Use transform to customize the schema.
Importing Datasets
Load a BigQuery table as a synode dataset for use during generation:
typescript
import { importFromBigQuery } from '@synode/adapter-bigquery';
import { generate } from '@synode/core';
const products = await importFromBigQuery({
projectId: 'my-gcp-project',
datasetId: 'ecommerce',
tableId: 'products',
id: 'products',
name: 'Product Catalog',
where: 'active = true',
limit: 5000,
});
await generate(journey, {
users: 1000,
preloadedDatasets: [products],
adapter,
});BigQueryImportOptions
| Option | Type | Default | Description |
|---|---|---|---|
projectId | string | required | GCP project ID |
datasetId | string | required | BigQuery dataset ID |
tableId | string | required | Source table |
id | string | required | Synode dataset ID |
name | string | required | Synode dataset name |
where | string | none | SQL WHERE clause |
limit | number | unlimited | Max rows to import |
