"Upload data to a table" node

The integration between Synerise and Google BigQuery opens up possibilities of exporting data collected in Synerise. By means of the Upload data to a table node, you can push data from Synerise to add it in the form of rows in a BigQuery tables. You can use this connection in various scenarios such as exporting transactions, event data, customer information (such as marketing agreements), results of metrics, aggregates, expressions, reports, and many more to Google BigQuery.

During the configuration of the node, you can use Jinjava inserts which let you dynamically refer to the profile attributes and event parameters in the workflow.

Important: This node is not optimized for batch operations that send large volumes of data (for example, updating all events for each profile). Use it to export changes within one event, profile attribute (for example, marketing agreement), analysis results and so on.


Node configuration

  1. Click Google BigQuery > Upload data to table.
  2. Click Select connection.
  3. From the dropdown list, select the connection.

Create a connection

To allow the data exchange, establish a connection between Synerise and Google BigQuery.

  1. At the bottom of the Select connection dropdown list, click Add connection.
  2. On the pop-up, click Sign in with Google.
  3. Select a Google account which has write access to the dataset that contains your destination table. Learn about required permissions.
  4. Follow the instructions on the interface.
  5. After the successful authentication, click Next.
  6. In the Connection name field, enter the name of the connection.
    It’s used to find the connection on the list.
  7. Click Apply.
    Result: A connection is created and selected.

Define the integration settings

In this step, fill in the form that allows you to send data from Synerise to a table in Google BigQuery.

  1. In the Project ID field, enter the unique identifier of your project in Google BigQuery.
    You can learn how to find the project ID here.

  2. In the Dataset ID field, enter the unique identifier of the dataset in the BigQuery project.

  3. In the Table ID field, enter the unique identifier of the table in the dataset.

  4. In the Rows field, enter JSON with rows which you want to upload to BigQuery project.

    • Optionally, you can add the insertId parameter to the request body. It is a property used in the BigQuery streaming API to ensure the uniqueness of records being inserted into a BigQuery table. When you stream data into BigQuery, insertId acts as a unique identifier for each record. To ensure an automatic generation of the insertId parameter value, we recommend using the following Jinjava: {{ currentStep.actionId }}. This will generate a unique value of this parameter for each request sent from Synerise to BigQuery.
      Note: You can read more about insert ID in the Google Big Query documentation.
    • Each object in the array of the request body inserts data into a row. The json property contains keys (columns names) and values (column values). You can use Jinjava inserts as the values of columns to dynamically send data stored in Synerise. This way, Jinjava dynamically fills in values (for example, ID, email address, expression/aggregate result) for each profile/item.
      Click here to see example of the request body

            "insertId": "{{ currentStep.actionId }}",
            "json": {
              "name": "{{ customer.firstname }}",
              "email": "{{ customer.email }}"

  5. Confirm the settings by clicking Apply.

Example of use

You can use the integration to send information about complaint to Google BigQuery. So every time the complaint event is generated, the data from the event will be sent automatically to BigQuery. In this example, we use a custom event - complaint.filed. If you want to collect such event in your workspace, you must implement it first.

Configuration of the workflow
Configuration of the workflow
  1. Start the workflow with the Profile Event node. In the configuration of the node:

    1. Enter the following title to the node: complaint.
    2. Select the event that signifies a customer’s complaint.
      Click here to see example event and its parameters for an example customer

          "time": "2023-02-15T15:24:49Z",
          "action": "complaint.filed",
          "label": "",
          "client": {
              "id": 5092159999,
              "email": "e0097757-d1e2-44ac-ba3c-d97979a354c1@anonymous.invalid",
              "uuid": "e0097757-d1e2-44ac-ba3c-d97979a354c1"
          "params": {
              "eventCreateTime": "2023-02-15T15:25:08.861Z",
              "name": "John",
              "surname": "Doe",
              "complaintNumber": "123",
              "complaintText": "Hello, my order didn't arrive."

  2. As the next step of the workflow, select Upload data to a table node. In the configuration of the node:

    1. Fill out the form according to the instructions in the Define the integration settings section.
    2. In the Rows field, enter the JSON that extracts name, surname, complaint number and the contents of complaint from the event selected in the Profile Event node. The example body contains the Automation inserts that retrieve data from the event included in the Profile Event node.
            "insertId": "{{ currentStep.actionId }}",
            "json": {
              "name": "{{ automationPathSteps['complaint'].event.params.name }}",
              "surname": "{{ automationPathSteps['complaint'].event.params.surname }}",
              "complaintNumber": "{{ automationPathSteps['complaint'].event.params.complaintNumber }}",
              "complaintText": "{{ automationPathSteps['complaint'].event.params.complaintText }}"
  3. As the finishing node, add the End node.
    Result: The event data is sent to a BigQuery table.


We are sorry to hear that

Thank you for helping improve out documentation. If you need help or have any questions, please consider contacting support.



Thank you for helping improve out documentation. If you need help or have any questions, please consider contacting support.

Close modal icon Placeholder alt for modal to satisfy link checker