BigQuery
Export Solvimon data to BigQuery
To enable usage of Solvimon's data seamlessly in your BigQuery instance. This page describes the steps necessary to connect your BigQuery instance to Solvimon for data exporting.
Getting started
Requirements
The following access is required for us to set-up a connection from your BigQuery instance to Solvimon.
- A Google Cloud project with BigQuery enabled.
- An existing BigQuery dataset to sync data to or the ability to create one.
- The ability to create a Google Cloud Service Account with
BigQuery UserandBigQuery Data Editorrights. - The ability to create a Google Cloud Storage bucket to which you grant
Storage Adminfor the service account you created.
Step 1: Create Google Cloud Service Account
This step is optional if you already have a service account with the BigQuery User and BigQuery Data Editor roles, but highly recommended a separate service account for ease of permissioning and auditing. A service account with these roles can run BigQuery jobs, write to BigQuery datasets and read table metadata.
We recommend to create a Service Account by following GCP's guide for Creating a Service Account. Once you've created the Service Account, make sure save its ID as you will need to reference it when granting roles. Service Account IDs typically take the form {ACCOUNT_NAME}@{PROJECT_NAME}.iam.gserviceaccount.com. Subsequently you will need to assign the BigQuery User and BigQuery Data Editor roles to the service account. This guide describes how to assign a role to the service account.
Step 2: Generate a service account key
To use the created service account, we will need to generate a Service Account Key. You can follow this guide to generate one, ensure you select JSON as the Key type and save the contents for the next step.
Step 3: Create Google Cloud Storage staging bucket
To insert Solvimon's data into your BigQuery instance we require a staging bucket in your Google Cloud environment. This bucket will be used to stage data to which can then be written to your BigQuery.
The following steps are necessary to create the storage bucket:
- Create a Cloud Storage bucket with the Protection Tools set to none or Object versioning. Make sure the bucket does not have a retention policy.
- Create an HMAC key and access ID.
- Grant the Storage Object Admin role to the Google Cloud Service Account.
Storage bucket encryptionYour bucket must be encrypted using a Google-managed encryption key (this is the default setting when creating a new bucket). We currently do not support buckets using customer-managed encryption keys (CMEK). You can view this setting under the "Configuration" tab of your GCS bucket, in the Encryption type row.
Step 4: Create BigQuery dataset
This step is optional if you already have a BigQuery dataset to which we should export data to. However, we recommend creating a separate dataset to ensure data is saved in the right place.
To create a BigQuery dataset you can follow this guide. Make sure you save the dataset name.
Step 5: Share the relevant information with Solvimon
Please confidently share the following information with us:
| Field | Description |
|---|---|
Project ID | The project ID to which the data should be exported to. |
Dataset ID | The dataset ID where the data will be exported to. |
Dataset Location | The location of the dataset, e.g. US, or europe-west1. |
HMAC Key Access ID | This is generated in the second step of Step 3 and is necessary to connect to your staging bucket. |
HMAC Key Secret | This is generated in the second step of Step 3 and is necessary to connect to your staging bucket. |
GCS Bucket Name | This is the name of the bucket you created in the first step of Step 3. |
Service Account Key JSON | This is the service account JSON that is created in Step 2. |
Updated 9 months ago