Redshift
Export Solvimon data to Redshift
To enable usage of Solvimon's data seamlessly in your Redshift cluster. This page describes the steps necessary to connect your Redshift cluster to Solvimon for data exporting.
Getting started
Requirements
The following access is required for us to set-up a connection from your Redshift cluster to Solvimon.
- An AWS account with access to create an S3 staging bucket.
- An existing Redshift cluster that is publicly accessible. Note that a Redshift cluster is publicly accessible when:
- You turned on the Publicly accessible cluster setting.
- You provisioned the cluster in a public subnet.
- Allow connections from Solvimon to your Redshift cluster. If your Redshift security posture using IP whitelisting, you will need to whitelist our IPs.
- Please contact us if your Redshift cluster needs an SSH tunnel before connecting to your cluster. We'll work together to facilitate this.
Step 1: Set-up user, database and schema
This step is optional but highly recommended to allow for better permission control and auditing. Alternatively, you can also use existing user in your database with the required write access. This step assumes your data is located inside one database.
- Connect to your selected Redshift database using the SQL client. In case of multiple databases you have to perform these steps separately for each database.
- Create a Solvimon export user, replace
{PASSWORD}with your selected password. Make sure that you securely save this password. If you want to use an existing user, you will need to replacesolvimon_userwith your selected user in the subsequent steps.
CREATE USER solvimon_user PASSWORD '{PASSWORD}';- Create a new database in which the Solvimon exports will be stored. This is optional you can use an existing database as well, make sure you replace
solvimon_exportswith your selected database in subsequent steps.
CREATE DATABASE solvimon_exports;- Create a new schemas in your database. This is optional you can use an existing schema as well, make sure you replace
solvimon_exportsandsolvimon_internalwith your selected schema in subsequent steps. Note that the internal schema is necessary to use as a staging schema.
CREATE SCHEMA solvimon_exports AUTHORIZATION solvimon_user;
CREATE SCHEMA solvimon_internal AUTHORIZATION solvimon_user;- Create a new group (
solvimon_export_group) with read and write access on your selected schema.
CREATE GROUP solvimon_export_group;
GRANT CREATE ON DATABASE solvimon_exports FROM GROUP solvimon_export_group;
GRANT USAGE, CREATE ON SCHEMA solvimon_exports TO GROUP solvimon_export_group;
GRANT CREATE ON DATABASE solvimon_internal FROM GROUP solvimon_export_group;
GRANT USAGE, CREATE ON SCHEMA solvimon_internal TO GROUP solvimon_export_group;
GRANT SELECT ON TABLE SVV_TABLE_INFO TO GROUP solvimon_export_group;
ALTER DEFAULT PRIVILEGES IN SCHEMA solvimon_exports GRANT USAGE, CREATE ON, SELECT ON TABLES TO GROUP solvimon_export_group;
ALTER DEFAULT PRIVILEGES IN SCHEMA solvimon_internal GRANT USAGE, CREATE ON, SELECT ON TABLES TO GROUP solvimon_export_group;- Execute the following query to add the created user to the created group.
ALTER GROUP solvimon_export_group ADD USER solvimon_user;Note that the above is under the assumption that you have one schema under one database. If you have multiple schemas, you will have to execute the above for the other schemas as well.
Step 2: Create a S3 staging bucket
Currently we require a staging bucket in S3 to be available to write intermediate data to before saving it your Redshift instance. In this step we will show you how to create the bucket. Alternatively, you can also use an existing bucket.
- To create a S3 bucket we refer to the AWS documentation, see here. Save the bucket name, in this example we will call it
solvimon_export_bucket. - (Optional) Place the S3 bucket and your Redshift cluster in the same region to save networking costs.
- Create a Solvimon IAM user that can be used to access the bucket. Please refer to this guide for the IAM user creation.
Create a permission policy
- Log in to Amazon AWS and open the IAM console.
- Go to
Policiesand then clickCreate Policies. - Select
JSONtab, and paste the following JSON in the policy editor. It is relevant to know that in this JSON there are not only the policies for the user, but also to be able to manage the S3 bucket properly.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Action": ["s3:ListBucket"],
"Resource": ["arn:aws:s3:::solvimon_export_bucket"]
},
{
"Sid": "AllObjectActions",
"Effect": "Allow",
"Action": "s3:*Object",
"Resource": ["arn:aws:s3:::solvimon_export_bucket/*"]
}
]
}- Click now
Create policy. - In the IAM dashboard, go to
Usersand either create a new one or use an existing one. - Go to
Add Permissionsand selectAttach policies directly, where you have to find the policy that has just been created. - Once the user with its permissions is created, select the
Security credentialstab and clickCreate access key. Click Create access key to generate the keys. Here you obtain theAccess Key IDand theSecret Access Keyyou will require to submit to us.
Step 3: Allow connections from Solvimon
This step is only necessary when your Redshift security posture requires IP whitelisting. The following IP addresses will need to be whitelisted: 35.205.227.168 (test/sandbox) 104.155.0.1 (live)
To whitelist our static IP you will need to execute the following steps:
- Navigate to the Redshift console and select
Clusters. You will need to take note of your cluster name. - Now select the cluster that contains the data you would like to use for event ingestion. In the
General informationpane, record the Endpoint details. You may need to use the copy icon to copy the full details to discover the full endpoint and port number. - Click the Properties tab. Scroll down to the Network and security settings section. In the VPC security group field, select a security group to open it.
- In the
Security Groupswindow, selectInbound rules. Then selectEdit inbound rules. In the Edit the Inbound rules window, follow the steps below to create custom TCP rules for the static IP:
a. SelectCustom TCPin the drop-down menu.
b. Enter your Redshift port number. (by default5439)
c. Enter the static IP you have received from us.
d. SelectAdd rule.
Step 4: Send the relevant information to Solvimon
The final step is to confidently share the following information with the Solvimon. If you have multiple databases you will need to share this information for each database separately.
| Field | Description |
|---|---|
| Host | Host Endpoint of the Redshift Cluster (must include the cluster-id, region and end with .redshift.amazonaws.com). |
| Port | Port of the database (by default 5439). |
| Database | Name of the database, e.g. solvimon_exports |
| Schema | Name of the schema to write to, e.g. solvimon_exports |
| Username | The selected Redshift username, e.g. solvimon_user |
| Password | The selected password for your Redshift user. |
| S3 Bucket Name | The name of the S3 staging bucket, e.g. solvimon_export_bucket |
| S3 Bucket Region | The region where the S3 staging bucket is located, e.g eu-west-1. |
| Access Key ID | This is the access key ID for the AWS user created to access the staging bucket. |
| Secret Access Key | This is the secret access key corresponding to the access key ID above. |
As soon as we receive this information we will reach out and work with you directly to set-up the relevant data exports.
Updated 9 months ago