Redshift

Export Solvimon data to Redshift

To enable usage of Solvimon's data seamlessly in your Redshift cluster. This page describes the steps necessary to connect your Redshift cluster to Solvimon for data exporting.

Getting started

Requirements

The following access is required for us to set-up a connection from your Redshift cluster to Solvimon.

  • An AWS account with access to create an S3 staging bucket.
  • An existing Redshift cluster that is publicly accessible. Note that a Redshift cluster is publicly accessible when:
    • You turned on the Publicly accessible cluster setting.
    • You provisioned the cluster in a public subnet.
  • Allow connections from Solvimon to your Redshift cluster. If your Redshift security posture using IP whitelisting, you will need to whitelist our IPs.
  • Please contact us if your Redshift cluster needs an SSH tunnel before connecting to your cluster. We'll work together to facilitate this.

Step 1: Set-up user, database and schema

This step is optional but highly recommended to allow for better permission control and auditing. Alternatively, you can also use existing user in your database with the required write access. This step assumes your data is located inside one database.

  1. Connect to your selected Redshift database using the SQL client. In case of multiple databases you have to perform these steps separately for each database.
  2. Create a Solvimon export user, replace {PASSWORD} with your selected password. Make sure that you securely save this password. If you want to use an existing user, you will need to replace solvimon_user with your selected user in the subsequent steps.
CREATE USER solvimon_user PASSWORD '{PASSWORD}';
  1. Create a new database in which the Solvimon exports will be stored. This is optional you can use an existing database as well, make sure you replace solvimon_exports with your selected database in subsequent steps.
CREATE DATABASE solvimon_exports;
  1. Create a new schemas in your database. This is optional you can use an existing schema as well, make sure you replace solvimon_exports and solvimon_internal with your selected schema in subsequent steps. Note that the internal schema is necessary to use as a staging schema.
CREATE SCHEMA solvimon_exports AUTHORIZATION solvimon_user;
CREATE SCHEMA solvimon_internal AUTHORIZATION solvimon_user;
  1. Create a new group (solvimon_export_group) with read and write access on your selected schema.
CREATE GROUP solvimon_export_group;
GRANT CREATE ON DATABASE solvimon_exports FROM GROUP solvimon_export_group;
GRANT USAGE, CREATE ON SCHEMA solvimon_exports TO GROUP solvimon_export_group;
GRANT CREATE ON DATABASE solvimon_internal FROM GROUP solvimon_export_group;
GRANT USAGE, CREATE ON SCHEMA solvimon_internal TO GROUP solvimon_export_group;
GRANT SELECT ON TABLE SVV_TABLE_INFO TO GROUP solvimon_export_group;
ALTER DEFAULT PRIVILEGES IN SCHEMA solvimon_exports GRANT USAGE, CREATE ON, SELECT ON TABLES TO GROUP solvimon_export_group;
ALTER DEFAULT PRIVILEGES IN SCHEMA solvimon_internal GRANT USAGE, CREATE ON, SELECT ON TABLES TO GROUP solvimon_export_group;
  1. Execute the following query to add the created user to the created group.
ALTER GROUP solvimon_export_group ADD USER solvimon_user;

Note that the above is under the assumption that you have one schema under one database. If you have multiple schemas, you will have to execute the above for the other schemas as well.

Step 2: Create a S3 staging bucket

Currently we require a staging bucket in S3 to be available to write intermediate data to before saving it your Redshift instance. In this step we will show you how to create the bucket. Alternatively, you can also use an existing bucket.

  1. To create a S3 bucket we refer to the AWS documentation, see here. Save the bucket name, in this example we will call it solvimon_export_bucket.
  2. (Optional) Place the S3 bucket and your Redshift cluster in the same region to save networking costs.
  3. Create a Solvimon IAM user that can be used to access the bucket. Please refer to this guide for the IAM user creation.

Create a permission policy

  1. Log in to Amazon AWS and open the IAM console.
  2. Go to Policies and then click Create Policies.
  3. Select JSON tab, and paste the following JSON in the policy editor. It is relevant to know that in this JSON there are not only the policies for the user, but also to be able to manage the S3 bucket properly.
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "ListObjectsInBucket",
            "Effect": "Allow",
            "Action": ["s3:ListBucket"],
            "Resource": ["arn:aws:s3:::solvimon_export_bucket"]
        },
        {
            "Sid": "AllObjectActions",
            "Effect": "Allow",
            "Action": "s3:*Object",
            "Resource": ["arn:aws:s3:::solvimon_export_bucket/*"]
        }
    ]
}
  1. Click now Create policy.
  2. In the IAM dashboard, go to Users and either create a new one or use an existing one.
  3. Go to Add Permissions and select Attach policies directly, where you have to find the policy that has just been created.
  4. Once the user with its permissions is created, select the Security credentials tab and click Create access key. Click Create access key to generate the keys. Here you obtain the Access Key ID and the Secret Access Key you will require to submit to us.

Step 3: Allow connections from Solvimon

This step is only necessary when your Redshift security posture requires IP whitelisting. The following IP addresses will need to be whitelisted: 35.205.227.168 (test/sandbox) 104.155.0.1 (live)

To whitelist our static IP you will need to execute the following steps:

  1. Navigate to the Redshift console and select Clusters. You will need to take note of your cluster name.
  2. Now select the cluster that contains the data you would like to use for event ingestion. In the General information pane, record the Endpoint details. You may need to use the copy icon to copy the full details to discover the full endpoint and port number.
  3. Click the Properties tab. Scroll down to the Network and security settings section. In the VPC security group field, select a security group to open it.
  4. In the Security Groups window, select Inbound rules. Then select Edit inbound rules. In the Edit the Inbound rules window, follow the steps below to create custom TCP rules for the static IP:
    a. Select Custom TCP in the drop-down menu.
    b. Enter your Redshift port number. (by default 5439)
    c. Enter the static IP you have received from us.
    d. Select Add rule.

Step 4: Send the relevant information to Solvimon

The final step is to confidently share the following information with the Solvimon. If you have multiple databases you will need to share this information for each database separately.

FieldDescription
HostHost Endpoint of the Redshift Cluster (must include the cluster-id, region and end with .redshift.amazonaws.com).
PortPort of the database (by default 5439).
DatabaseName of the database, e.g. solvimon_exports
SchemaName of the schema to write to, e.g. solvimon_exports
UsernameThe selected Redshift username, e.g. solvimon_user
PasswordThe selected password for your Redshift user.
S3 Bucket NameThe name of the S3 staging bucket, e.g. solvimon_export_bucket
S3 Bucket RegionThe region where the S3 staging bucket is located, e.g eu-west-1.
Access Key IDThis is the access key ID for the AWS user created to access the staging bucket.
Secret Access KeyThis is the secret access key corresponding to the access key ID above.

As soon as we receive this information we will reach out and work with you directly to set-up the relevant data exports.