Overview
These instructions are for using a Netskope One DSPM-provided script to configure the Netskope One DSPM connections for multiple AWS Redshift data stores at once.
Prerequisites
Excerpt: Multiple Data Stores: Prerequisites
Running this script requires the following:
- Python (version 3 or higher) is locally-installed; and
- The Netskope One DSPM application has network connectivity to the Data Stores being connected.
In addition, the following is required for each Data Store being connected:
- A Netskope One DSPM-specific service account which has been properly-configured; and
- The Data Store's connection endpoint.
Finally, each data store's credentials are available in a published YAML file, as described in our Self-Managed Secrets for Data Store Credentials article.
Setup Input File
Excerpt: Multiple Data Stores: Setup Input File 1
The script will require the following inputs (in order), defined as separate rows within a comma-separated values (CSV) text file:
Input | Value |
---|---|
Endpoint |
Supply the corresponding values from the Prerequisites step above, plus the port number. For example, for an address like example-production.redshift.us-west-2.vpce.amazonaws.com, you would enter example-production.redshift.us-west-2.vpce.amazonaws.com:5439. 5439 is the default Redshift port number. If you are using a custom port number, be sure to substitute it here. |
Data Store Identifier | Provide a friendly name to describe this Data Store. Your value is displayed in other Netskope One DSPM screens such as Policy Management and Classification Management. |
Infrastructure Connection Name | Enter one of the AWS Accounts defined within the Infrastructure Connection screen. |
Excerpt: Multiple Data Stores: Setup Input File 2
Your CSV file should not include headers.
For example, attempting to onboard a Data Store within the Netskope One DSPM application looks something like this:

Excerpt: Multiple Data Stores: Setup Input File 3
Using this script to onboard that same Data Store would require your CSV text file to contain a row like the following. When you complete populating the CSV input file, save it within any accessible location.
example-production.redshift.us-west-2.vpce.amazonaws.com:5439,production,aws-sandbox
Run Script
Excerpt: Multiple Data Stores: Run Script 1
- Open the command line interface (CLI).
- Enter the following command to download the automation script locally in your system:
wget https://dasera-release.s3.us-west-2.amazonaws.com/redshift_onboarding_vault.py
Excerpt: Multiple Data Stores: Run Script 2
- If necessary, navigate to the directory where the script was downloaded.
- Enter the following command to run the script:
python3 redshift_onboarding_vault.py
Excerpt: Multiple Data Stores: Run Script 3
- When prompted, enter the following parameters:
Parameter | Value |
---|---|
CSV file name | Relative location of the CSV text file defined in the Setup Input File section above. |
Netskope One DSPM user name |
Any Netskope One DSPM Platform User with RBAC permission to connect Data Stores. This user’s name will be associated with any User Activity Log entries generated by this operation. |
Password | Password which corresponds to the Netskope One DSPM user name value above. |
Netskope One DSPM endpoint | Your tenant URL including the protocol. For example, if your tenant is accessed using https://example.dasera.io, your value will be https://example.dasera.io. |
Excerpt: Multiple Data Stores: Run Script 4
At this point, the script will use each row of the CSV text file and attempt to connect each Data Store. For each row processed, the script will output a response code: 200 indicates a successful connection, while any other value indicates an issue that should be investigated further. For more information, see the Review Results section below.
When complete, the script will output the following message (where # matches the number of rows within your CSV text file):
Onboarded # Data Stores.
Review Results
Excerpt: Multiple Data Stores: Review Results
Validate Data Store States
This script will attempt to connect all Data Stores within your CSV input file, but it does not validate that the Data Stores are properly-configured (see the Prerequisites section above). Possible issues may include:
- The service account credentials supplied were incorrect;
- The service account is missing one or more required permissions; and/or
- Netskope One DSPM does not have network connectivity to the Data Store
In such events, the Netskope One DSPM application will:
- Display a red status indicator within the Data Stores > Data Store Inventory screen > Connected tab; and
- Record one or more messages in the Activity Logs screen > System Activity tab, each starting with “Netskope One DSPM encountered an error while scanning data warehouse”, followed by the Data Store Identifier & additional details


After you remediate any of these issues within your Data Store, Netskope One DSPM will recognize those fixes at the time of the next scheduled scan.
Configure Individual Features
All Data Stores connected by this script will use the following default configurations. Some of these values can be overridden by navigating to the Data Stores > Data Store Inventory screen, then editing the Data Store in question:
Feature | Default Value | Can Be Overridden? |
---|---|---|
Scan Frequency | Once daily | Yes. Can be changed to any other available frequency. |
Data Sets | All data sets. | Yes. Can be changed to any lesser combination. |
Discovery | Enabled | No (always on) |
Privilege Analysis | Enabled | Yes |
Classification | Enabled | Yes |
Data In Use Monitoring |
Enabled Query logging must be configured for this before enabling this capability. |
Yes |
Automation | Enabled | No (always on) |