
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-18-2023 07:33 AM
We have a Redshift cluster that we would like to load data into. Using the Redshift - Insert snap is simple, but prohibitively slow and isn’t the recommended way to load data into a Redshift. We would like to use the Redshift - Bulk Load snap, but are running into a few issues when setting up the Redshift Account in SnapLogic.
- Our understanding is that using IAM Roles for authentication is NOT possible on a Cloudplex. Is this true? If so, this is a huge issue.
- If we can’t use an IAM Role for authentication, the only other option is an AWS Access Key with its secret and token. The main issue with this approach is that the tokens are temporary and only last a few hours at a maximum. How can we use an AWS Access Key with its secret and token without having to refresh the token every 15 minutes? This doesn’t seem useful.
Any help would be great. Thanks!
Solved! Go to Solution.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-20-2023 10:48 AM
We think we figured out what was happening. Every 180 days we are required by our company policies to swap out our Access Keys and Secrets for security purposes. When we swapped them out, there was somehow a value in the Token field on the Redshift Account. We tried again by updating the Access Key and Secret, then clearing out the Token field and that seemed to do the trick. We’re not sure if this is a long-term fix or if it will fail again after a certain amount of time, but we are good to go for now.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-18-2023 09:07 PM
@ptaylor None of our S3 buckets require an STS Token. We have other third party tools that use the “bulk load”/copy action in Redshift with only an Access Key and Secret Key. I’m not sure why it’s not working in SnapLogic.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-19-2023 08:57 AM
Well in that case, in what way isn’t it working? Are you getting an error? If so, please share the details.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-20-2023 10:48 AM
We think we figured out what was happening. Every 180 days we are required by our company policies to swap out our Access Keys and Secrets for security purposes. When we swapped them out, there was somehow a value in the Token field on the Redshift Account. We tried again by updating the Access Key and Secret, then clearing out the Token field and that seemed to do the trick. We’re not sure if this is a long-term fix or if it will fail again after a certain amount of time, but we are good to go for now.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-20-2023 11:58 AM
That’s great to hear! I’m glad it’s working for you now.
It would help if the UI made it possible to tell when an encrypted field has a value or not. Currently, it looks the same either way: “Value is encrypted” even if it has no value.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-20-2023 12:12 PM
@ptaylor I agree. That would definitely be helpful.
