I am using Pentaho 8.3 and running the job through a shell script on an Linux EC2 Dev server.Does the s3 file output step need the aws credentials set in the profile or in the .aws/config please?Does Pentaho need the full credentials set in the .aws/credentials please?I am getting this error:S3 file output.0 - Caused by: org.apache.http.conn.ConnectTimeoutException: Connect to dat-xxx-dev-logs.s3.us-west-2.amazonaws.com:443 [dat-xxx-dev-logs.s3.us-west-2.amazonaws.com/220.127.116.11] failed: connect timed outRegards Vince
Hi VinceTry adding
echo "currentuser=$USER " > /tmp/awscheck.logecho "homedir=$HOME" >> /tmp/awscheck.logecho ` aws configure list` >> /tmp/awscheck.logto the shell script and then check the output in /tmp/awscheck.log Just to confirm that all the environment variables have been set on the EC2 side as you expect (bash can be tricky running scripts remotely)Alternatively, try adding the appropriate variable at the head of teh scriptexport AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE export AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY export AWS_DEFAULT_REGION=eu-west-2
AWS_ACCESS_KEY_ID or the
The S3 File Output step provides credentials to the Amazon Web Services SDK for Java using a credential provider chain. The default credential provider chain looks for AWS credentials (we use option 5):
Instance profile credentials
These credentials are delivered through the Amazon EC2 metadata service, and can be used on EC2 instances with an assigned instance role.
The S3 File Output step can use any of these methods to authenticate AWS credentials. For more information on setting up AWS credentials, see Working with AWS Credentials.
Hi VinceTry adding
------------------------------Andrew CaveSystems EngineerBizCubed Pty LtdAustraliaOriginal Message:Sent: 05-13-2022 04:22From: Vince PopplewellSubject: s3 file output Step connect timed out us-west-2
Thank You so much for your Reply. I have been on Holiday and missed this update after being given other work to do. I have used similar Pentaho s3 steps to run aws s3 mv and aws s3 copy Utility, statements but we do not want to use batch mode.
The requirement here is for real time logging, in fact.
A work-around could be, writing to a temporary Table and using the aws s3 copy utility but the requirement is real time not batch.
We want to use option 5 from the supporting Documentation:
The S3 File Output step provides credentials to the Amazon Web Services SDK for Java using a credential provider chain. The default credential provider chain looks for AWS credentials:
Vince,You might be struggling will trying to fall through the credential providers. I seem to remember having an old creditials file and it was stopping me even with IAM roles in place. Not on AWS anymore to test and my memory may be faulty.I know that I have been able to add an S3 role to an EC2 instance and then access via the CLI as Andrew showed. You might want to make sure that there are no variables set and no credentials files in place that might be sending you down the wrong authentication pathway.
Could also turn on debug logging for com.amazonaws.auth.AWSCredentialsProviderChain
A proud part of Hitachi Vantara
© Hitachi Vantara Corporation. All Rights Reserved.