I am trying to build my first transform to read from an AWS S3 bucket (own AWS account), using the Big Data Hadoop file input step
Spoon version: version 9.3.0.0-428
On step 'File' tab, I set the following
Environment: S3
File/Folder: s3://(Access_key):(Secret_access_key)@s3/xx-xxxx-demo/Xx.csv
Click 'Show file content' button => file content is retrieved correctly
On 'Content' tab, I set the following
Filetype: CSV
Separator: ,
Format: mixed
On 'Fields' tab
Click Get Fields
Scan results for all fields is displayed correctly
Click Preview rows button
Get ERROR
…
Transformation detected one or more steps with errors
..
Couldn't open file #0 : "s3:/(Access_key):(Secret_access_key)@s3/ xx-xxxx-demo/Xx.csv"
org.apache.commons.vfs2.FileSystemException: Incorrect file system URI s3://" in name "s3:// xx-xxxx-demo", was expecting "s3:/ (Access_key):(Secret_access_key)@s3/s3"
…
Thanking you all in advance