Hitachi Content Platform​

 HCI File Migration

  • Object Storage
  • Hitachi Content Intelligence HCI
  • Hitachi Data Ingestor HDI
Herman Botha's profile image
Herman Botha posted 02-12-2019 09:59

Hi All

I want to write a pipeline to Migrate a folder to HCP S3 from an NFS mount, I have two questions

1. Has a plugin been developed to connect to an nfs source or do I still need to mount the NFS on all of the HCI nodes and use the local file connector if I do does anyone have to procedure to mount it

2 Has anyone done file migrations that can supply me the transformation, it would be nice to not do all the work from scratch

It needs to keep the sub folder structure the same in HCP as on the file server

Regards

Herman Botha


#HitachiContentIntelligenceHCI
#HitachiDataIngestorHDI
Jonathan Chinitz's profile image
Jonathan Chinitz

#1 -- requires using LFS connector and mounting locally.

#2 -- you can actually do this with an empty pipeline. Define the LFS and S3 Compatible. Define your workflow to have an input from the LFS connector, a pipeline with no stages, and an Output with the S3 compatible connector and the Action Outputfile. You should check that the fields required by the S3 connector are provided by the input connector.

Herman Botha's profile image
Herman Botha

Thanks Jon. When I create the LFS do I have to mount them on all the HCI servers

Jonathan Chinitz's profile image
Jonathan Chinitz

Yes you do.

Troy Myers's profile image
Troy Myers

The attached demo's workflow is currently in the HALO lab and can be exported

NAS to HCP migrations-20171222 1636-1.mp4