Open topic with navigation
The HDFS Load use case accelerator demonstrates how to load TPC-H supplier data from the local file system to HDFS. This single-stream load method is useful for small files; for large files, see DMX-h Use Case Accelerator: HDFS Load Parallel.
The source file is defined as a UNIX text file residing on the local file system.
The target is defined to be a UNIX text file, and is specified with an HDFS connection to load directly to HDFS.
Since the target file resides in HDFS, you can run this use case accelerator on any Linux system that has an HDFS client configured to connect to a Hadoop cluster.
The following attachments are available for running this UCA:
See the Guide to DMX-h ETL Use Case Accelerators for an overview of how the set of use case accelerators are organized and how to run them.
For general guidance on developing and running DMX-h ETL solutions, see Developing DMX-h ETL Jobs and Running DMX-h ETL Jobs in the DMExpress Help.
Copyright © 2016 Syncsort All rights reserved.