public class Jobs
extends java.lang.Object
Constructor and Description |
---|
Jobs(XdEnvironment environment)
Initializes the instance with xdEnvironment
|
Modifier and Type | Method and Description |
---|---|
FileJdbcJob |
fileJdbcJob()
Create an instance of the FileJdbc job with the default target dir, fileName, tableName and column names.
|
FilePollHdfsJob |
filePollHdfsJob(java.lang.String names)
Creates an instance of the FilePollHdfsJob using defaults.
|
FtpHdfsJob |
ftpHdfsJob()
Create a FtpHdfsJob Fixture using defaults.
|
HdfsJdbcJob |
hdfsJdbcJob()
Create an instance of the HdfsJdbc job with the default target dir, fileName, tableName and column names.
|
HdfsMongoDbJob |
hdfsMongoDb()
Create a new HdfsMongoDbJob that constructs a job that will read data from hdfs and output to mongo
using default values.
|
IncrementalJdbcHdfsJob |
incrementalJdbcHdfsJob()
Create an instance of the JdbcHdfs job with the default hdfs target dir, fileName and source sql statement.
|
JdbcHdfsJob |
jdbcHdfsJob()
Create an instance of the JdbcHdfs job with the default hdfs target dir, fileName and source sql statement.
|
PartitionedJdbcHdfsJob |
partitionedJdbcHdfsJob()
Create an instance of the PartitionedJdbcHdfsJob job with the default hdfs target dir, fileName, table,
columnNames, partitionColumn and partitions.
|
SparkAppJob |
sparkAppJob()
Create a new SparkAppJob fixture instance that is configured via property files
or environment variables.
|
public Jobs(XdEnvironment environment)
environment
- public FileJdbcJob fileJdbcJob()
for default values
public HdfsJdbcJob hdfsJdbcJob()
for default values
public JdbcHdfsJob jdbcHdfsJob()
for default values
public IncrementalJdbcHdfsJob incrementalJdbcHdfsJob()
for default values
public PartitionedJdbcHdfsJob partitionedJdbcHdfsJob()
for default values
public FilePollHdfsJob filePollHdfsJob(java.lang.String names)
names
- A comma delimited list of column namespublic HdfsMongoDbJob hdfsMongoDb()
public FtpHdfsJob ftpHdfsJob()
public SparkAppJob sparkAppJob()