Before we dive deeper into the details of creating Tasks, we need to understand the typical lifecycle for tasks in the context of Spring Cloud Data Flow:
Register a Task App with the App Registry using the Spring Cloud Data Flow Shell
app register
command. You must provide a unique name and a URI that can be
resolved to the app artifact. For the type, specify "task". Here are a few examples:
dataflow:>app register --name task1 --type task --uri maven://com.example:mytask:1.0.2 dataflow:>app register --name task2 --type task --uri file:///Users/example/mytask-1.0.2.jar dataflow:>app register --name task3 --type task --uri http://example.com/mytask-1.0.2.jar
When providing a URI with the maven
scheme, the format should conform to the following:
maven://<groupId>:<artifactId>[:<extension>[:<classifier>]]:<version>
If you would like to register multiple apps at one time, you can store them in a properties file
where the keys are formatted as <type>.<name>
and the values are the URIs. For example, this
would be a valid properties file:
task.foo=file:///tmp/foo.jar task.bar=file:///tmp/bar.jar
Then use the app import
command and provide the location of the properties file via --uri
:
app import --uri file:///tmp/task-apps.properties
For convenience, we have the static files with application-URIs (for both maven and docker) available for all the out-of-the-box Task app-starters. You can point to this file and import all the application-URIs in bulk. Otherwise, as explained in previous paragraphs, you can register them individually or have your own custom property file with only the required application-URIs in it. It is recommended, however, to have a "focused" list of desired application-URIs in a custom property file.
List of available static property files:
For example, if you would like to register all out-of-the-box task applications in bulk, you can with the following command.
dataflow:>app import --uri http://bit.ly/task-applications-maven
You can also pass the --local
option (which is TRUE by default) to indicate whether the
properties file location should be resolved within the shell process itself. If the location should
be resolved from the Data Flow Server process, specify --local false
.
When using either app register
or app import
, if a task app is already registered with
the provided name, it will not be overridden by default. If you would like to override the
pre-existing task app, then include the --force
option.
Note | |
---|---|
In some cases the Resource is resolved on the server side, whereas in others the URI will be passed to a runtime container instance where it is resolved. Consult the specific documentation of each Data Flow Server for more detail. |
Create a Task Definition from a Task App by providing a definition name as well as
properties that apply to the task execution. Creating a task definition can be done via
the restful API or the shell. To create a task definition using the shell, use the
task create
command to create the task definition. For example:
dataflow:>task create mytask --definition "timestamp --format=\"yyyy\"" Created new task 'mytask'
A listing of the current task definitions can be obtained via the restful API or the
shell. To get the task definition list using the shell, use the task list
command.
An adhoc task can be launched via the restful API or via the shell. To launch an ad-hoc
task via the shell use the task launch
command. For Example:
dataflow:>task launch mytask Launched task 'mytask'
When a task is launched, any properties that need to be passed as the command line arguments to the task application can be set when launching the task as follows:
dataflow:>task launch mytask --arguments "--server.port=8080,--foo=bar"
Once the task is launched the state of the task is stored in a relational DB. The state includes:
A user can check the status of their task executions via the restful API or by the shell.
To display the latest task executions via the shell use the task execution list
command.
To get a list of task executions for just one task definition, add --name
and
the task definition name, for example task execution list --name foo
. To retrieve full
details for a task execution use the task display
command with the id of the task execution
, for example task display --id 549
.
Destroying a Task Definition will remove the definition from the definition repository.
This can be done via the restful API or via the shell. To destroy a task via the shell
use the task destroy
command. For Example:
dataflow:>task destroy mytask Destroyed task 'mytask'
The task execution information for previously launched tasks for the definition will remain in the task repository.
Note: This will not stop any currently executing tasks for this definition, this just removes the definition.