37. The Lifecycle of a task

Before we dive deeper into the details of creating Tasks, we need to understand the typical lifecycle for tasks in the context of Spring Cloud Data Flow:

  1. Register a Task App
  2. Create a Task Definition
  3. Launch a Task
  4. Task Execution
  5. Destroy a Task Definition

37.1 Creating a custom Task Application

While Spring Cloud Task does provide a number of out of the box applications (via the spring-cloud-task-app-starters), most task applications will be custom developed. In order to create a custom task application:

  1. Create a new project via Spring Initializer via either the website or your IDE making sure to select the following starters:

    1. Cloud Task - This dependency is the spring-cloud-starter-task.
    2. JDBC - This is the dependency for the spring-jdbc starter.
  2. Within your new project, create a new class that will serve as your main class:
@EnableTask
@SpringBootApplication
public class MyTask {

    public static void main(String[] args) {
		SpringApplication.run(MyTask.class, args);
	}
}
  1. With this, you’ll need one or more CommandLineRunner or ApplicationRunner within your application. You can either implement your own or use the ones provided by Spring Boot (there is one for running batch jobs for example).
  2. Packaging your application up via Spring Boot into an über jar is done via the standard Boot conventions.
  3. The packaged application can be registered and deployed as noted below.

37.2 Registering a Task Application

Register a Task App with the App Registry using the Spring Cloud Data Flow Shell app register command. You must provide a unique name and a URI that can be resolved to the app artifact. For the type, specify "task". Here are a few examples:

dataflow:>app register --name task1 --type task --uri maven://com.example:mytask:1.0.2

dataflow:>app register --name task2 --type task --uri file:///Users/example/mytask-1.0.2.jar

dataflow:>app register --name task3 --type task --uri http://example.com/mytask-1.0.2.jar

When providing a URI with the maven scheme, the format should conform to the following:

maven://<groupId>:<artifactId>[:<extension>[:<classifier>]]:<version>

If you would like to register multiple apps at one time, you can store them in a properties file where the keys are formatted as <type>.<name> and the values are the URIs. For example, this would be a valid properties file:

task.foo=file:///tmp/foo.jar
task.bar=file:///tmp/bar.jar

Then use the app import command and provide the location of the properties file via --uri:

app import --uri file:///tmp/task-apps.properties

For convenience, we have the static files with application-URIs (for both maven and docker) available for all the out-of-the-box Task app-starters. You can point to this file and import all the application-URIs in bulk. Otherwise, as explained in previous paragraphs, you can register them individually or have your own custom property file with only the required application-URIs in it. It is recommended, however, to have a "focused" list of desired application-URIs in a custom property file.

List of available static property files:

For example, if you would like to register all out-of-the-box task applications in bulk, you can with the following command.

dataflow:>app import --uri http://bit.ly/1-0-1-GA-task-applications-maven

You can also pass the --local option (which is TRUE by default) to indicate whether the properties file location should be resolved within the shell process itself. If the location should be resolved from the Data Flow Server process, specify --local false.

When using either app register or app import, if a task app is already registered with the provided name, it will not be overridden by default. If you would like to override the pre-existing task app, then include the --force option.

[Note]Note

In some cases the Resource is resolved on the server side, whereas in others the URI will be passed to a runtime container instance where it is resolved. Consult the specific documentation of each Data Flow Server for more detail.

37.3 Creating a Task

Create a Task Definition from a Task App by providing a definition name as well as properties that apply to the task execution. Creating a task definition can be done via the restful API or the shell. To create a task definition using the shell, use the task create command to create the task definition. For example:

dataflow:>task create mytask --definition "timestamp --format=\"yyyy\""
 Created new task 'mytask'

A listing of the current task definitions can be obtained via the restful API or the shell. To get the task definition list using the shell, use the task list command.

37.4 Launching a Task

An adhoc task can be launched via the restful API or via the shell. To launch an ad-hoc task via the shell use the task launch command. For Example:

dataflow:>task launch mytask
 Launched task 'mytask'

When a task is launched, any properties that need to be passed as the command line arguments to the task application can be set when launching the task as follows:

dataflow:>task launch mytask --arguments "--server.port=8080,--foo=bar"

Additional properties meant for a TaskLauncher itself can be passed in using a --properties option. Format of this option is a comma delimited string of properties prefixed with app.<task definition name>.<property>.

If actual property is prefixed with spring.cloud.deployer it is passed to TaskLauncher as a deployment property and its meaning may be TaskLauncher implementation specific. Other properties are passed to TaskLauncher as application properties and it is up to an implementation to choose how those are passed into an actual task application.

dataflow:>task launch mytask --properties "app.timestamp.spring.cloud.deployer.foo1=bar1,app.timestamp.foo2=bar2"

37.5 Reviewing Task Executions

Once the task is launched the state of the task is stored in a relational DB. The state includes:

  • Task Name
  • Start Time
  • End Time
  • Exit Code
  • Exit Message
  • Last Updated Time
  • Parameters

A user can check the status of their task executions via the restful API or by the shell. To display the latest task executions via the shell use the task execution list command.

To get a list of task executions for just one task definition, add --name and the task definition name, for example task execution list --name foo. To retrieve full details for a task execution use the task display command with the id of the task execution, for example task display --id 549.

37.6 Destroying a Task

Destroying a Task Definition will remove the definition from the definition repository. This can be done via the restful API or via the shell. To destroy a task via the shell use the task destroy command. For Example:

dataflow:>task destroy mytask
 Destroyed task 'mytask'

The task execution information for previously launched tasks for the definition will remain in the task repository.

Note: This will not stop any currently executing tasks for this definition, instead it just removes the task definition from the database.