Running a task application within Spring Cloud Data Flow goes through a slightly different lifecycle than running a stream application. Both types of applications need to be registered with the appropriate artifact coordinates. Both need a definition created via the SCDF DSL. However, that’s where the similarities end.
With stream based applications, you "deploy" them with the intent that they run until they are undeployed. A stream definition is only deployed once (it can be scaled, but only deployed as one instance of the stream as a whole). However, tasks are launched. A single task definition can be launched many times. With each launch, they will start, execute, and shut down with PCF cleaning up the resources once the shutdown has occurred. The following sections outline the process of creating, launching, destroying, and viewing tasks.
Similar to streams, creating a task application is done via the SCDF DSL or through the dashboard. To create a task definition in SCDF, you’ve to either develop a task application or use one of the out-of-the-box task app-starters. The maven coordinates of the task application should be registered in SCDF. For more details on how to register task applications, review register task applications section from the core docs.
Let’s see an example that uses the out-of-the-box timestamp
task application.
dataflow:>task create --name foo --definition "timestamp" Created new task 'foo'
Note | |
---|---|
Tasks in SCDF do not require explicit deployment. They are required to be launched and with that there are different ways to launch them - refer to this section for more details. |
Unlike streams, tasks in SCDF requires an explicit launch trigger or it can be manually kicked-off.
dataflow:>task launch foo Launched task 'foo'
As previously mentioned, the CL CLI is the way to interact with tasks on PCF,
including viewing the logs. In order to view the logs as a task is executing use the
following command where foo
is the name of the task you are executing:
cf v3-logs foo Tailing logs for app foo... .... .... .... .... 2016-08-19T09:44:49.11-0700 [APP/TASK/bar1/0]OUT 2016-08-19 16:44:49.111 INFO 7 --- [ main] o.s.c.t.a.t.TimestampTaskApplication : Started TimestampTaskApplication in 2.734 seconds (JVM running for 3.288) 2016-08-19T09:44:49.13-0700 [APP/TASK/bar1/0]OUT Exit status 0 2016-08-19T09:44:49.19-0700 [APP/TASK/bar1/0]OUT Destroying container 2016-08-19T09:44:50.41-0700 [APP/TASK/bar1/0]OUT Successfully destroyed container
Note | |
---|---|
Logs are only viewable through the CF CLI as the app is running. Historic logs are not available. |
Listing tasks is as simple as:
dataflow:>task list ╔══════════════════════╤═════════════════════════╤═══════════╗ ║ Task Name │ Task Definition │Task Status║ ╠══════════════════════╪═════════════════════════╪═══════════╣ ║foo │timestamp │complete ║ ╚══════════════════════╧═════════════════════════╧═══════════╝
If you’d like to view the execution details of the launched task, you could do the following.
dataflow:>task execution list ╔════════════════════════╤══╤═════════════════════════╤═════════════════════════╤════════╗ ║ Task Name │ID│ Start Time │ End Time │ Exit ║ ║ │ │ │ │ Code ║ ╠════════════════════════╪══╪═════════════════════════╪═════════════════════════╪════════╣ ║foo:cloud: │1 │ Fri Aug 19 09:44:49 PDT │Fri Aug 19 09:44:49 PDT │0 ║ ╚════════════════════════╧══╧═════════════════════════╧═════════════════════════╧════════╝
Destroying the task application from SCDF removes the task definition from task repository.
dataflow:>task destroy foo Destroyed task 'foo' dataflow:>task list ╔═════════╤═══════════════╤═══════════╗ ║Task Name│Task Definition│Task Status║ ╚═════════╧═══════════════╧═══════════╝
Currently Spring Cloud Data Flow does not delete tasks deployed on a Cloud Foundry instance once they have been pushed. The only way to do this now is via CLI on a Cloud Foundry instance version 1.9 or above. This is done in 2 steps:
cf apps
command.cf delete <task-name>
command.Note | |
---|---|
The |