2. Deploying Spring Cloud Data Flow Local Server

  1. Download the Spring Cloud Data Flow Server and Shell apps:

    wget http://repo.spring.io/release/org/springframework/cloud/spring-cloud-dataflow-server-local/1.2.3.RELEASE/spring-cloud-dataflow-server-local-1.2.3.RELEASE.jar
    
    wget http://repo.spring.io/release/org/springframework/cloud/spring-cloud-dataflow-shell/1.2.3.RELEASE/spring-cloud-dataflow-shell-1.2.3.RELEASE.jar
  2. Launch the Data Flow Server

    1. Since the Data Flow Server is a Spring Boot application, you can run it just by using java -jar.

      $ java -jar spring-cloud-dataflow-server-local-1.2.3.RELEASE.jar
  3. Launch the shell:

    $ java -jar spring-cloud-dataflow-shell-1.2.3.RELEASE.jar

    If the Data Flow Server and shell are not running on the same host, point the shell to the Data Flow server URL:

    server-unknown:>dataflow config server http://198.51.100.0
    Successfully targeted http://198.51.100.0
    dataflow:>

    By default, the application registry will be empty. If you would like to register all out-of-the-box stream applications built with the Kafka binder in bulk, you can with the following command. For more details, review how to register applications.

    $ dataflow:>app import --uri http://bit.ly/Bacon-RELEASE-stream-applications-kafka-10-maven
    [Note]Note

    Depending on your environment, you may need to configure the Data Flow Server to point to a custom Maven repository location or configure proxy settings. See Section 2.1, “Maven Configuration” for more information.

  4. You can now use the shell commands to list available applications (source/processors/sink) and create streams. For example:

    dataflow:> stream create --name httptest --definition "http --server.port=9000 | log" --deploy
    [Note]Note

    You will need to wait a little while until the apps are actually deployed successfully before posting data. Look in the log file of the Data Flow server for the location of the log files for the http and log applications. Tail the log file for each application to verify the application has started.

    Now post some data

dataflow:> http post --target http://localhost:9000 --data "hello world"

Look to see if hello world ended up in log files for the log application.

[Note]Note

When deploying locally, each app (and each app instance, in case of count>1) gets a dynamically assigned server.port unless you explicitly assign one with --server.port=x. In both cases, this setting is propagated as a configuration property that will override any lower-level setting that you may have used (e.g. in application.yml files).

[Tip]Tip

In case you encounter unexpected errors when executing shell commands, you can retrieve more detailed error information by setting the exception logging level to WARNING in logback.xml:

<logger name="org.springframework.shell.core.JLineShellComponent.exceptions" level="WARNING"/>

2.1 Maven Configuration

If you want to override specific maven configuration properties (remote repositories, proxies, etc.) and/or run the Data Flow Server behind a proxy, you need to specify those properties as command line arguments when starting the Data Flow Server. For example:

$ java -jar spring-cloud-dataflow-server-local-1.2.3.RELEASE.jar --maven.localRepository=mylocal
--maven.remote-repositories.repo1.url=https://repo1
--maven.remote-repositories.repo1.auth.username=user1
--maven.remote-repositories.repo1.auth.password=pass1
--maven.remote-repositories.repo2.url=https://repo2 --maven.proxy.host=proxy1
--maven.proxy.port=9010 --maven.proxy.auth.username=proxyuser1
--maven.proxy.auth.password=proxypass1

By default, the protocol is set to http. You can omit the auth properties if the proxy doesn’t need a username and password. Also, the maven localRepository is set to ${user.home}/.m2/repository/ by default. Like in the above example, the remote repositories can be specified along with their authentication (if needed). If the remote repositories are behind a proxy, then the proxy properties can be specified as above.

As these are Spring Boot @ConfigurationProperties you can also specify them as environment variables, e.g. MAVEN_REMOTE_REPOSITORIES_REPO1_URL. Another common option, is to set the properties using the SPRING_APPLICATION_JSON environment variable. An example of how the JSON is structured is shown below:

$ SPRING_APPLICATION_JSON='{ "maven": { "local-repository": null,
"remote-repositories": { "repo1": { "url": "https://repo1", "auth": { "username": "repo1user", "password": "repo1pass" } }, "repo2": { "url": "https://repo2" } },
"proxy": { "host": "proxyhost", "port": 9018, "auth": { "username": "proxyuser", "password": "proxypass" } } } }' java -jar spring-cloud-dataflow-server-local-{project-version}.jar