Download the Spring Cloud Data Flow Server and Shell apps:
wget http://repo.spring.io/milestone/org/springframework/cloud/spring-cloud-dataflow-server-local/1.1.0.M2/spring-cloud-dataflow-server-local-1.1.0.M2.jar wget http://repo.spring.io/milestone/org/springframework/cloud/spring-cloud-dataflow-shell/1.1.0.M2/spring-cloud-dataflow-shell-1.1.0.M2.jar
Launch the Data Flow Server
Since the Data Flow Server is a Spring Boot application, you can run it just by using java -jar
.
$ java -jar spring-cloud-dataflow-server-local-1.1.0.M2.jar
Running with Custom Maven Settings and/or Behind a Proxy If you want to override specific maven configuration properties (remote repositories, etc.) and/or run the Data Flow Server behind a proxy, you need to specify those properties as command line arguments when starting the Data Flow Server. For example:
$ java -jar spring-cloud-dataflow-server-local-1.1.0.M2.jar --maven.localRepository=mylocal --maven.remote-repositories.repo1.url=https://repo1 --maven.remote-repositories.repo1.auth.username=user1 --maven.remote-repositories.repo1.auth.password=pass1 --maven.remote-repositories.repo2.url=https://repo2 --maven.proxy.host=proxy1 --maven.proxy.port=9010 --maven.proxy.auth.username=proxyuser1 --maven.proxy.auth.password=proxypass1
By default, the protocol is set to http
. You can omit the auth properties if the proxy doesn’t need a username and password.
By default, the maven localRepository
is set to ${user.home}/.m2/repository/
,
and repo.spring.io/libs-snapshot
will be the only remote repository. Like in the above example, the remote
repositories can be specified along with their authentication (if needed). If the remote repositories are behind a proxy,
then the proxy properties can be specified as above.
If you want to pass these properties as environment properties, then you need to use SPRING_APPLICATION_JSON
to set
these properties and pass SPRING_APPLICATION_JSON
as environment variable as below:
$ SPRING_APPLICATION_JSON='{ "maven": { "local-repository": null,
"remote-repositories": { "repo1": { "url": "https://repo1", "auth": { "username": "repo1user", "password": "repo1pass" } }, "repo2": { "url": "https://repo2" } },
"proxy": { "host": "proxyhost", "port": 9018, "auth": { "username": "proxyuser", "password": "proxypass" } } } }' java -jar spring-cloud-dataflow-server-local-{project-version}.jar
Launch the shell:
$ java -jar spring-cloud-dataflow-shell-1.1.0.M2.jar
If the Data Flow Server and shell are not running on the same host, point the shell to the Data Flow server:
server-unknown:>dataflow config server http://dataflow-server.cfapps.io Successfully targeted http://dataflow-server.cfapps.io dataflow:>
By default, the application registry will be empty. If you would like to register all out-of-the-box stream applications built with the Kafka binder in bulk, you can with the following command. For more details, review how to register applications.
$ dataflow:>app import --uri http://bit.ly/1-0-4-GA-stream-applications-kafka-maven
You can now use the shell commands to list available applications (source/processors/sink) and create streams. For example:
dataflow:> stream create --name httptest --definition "http --server.port=9000 | log" --deploy
Note | |
---|---|
You will need to wait a little while until the apps are actually deployed successfully
before posting data. Look in the log file of the Data Flow server for the location of the log
files for the |
Now post some data
dataflow:> http post --target http://localhost:9000 --data "hello world"
Look to see if hello world
ended up in log files for the log
application.
Note | |
---|---|
When deploying locally, each app (and each app instance, in case of |
Tip | |
---|---|
In case you encounter unexpected errors when executing shell commands, you can
retrieve more detailed error information by setting the exception logging level
to <logger name="org.springframework.shell.core.JLineShellComponent.exceptions" level="WARNING"/> |