It is only necessary to quote parameter values if they contain spaces or the |
character. Here the transform processor is being passed a SpEL expression that will be applied to any data it encounters:
transform --expression='new StringBuilder(payload).reverse()'
If the parameter value needs to embed a single quote, use two single quotes:
// Query is: Select * from /Customers where name='Smith' scan --query='Select * from /Customers where name=''Smith'''
There is a Spring Shell based client that talks to the Data Flow Server that is responsible for parsing the DSL. In turn, applications may have applications properties that rely on embedded languages, such as the Spring Expression Language.
The shell, Data Flow DSL parser, and SpEL have rules about how they handle quotes and how syntax escaping works. When combined together, confusion may arise. This section explains the rules that apply and provides examples of the most complicated situations you will encounter when all three components are involved.
It’s not always that complicated | |
---|---|
If you don’t use the Data Flow shell, for example you’re using the REST API directly, or if applications properties are not SpEL expressions, then escaping rules are simpler. |
Arguably, the most complex component when it comes to quotes is the shell. The rules can be laid out quite simply, though:
--foo
) and corresponding values. There is a special, key-less mapping though, see below'
] or double ["
] quotes)\
)\t
, \n
, \r
, \f
and unicode escapes of the form \uxxxx
For example, the shell supports the !
command to execute native shell commands. The !
accepts a single, key-less argument. This is why the following works:
dataflow:>! rm foo
The argument here is the whole rm foo
string, which is passed as is to the underlying shell.
As another example, the following commands are strictly equivalent, and the argument value is foo
(without the quotes):
dataflow:>stream destroy foo dataflow:>stream destroy --name foo dataflow:>stream destroy "foo" dataflow:>stream destroy --name "foo"
At the parser level (that is, inside the body of a stream or task definition) the rules are the following:
As such, the values of the --expression
option to the filter application are semantically equivalent in the following examples:
filter --expression=payload>5 filter --expression="payload>5" filter --expression='payload>5' filter --expression='payload > 5'
Arguably, the last one is more readable. It is made possible thanks to the surrounding quotes. The actual expression is payload > 5
(without quotes).
Now, let’s imagine we want to test against string messages. If we’d like to compare the payload to the SpEL literal string, "foo"
, this is how we could do:
filter --expression=payload=='foo' filter --expression='payload == ''foo''' filter --expression='payload == "foo"'
This works because there are no spaces. Not very legible though | |
This uses single quotes to protect the whole argument, hence actual single quotes need to be doubled | |
But SpEL recognizes String literals with either single or double quotes, so this last method is arguably the best |
Please note that the examples above are to be considered outside of the shell, for example if when calling the REST API directly. When entered inside the shell, chances are that the whole stream definition will itself be inside double quotes, which would need escaping. The whole example then becomes:
dataflow:>stream create foo --definition "http | filter --expression=payload='foo' | log" dataflow:>stream create foo --definition "http | filter --expression='payload == ''foo''' | log" dataflow:>stream create foo --definition "http | filter --expression='payload == \"foo\"' | log"
The last piece of the puzzle is about SpEL expressions. Many applications accept options that are to be interpreted as SpEL expressions, and as seen above, String literals are handled in a special way there too. The rules are:
As a last example, assume you want to use the transform processor.
This processor accepts an expression
option which is a SpEL expression. It is to be evaluated against the incoming message, with a default of payload
(which forwards the message payload untouched).
It is important to understand that the following are equivalent:
transform --expression=payload transform --expression='payload'
but very different from the following:
transform --expression="'payload'" transform --expression='''payload'''
and other variations.
The first series will simply evaluate to the message payload, while the latter examples will evaluate to the actual literal string payload
(again, without quotes).
As a last, complete example, let’s review how one could force the transformation of all messages to the string literal hello world
, by creating a stream in the context of the Data Flow shell:
dataflow:>stream create foo --definition "http | transform --expression='''hello world''' | log" dataflow:>stream create foo --definition "http | transform --expression='\"hello world\"' | log" dataflow:>stream create foo --definition "http | transform --expression=\"'hello world'\" | log"
This uses single quotes around the string (at the Data Flow parser level), but they need to be doubled because we’re inside a string literal (very first single quote after the equals sign) | |
use single and double quotes respectively to encompass the whole string at the Data Flow parser level. Hence, the other kind of quote can be used inside the string. The whole thing is inside the |