Uses of Interface
org.springframework.batch.item.file.transform.LineTokenizer
Packages that use LineTokenizer
Package
Description
Builders for file item readers and writers.
Infrastructure implementations of io file support mapping concerns.
Infrastructure implementations of io file support transform concerns.
-
Uses of LineTokenizer in org.springframework.batch.item.file.builder
Methods in org.springframework.batch.item.file.builder with parameters of type LineTokenizerModifier and TypeMethodDescriptionFlatFileItemReaderBuilder.lineTokenizer(LineTokenizer tokenizer) ALineTokenizerimplementation to be used. -
Uses of LineTokenizer in org.springframework.batch.item.file.mapping
Methods in org.springframework.batch.item.file.mapping with parameters of type LineTokenizerModifier and TypeMethodDescriptionvoidDefaultLineMapper.setLineTokenizer(LineTokenizer tokenizer) Method parameters in org.springframework.batch.item.file.mapping with type arguments of type LineTokenizerModifier and TypeMethodDescriptionvoidPatternMatchingCompositeLineMapper.setTokenizers(Map<String, LineTokenizer> tokenizers) -
Uses of LineTokenizer in org.springframework.batch.item.file.transform
Classes in org.springframework.batch.item.file.transform that implement LineTokenizerModifier and TypeClassDescriptionclassAbstract class handling common concerns of variousLineTokenizerimplementations such as dealing with names and actual construction ofFieldSetclassALineTokenizerimplementation that splits the input String on a configurable delimiter.classTokenizer used to process data obtained from files with fixed-length format.classALineTokenizerimplementation that stores a mapping of String patterns to delegateLineTokenizers.classLine-tokenizer using a regular expression to filter out data (by using matching and non-matching groups).Method parameters in org.springframework.batch.item.file.transform with type arguments of type LineTokenizerModifier and TypeMethodDescriptionvoidPatternMatchingCompositeLineTokenizer.setTokenizers(Map<String, LineTokenizer> tokenizers)