Dynamically processing multiple batch files and generating corresponding output files with Spring Batch
I've recently started learning Spring Batch in order to make use of some of its more advanced features like asynchronous batch processing, job stopping, and scheduling to replace some existing batch processing functionality, and implement new batch processing functionality. Right now, I'm trying to figure out how to dynamically process multiple batch files and generate a "receipt" file for each input file, and I'm confused by some of the design decisions of the Spring Batch architects. It seems开发者_如何学JAVA like in order to process a simple flat CSV file and generate the output, I'm going to have to manually hack the beans in my ApplicationContext and manually set their "resource" properties at runtime in order to achieve what I want with the FlatFileItemReader and FlatFileItemWriter. This is neither safe nor good practice for something that is allegedly a multi-threaded, high-performance batch processing framework. Am I just missing something ?
As it turns out, I was misunderstanding some of the documentation. Spring Batch introduces a new bean scope, "step" and uses that to create new beans for each step in a batch flow based on the bean with the name given in the configuration for the readers and writers. It then dynamically configures these using Spring's EL. As for getting the multiple batches going, I just wound up doing the configuration for a single batch definition, then pushing the multiple batch handling into code.
Seems like you might use the MultiResourceItemReader
to specify multiple resources and set the FlatFileItemReader
as MultiResourceItemReaders
delegate.
精彩评论