I have a Spring Batch job with sequence of Steps (with ItemReader, an optional ItemProcessor, ItemWriter). There are multiple Steps but the most use FlatFileItemReader/Writer to read in a CSV file and do some operations based on the item (single row read).
Now I need to have to read in the CSV file (as a whole) and create a Map object that I need in batch Steps. I was thinking of having a custom Tasklet to do this (read in the CSV file and create the map to put in Execution Context which can be later read in the Steps). I was wondering if this is a good solution for this ? If anybody has done something similar would appreciate any inputs...
Mark Spritzler wrote:Yes you can implement a custom ItemWriter that just writes to that map object, then put that map object into the JobRepository so that it is available to other steps.
Remember that the job repository (and I'm assuming you mean the Execution Context - either Step or Job) gets committed to the database at commit interval (depending on the type - Step or Job) so you've effectively writing the entire Map to the database on each commit point.
That may or may not be a problem - depending on how big your file is.
A solution would be to stage the records in a "table" and then update them as needed as you move along the steps ...