...
[12:27:01 CST(-0600)] <dmccallum54> so in my mind, at this point, the question is whether liquibase really buys us anything, or if it's simpler to just go straight DDL for both framework and staging tables
[12:28:15 CST(-0600)] <dmccallum54> because we don't really have a need for "migrations" in any of our custom tables, and SpringBatch doesn't use liquibase for migrations of its framework tables, I'm actually leaning toward just maintaining possible db-specific ddl files in our source, using db-specific tools to generate those files from existing, "known-good" SSP installs
[12:30:13 CST(-0600)] <TonyUnicon> ok
[12:31:19 CST(-0600)] <TonyUnicon> so I'll start with that
[12:31:51 CST(-0600)] <dmccallum54> https://github.com/spring-projects/spring-batch/blob/master/spring-batch-samples/src/main/resources/data-source-context.xml
[12:32:16 CST(-0600)] <dmccallum54> here's what i was referring to with the cryptic comment about the "jdbc namespace"
[12:32:35 CST(-0600)] <dmccallum54> see the "jdbc:initialize-database" element at the tiop?
[12:34:37 CST(-0600)] <TonyUnicon> yeah
[12:46:34 CST(-0600)] <dmccallum54> TonyUnicon it also occurred to me that if we take this staging table approach, it might make sense to split the "writeFilteredItems" step into 2 steps… one for the writing to the stage tables and one for the upserts into the "live" tables
[12:47:18 CST(-0600)] <dmccallum54> the write to the stage tables seems like it's right in SpringBatch's wheelhouse
[12:47:51 CST(-0600)] <TonyUnicon> right
[12:48:00 CST(-0600)] <dmccallum54> the upserts would be more of a custom Tasklet that just executes a couple of db statements