1. In the following example, we will read data from multiple csv files and then write to Derby DB.
1) In order to read from multiple csv, we need "org.springframework.batch.item.file.MultiResourceItemReader" as reader.
And config flatFileReader for delegate which reading single csv file.
2) In order to write to Derby, we need "org.springframework.batch.item.database.JdbcBatchItemWriter" as writer.
And config derbyDataSource as datasource.
2.
1) pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>edu.xmu.spring.batch</groupId> <artifactId>spring-batch</artifactId> <version>0.0.1-SNAPSHOT</version> <properties> <spring.version>3.2.2.RELEASE</spring.version> <spring.batch.version>2.2.0.RELEASE</spring.batch.version> <junit.version>4.11</junit.version> </properties> <dependencies> <!-- Spring Core --> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-core</artifactId> <version>${spring.version}</version> </dependency> <!-- Spring jdbc, for database --> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-jdbc</artifactId> <version>${spring.version}</version> </dependency> <!-- Derby database driver --> <dependency> <groupId>org.apache.derby</groupId> <artifactId>derby</artifactId> <version>10.10.1.1</version> </dependency> <!-- Spring Batch dependencies --> <dependency> <groupId>org.springframework.batch</groupId> <artifactId>spring-batch-core</artifactId> <version>${spring.batch.version}</version> </dependency> <dependency> <groupId>org.springframework.batch</groupId> <artifactId>spring-batch-infrastructure</artifactId> <version>${spring.batch.version}</version> </dependency> </dependencies> </project>
2) domain.ddl
DROP TABLE ROOT.DOMAIN; CREATE TABLE ROOT.DOMAIN(ID INT NOT NULL, USERNAME VARCHAR(100), PASSWORD VARCHAR(100), DATE_ADDED DATE);
3) Domain.java
package edu.xmu.spring.batch.model; import java.util.Date; public class Domain { int id; String username; String password; Date dateAdded; public int getId() { return id; } public void setId(int id) { this.id = id; } public String getUsername() { return username; } public void setUsername(String username) { this.username = username; } public String getPassword() { return password; } public void setPassword(String password) { this.password = password; } public Date getDateAdded() { return dateAdded; } public void setDateAdded(Date dateAdded) { this.dateAdded = dateAdded; } }
4) context.xml
<?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.2.xsd"> <import resource="database.xml" /> <import resource="../jobs/spring-batch-job-derby.xml" /> <bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher"> <property name="jobRepository" ref="jobRepository" /> </bean> <bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean"> <property name="transactionManager" ref="transactionManager" /> </bean> </beans>
5) database.xml
<?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:jdbc="http://www.springframework.org/schema/jdbc" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.2.xsd http://www.springframework.org/schema/jdbc http://www.springframework.org/schema/jdbc/spring-jdbc-3.2.xsd"> <!-- connect to database --> <bean id="derbyDataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource"> <property name="driverClassName" value="org.apache.derby.jdbc.EmbeddedDriver" /> <property name="url" value="jdbc:derby:C:/Documents and Settings/******/MyDB" /> <property name="username" value="******" /> <property name="password" value="******" /> </bean> <bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager" /> </beans>
6) spring-batch-job-derby.xml
<?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:batch="http://www.springframework.org/schema/batch" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation=" http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.2.xsd http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch-2.2.xsd"> <bean id="domain" class="edu.xmu.spring.batch.model.Domain" scope="prototype"/> <batch:job id="readMultiFileJob"> <batch:step id="readAndWriteCsvFile"> <batch:tasklet> <batch:chunk reader="multiResourceReader" writer="derbyItemWriter" processor="flatFileItemProcessor" commit-interval="10" /> </batch:tasklet> </batch:step> </batch:job> <bean id="multiResourceReader" class="org.springframework.batch.item.file.MultiResourceItemReader"> <property name="resources" value="file:csv/input/domain-*.csv" /> <property name="delegate" ref="flatFileItemReader" /> </bean> <bean id="flatFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader"> <property name="lineMapper"> <bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper"> <property name="lineTokenizer"> <bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer"> <property name="names" value="id, username, password" /> </bean> </property> <property name="fieldSetMapper"> <bean class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper"> <property name="prototypeBeanName" value="domain" /> </bean> </property> </bean> </property> </bean> <bean id="flatFileItemProcessor" class="edu.xmu.spring.batch.processor.CustomCSVItemProcessor" /> <bean id="derbyItemWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter"> <property name="dataSource" ref="derbyDataSource" /> <property name="sql"> <value> <![CDATA[ INSERT INTO ROOT.DOMAIN(ID, USERNAME, PASSWORD, DATE_ADDED) VALUES (:id, :username, :password, :dateAdded) ]]> </value> </property> <property name="itemSqlParameterSourceProvider"> <bean class="org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider" /> </property> </bean> </beans>
7) CustomCSVItemProcessor.java
package edu.xmu.spring.batch.processor; import java.util.Calendar; import java.util.Date; import org.springframework.batch.item.ItemProcessor; import edu.xmu.spring.batch.model.Domain; public class CustomCSVItemProcessor implements ItemProcessor<Domain, Domain> { @Override public Domain process(Domain item) throws Exception { Date date = Calendar.getInstance().getTime(); item.setUsername(item.getUsername().replace(',', '-') .replaceAll(" ", "")); item.setPassword(item.getPassword().replace(',', '-')); item.setDateAdded(date); return item; } }
8) App.java
package edu.xmu.spring.batch; import org.springframework.batch.core.Job; import org.springframework.batch.core.JobExecution; import org.springframework.batch.core.JobParameters; import org.springframework.batch.core.launch.JobLauncher; import org.springframework.context.support.ClassPathXmlApplicationContext; public class App { public static void main(String[] args) { ClassPathXmlApplicationContext context = new ClassPathXmlApplicationContext( new String[] { "spring/batch/config/context.xml" }); JobLauncher jobLauncher = (JobLauncher) context.getBean("jobLauncher"); Job job = (Job) context.getBean("readMultiFileJob"); JobExecution execution; try { execution = jobLauncher.run(job, new JobParameters()); System.out.println("Exit Status : " + execution.getStatus()); } catch (Exception e) { e.printStackTrace(); } System.out.println("Done"); context.close(); } }
Reference Links:
1) http://www.mkyong.com/spring-batch/spring-batch-example-csv-file-to-database/
相关推荐
* Spring enterprise: Spring Java EE integration, Spring Integration, Spring Batch, jBPM with Spring, Spring Remoting, messaging, transactions, scaling using Terracotta and GridGrain, and more. ...
spring batch 3.0.5,官方下载包。
You'll also get solutions to common problems with persistence, integrating Spring Boot with batch processing, algorithmic programming via Spring Batch, and much more. Other recipes cover topics such ...
spring-batch-admin-1.3.0.RELEASE,很多小朋友找不到 ,所以上传下
初始化数据库:找到目录:/ spring-batch-admin-backend / src / main / db,里面有两个文件,一个是数据库创建脚本,一个是表结构+数据的脚本,先执行创建库的,如果想在已经存在的库里面运行程序,可以省略这一步...
使用Java 8+构建,Spring Boot(2.0.3)Redis(3.0.4) 经过JUnit(4.12)和JMeter的测试 通过带有三个Spring配置文件(dev,qa,dk)的Spring Boot执行-假定mvn clean install dev Profile异步处理批处理并注销...
spring-batch4.0.0 batch spring-batch集成 spring-batch.jar
Jointly developed by SpringSource and Accenture, Spring Batch fills this critical gap by providing a robust and convenient framework for writing batch applications that process large volumes of ...
bank-spring-batch:具有多处理器的Spring Batch项目
poc_spring_batch 概念证明 - Spring批次 使用 Spring Batch 作为基础框架测试批量应用程序升级
|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:| | 25 | 28.57 | 28.83 | 28.68 | 28.96 | 28.74 | 28.92 | **29.23** | **29.16** | **29.17** | - Set...
org.springframework.batch-2.0.0.RELEASE-with-dependencies
使用springBatch进行数据迁移的demo,数据库使用mysql,需要新建3个数据库data-rep(springBatch需要的表)、spring_batch_right(目标数据库)、spring_batch_left(数据所在数据库)
spring batch官方文档:https://docs.spring.io/spring-batch spring batch3.x中文文档:http://www.kailing.pub/SpringBatchReference spring batch官方入门实例:https://projects.spring.io/spring-batch/ 简单...
spring-batch-2.2.0.RELEASE-no-dependencies
Spring enterprise: Spring Java EE integration, Spring Integration, Spring Batch, Spring Remoting, messaging, transactions, and working with big data and the cloud using Hadoop and MongoDB. Spring web...
17)..Added: Support for relative file paths and environment variables for events and various module paths 18)..Added: Logging in Manage tool 19)..Added: Windows 10 version detection 20)..Added: Stack ...