开发者

how to use JdbcTemplate in a multithreaded environment?

I'm trying to use Spring JdbcTemplate with Spring's SimpleAsyncTaskExecutor so that concurrent connections to the DB can be made and the whole data can be inserted into the related table in a smaller amount of time when compared to a single threaded environment.

I'm using the following code, however it doesn't speed up my application.

The only clue I could find is the fact that the bean "campaignProductDBWriter" is constructed only once whereas I'm expecting 10 seperate instances to be created as I set "throttle-limit" to 10 in the tasklet.

What am I doing wrong? Any help or suggestions will be greatly appreciated.

Regards,

<bean id="dataSourceProduct"
  class="org.springframework.jdbc.datasource.DriverManagerDataSource"
  p:driverClassName="${jdbc.driverClassName}" p:url=开发者_StackOverflow"${jdbc.url.product}"
  p:username="${jdbc.username.product}" p:password="${jdbc.password.product}" 
/>

<bean id="jdbcTemplateProduct" class="org.springframework.jdbc.core.JdbcTemplate">
  <property name="dataSource" ref="dataSourceProduct" />
</bean>

<bean id="simpleTaskExecutor" class="org.springframework.core.task.SimpleAsyncTaskExecutor" >
  <property name="concurrencyLimit" value="-1" />
</bean>

<batch:job id="sampleJob" restartable="true"  incrementer="dynamicJobParameters">             
  <batch:step id="mapMZList">
    <batch:tasklet allow-start-if-complete="true" task-executor="simpleTaskExecutor" throttle-limit="10">                     
      <batch:chunk reader="campaignProductItemReader" processor="campaignProductProcessor" writer="campaignProductDBWriter" commit-interval="5000"/>        
    </batch:tasklet>
  </batch:step>                 
</batch:job>

<bean id="campaignProductDBWriter" class="com.falcon.cc.job.step.CampaignProductWriter">
  <property name="jdbcTemplate" ref="jdbcTemplateProduct" />
</bean>


<bean id="campaignProductItemReader" class="com.falcon.cc.job.step.FlatFileSynchronizedItemReader" scope="step">    
  <property name="resource" value="file:#{jobParameters['input.TEST_FILE.path']}"/>

  <property name="lineMapper">
    <bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">        
      <property name="lineTokenizer">       
        <bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
          <property name="delimiter" value=";"/>            
          <property name="names" value="approvalStatus,validFrom,validTo"/>
        </bean>
      </property>
      <property name="fieldSetMapper">
        <bean class="com.falcon.cc.mapper.CampaignProductFieldSetMapper" />
      </property>
    </bean>
  </property>
</bean>


This is not a problem with your Spring config, or with how you're using jdbcTemplate, which is just a thin, stateless wrapper around the JDBC API.

The most obvious likelihood is that your bottleneck is your database, not your code. It's entirely possible that running multiple concurrent operations against the database is no faster than doing them one at a time.

There could be several reasons for this, such as database locking, or just lack of raw I/O performance.

When considering using multi-threading to improve performance, you have to be sure where your bottlenecks are. If your code isn't the bottleneck, then making it multi-threaded isn't going to make things any faster.


When spring's context is initialized, it creates all instances declared in the context. <bean id="campaignProductDBWriter" class="com.falcon.cc.job.step.CampaignProductWriter"> <property name="jdbcTemplate" ref="jdbcTemplateProduct" /> </bean> this code will result in spring creating an instance of CampaignProductWriter which will be a singleton (as by default the scope is singleton). In order to have a new instance of your bean, its scope have to be prototype.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜