You are viewing a plain text version of this content. The canonical link for it is here.
Posted to notifications@shardingsphere.apache.org by GitBox <gi...@apache.org> on 2022/10/19 08:40:05 UTC

[GitHub] [shardingsphere] tq02ksu opened a new issue, #21649: mysql batch insert can not work on shardingsphere-jdbc-core 5.2.0

tq02ksu opened a new issue, #21649:
URL: https://github.com/apache/shardingsphere/issues/21649

   ## Bug Report
   
   **For English only**, other languages will not accept.
   
   Before report a bug, make sure you have:
   
   - Searched open and closed [GitHub issues](https://github.com/apache/shardingsphere/issues).
   - Read documentation: [ShardingSphere Doc](https://shardingsphere.apache.org/document/current/en/overview).
   
   Please pay attention on issues you submitted, because we maybe need more details. 
   If no response anymore and we cannot reproduce it on current information, we will **close it**.
   
   Please answer these questions before submitting your issue. Thanks!
   
   ### Which version of ShardingSphere did you use?
   shardingsphere-jdbc-core 5.2.0
   
   ### Which project did you use? ShardingSphere-JDBC or ShardingSphere-Proxy?
   ShardingSphere-JDBC
   
   ### Expected behavior
   I perform DB operations by Hibernate batch insert. i wish the sql insert executed unexceptly.
   
   
   
   ### Actual behavior
   I perform DB operations by Hibernate batch insert.
   
   while execute batch insert, the sql statement is like `insert into xx (f1,f2,f3) values (?,?,?)`, the output exception is like the following:
   ```Caused by: java.sql.SQLException: Parameter index out of range (1 > number of parameters, which is 0).                                                                                                                                                                                                                         
           at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:129)                                                                                                                                                                                                                                         
           at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)                                                                                                                                                                                                                                          
           at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:89)                                                                                                                                                                                                                                          
           at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:63)               
   ```
   
   ### Reason analyze (If you can)
   I guess the root reason is the `ShardingInsertValuesToken` class did not implemented toString() method.
   I reload the class and implement toString() method, the problem resolved.
   ```
       @Override
       public String toString() {
           return getInsertValues().stream().map(Objects::toString).collect(Collectors.joining(","));
       }
   ```
   
   I debug the`PrepareStatementImpl` instance within mysql-connector lib, the actual sql is 
   `insert invo xx (f1,f2,f3) values org.apache.shardingsphere.sharding.rewrite.token.pojo.ShardingInsertValuesToken@XXX'
   
   The problem occours at sql rewrite process, and i found that when no db sharding is configured, the   by depperly debugging.
   
   the invocation path something like below:
   ```
   SQLRouteEngine -> PartialSQLRouteExecutor#route
   
   ```
   
   
   ### Steps to reproduce the behavior, such as: SQL to execute, sharding rule configuration, when exception occur etc.
   MultiDataBaseShardingSphereDataSource.java
   ```java
   package app.support;
   
   import org.apache.shardingsphere.driver.jdbc.adapter.AbstractDataSourceAdapter;
   import org.apache.shardingsphere.driver.jdbc.context.JDBCContext;
   import org.apache.shardingsphere.driver.state.DriverStateContext;
   import org.apache.shardingsphere.infra.config.database.DatabaseConfiguration;
   import org.apache.shardingsphere.infra.config.database.impl.DataSourceProvidedDatabaseConfiguration;
   import org.apache.shardingsphere.infra.config.mode.ModeConfiguration;
   import org.apache.shardingsphere.infra.config.rule.RuleConfiguration;
   import org.apache.shardingsphere.infra.config.rule.scope.GlobalRuleConfiguration;
   import org.apache.shardingsphere.infra.instance.metadata.InstanceMetaData;
   import org.apache.shardingsphere.infra.instance.metadata.InstanceMetaDataBuilderFactory;
   import org.apache.shardingsphere.mode.manager.ContextManager;
   import org.apache.shardingsphere.mode.manager.ContextManagerBuilderFactory;
   import org.apache.shardingsphere.mode.manager.ContextManagerBuilderParameter;
   
   import javax.sql.DataSource;
   import java.sql.Connection;
   import java.sql.SQLException;
   import java.util.*;
   import java.util.function.Function;
   import java.util.stream.Collectors;
   
   /**
    * ShardingSphere data source. support multi database
    */
   public class MultiDatabaseShardingSphereDataSource extends AbstractDataSourceAdapter implements AutoCloseable {
   
       private final String databaseName;
   
       private final ContextManager contextManager;
   
       private final JDBCContext jdbcContext;
   
       public MultiDatabaseShardingSphereDataSource(final String databaseName, final ModeConfiguration modeConfig, final Map<String, DataSource> dataSourceMap,
                                       final Collection<RuleConfiguration> ruleConfigs, final Properties props) throws SQLException {
           String[] databaseNames = databaseName.split(",");
           this.databaseName = databaseNames[0];
           contextManager = createContextManager(databaseNames, modeConfig, dataSourceMap, ruleConfigs, null == props ? new Properties() : props);
           jdbcContext = new JDBCContext(contextManager.getDataSourceMap(databaseNames[0]));
       }
   
       private ContextManager createContextManager(final String[] databaseNames, final ModeConfiguration modeConfig, final Map<String, DataSource> dataSourceMap,
                                                   final Collection<RuleConfiguration> ruleConfigs, final Properties props) throws SQLException {
           InstanceMetaData instanceMetaData = InstanceMetaDataBuilderFactory.create("JDBC", -1);
           Collection<RuleConfiguration> globalRuleConfigs = ruleConfigs.stream().filter(each -> each instanceof GlobalRuleConfiguration).collect(Collectors.toList());
           Collection<RuleConfiguration> databaseRuleConfigs = new LinkedList<>(ruleConfigs);
           databaseRuleConfigs.removeAll(globalRuleConfigs);
           Map<String, DatabaseConfiguration> databaseConfigs = Arrays.stream(databaseNames)
                   .collect(Collectors.toMap(Function.identity(),
                           e -> new DataSourceProvidedDatabaseConfiguration(dataSourceMap, databaseRuleConfigs)));
           ContextManagerBuilderParameter parameter = new ContextManagerBuilderParameter(modeConfig, databaseConfigs,
                   globalRuleConfigs, props, Collections.emptyList(), instanceMetaData);
           return ContextManagerBuilderFactory.getInstance(modeConfig).build(parameter);
       }
   
       @Override
       public Connection getConnection() throws SQLException {
           return DriverStateContext.getConnection(databaseName, contextManager, jdbcContext);
       }
   
       @Override
       public Connection getConnection(final String username, final String password) throws SQLException {
           return getConnection();
       }
   
       /**
        * Close data sources.
        *
        * @param dataSourceNames data source names to be closed
        * @throws Exception exception
        */
       public void close(final Collection<String> dataSourceNames) throws Exception {
           Map<String, DataSource> dataSourceMap = contextManager.getDataSourceMap(databaseName);
           for (String each : dataSourceNames) {
               close(dataSourceMap.get(each));
           }
           contextManager.close();
       }
   
       private void close(final DataSource dataSource) throws Exception {
           if (dataSource instanceof AutoCloseable) {
               ((AutoCloseable) dataSource).close();
           }
       }
   
       @Override
       public void close() throws Exception {
           close(contextManager.getDataSourceMap(databaseName).keySet());
       }
   
       @Override
       public int getLoginTimeout() throws SQLException {
           Map<String, DataSource> dataSourceMap = contextManager.getDataSourceMap(databaseName);
           return dataSourceMap.isEmpty() ? 0 : dataSourceMap.values().iterator().next().getLoginTimeout();
       }
   
       @Override
       public void setLoginTimeout(final int seconds) throws SQLException {
           for (DataSource each : contextManager.getDataSourceMap(databaseName).values()) {
               each.setLoginTimeout(seconds);
           }
       }
   }
   ```
   DataSource:
   ```java
   @Bean
       DataSource dataSource(Environment environment) {
           BindResult<Map<String, DataSourceProperties>> bindResult = Binder.get(environment)
                   .bind("abacus.datasource", Bindable.mapOf(String.class, DataSourceProperties.class));
           if (!bindResult.isBound()) {
               throw new IllegalArgumentException("abacus.datasource must be specified");
           }
   
           Map<String, DataSource> dataSourceMap = bindResult.get().entrySet().stream()
                   .collect(Collectors.toMap(Map.Entry::getKey, e -> e.getValue().initializeDataSourceBuilder().build()));
   
           ModeConfiguration modeConfig = new ModeConfiguration("Standalone", null, false);
   
           // table rule:
           ParameterizedType type = TypeUtils.parameterize(Map.class, String.class, TypeUtils.parameterize(List.class, String.class));
           ResolvableType tableRuleConfigType = ResolvableType.forType(type);
           BindResult<Map<String, List<String>>> tableRulesResult = Binder.get(environment).bind("abacus.table-rules",
                   Bindable.of(tableRuleConfigType));
   
           List<ShardingTableRuleConfiguration> tableRules = tableRulesResult.get().entrySet().stream()
                   .flatMap(entry -> {
                       String db = entry.getKey();
                       return entry.getValue().stream().map(table -> new ShardingTableRuleConfiguration(table,
                               db + "." + table.replaceFirst("^([^.]+[.])*", "")));
                   })
                   .collect(Collectors.toList());
   
           BindResult<List<String>> bindingTables = Binder.get(environment).bind("abacus.binding-tables", Bindable.listOf(String.class));
   
           try {
               ShardingRuleConfiguration rule = new ShardingRuleConfiguration();
               rule.setTables(tableRules);
               rule.setBindingTableGroups(bindingTables.get());
               Properties props = new Properties();
               props.put("sql-show", true);
   //            return ShardingSphereDataSourceFactory.createDataSource("abacus", modeConfig, dataSourceMap,
   //                    Collections.singletonList(rule), props);
               return new MultiDatabaseShardingSphereDataSource("abacus,acornhc_account", modeConfig, dataSourceMap,
                       Collections.singletonList(rule), props);
           } catch (SQLException e) {
               throw new RuntimeException(e);
           }
       }
   ```
   
   application.yml:
   ```yaml
   abacus:
     datasource:
       default:
         url: jdbc:mysql://ip1:port/abacus
         username: xx
         password: aa
       my-cat-account:
         url: jdbc:mysql://ip2:port/
         username: xx
         password: xx
     table-rules:
       default:
         - abacus.tag
         - abacus.tag_binding
         - abacus.tag_category
       my-cat-account:
         - member
   
     binding-tables:
       - abacus.tag,abacus.tag_binding
       - abacus.tag_category,abacus.tag
   
   ```
   
   ### Example codes for reproduce this issue (such as a github link).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@shardingsphere.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [shardingsphere] TeslaCN closed issue #21649: mysql batch insert can not work on shardingsphere-jdbc-core 5.2.0

Posted by GitBox <gi...@apache.org>.
TeslaCN closed issue #21649: mysql batch insert can not work on shardingsphere-jdbc-core 5.2.0
URL: https://github.com/apache/shardingsphere/issues/21649


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@shardingsphere.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [shardingsphere] TeslaCN commented on issue #21649: mysql batch insert can not work on shardingsphere-jdbc-core 5.2.0

Posted by GitBox <gi...@apache.org>.
TeslaCN commented on issue #21649:
URL: https://github.com/apache/shardingsphere/issues/21649#issuecomment-1321536554

   Duplicate with https://github.com/apache/shardingsphere/issues/18456.
   The `toString` stuff has been fixed in 5.2.1.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@shardingsphere.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org