You are viewing a plain text version of this content. The canonical link for it is here.
Posted to notifications@shardingsphere.apache.org by GitBox <gi...@apache.org> on 2022/05/30 12:59:47 UTC

[GitHub] [shardingsphere] zjcnb opened a new issue, #18075: Scaling throws exception when ALTER SHARDING TABLE RULE

zjcnb opened a new issue, #18075:
URL: https://github.com/apache/shardingsphere/issues/18075

   ## Bug Report
   
   **For English only**, other languages will not accept.
   
   Before report a bug, make sure you have:
   
   - Searched open and closed [GitHub issues](https://github.com/apache/shardingsphere/issues).
   - Read documentation: [ShardingSphere Doc](https://shardingsphere.apache.org/document/current/en/overview).
   
   Please pay attention on issues you submitted, because we maybe need more details. 
   If no response anymore and we cannot reproduce it on current information, we will **close it**.
   
   Please answer these questions before submitting your issue. Thanks!
   
   ### Which version of ShardingSphere did you use?
   
   `master`
   
   ### Which project did you use? ShardingSphere-JDBC or ShardingSphere-Proxy?
   
   `ShardingSphere-Proxy`
   
   ### Expected behavior
   
   No error msg
   
   ### Actual behavior
   
   ```
   [INFO ] 2022-05-30 20:50:14.838 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.AbstractImporter - write, get FinishedRecord, break
   [INFO ] 2022-05-30 20:50:14.838 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.AbstractImporter - importer write done, rowCount=2, finishedByBreak=true
   [INFO ] 2022-05-30 20:50:14.838 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-4] o.a.s.d.p.core.task.InventoryTask - importer onSuccess, taskId=ds_0.t_order_0#0
   [INFO ] 2022-05-30 20:50:14.839 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-5] o.a.s.d.p.core.task.InventoryTask - importer onSuccess, taskId=ds_0.t_order_item_0#0
   [INFO ] 2022-05-30 20:50:14.839 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-0] o.a.s.d.p.core.task.InventoryTask - importer future done
   [INFO ] 2022-05-30 20:50:14.839 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-2] o.a.s.d.p.core.task.InventoryTask - importer future done
   [INFO ] 2022-05-30 20:50:14.841 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.AbstractImporter - write, get FinishedRecord, break
   [INFO ] 2022-05-30 20:50:14.841 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.AbstractImporter - write, get FinishedRecord, break
   [INFO ] 2022-05-30 20:50:14.841 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.AbstractImporter - importer write done, rowCount=3, finishedByBreak=true
   [INFO ] 2022-05-30 20:50:14.841 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.AbstractImporter - importer write done, rowCount=3, finishedByBreak=true
   [INFO ] 2022-05-30 20:50:14.842 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-6] o.a.s.d.p.core.task.InventoryTask - importer onSuccess, taskId=ds_0.t_order_item_2#0
   [INFO ] 2022-05-30 20:50:14.842 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-7] o.a.s.d.p.core.task.InventoryTask - importer onSuccess, taskId=ds_0.t_order_2#0
   [INFO ] 2022-05-30 20:50:14.842 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-3] o.a.s.d.p.core.task.InventoryTask - importer future done
   [INFO ] 2022-05-30 20:50:14.842 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-1] o.a.s.d.p.core.task.InventoryTask - importer future done
   [INFO ] 2022-05-30 20:50:14.843 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-6] o.a.s.d.p.s.r.RuleAlteredJobScheduler - onSuccess, all inventory tasks finished.
   [INFO ] 2022-05-30 20:50:14.843 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-7] o.a.s.d.p.s.r.RuleAlteredJobScheduler - onSuccess, all inventory tasks finished.
   [INFO ] 2022-05-30 20:50:14.843 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-6] o.a.s.d.p.s.r.RuleAlteredJobScheduler - -------------- Start incremental task --------------
   [INFO ] 2022-05-30 20:50:14.844 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-7] o.a.s.d.p.s.r.RuleAlteredJobScheduler - job status already EXECUTE_INCREMENTAL_TASK, ignore
   [INFO ] 2022-05-30 20:50:14.844 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-0] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.core.task.IncrementalTask@529fa392
   [INFO ] 2022-05-30 20:50:14.845 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-1] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@647941e6
   [INFO ] 2022-05-30 20:50:14.845 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:14.845 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-2] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@28e949f3
   [INFO ] 2022-05-30 20:50:14.845 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:14.845 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-3] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@74293cbf
   [INFO ] 2022-05-30 20:50:14.845 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:14.846 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-0] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.ingest.MySQLIncrementalDumper@7904bb3e
   [INFO ] 2022-05-30 20:50:14.846 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-0] o.a.s.d.p.m.i.MySQLIncrementalDumper - incremental dump, jdbcUrl=jdbc:mysql://192.168.10.18:33307/original_1?rewriteBatchedStatements=true&serverTimezone=UTC&yearIsDateType=false&useSSL=false
   [ERROR] 2022-05-30 20:50:14.906 [nioEventLoopGroup-4-1] o.a.s.d.p.m.i.client.MySQLClient - protocol resolution error
   io.netty.handler.codec.DecoderException: java.lang.UnsupportedOperationException: Do not parse binlog event fully, eventHeader: MySQLBinlogEventHeader(timestamp=1653560367, eventType=15, serverId=133307, eventSize=121, logPos=0, flags=0), remaining packet 3a9d9af2
   	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:480)
   	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
   	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)
   	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:314)
   	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:435)
   	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279)
   	at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
   	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
   	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
   	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
   	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
   	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
   	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
   	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
   	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
   	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
   	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
   	at java.base/java.lang.Thread.run(Thread.java:833)
   Caused by: java.lang.UnsupportedOperationException: Do not parse binlog event fully, eventHeader: MySQLBinlogEventHeader(timestamp=1653560367, eventType=15, serverId=133307, eventSize=121, logPos=0, flags=0), remaining packet 3a9d9af2
   	at org.apache.shardingsphere.data.pipeline.mysql.ingest.client.netty.MySQLBinlogEventPacketDecoder.decode(MySQLBinlogEventPacketDecoder.java:89)
   	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:510)
   	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:449)
   	... 25 common frames omitted
   [INFO ] 2022-05-30 20:50:14.906 [nioEventLoopGroup-4-1] o.a.s.d.p.m.i.client.MySQLClient - reconnect mysql client.
   
   
   
   
   
   [INFO ] 2022-05-30 20:50:16.496 [Thread-9] o.a.s.d.p.s.r.RuleAlteredJobPreparer - unlocked, lockName=prepare-0130317c30317c3054317c74657374
   [INFO ] 2022-05-30 20:50:16.517 [Thread-9] o.a.s.d.p.c.m.l.PipelineTableMetaDataLoader - loadTableMetaData, schemaNameFinal=null, tableNamePattern=t_order_1, result={t_order_1=PipelineTableMetaData(name=t_order_1, columnMetaDataMap={order_id=PipelineColumnMetaData(ordinalPosition=1, name=order_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=true), user_id=PipelineColumnMetaData(ordinalPosition=2, name=user_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=false), status=PipelineColumnMetaData(ordinalPosition=3, name=status, dataType=12, dataTypeName=VARCHAR, nullable=true, primaryKey=false)}, columnNames=[order_id, user_id, status], primaryKeyColumns=[order_id], uniqueIndexes=[org.apache.shardingsphere.data.pipeline.core.metadata.model.PipelineIndexMetaData@106fc0bf])}, cost time=14 ms
   [INFO ] 2022-05-30 20:50:16.519 [Thread-9] o.a.s.d.p.s.r.p.InventoryTaskSplitter - getPositionByPrimaryKeyRange, endId is 0, break, tableName=t_order_1, primaryKey=order_id, beginId=6
   [INFO ] 2022-05-30 20:50:16.530 [Thread-9] o.a.s.d.p.c.m.l.PipelineTableMetaDataLoader - loadTableMetaData, schemaNameFinal=null, tableNamePattern=t_order_3, result={t_order_3=PipelineTableMetaData(name=t_order_3, columnMetaDataMap={order_id=PipelineColumnMetaData(ordinalPosition=1, name=order_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=true), user_id=PipelineColumnMetaData(ordinalPosition=2, name=user_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=false), status=PipelineColumnMetaData(ordinalPosition=3, name=status, dataType=12, dataTypeName=VARCHAR, nullable=true, primaryKey=false)}, columnNames=[order_id, user_id, status], primaryKeyColumns=[order_id], uniqueIndexes=[org.apache.shardingsphere.data.pipeline.core.metadata.model.PipelineIndexMetaData@4085a818])}, cost time=11 ms
   [INFO ] 2022-05-30 20:50:16.531 [Thread-9] o.a.s.d.p.s.r.p.InventoryTaskSplitter - getPositionByPrimaryKeyRange, endId is 0, break, tableName=t_order_3, primaryKey=order_id, beginId=4
   [INFO ] 2022-05-30 20:50:16.545 [Thread-9] o.a.s.d.p.c.m.l.PipelineTableMetaDataLoader - loadTableMetaData, schemaNameFinal=null, tableNamePattern=t_order_item_1, result={t_order_item_1=PipelineTableMetaData(name=t_order_item_1, columnMetaDataMap={item_id=PipelineColumnMetaData(ordinalPosition=1, name=item_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=true), order_id=PipelineColumnMetaData(ordinalPosition=2, name=order_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=false), user_id=PipelineColumnMetaData(ordinalPosition=3, name=user_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=false), status=PipelineColumnMetaData(ordinalPosition=4, name=status, dataType=12, dataTypeName=VARCHAR, nullable=true, primaryKey=false), creation_date=PipelineColumnMetaData(ordinalPosition=5, name=creation_date, dataType=91, dataTypeName=DATE, nullable=true, primaryKey=false)}, columnNames=[item_id, order_id, user_id, status, creation_date], primaryKeyColumns=[i
 tem_id], uniqueIndexes=[org.apache.shardingsphere.data.pipeline.core.metadata.model.PipelineIndexMetaData@25aff154])}, cost time=14 ms
   [INFO ] 2022-05-30 20:50:16.546 [Thread-9] o.a.s.d.p.s.r.p.InventoryTaskSplitter - getPositionByPrimaryKeyRange, endId is 0, break, tableName=t_order_item_1, primaryKey=item_id, beginId=6
   [INFO ] 2022-05-30 20:50:16.559 [Thread-9] o.a.s.d.p.c.m.l.PipelineTableMetaDataLoader - loadTableMetaData, schemaNameFinal=null, tableNamePattern=t_order_item_3, result={t_order_item_3=PipelineTableMetaData(name=t_order_item_3, columnMetaDataMap={item_id=PipelineColumnMetaData(ordinalPosition=1, name=item_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=true), order_id=PipelineColumnMetaData(ordinalPosition=2, name=order_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=false), user_id=PipelineColumnMetaData(ordinalPosition=3, name=user_id, dataType=4, dataTypeName=INT, nullable=false, primaryKey=false), status=PipelineColumnMetaData(ordinalPosition=4, name=status, dataType=12, dataTypeName=VARCHAR, nullable=true, primaryKey=false), creation_date=PipelineColumnMetaData(ordinalPosition=5, name=creation_date, dataType=91, dataTypeName=DATE, nullable=true, primaryKey=false)}, columnNames=[item_id, order_id, user_id, status, creation_date], primaryKeyColumns=[i
 tem_id], uniqueIndexes=[org.apache.shardingsphere.data.pipeline.core.metadata.model.PipelineIndexMetaData@23424faa])}, cost time=13 ms
   [INFO ] 2022-05-30 20:50:16.560 [Thread-9] o.a.s.d.p.s.r.p.InventoryTaskSplitter - getPositionByPrimaryKeyRange, endId is 0, break, tableName=t_order_item_3, primaryKey=item_id, beginId=4
   [INFO ] 2022-05-30 20:50:16.560 [Thread-9] o.a.s.d.p.s.r.RuleAlteredJobPreparer - prepare, jobId=0130317c30317c3054317c74657374, shardingItem=1, inventoryTasks=[InventoryTask(taskId=ds_1.t_order_1#0, position=i,0,5), InventoryTask(taskId=ds_1.t_order_3#0, position=i,0,3), InventoryTask(taskId=ds_1.t_order_item_1#0, position=i,0,5), InventoryTask(taskId=ds_1.t_order_item_3#0, position=i,0,3)], incrementalTasks=[IncrementalTask(taskId=ds_1)]
   [INFO ] 2022-05-30 20:50:16.563 [Thread-9] o.a.s.d.p.s.r.RuleAlteredJobScheduler - -------------- Start inventory task --------------
   [INFO ] 2022-05-30 20:50:16.564 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-0] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.core.task.InventoryTask@77e3cc33
   [INFO ] 2022-05-30 20:50:16.564 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-1] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.core.task.InventoryTask@7b12f15d
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-2] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.core.task.InventoryTask@45a33fa9
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-0] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.ingest.MySQLInventoryDumper@7d1ba743
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-0] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@6747aa25
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-0] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump, uniqueKeyDataType=4, firstSQL=SELECT * FROM `t_order_1` WHERE `order_id` >= ? AND `order_id` <= ? ORDER BY `order_id` ASC LIMIT ?, laterSQL=SELECT * FROM `t_order_1` WHERE `order_id` > ? AND `order_id` <= ? ORDER BY `order_id` ASC LIMIT ?, position=i,0,5
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-0] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-3] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.core.task.InventoryTask@5b48906b
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-1] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.ingest.MySQLInventoryDumper@71ea9a2d
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump, uniqueKeyDataType=4, firstSQL=SELECT * FROM `t_order_3` WHERE `order_id` >= ? AND `order_id` <= ? ORDER BY `order_id` ASC LIMIT ?, laterSQL=SELECT * FROM `t_order_3` WHERE `order_id` > ? AND `order_id` <= ? ORDER BY `order_id` ASC LIMIT ?, position=i,0,3
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-2] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.ingest.MySQLInventoryDumper@4f5b54d9
   [INFO ] 2022-05-30 20:50:16.565 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-1] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@7d08e610
   [INFO ] 2022-05-30 20:50:16.566 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump, uniqueKeyDataType=4, firstSQL=SELECT * FROM `t_order_item_1` WHERE `item_id` >= ? AND `item_id` <= ? ORDER BY `item_id` ASC LIMIT ?, laterSQL=SELECT * FROM `t_order_item_1` WHERE `item_id` > ? AND `item_id` <= ? ORDER BY `item_id` ASC LIMIT ?, position=i,0,5
   [INFO ] 2022-05-30 20:50:16.566 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:16.566 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-3] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.ingest.MySQLInventoryDumper@51b0fb1b
   [INFO ] 2022-05-30 20:50:16.566 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump, uniqueKeyDataType=4, firstSQL=SELECT * FROM `t_order_item_3` WHERE `item_id` >= ? AND `item_id` <= ? ORDER BY `item_id` ASC LIMIT ?, laterSQL=SELECT * FROM `t_order_item_3` WHERE `item_id` > ? AND `item_id` <= ? ORDER BY `item_id` ASC LIMIT ?, position=i,0,3
   [INFO ] 2022-05-30 20:50:16.566 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-2] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@388d4882
   [INFO ] 2022-05-30 20:50:16.566 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-3] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@21fb12ed
   [INFO ] 2022-05-30 20:50:16.566 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:16.566 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:16.567 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-0] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump done, round=3, maxUniqueKeyValue=Optional.empty
   [INFO ] 2022-05-30 20:50:16.567 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-0] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump, before put FinishedRecord
   [INFO ] 2022-05-30 20:50:16.570 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump done, round=3, maxUniqueKeyValue=Optional.empty
   [INFO ] 2022-05-30 20:50:16.570 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump, before put FinishedRecord
   [INFO ] 2022-05-30 20:50:16.572 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump done, round=3, maxUniqueKeyValue=Optional.empty
   [INFO ] 2022-05-30 20:50:16.572 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump, before put FinishedRecord
   [INFO ] 2022-05-30 20:50:16.572 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump done, round=3, maxUniqueKeyValue=Optional.empty
   [INFO ] 2022-05-30 20:50:16.572 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.d.AbstractInventoryDumper - inventory dump, before put FinishedRecord
   [INFO ] 2022-05-30 20:50:29.837 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.AbstractImporter - write, get FinishedRecord, break
   [INFO ] 2022-05-30 20:50:29.837 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.AbstractImporter - importer write done, rowCount=2, finishedByBreak=true
   [INFO ] 2022-05-30 20:50:29.838 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-4] o.a.s.d.p.core.task.InventoryTask - importer onSuccess, taskId=ds_1.t_order_3#0
   [INFO ] 2022-05-30 20:50:29.838 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-1] o.a.s.d.p.core.task.InventoryTask - importer future done
   [INFO ] 2022-05-30 20:50:29.841 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.AbstractImporter - write, get FinishedRecord, break
   [INFO ] 2022-05-30 20:50:29.841 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.AbstractImporter - importer write done, rowCount=2, finishedByBreak=true
   [INFO ] 2022-05-30 20:50:29.842 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-5] o.a.s.d.p.core.task.InventoryTask - importer onSuccess, taskId=ds_1.t_order_item_3#0
   [INFO ] 2022-05-30 20:50:29.842 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-3] o.a.s.d.p.core.task.InventoryTask - importer future done
   [INFO ] 2022-05-30 20:50:29.844 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.AbstractImporter - write, get FinishedRecord, break
   [INFO ] 2022-05-30 20:50:29.844 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-0] o.a.s.d.p.c.i.AbstractImporter - write, get FinishedRecord, break
   [INFO ] 2022-05-30 20:50:29.844 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.AbstractImporter - importer write done, rowCount=3, finishedByBreak=true
   [INFO ] 2022-05-30 20:50:29.844 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-0] o.a.s.d.p.c.i.AbstractImporter - importer write done, rowCount=3, finishedByBreak=true
   [INFO ] 2022-05-30 20:50:29.845 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-6] o.a.s.d.p.core.task.InventoryTask - importer onSuccess, taskId=ds_1.t_order_item_1#0
   [INFO ] 2022-05-30 20:50:29.845 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-2] o.a.s.d.p.core.task.InventoryTask - importer future done
   [INFO ] 2022-05-30 20:50:29.845 [ShardingSphere-Scaling-Importer-0130317c30317c3054317c74657374-7] o.a.s.d.p.core.task.InventoryTask - importer onSuccess, taskId=ds_1.t_order_1#0
   [INFO ] 2022-05-30 20:50:29.845 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-0] o.a.s.d.p.core.task.InventoryTask - importer future done
   [INFO ] 2022-05-30 20:50:29.845 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-6] o.a.s.d.p.s.r.RuleAlteredJobScheduler - onSuccess, all inventory tasks finished.
   [INFO ] 2022-05-30 20:50:29.845 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-6] o.a.s.d.p.s.r.RuleAlteredJobScheduler - -------------- Start incremental task --------------
   [INFO ] 2022-05-30 20:50:29.846 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-7] o.a.s.d.p.s.r.RuleAlteredJobScheduler - onSuccess, all inventory tasks finished.
   [INFO ] 2022-05-30 20:50:29.846 [ShardingSphere-Scaling-Inventory-0130317c30317c3054317c74657374-7] o.a.s.d.p.s.r.RuleAlteredJobScheduler - job status already EXECUTE_INCREMENTAL_TASK, ignore
   [INFO ] 2022-05-30 20:50:29.846 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-0] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.core.task.IncrementalTask@afed38c
   [INFO ] 2022-05-30 20:50:29.846 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-1] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@6f6ed929
   [INFO ] 2022-05-30 20:50:29.847 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-1] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:29.847 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-2] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@58a5b9e2
   [INFO ] 2022-05-30 20:50:29.847 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-2] o.a.s.d.p.c.i.AbstractImporter - importer write
   [INFO ] 2022-05-30 20:50:29.847 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-0] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.ingest.MySQLIncrementalDumper@391aef07
   [INFO ] 2022-05-30 20:50:29.847 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-0] o.a.s.d.p.m.i.MySQLIncrementalDumper - incremental dump, jdbcUrl=jdbc:mysql://192.168.10.18:33307/original_2?rewriteBatchedStatements=true&serverTimezone=UTC&yearIsDateType=false&useSSL=false
   [INFO ] 2022-05-30 20:50:29.847 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-3] o.a.s.d.p.a.e.AbstractLifecycleExecutor - start lifecycle executor: org.apache.shardingsphere.data.pipeline.mysql.importer.MySQLImporter@4d753e7c
   [INFO ] 2022-05-30 20:50:29.847 [ShardingSphere-Scaling-Incremental-0130317c30317c3054317c74657374-3] o.a.s.d.p.c.i.AbstractImporter - importer write
   [ERROR] 2022-05-30 20:50:29.863 [nioEventLoopGroup-5-1] o.a.s.d.p.m.i.client.MySQLClient - protocol resolution error
   io.netty.handler.codec.DecoderException: java.lang.UnsupportedOperationException: Do not parse binlog event fully, eventHeader: MySQLBinlogEventHeader(timestamp=1653560367, eventType=15, serverId=133307, eventSize=121, logPos=0, flags=0), remaining packet 3a9d9af2
   	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:480)
   	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
   	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)
   	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:314)
   	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:435)
   	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279)
   	at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
   	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
   	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
   	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
   	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
   	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
   	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
   	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
   	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
   	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
   	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
   	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
   	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
   	at java.base/java.lang.Thread.run(Thread.java:833)
   Caused by: java.lang.UnsupportedOperationException: Do not parse binlog event fully, eventHeader: MySQLBinlogEventHeader(timestamp=1653560367, eventType=15, serverId=133307, eventSize=121, logPos=0, flags=0), remaining packet 3a9d9af2
   	at org.apache.shardingsphere.data.pipeline.mysql.ingest.client.netty.MySQLBinlogEventPacketDecoder.decode(MySQLBinlogEventPacketDecoder.java:89)
   	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:510)
   	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:449)
   	... 25 common frames omitted
   [INFO ] 2022-05-30 20:50:29.863 [nioEventLoopGroup-5-1] o.a.s.d.p.m.i.client.MySQLClient - reconnect mysql client.
   [INFO ] 2022-05-30 20:50:45.042 [_finished_check_Worker-1] o.a.s.d.p.core.job.FinishedCheckJob - completionDetector not configured, auto switch will not be enabled. You could query job progress and switch config manually with DistSQL.
   [INFO ] 2022-05-30 20:50:50.033 [_finished_check_Worker-1] o.a.s.d.p.core.job.FinishedCheckJob - completionDetector not configured, auto switch will not be enabled. You could query job progress and switch config manually with DistSQL.
   [INFO ] 2022-05-30 20:51:00.222 [_finished_check_Worker-1] o.a.s.d.p.core.job.FinishedCheckJob - completionDetector not configured, auto switch will not be enabled. You could query job progress and switch config manually with DistSQL.
   [INFO ] 2022-05-30 20:51:10.215 [_finished_check_Worker-1] o.a.s.d.p.core.job.FinishedCheckJob - completionDetector not configured, auto switch will not be enabled. You could query job progress and switch config manually with DistSQL.
   [INFO ] 2022-05-30 20:51:20.220 [_finished_check_Worker-1] o.a.s.d.p.core.job.FinishedCheckJob - completionDetector not configured, auto switch will not be enabled. You could query job progress and switch config manually with DistSQL.
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@shardingsphere.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [shardingsphere] sandynz closed issue #18075: Scaling throws exception when ALTER SHARDING TABLE RULE

Posted by GitBox <gi...@apache.org>.
sandynz closed issue #18075: Scaling throws exception when ALTER SHARDING TABLE RULE
URL: https://github.com/apache/shardingsphere/issues/18075


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@shardingsphere.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [shardingsphere] sandynz commented on issue #18075: Scaling throws exception when ALTER SHARDING TABLE RULE

Posted by GitBox <gi...@apache.org>.
sandynz commented on issue #18075:
URL: https://github.com/apache/shardingsphere/issues/18075#issuecomment-1141143023

   In `MySQLBinlogEventPacketDecoder.java`:
   ```
       @Override
       protected void decode(final ChannelHandlerContext ctx, final ByteBuf in, final List<Object> out) {
           MySQLPacketPayload payload = new MySQLPacketPayload(in, ctx.channel().attr(CommonConstants.CHARSET_ATTRIBUTE_KEY).get());
           skipSequenceId(payload);
           checkError(payload);
           MySQLBinlogEventHeader binlogEventHeader = new MySQLBinlogEventHeader(payload);
           removeChecksum(binlogEventHeader.getEventType(), in);
   ...
           if (in.isReadable()) {
               throw new UnsupportedOperationException(String.format("Do not parse binlog event fully, eventHeader: %s, remaining packet %s", binlogEventHeader, readRemainPacket(payload)));
           }
       }
   ```
   There's still left some bytes after parsing, it doesn't mean it has problem definitely, it could be parsed at next call.
   
   We need to verify `MySQLBinlogEventHeader(timestamp=1653560367, eventType=15, serverId=133307, eventSize=121, logPos=0, flags=0), remaining packet 3a9d9af2`.
   
   `eventType=15` means `FORMAT_DESCRIPTION_EVENT(0x0f)`. Maybe it should be parsed in current method loop.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@shardingsphere.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org