You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by godfrey he <go...@gmail.com> on 2020/02/25 14:15:27 UTC

Re: TIME/TIMESTAMP parse in Flink TABLE/SQL API

hi, I find that JsonRowDeserializationSchema only supports date-time with
timezone according to RFC 3339. So you need add timezone to time data (like
14:02:00Z) and timestamp data(2019-07-09T02:02:00.040Z). Hope it can help
you.

Bests,
godfrey

Outlook <ni...@outlook.com> 于2020年2月25日周二 下午5:49写道:

> By the way, my flink version is 1.10.0.
>
>  Original Message
> *Sender:* Outlook<ni...@outlook.com>
> *Recipient:* user<us...@flink.apache.org>
> *Date:* Tuesday, Feb 25, 2020 17:43
> *Subject:* TIME/TIMESTAMP parse in Flink TABLE/SQL API
>
> Hi all,
>
> I read json data from kafka, and print to console. When I do this, some
> error occurs when time/timestamp deserialization.
>
> json data in Kafka:
>
> ```
> {
> "server_date": "2019-07-09",
> "server_time": "14:02:00",
> "reqsndtime_c": "2019-07-09 02:02:00.040"
> }
> ```
>
> flink code:
>
> ```
> bsTableEnv.connect(
> new Kafka()
> .version("universal")
> .topic("xxx")
> .property("bootstrap.servers", "localhost:9092")
> .property("zookeeper.connect", "localhost:2181")
> .property("group.id", "g1")
> .startFromEarliest()
> ).withFormat(
> new Json()
> .failOnMissingField(false)
> ).withSchema(
> new Schema()
> .field("server_date", DataTypes.DATE())
> .field("server_time", DataTypes.TIME())
> .field("reqsndtime_c", DataTypes.TIMESTAMP(3))
> ).inAppendMode()
> .createTemporaryTable("xxx”);
> ```
>
>
> server_date with format  is ok, but server_time with  DataTypes.DATE()
> and reqsndtime_c with DataTypes.TIMESTAMP(3) cause error.  If I change them
> to DataTypes.STRING(), everything will be OK.
>
> Error message:
> ```
> Exception in thread "main" java.util.concurrent.ExecutionException:
> org.apache.flink.client.program.ProgramInvocationException: Job failed
> (JobID: 395d1ba3d41f92734d4ef25aa5f9b4a1)
> at
> java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
> at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
> at
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1640)
> at
> org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:74)
> at
> org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1620)
> at
> org.apache.flink.table.planner.delegation.StreamExecutor.execute(StreamExecutor.java:42)
> at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.execute(TableEnvironmentImpl.java:643)
> at cn.com.agree.Main.main(Main.java:122)
> Caused by: org.apache.flink.client.program.ProgramInvocationException: Job
> failed (JobID: 395d1ba3d41f92734d4ef25aa5f9b4a1)
> at
> org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:112)
> at
> java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:602)
> at
> java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:577)
> at
> java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
> at
> java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1962)
> at
> org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:874)
> at akka.dispatch.OnComplete.internal(Future.scala:264)
> at akka.dispatch.OnComplete.internal(Future.scala:261)
> at akka.dispatch.japi$CallbackBridge.apply(Future.scala:191)
> at akka.dispatch.japi$CallbackBridge.apply(Future.scala:188)
> at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
> at
> org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:74)
> at
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44)
> at
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252)
> at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:572)
> at
> akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:22)
> at
> akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:21)
> at scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:436)
> at scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:435)
> at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
> at
> akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
> at
> akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
> at
> akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
> at
> akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
> at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
> at
> akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
> at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: org.apache.flink.runtime.client.JobExecutionException: Job
> execution failed.
> at
> org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
> at
> org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:110)
> ... 31 more
> Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed
> by NoRestartBackoffTimeStrategy
> at
> org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:110)
> at
> org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:76)
> at
> org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
> at
> org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:186)
> at
> org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:180)
> at
> org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:484)
> at
> org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:380)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:279)
> at
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:194)
> at
> org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
> at
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
> at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
> at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
> at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
> at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
> at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
> at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
> at akka.actor.ActorCell.invoke(ActorCell.scala:561)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
> at akka.dispatch.Mailbox.run(Mailbox.scala:225)
> at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
> ... 4 more
> Caused by: java.io.IOException: Failed to deserialize JSON object.
> at
> org.apache.flink.formats.json.JsonRowDeserializationSchema.deserialize(JsonRowDeserializationSchema.java:133)
> at
> org.apache.flink.formats.json.JsonRowDeserializationSchema.deserialize(JsonRowDeserializationSchema.java:76)
> at
> org.apache.flink.streaming.connectors.kafka.internals.KafkaDeserializationSchemaWrapper.deserialize(KafkaDeserializationSchemaWrapper.java:45)
> at
> org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:140)
> at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:715)
> at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
> at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
> at
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:196)
> Caused by: java.time.format.DateTimeParseException: *Text '14:02:00'
> could not be parsed at index 8*
> at
> java.time.format.DateTimeFormatter.parseResolved0(DateTimeFormatter.java:1949)
> at java.time.format.DateTimeFormatter.parse(DateTimeFormatter.java:1777)
> at
> org.apache.flink.formats.json.JsonRowDeserializationSchema.convertToLocalTime(JsonRowDeserializationSchema.java:390)
> at
> org.apache.flink.formats.json.JsonRowDeserializationSchema.lambda$wrapIntoNullableConverter$d586c97$1(JsonRowDeserializationSchema.java:236)
> at
> org.apache.flink.formats.json.JsonRowDeserializationSchema.convertField(JsonRowDeserializationSchema.java:439)
> at
> org.apache.flink.formats.json.JsonRowDeserializationSchema.lambda$assembleRowConverter$77f7700$1(JsonRowDeserializationSchema.java:418)
> at
> org.apache.flink.formats.json.JsonRowDeserializationSchema.lambda$wrapIntoNullableConverter$d586c97$1(JsonRowDeserializationSchema.java:236)
> at
> org.apache.flink.formats.json.JsonRowDeserializationSchema.deserialize(JsonRowDeserializationSchema.java:131)
> ... 7 more
>
> Process finished with exit code 1
> ```
>
> reqsndtime_c with DataTypes.TIMESTAMP(3) has similar exception.  I see the
> doc,  DataTypes.TIME() value range is  from {@code 00:00:00} to {@code
> 23:59:59} , DataTypes.TIMESTAMP value range is from {@code 0000-01-01
> 00:00:00.000000000} to
> * {@code 9999-12-31 23:59:59.999999999}.  And my value is in the range, I
> don’t know why.  And I see this may be bug in java 8, I change jdk to 11,
> error still occurs.
>
> Can someone give me some help, thanks in advance.
>

Re: TIME/TIMESTAMP parse in Flink TABLE/SQL API

Posted by Leonard Xu <xb...@gmail.com>.
Hi,Outlook
Godfrey is right, you should follow the json format[1] when you parse your json message.
You can use following code to produce a json data-time String.
```
Long time = System.currentTimeMillis();
DateFormat dateFormat =  new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'");
Date date = new Date(time);
String jsonSchemaDate = dateFormat.format(date);
```
[1] https://json-schema.org/understanding-json-schema/reference/string.html#dates-and-times <https://json-schema.org/understanding-json-schema/reference/string.html#dates-and-times>

> 在 2020年2月25日,22:15,godfrey he <go...@gmail.com> 写道:
> 
> hi, I find that JsonRowDeserializationSchema only supports date-time with timezone according to RFC 3339. So you need add timezone to time data (like 14:02:00Z) and timestamp data(2019-07-09T02:02:00.040Z). Hope it can help you.
> 
> Bests,
> godfrey
> 
> Outlook <niyanchun@outlook.com <ma...@outlook.com>> 于2020年2月25日周二 下午5:49写道:
> By the way, my flink version is 1.10.0.
> 
>                              Original Message                             
> Sender: Outlook<niyanchun@outlook.com <ma...@outlook.com>>
> Recipient: user<user@flink.apache.org <ma...@flink.apache.org>>
> Date: Tuesday, Feb 25, 2020 17:43
> Subject: TIME/TIMESTAMP parse in Flink TABLE/SQL API
> 
> Hi all,
> 
> I read json data from kafka, and print to console. When I do this, some error occurs when time/timestamp deserialization.
> 
> json data in Kafka:
> 
> ```
> {
> "server_date": "2019-07-09",
> "server_time": "14:02:00",
> "reqsndtime_c": "2019-07-09 02:02:00.040"
> }
> ```
> 
> flink code:
> 
> ```
> bsTableEnv.connect(
> new Kafka()
> .version("universal")
> .topic("xxx")
> .property("bootstrap.servers", "localhost:9092")
> .property("zookeeper.connect", "localhost:2181")
> .property("group.id <http://group.id/>", "g1")
> .startFromEarliest()
> ).withFormat(
> new Json()
> .failOnMissingField(false)
> ).withSchema(
> new Schema()
> .field("server_date", DataTypes.DATE())
> .field("server_time", DataTypes.TIME())
> .field("reqsndtime_c", DataTypes.TIMESTAMP(3))
> ).inAppendMode()
> .createTemporaryTable("xxx”);
> ```
> 
> 
> server_date with format  is ok, but server_time with  DataTypes.DATE() and reqsndtime_c with DataTypes.TIMESTAMP(3) cause error.  If I change them to DataTypes.STRING(), everything will be OK.
> 
> Error message:
> ```
> Exception in thread "main" java.util.concurrent.ExecutionException: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 395d1ba3d41f92734d4ef25aa5f9b4a1)
> 	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
> 	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
> 	at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1640)
> 	at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:74)
> 	at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1620)
> 	at org.apache.flink.table.planner.delegation.StreamExecutor.execute(StreamExecutor.java:42)
> 	at org.apache.flink.table.api.internal.TableEnvironmentImpl.execute(TableEnvironmentImpl.java:643)
> 	at cn.com.agree.Main.main(Main.java:122)
> Caused by: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 395d1ba3d41f92734d4ef25aa5f9b4a1)
> 	at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:112)
> 	at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:602)
> 	at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:577)
> 	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
> 	at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1962)
> 	at org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:874)
> 	at akka.dispatch.OnComplete.internal(Future.scala:264)
> 	at akka.dispatch.OnComplete.internal(Future.scala:261)
> 	at akka.dispatch.japi$CallbackBridge.apply(Future.scala:191)
> 	at akka.dispatch.japi$CallbackBridge.apply(Future.scala:188)
> 	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
> 	at org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:74)
> 	at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44)
> 	at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252)
> 	at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:572)
> 	at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:22)
> 	at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:21)
> 	at scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:436)
> 	at scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:435)
> 	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
> 	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
> 	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
> 	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
> 	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
> 	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
> 	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
> 	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
> 	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
> 	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
> 	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
> 	at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:110)
> 	... 31 more
> Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
> 	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:110)
> 	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:76)
> 	at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
> 	at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:186)
> 	at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:180)
> 	at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:484)
> 	at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:380)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:279)
> 	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:194)
> 	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
> 	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
> 	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
> 	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
> 	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> 	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
> 	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
> 	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> 	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> 	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
> 	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
> 	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
> 	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
> 	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
> 	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
> 	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
> 	... 4 more
> Caused by: java.io.IOException: Failed to deserialize JSON object.
> 	at org.apache.flink.formats.json.JsonRowDeserializationSchema.deserialize(JsonRowDeserializationSchema.java:133)
> 	at org.apache.flink.formats.json.JsonRowDeserializationSchema.deserialize(JsonRowDeserializationSchema.java:76)
> 	at org.apache.flink.streaming.connectors.kafka.internals.KafkaDeserializationSchemaWrapper.deserialize(KafkaDeserializationSchemaWrapper.java:45)
> 	at org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:140)
> 	at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:715)
> 	at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
> 	at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
> 	at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:196)
> Caused by: java.time.format.DateTimeParseException: Text '14:02:00' could not be parsed at index 8
> 	at java.time.format.DateTimeFormatter.parseResolved0(DateTimeFormatter.java:1949)
> 	at java.time.format.DateTimeFormatter.parse(DateTimeFormatter.java:1777)
> 	at org.apache.flink.formats.json.JsonRowDeserializationSchema.convertToLocalTime(JsonRowDeserializationSchema.java:390)
> 	at org.apache.flink.formats.json.JsonRowDeserializationSchema.lambda$wrapIntoNullableConverter$d586c97$1(JsonRowDeserializationSchema.java:236)
> 	at org.apache.flink.formats.json.JsonRowDeserializationSchema.convertField(JsonRowDeserializationSchema.java:439)
> 	at org.apache.flink.formats.json.JsonRowDeserializationSchema.lambda$assembleRowConverter$77f7700$1(JsonRowDeserializationSchema.java:418)
> 	at org.apache.flink.formats.json.JsonRowDeserializationSchema.lambda$wrapIntoNullableConverter$d586c97$1(JsonRowDeserializationSchema.java:236)
> 	at org.apache.flink.formats.json.JsonRowDeserializationSchema.deserialize(JsonRowDeserializationSchema.java:131)
> 	... 7 more
> 
> Process finished with exit code 1
> ```
> 
> reqsndtime_c with DataTypes.TIMESTAMP(3) has similar exception.  I see the doc,  DataTypes.TIME() value range is  from {@code 00:00:00} to {@code 23:59:59} , DataTypes.TIMESTAMP value range is from {@code 0000-01-01 00:00:00.000000000} to
> * {@code 9999-12-31 23:59:59.999999999}.  And my value is in the range, I don’t know why.  And I see this may be bug in java 8, I change jdk to 11,   
> error still occurs.
> 
> Can someone give me some help, thanks in advance.