You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Leonard Xu (Jira)" <ji...@apache.org> on 2022/11/01 09:39:00 UTC

[jira] [Updated] (FLINK-6158) when creating the typeinfo of varchar sometime flink takes it as long or String

     [ https://issues.apache.org/jira/browse/FLINK-6158?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Leonard Xu updated FLINK-6158:
------------------------------
    Labels: starter  (was: newbie)

> when creating the typeinfo of varchar  sometime flink takes it as long or String 
> ---------------------------------------------------------------------------------
>
>                 Key: FLINK-6158
>                 URL: https://issues.apache.org/jira/browse/FLINK-6158
>             Project: Flink
>          Issue Type: Bug
>          Components: API / Type Serialization System, Table SQL / API
>    Affects Versions: 1.2.0
>         Environment: linux , flink wiith scala 2.10
>            Reporter: naveen holla U
>            Priority: Major
>              Labels: starter
>   Original Estimate: 3h
>  Remaining Estimate: 3h
>
> i am trying to get metadata of the column type from a vertica database 
> using prepared statement .
> in vertica DB one of the column is of type VARCHAR. if i want to use 
> createTypeInformation then i have to use it as createTypeInformation[String]
> but it gives the following error 
> I am absolutely sure that it is varchar so what should i do 
> Connected to JobManager at Actor[akka.tcp://flink@localhost:40021/user/jobmanager#1647776425]
> 03/22/2017 15:12:35     Job execution switched to status RUNNING.
> 03/22/2017 15:12:35     DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(1/1) switched to SCHEDULED
> 03/22/2017 15:12:35     DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(1/1) switched to DEPLOYING
> 03/22/2017 15:12:36     DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(1/1) switched to RUNNING
> 03/22/2017 15:12:36     DataSource (at org.apache.flink.api.scala.ExecutionEnvironment.createInput(ExecutionEnvironment.scala:395) (org.apache.flink.api.java.io.jdbc.JDBCInputFormat))(1/1) switched to FAILED
> java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Long
>         at org.apache.flink.api.common.typeutils.base.LongSerializer.serialize(LongSerializer.java:27)
>         at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serialize(RowSerializer.java:152)
>         at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serialize(RowSerializer.java:37)
>         at org.apache.flink.runtime.plugable.SerializationDelegate.write(SerializationDelegate.java:56)
>         at org.apache.flink.runtime.io.network.api.serialization.SpanningRecordSerializer.addRecord(SpanningRecordSerializer.java:93)
>         at org.apache.flink.runtime.io.network.api.writer.RecordWriter.sendToTarget(RecordWriter.java:114)
>         at org.apache.flink.runtime.io.network.api.writer.RecordWriter.emit(RecordWriter.java:89)
>         at org.apache.flink.runtime.operators.shipping.OutputCollector.collect(OutputCollector.java:65)
>         at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
>         at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:168)
>         at org.apache.flink.runtime.taskmanager.Task.run(Task.java:670)
>         at java.lang.Thread.run(Thread.java:745)
> 03/22/2017 15:12:36     Job execution switched to status FAILING.
> java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Long
>         at org.apache.flink.api.common.typeutils.base.LongSerializer.serialize(LongSerializer.java:27)
>         at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serialize(RowSerializer.java:152)
>         at org.apache.flink.api.java.typeutils.runtime.RowSerializer.serialize(RowSerializer.java:37)
>         at org.apache.flink.runtime.plugable.SerializationDelegate.write(SerializationDelegate.java:56)
>         at org.apache.flink.runtime.io.network.api.serialization.SpanningRecordSerializer.addRecord(SpanningRecordSerializer.java:93)
>         at org.apache.flink.runtime.io.network.api.writer.RecordWriter.sendToTarget(RecordWriter.java:114)
>         at org.apache.flink.runtime.io.network.api.writer.RecordWriter.emit(RecordWriter.java:89)
>         at org.apache.flink.runtime.operators.shipping.OutputCollector.collect(OutputCollector.java:65)
>         at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
>         at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:168)
>         at org.apache.flink.runtime.taskmanager.Task.run(Task.java:670)
>         at java.lang.Thread.run(Thread.java:745)
> 03/22/2017 15:12:36     DataSink (collect())(1/1) switched to CANCELED
> 03/22/2017 15:12:36     Job execution switched to status FAILED.
> org.apache.flink.client.program.ProgramInvocationException: The program execution failed: Job execution failed.
>         at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
>         at org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:101)
>         at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:400)
>         at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:387)
>         at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:362)
>         at org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:211)
>         at org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:188)
>         at org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:172)
>         at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:926)
>         at org.apache.flink.api.java.DataSet.collect(DataSet.java:410)
>         at org.apache.flink.api.java.DataSet.print(DataSet.java:1605)
>         at org.apache.flink.api.scala.DataSet.print(DataSet.scala:1726)
>         at .<init>(<console>:78)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:734)
>         at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:983)
>         at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
>         at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:604)
>         at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:568)
>         at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:760)
>         at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:805)
>         at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:717)
>         at scala.tools.nsc.interpreter.ILoop.processLine$1(ILoop.scala:581)
>         at scala.tools.nsc.interpreter.ILoop.innerLoop$1(ILoop.scala:588)
>         at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:591)
>         at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:882)
>         at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:837)
>         at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:837)
>         at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:837)
>         at org.apache.flink.api.scala.FlinkShell$.startShell(FlinkShell.scala:217)
>         at org.apache.flink.api.scala.FlinkShell$.main(FlinkShell.scala:136)
>         at org.apache.flink.api.scala.FlinkShell.main(FlinkShell.scala)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)