You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Bob Lau (JIRA)" <ji...@apache.org> on 2018/05/02 01:47:00 UTC

[jira] [Commented] (FLINK-9273) Class cast exception

    [ https://issues.apache.org/jira/browse/FLINK-9273?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16460389#comment-16460389 ] 

Bob Lau commented on FLINK-9273:
--------------------------------

[~StephanEwen]  I want to transform DataStream<Row> into table environment, and I registerDataStream via scala language as follows.

and the message I received from the MQ is JSONString type.  the whole code like follows:

 
public static DataStream<Row> deserializationToRow(DataStream<String> input, String[] fields, TypeInformation<?>[] typeInfos, Boolean arrayFlag) {

 

DataStream<Row> out = input.flatMap(new FlatMapFunction<String, Row>() {

 

/**  */

privatestaticfinallongserialVersionUID = 1L;

 

@Override

public void flatMap(String input, Collector<Row> collector) {

Row row = null;

try {

 

Map<String, Object> map = JSON.parseObject(input, Map.class);

row = convertMapToRow(map, fields);

collector.collect(row);

} catch (JSONException e) {

List<Map> mapList = JSON.parseArray(input, Map.class);

if(mapList.size() > 0){

for(Map<String, Object> o : mapList){

row = convertMapToRow(o, fields);

{color:#FF0000}collector.collect(row); // The exception will happen here{color}

}

}

 

} catch (Exception e){

}

}

});

 

returnout;

}

 

private static Row convertMapToRow(Map<String, Object> map, String[] fields){

int colSize = fields.length;

Row row = new Row(colSize);

for(int i = 0; i < colSize; i++){

row.setField(i, map.get(fields[i]));

}

returnrow;

}

 

 

> Class cast exception
> --------------------
>
>                 Key: FLINK-9273
>                 URL: https://issues.apache.org/jira/browse/FLINK-9273
>             Project: Flink
>          Issue Type: Bug
>          Components: DataStream API, Streaming, Table API &amp; SQL
>    Affects Versions: 1.5.0
>            Reporter: Bob Lau
>            Priority: Major
>
> Exception stack is as follows:
> org.apache.flink.runtime.client.JobExecutionException: java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Long
> at org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:621)
> at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:121)
> at com.xxxx.tysc.job.service.SubmitJobService.submitJobToLocal(SubmitJobService.java:385)
> at com.xxxx.tysc.rest.JobSubmitController$3.run(JobSubmitController.java:114)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Long
> at org.apache.flink.api.common.typeutils.base.LongSerializer.copy(LongSerializer.java:27)
> at org.apache.flink.api.java.typeutils.runtime.RowSerializer.copy(RowSerializer.java:95)
> at org.apache.flink.api.java.typeutils.runtime.RowSerializer.copy(RowSerializer.java:46)
> at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:558)
> at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:535)
> at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:515)
> at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:679)
> at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:657)
> at org.apache.flink.streaming.api.operators.StreamMap.processElement(StreamMap.java:41)
> at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:560)
> at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:535)
> at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:515)
> at org.apache.flink.streaming.runtime.tasks.OperatorChain$BroadcastingOutputCollector.collect(OperatorChain.java:630)
> at org.apache.flink.streaming.runtime.tasks.OperatorChain$BroadcastingOutputCollector.collect(OperatorChain.java:583)
> at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:679)
> at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:657)
> at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
> at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collectWithTimestamp(StreamSourceContexts.java:111)
> at org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordWithTimestamp(AbstractFetcher.java:396)
> at org.apache.flink.streaming.connectors.kafka.internal.Kafka010Fetcher.emitRecord(Kafka010Fetcher.java:89)
> at org.apache.flink.streaming.connectors.kafka.internal.Kafka09Fetcher.runFetchLoop(Kafka09Fetcher.java:154)
> at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:738)
> at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:87)
> at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:56)
> at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:99)
> at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:307)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702)
> ... 1 more



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)