You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@eagle.apache.org by "zhiwen wang (JIRA)" <ji...@apache.org> on 2018/06/14 03:35:00 UTC

[jira] [Commented] (EAGLE-1091) serialize exception while processing field which length more than 64k

    [ https://issues.apache.org/jira/browse/EAGLE-1091?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16511951#comment-16511951 ] 

zhiwen wang commented on EAGLE-1091:
------------------------------------

exception throw from thereļ¼š

public class StringSerializer implements Serializer<String> {
 @Override
 public void serialize(String value, DataOutput dataOutput) throws IOException

{ dataOutput.writeUTF(value); }
 
 @Override
 public String deserialize(DataInput dataInput) throws IOException \{ return dataInput.readUTF(); }
 }
 
 and i fix by ignored value which length more than 64kb,such as ,
 
 if(value.getBytes().length >= 64*1024) \{ dataOutput.writeUTF(""); }
 else \{ dataOutput.writeUTF(value); }

anyone got best way?

> serialize exception while processing field which length more than 64k
> ---------------------------------------------------------------------
>
>                 Key: EAGLE-1091
>                 URL: https://issues.apache.org/jira/browse/EAGLE-1091
>             Project: Eagle
>          Issue Type: Bug
>          Components: Core::Alert Engine
>    Affects Versions: v0.5.0, v0.5.1
>            Reporter: zhiwen wang
>            Assignee: Edward Zhang
>            Priority: Major
>
> java.lang.AssertionError: java.io.UTFDataFormatException: encoded string too long: 228413 bytes
>  at com.google.common.io.ByteStreams$ByteArrayDataOutputStream.writeUTF(ByteStreams.java:530) ~[stormjar.jar:?]
>  at org.apache.eagle.alert.engine.serialization.impl.StringSerializer.serialize(StringSerializer.java:28) ~[stormjar.jar:?]
>  at org.apache.eagle.alert.engine.serialization.impl.StringSerializer.serialize(StringSerializer.java:25) ~[stormjar.jar:?]
>  at org.apache.eagle.alert.engine.serialization.impl.StreamEventSerializer.serialize(StreamEventSerializer.java:79) ~[stormjar.jar:?]
>  at org.apache.eagle.alert.engine.serialization.impl.PartitionedEventSerializerImpl.serialize(PartitionedEventSerializerImpl.java:74) ~[stormjar.jar:?]
>  at org.apache.eagle.alert.engine.serialization.impl.PartitionedEventSerializerImpl.serialize(PartitionedEventSerializerImpl.java:81) ~[stormjar.jar:?]
>  at org.apache.eagle.alert.engine.spout.SpoutOutputCollectorWrapper.emit(SpoutOutputCollectorWrapper.java:156) ~[stormjar.jar:?]
>  at org.apache.storm.kafka.PartitionManager.next(PartitionManager.java:160) ~[stormjar.jar:?]
>  at org.apache.storm.kafka.KafkaSpout.nextTuple(KafkaSpout.java:135) ~[stormjar.jar:?]
>  at org.apache.eagle.alert.engine.spout.CorrelationSpout.nextTuple(CorrelationSpout.java:174) ~[stormjar.jar:?]
>  at org.apache.storm.daemon.executor$fn__6505$fn__6520$fn__6551.invoke(executor.clj:651) ~[storm-core-1.0.1.2.5.5.0-157.jar:1.0.1.2.5.5.0-157]
>  at org.apache.storm.util$async_loop$fn__554.invoke(util.clj:484) [storm-core-1.0.1.2.5.5.0-157.jar:1.0.1.2.5.5.0-157]
>  at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?]
>  at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
> Caused by: java.io.UTFDataFormatException: encoded string too long: 228413 bytes
>  at java.io.DataOutputStream.writeUTF(DataOutputStream.java:364) ~[?:1.8.0_131]
>  at java.io.DataOutputStream.writeUTF(DataOutputStream.java:323) ~[?:1.8.0_131]
>  at com.google.common.io.ByteStreams$ByteArrayDataOutputStream.writeUTF(ByteStreams.java:528) ~[stormjar.jar:?]
>  ... 13 more



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)