You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Felipe Quirce <pi...@gmail.com> on 2018/11/29 10:55:22 UTC
Check-pointing error
Hi,
I have found a problem during the checkpoint.
Could anyone help me or help me to debug it?
Exception:
> 1804 2018-11-29 11:31:00,448 INFO org.apache.flink.runtime.executiongraph.ExecutionGraph
> - keyedstats-processor-165 -> map2alert-165 -> Process -> Sink:
> sink-level165 (1/2) (d860069560a4e3e6a62a450c9e3fa699) switched from
> RUNNING to FAILED.
> 51805 java.io.IOException: Exception while applying AggregateFunction in
> aggregating state
> 51806 at
> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
> 51807 at
> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
> 51808 at
> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
> 51809 at
> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
> 51810 at
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
> 51811 at
> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
> 51812 at
> java.lang.Thread.run(Thread.java:748)
> 51813 Caused by: java.lang.ArrayIndexOutOfBoundsException
>
Thanks in Advance,
Re: Check-pointing error
Posted by Congxian Qiu <qc...@gmail.com>.
Hi,Felipe Quirce
Could you reproduce this in stand alone mode? or could you share your code?
Best
Congxian
Felipe Quirce <pi...@gmail.com> 于2018年11月30日周五 下午9:34写道:
> Hi Chesnay,
>
> I tried with the version 1.7.0 and I had the same error.
>
> 2018-11-30 13:13:00,718 INFO org.apache.flink.runtime.taskmanager.Task - keyedstats-processor-165 -> map2alert-165 -> Process -> Sink: sink-level165 (1/4) (a972c963d4ee576a88c9116e946eec62) switched from RUNNING to FAILED.
> java.io.IOException: Exception while applying AggregateFunction in aggregating state
> at org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
> at org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
> at org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
> at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
> at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: -1
> at com.esotericsoftware.kryo.util.IntArray.pop(IntArray.java:157)
> at com.esotericsoftware.kryo.Kryo.reference(Kryo.java:822)
> at com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:129)
> at com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:22)
> at com.esotericsoftware.kryo.Kryo.copy(Kryo.java:862)
> at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:217)
> at org.apache.flink.api.java.typeutils.runtime.PojoSerializer.copy(PojoSerializer.java:243)
> at org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> at org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> at scala.collection.immutable.List.foreach(List.scala:392)
> at org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
> at org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
> at org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
> at org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
> at org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> at org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
> at org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
> at org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:465)
> at org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:341)
> at org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:105)
> ... 6 more
> 2018-11-30 13:13:00,719 INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for keyedstats-processor-165 -> map2alert-165 -> Process -> Sink: sink-level165 (1/4) (a972c963d4ee576a88c9116e946eec62).
> 2018-11-30 13:13:00,748 INFO org.apache.flink.runtime.taskmanager.Task
>
> Thanks
>
>
> On Thu, 29 Nov 2018 at 12:50, Chesnay Schepler <ch...@apache.org> wrote:
>
>> Would it be possible for you to try this with 1.6-SNAPSHOT? This issue
>> may have been fixed with
>> https://issues.apache.org/jira/browse/FLINK-10839.
>>
>> On 29.11.2018 12:11, Felipe Quirce wrote:
>>
>> Hi
>>
>> I'm using the flink 1.6.2, and full stack trace is
>>
>> java.io.IOException: Exception while applying AggregateFunction in
>> aggregating state
>> 4308 at
>> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
>> 4309 at
>> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
>> 4310 at
>> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
>> 4311 at
>> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
>> 4312 at
>> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
>> 4313 at
>> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
>> 4314 at
>> java.lang.Thread.run(Thread.java:748)
>> 4315 Caused by: java.lang.ArrayIndexOutOfBoundsException:
>> -1
>> 4316 at
>> com.esotericsoftware.kryo.util.IntArray.pop(IntArray.java:157)
>> 4317 at
>> com.esotericsoftware.kryo.Kryo.reference(Kryo.java:822)
>> 4318 at
>> com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:129)
>> 4319 at
>> com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:22)
>> 4320 at
>> com.esotericsoftware.kryo.Kryo.copy(Kryo.java:862)
>> 4321 at
>> org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:217)
>> 4322 at
>> org.apache.flink.api.java.typeutils.runtime.PojoSerializer.copy(PojoSerializer.java:239)
>> 4323 at
>> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
>> 4324 at
>> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
>> 4325 at
>> scala.collection.immutable.List.foreach(List.scala:392)
>> 4326 at
>> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
>> 4327 at
>> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
>> 4328 at
>> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
>> 4329 at
>> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
>> 4330 at
>> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
>> 4331 at
>> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
>> 4332 at
>> scala.collection.Iterator$class.foreach(Iterator.scala:891)
>> 4333 at
>> scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
>> 4334 at
>> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>> 4335 at
>> scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>> 4336 at
>> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
>> 4337 at
>> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
>> 4338 at
>> org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:465)
>> 4339 at
>> org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:341)
>> 4340 at
>> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:105)
>> 4341 ... 6 more
>>
>> Thanks
>>
>> On Thu, 29 Nov 2018 at 11:55, Felipe Quirce <pi...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I have found a problem during the checkpoint.
>>> Could anyone help me or help me to debug it?
>>> Exception:
>>>
>>>> 1804 2018-11-29 11:31:00,448 INFO org.apache.flink.runtime.executiongraph.ExecutionGraph
>>>> - keyedstats-processor-165 -> map2alert-165 -> Process -> Sink:
>>>> sink-level165 (1/2) (d860069560a4e3e6a62a450c9e3fa699) switched from
>>>> RUNNING to FAILED.
>>>> 51805 java.io.IOException: Exception while applying AggregateFunction
>>>> in aggregating state
>>>> 51806 at
>>>> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
>>>> 51807 at
>>>> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
>>>> 51808 at
>>>> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
>>>> 51809 at
>>>> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
>>>> 51810 at
>>>> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
>>>> 51811 at
>>>> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
>>>> 51812 at
>>>> java.lang.Thread.run(Thread.java:748)
>>>> 51813 Caused by: java.lang.ArrayIndexOutOfBoundsException
>>>>
>>>
>>> Thanks in Advance,
>>>
>>
>>
--
Blog:http://www.klion26.com
GTalk:qcx978132955
一切随心
Re: Check-pointing error
Posted by Felipe Quirce <pi...@gmail.com>.
Hi Chesnay,
I tried with the version 1.7.0 and I had the same error.
2018-11-30 13:13:00,718 INFO
org.apache.flink.runtime.taskmanager.Task -
keyedstats-processor-165 -> map2alert-165 -> Process -> Sink:
sink-level165 (1/4) (a972c963d4ee576a88c9116e946eec62) switched from
RUNNING to FAILED.
java.io.IOException: Exception while applying AggregateFunction in
aggregating state
at org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
at org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
at org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ArrayIndexOutOfBoundsException: -1
at com.esotericsoftware.kryo.util.IntArray.pop(IntArray.java:157)
at com.esotericsoftware.kryo.Kryo.reference(Kryo.java:822)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:129)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:22)
at com.esotericsoftware.kryo.Kryo.copy(Kryo.java:862)
at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:217)
at org.apache.flink.api.java.typeutils.runtime.PojoSerializer.copy(PojoSerializer.java:243)
at org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
at org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
at org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
at org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
at org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
at org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
at org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
at org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
at org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:465)
at org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:341)
at org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:105)
... 6 more
2018-11-30 13:13:00,719 INFO
org.apache.flink.runtime.taskmanager.Task -
Freeing task resources for keyedstats-processor-165 -> map2alert-165
-> Process -> Sink: sink-level165 (1/4)
(a972c963d4ee576a88c9116e946eec62).
2018-11-30 13:13:00,748 INFO org.apache.flink.runtime.taskmanager.Task
Thanks
On Thu, 29 Nov 2018 at 12:50, Chesnay Schepler <ch...@apache.org> wrote:
> Would it be possible for you to try this with 1.6-SNAPSHOT? This issue may
> have been fixed with https://issues.apache.org/jira/browse/FLINK-10839.
>
> On 29.11.2018 12:11, Felipe Quirce wrote:
>
> Hi
>
> I'm using the flink 1.6.2, and full stack trace is
>
> java.io.IOException: Exception while applying AggregateFunction in
> aggregating state
> 4308 at
> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
> 4309 at
> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
> 4310 at
> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
> 4311 at
> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
> 4312 at
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
> 4313 at
> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
> 4314 at
> java.lang.Thread.run(Thread.java:748)
> 4315 Caused by: java.lang.ArrayIndexOutOfBoundsException:
> -1
> 4316 at
> com.esotericsoftware.kryo.util.IntArray.pop(IntArray.java:157)
> 4317 at
> com.esotericsoftware.kryo.Kryo.reference(Kryo.java:822)
> 4318 at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:129)
> 4319 at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:22)
> 4320 at
> com.esotericsoftware.kryo.Kryo.copy(Kryo.java:862)
> 4321 at
> org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:217)
> 4322 at
> org.apache.flink.api.java.typeutils.runtime.PojoSerializer.copy(PojoSerializer.java:239)
> 4323 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> 4324 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> 4325 at
> scala.collection.immutable.List.foreach(List.scala:392)
> 4326 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
> 4327 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
> 4328 at
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
> 4329 at
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
> 4330 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> 4331 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> 4332 at
> scala.collection.Iterator$class.foreach(Iterator.scala:891)
> 4333 at
> scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> 4334 at
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> 4335 at
> scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> 4336 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
> 4337 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
> 4338 at
> org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:465)
> 4339 at
> org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:341)
> 4340 at
> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:105)
> 4341 ... 6 more
>
> Thanks
>
> On Thu, 29 Nov 2018 at 11:55, Felipe Quirce <pi...@gmail.com> wrote:
>
>> Hi,
>>
>> I have found a problem during the checkpoint.
>> Could anyone help me or help me to debug it?
>> Exception:
>>
>>> 1804 2018-11-29 11:31:00,448 INFO org.apache.flink.runtime.executiongraph.ExecutionGraph
>>> - keyedstats-processor-165 -> map2alert-165 -> Process -> Sink:
>>> sink-level165 (1/2) (d860069560a4e3e6a62a450c9e3fa699) switched from
>>> RUNNING to FAILED.
>>> 51805 java.io.IOException: Exception while applying AggregateFunction in
>>> aggregating state
>>> 51806 at
>>> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
>>> 51807 at
>>> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
>>> 51808 at
>>> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
>>> 51809 at
>>> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
>>> 51810 at
>>> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
>>> 51811 at
>>> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
>>> 51812 at
>>> java.lang.Thread.run(Thread.java:748)
>>> 51813 Caused by: java.lang.ArrayIndexOutOfBoundsException
>>>
>>
>> Thanks in Advance,
>>
>
>
Re: Check-pointing error
Posted by Chesnay Schepler <ch...@apache.org>.
Would it be possible for you to try this with 1.6-SNAPSHOT? This issue
may have been fixed with https://issues.apache.org/jira/browse/FLINK-10839.
On 29.11.2018 12:11, Felipe Quirce wrote:
> Hi
>
> I'm using the flink 1.6.2, and full stack trace is
>
> java.io.IOException: Exception while applying AggregateFunction in
> aggregating state
> 4308 at
> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
> 4309 at
> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
> 4310 at
> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
> 4311 at
> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
> 4312 at
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
> 4313 at
> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
> 4314 at java.lang.Thread.run(Thread.java:748)
> 4315 Caused by: java.lang.ArrayIndexOutOfBoundsException: -1
> 4316 at
> com.esotericsoftware.kryo.util.IntArray.pop(IntArray.java:157)
> 4317 at com.esotericsoftware.kryo.Kryo.reference(Kryo.java:822)
> 4318 at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:129)
> 4319 at
> com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:22)
> 4320 at com.esotericsoftware.kryo.Kryo.copy(Kryo.java:862)
> 4321 at
> org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:217)
> 4322 at
> org.apache.flink.api.java.typeutils.runtime.PojoSerializer.copy(PojoSerializer.java:239)
> 4323 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> 4324 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> 4325 at scala.collection.immutable.List.foreach(List.scala:392)
> 4326 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
> 4327 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
> 4328 at
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
> 4329 at
> org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
> 4330 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> 4331 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
> 4332 at
> scala.collection.Iterator$class.foreach(Iterator.scala:891)
> 4333 at
> scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> 4334 at
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> 4335 at
> scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> 4336 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
> 4337 at
> org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
> 4338 at
> org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:465)
> 4339 at
> org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:341)
> 4340 at
> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:105)
> 4341 ... 6 more
>
> Thanks
>
> On Thu, 29 Nov 2018 at 11:55, Felipe Quirce <pipe.235@gmail.com
> <ma...@gmail.com>> wrote:
>
> Hi,
>
> I have found a problem during the checkpoint.
> Could anyone help me or help me to debug it?
> Exception:
>
> 1804 2018-11-29 11:31:00,448 INFO
> org.apache.flink.runtime.executiongraph.ExecutionGraph -
> keyedstats-processor-165 -> map2alert-165 -> Process -> Sink:
> sink-level165 (1/2) (d860069560a4e3e6a62a450c9e3fa699)
> switched from RUNNING to FAILED.
> 51805 java.io.IOException: Exception while applying
> AggregateFunction in aggregating state
> 51806 at
> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
> 51807 at
> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
> 51808 at
> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
> 51809 at
> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
> 51810 at
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
> 51811 at
> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
> 51812 at java.lang.Thread.run(Thread.java:748)
> 51813 Caused by: java.lang.ArrayIndexOutOfBoundsException
>
>
> Thanks in Advance,
>
Re: Check-pointing error
Posted by Felipe Quirce <pi...@gmail.com>.
Hi
I'm using the flink 1.6.2, and full stack trace is
java.io.IOException: Exception while applying AggregateFunction in
aggregating state
4308 at
org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
4309 at
org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
4310 at
org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
4311 at
org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
4312 at
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
4313 at
org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
4314 at
java.lang.Thread.run(Thread.java:748)
4315 Caused by: java.lang.ArrayIndexOutOfBoundsException:
-1
4316 at
com.esotericsoftware.kryo.util.IntArray.pop(IntArray.java:157)
4317 at
com.esotericsoftware.kryo.Kryo.reference(Kryo.java:822)
4318 at
com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:129)
4319 at
com.esotericsoftware.kryo.serializers.CollectionSerializer.copy(CollectionSerializer.java:22)
4320 at
com.esotericsoftware.kryo.Kryo.copy(Kryo.java:862)
4321 at
org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:217)
4322 at
org.apache.flink.api.java.typeutils.runtime.PojoSerializer.copy(PojoSerializer.java:239)
4323 at
org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
4324 at
org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
4325 at
scala.collection.immutable.List.foreach(List.scala:392)
4326 at
org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
4327 at
org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
4328 at
org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:101)
4329 at
org.apache.flink.api.scala.typeutils.CaseClassSerializer.copy(CaseClassSerializer.scala:32)
4330 at
org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
4331 at
org.apache.flink.api.scala.typeutils.TraversableSerializer$$anonfun$copy$1.apply(TraversableSerializer.scala:69)
4332 at
scala.collection.Iterator$class.foreach(Iterator.scala:891)
4333 at
scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
4334 at
scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
4335 at
scala.collection.AbstractIterable.foreach(Iterable.scala:54)
4336 at
org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:69)
4337 at
org.apache.flink.api.scala.typeutils.TraversableSerializer.copy(TraversableSerializer.scala:33)
4338 at
org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:465)
4339 at
org.apache.flink.runtime.state.heap.CopyOnWriteStateTable.transform(CopyOnWriteStateTable.java:341)
4340 at
org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:105)
4341 ... 6 more
Thanks
On Thu, 29 Nov 2018 at 11:55, Felipe Quirce <pi...@gmail.com> wrote:
> Hi,
>
> I have found a problem during the checkpoint.
> Could anyone help me or help me to debug it?
> Exception:
>
>> 1804 2018-11-29 11:31:00,448 INFO org.apache.flink.runtime.executiongraph.ExecutionGraph
>> - keyedstats-processor-165 -> map2alert-165 -> Process -> Sink:
>> sink-level165 (1/2) (d860069560a4e3e6a62a450c9e3fa699) switched from
>> RUNNING to FAILED.
>> 51805 java.io.IOException: Exception while applying AggregateFunction in
>> aggregating state
>> 51806 at
>> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
>> 51807 at
>> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
>> 51808 at
>> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
>> 51809 at
>> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
>> 51810 at
>> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
>> 51811 at
>> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
>> 51812 at
>> java.lang.Thread.run(Thread.java:748)
>> 51813 Caused by: java.lang.ArrayIndexOutOfBoundsException
>>
>
> Thanks in Advance,
>
Re: Check-pointing error
Posted by Chesnay Schepler <ch...@apache.org>.
Please provide the full exception stack trace and version of Flink that
you are using.
On 29.11.2018 11:55, Felipe Quirce wrote:
> Hi,
>
> I have found a problem during the checkpoint.
> Could anyone help me or help me to debug it?
> Exception:
>
> 1804 2018-11-29 11:31:00,448 INFO
> org.apache.flink.runtime.executiongraph.ExecutionGraph -
> keyedstats-processor-165 -> map2alert-165 -> Process ->
> Sink: sink-level165 (1/2) (d860069560a4e3e6a62a450c9e3fa699)
> switched from RUNNING to FAILED.
> 51805 java.io.IOException: Exception while applying
> AggregateFunction in aggregating state
> 51806 at
> org.apache.flink.runtime.state.heap.HeapAggregatingState.add(HeapAggregatingState.java:107)
> 51807 at
> org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.processElement(WindowOperator.java:391)
> 51808 at
> org.apache.flink.streaming.runtime.io.StreamInputProcessor.processInput(StreamInputProcessor.java:202)
> 51809 at
> org.apache.flink.streaming.runtime.tasks.OneInputStreamTask.run(OneInputStreamTask.java:105)
> 51810 at
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
> 51811 at
> org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
> 51812 at java.lang.Thread.run(Thread.java:748)
> 51813 Caused by: java.lang.ArrayIndexOutOfBoundsException
>
>
> Thanks in Advance,