You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@systemml.apache.org by "Matthias Boehm (JIRA)" <ji...@apache.org> on 2018/02/28 02:31:00 UTC
[jira] [Created] (SYSTEMML-2162) Deserialization of compressed
blocks w/ shared dictionary fails in Spark
Matthias Boehm created SYSTEMML-2162:
----------------------------------------
Summary: Deserialization of compressed blocks w/ shared dictionary fails in Spark
Key: SYSTEMML-2162
URL: https://issues.apache.org/jira/browse/SYSTEMML-2162
Project: SystemML
Issue Type: Bug
Reporter: Matthias Boehm
For very large datasets that exceed aggregate cluster memory, we automatically apply compression. However, in case blocks with shared DDC1 dictionaries are evicted and subsequently deserialized this deserialization fails with mismatching number of bytes.
{code}
java.lang.IllegalStateException: unread block data
at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1383)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
{code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)