You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@druid.apache.org by GitBox <gi...@apache.org> on 2019/08/16 19:41:09 UTC

[GitHub] [incubator-druid] drnushooz opened a new issue #8325: InvalidTypeIdException while deserializing Granularity JSON

drnushooz opened a new issue #8325: InvalidTypeIdException while deserializing Granularity JSON 
URL: https://github.com/apache/incubator-druid/issues/8325
 
 
   Getting `Cause: com.fasterxml.jackson.databind.exc.InvalidTypeIdException: Missing type id when trying to resolve subtype of [simple type, class org.apache.druid.java.util.common.granularity.PeriodGranularity]: missing type id property 'type'` when deserializing `Granularity`.
   
   ### Affected Version
   
   0.15.0-incubating
   
   ### Description
   I am working on porting https://github.com/metamx/druid-spark-batch to the latest version of Druid and there are couple of unit tests which are failing while trying to deserialize Granularity in `DateBucketPartitioner`. My code is located here https://github.com/drnushooz/druid-spark-batch/tree/druid-version-upgrade and the exact location of the exception is https://github.com/drnushooz/druid-spark-batch/blob/druid-version-upgrade/src/main/scala/org/apache/druid/indexer/spark/DateBucketPartitioner.scala#L41
   
   The exception is reported below when I run `"TestSparkDruidIndexer.The spark indexer return proper DataSegments"` located at https://github.com/drnushooz/druid-spark-batch/blob/druid-version-upgrade/src/test/scala/org/apache/druid/indexer/spark/TestSparkDruidIndexer.scala#L48
   
   ```Caused by: com.fasterxml.jackson.databind.exc.InvalidTypeIdException: Missing type id when trying to resolve subtype of [simple type, class org.apache.druid.java.util.common.granularity.PeriodGranularity]: missing type id property 'type'
    at [Source: UNKNOWN; line: 1, column: 1]
   	at com.fasterxml.jackson.databind.exc.InvalidTypeIdException.from(InvalidTypeIdException.java:43)
   	at com.fasterxml.jackson.databind.DeserializationContext.missingTypeIdException(DeserializationContext.java:1645)
   	at com.fasterxml.jackson.databind.DeserializationContext.handleMissingTypeId(DeserializationContext.java:1218)
   	at com.fasterxml.jackson.databind.jsontype.impl.TypeDeserializerBase._handleMissingTypeId(TypeDeserializerBase.java:300)
   	at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedUsingDefaultImpl(AsPropertyTypeDeserializer.java:164)
   	at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:88)
   	at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeWithType(BeanDeserializerBase.java:1178)
   	at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:68)
   	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4013)
   	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3004)
   	at org.apache.druid.indexer.spark.SerializedJson.fillFromMap(SerializedJson.scala:81)
   	at org.apache.druid.indexer.spark.SerializedJson.readObject(SerializedJson.scala:51)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1170)
   	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2178)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
   	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
   	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
   	at org.apache.druid.indexer.spark.DateBucketPartitioner.readObject(DateBucketPartitioner.scala:42)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1170)
   	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2178)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
   	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
   	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
   	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
   	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
   	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
   	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
   	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
   	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
   	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
   	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
   	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:88)
   	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
   	at org.apache.spark.scheduler.Task.run(Task.scala:121)
   	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
   	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
   	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at java.lang.Thread.run(Thread.java:748)```
   
   Another variation of the stack trace
   ```
   org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): com.fasterxml.jackson.databind.exc.InvalidTypeIdException: Missing type id when trying to resolve subtype of [simple type, class org.apache.druid.java.util.common.granularity.PeriodGranularity]: missing type id property 'type'
    at [Source: (String)""YEAR""; line: 1, column: 1]
   	at com.fasterxml.jackson.databind.exc.InvalidTypeIdException.from(InvalidTypeIdException.java:43)
   	at com.fasterxml.jackson.databind.DeserializationContext.missingTypeIdException(DeserializationContext.java:1645)
   	at com.fasterxml.jackson.databind.DeserializationContext.handleMissingTypeId(DeserializationContext.java:1218)
   	at com.fasterxml.jackson.databind.jsontype.impl.TypeDeserializerBase._handleMissingTypeId(TypeDeserializerBase.java:300)
   	at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedUsingDefaultImpl(AsPropertyTypeDeserializer.java:164)
   	at com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:88)
   	at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeWithType(BeanDeserializerBase.java:1178)
   	at com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:68)
   	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4013)
   	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3004)
   	at org.apache.druid.indexer.spark.SerializedJson.fillFromMap(SerializedJson.scala:81)
   	at org.apache.druid.indexer.spark.SerializedJson.readObject(SerializedJson.scala:51)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1170)
   	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2178)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
   	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
   	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
   	at org.apache.druid.indexer.spark.DateBucketPartitioner.readObject(DateBucketPartitioner.scala:42)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1170)
   	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2178)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
   	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
   	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
   	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
   	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
   	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
   	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
   	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
   	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
   	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
   	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
   	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
   	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:88)
   	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
   	at org.apache.spark.scheduler.Task.run(Task.scala:121)
   	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
   	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
   	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at java.lang.Thread.run(Thread.java:748)
   ```
   I looked through the GranularitySpec.scala to see if there are issues with the input schema which doesn't seem to be the case. What am I doing wrong in deserialization of Granularity?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org