You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "John Ferguson (JIRA)" <ji...@apache.org> on 2016/04/26 20:15:13 UTC

[jira] [Resolved] (SPARK-14919) Spark Cannot be used with software that requires jackson-databind 2.6+: RDDOperationScope

     [ https://issues.apache.org/jira/browse/SPARK-14919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

John Ferguson resolved SPARK-14919.
-----------------------------------
    Resolution: Not A Problem

Although it is not optimal, by using POM dependencies in the application consuming Spark, we can force the required Jackson dependencies, specifically the Scala module ones to be up to date.  This was not immediately obvious without digging into a lot of other documentation such as: https://github.com/FasterXML/jackson-module-scala/issues/177  

However, given Jackson has been moving forward and without care for how changes impact legacy code, this may be an issue that returns.

> Spark Cannot be used with software that requires jackson-databind 2.6+: RDDOperationScope
> -----------------------------------------------------------------------------------------
>
>                 Key: SPARK-14919
>                 URL: https://issues.apache.org/jira/browse/SPARK-14919
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 1.6.1
>         Environment: Linux, OSX
>            Reporter: John Ferguson
>
> When using Spark 1.4.x or Spark 1.6.1 in an application that has a front end requiring jackson-databind 2.6+, we see the follow exceptions:
> Subset of stack trace:
> ==================
> com.fasterxml.jackson.databind.JsonMappingException: Could not find creator property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope)
>  at [Source: {"id":"0","name":"textFile"}; line: 1, column: 1]
>   at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
>   at com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843)
>   at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.addBeanProps(BeanDeserializerFactory.java:533)
>   at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:220)
>   at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:143)
>   at com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:405)
>   at com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:354)
>   at com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:262)
>   at com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:242)
>   at com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:143)
>   at com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:439)
>   at com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3664)
>   at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3556)
>   at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2576)
>   at org.apache.spark.rdd.RDDOperationScope$.fromJson(RDDOperationScope.scala:85)
>   at org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)
>   at org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)
>   at scala.Option.map(Option.scala:145)
>   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:136)
>   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
>   at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
>   at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:1011)
>   at org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:832)
>   at org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:830)
>   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
>   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
>   at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
>   at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org