You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dev Lakhani (JIRA)" <ji...@apache.org> on 2015/09/24 16:46:04 UTC

[jira] [Created] (SPARK-10798) JsonMappingException with Spark Context Parallelize

Dev Lakhani created SPARK-10798:
-----------------------------------

             Summary: JsonMappingException with Spark Context Parallelize
                 Key: SPARK-10798
                 URL: https://issues.apache.org/jira/browse/SPARK-10798
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.5.0
         Environment: Linux, Java 1.8.40
            Reporter: Dev Lakhani


When trying to create an RDD of Rows using a Java Spark Context:

List<Row> rows= new Vector<Row>();
rows.add(RowFactory.create("test"));
javaSparkContext.parallelize(rows);

I get :

com.fasterxml.jackson.databind.JsonMappingException: (None,None) (of class scala.Tuple2) (through reference chain: org.apache.spark.rdd.RDDOperationScope["parent"])
               at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:210)
               at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:177)
               at com.fasterxml.jackson.databind.ser.std.StdSerializer.wrapAndThrow(StdSerializer.java:187)
               at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:647)
               at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:152)
               at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:128)
               at com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:2881)
               at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:2338)
               at org.apache.spark.rdd.RDDOperationScope.toJson(RDDOperationScope.scala:50)
               at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:141)
               at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
               at org.apache.spark.SparkContext.withScope(SparkContext.scala:700)
               at org.apache.spark.SparkContext.parallelize(SparkContext.scala:714)
               at org.apache.spark.api.java.JavaSparkContext.parallelize(JavaSparkContext.scala:145)
               at org.apache.spark.api.java.JavaSparkContext.parallelize(JavaSparkContext.scala:157)
               ...
Caused by: scala.MatchError: (None,None) (of class scala.Tuple2)
               at com.fasterxml.jackson.module.scala.ser.OptionSerializer$$anonfun$serialize$1.apply$mcV$sp(OptionSerializerModule.scala:32)
               at com.fasterxml.jackson.module.scala.ser.OptionSerializer$$anonfun$serialize$1.apply(OptionSerializerModule.scala:32)
               at com.fasterxml.jackson.module.scala.ser.OptionSerializer$$anonfun$serialize$1.apply(OptionSerializerModule.scala:32)
               at scala.Option.getOrElse(Option.scala:120)
               at com.fasterxml.jackson.module.scala.ser.OptionSerializer.serialize(OptionSerializerModule.scala:31)
               at com.fasterxml.jackson.module.scala.ser.OptionSerializer.serialize(OptionSerializerModule.scala:22)
               at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:505)
               at com.fasterxml.jackson.module.scala.ser.OptionPropertyWriter.serializeAsField(OptionSerializerModule.scala:128)
               at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:639)
               ... 19 more

I've tried updating jackson module scala to 2.6.1 but same issue. This happens in local mode with java 1.8_40
 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org