You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/11/07 02:31:58 UTC

[jira] [Created] (SPARK-18295) Match up to_json to from_json in null safety

Hyukjin Kwon created SPARK-18295:
------------------------------------

             Summary: Match up to_json to from_json in null safety
                 Key: SPARK-18295
                 URL: https://issues.apache.org/jira/browse/SPARK-18295
             Project: Spark
          Issue Type: Bug
          Components: SQL
            Reporter: Hyukjin Kwon


{code}
scala> val df = Seq(Some(Tuple1(Tuple1(1))), None).toDF("a")
df: org.apache.spark.sql.DataFrame = [a: struct<_1: int>]

scala> df.show()
+----+
|   a|
+----+
| [1]|
|null|
+----+


scala> df.select(to_json($"a")).show()
java.lang.NullPointerException
  at org.apache.spark.sql.catalyst.json.JacksonGenerator.org$apache$spark$sql$catalyst$json$JacksonGenerator$$writeFields(JacksonGenerator.scala:138)
  at org.apache.spark.sql.catalyst.json.JacksonGenerator$$anonfun$write$1.apply$mcV$sp(JacksonGenerator.scala:194)
  at org.apache.spark.sql.catalyst.json.JacksonGenerator.org$apache$spark$sql$catalyst$json$JacksonGenerator$$writeObject(JacksonGenerator.scala:131)
  at org.apache.spark.sql.catalyst.json.JacksonGenerator.write(JacksonGenerator.scala:193)
  at org.apache.spark.sql.catalyst.expressions.StructToJson.eval(jsonExpressions.scala:544)
  at org.apache.spark.sql.catalyst.expressions.Alias.eval(namedExpressions.scala:142)
  at org.apache.spark.sql.catalyst.expressions.InterpretedProjection.apply(Projection.scala:48)
  at org.apache.spark.sql.catalyst.expressions.InterpretedProjection.apply(Projection.scala:30)
  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org