You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/08/09 13:27:00 UTC

[jira] [Comment Edited] (SPARK-21677) json_tuple throws NullPointException when column is null as string type.

    [ https://issues.apache.org/jira/browse/SPARK-21677?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16119891#comment-16119891 ] 

Hyukjin Kwon edited comment on SPARK-21677 at 8/9/17 1:26 PM:
--------------------------------------------------------------

cc [~viirya], I remember your mentee was checking through JSON related code paths. Does this make sense to you and would you be interested in this? I don't have time to work on this and am currently fighting with AppVeyor time limit issue. 


was (Author: hyukjin.kwon):
cc [~viirya], I remember your mentee was checking through JSON related code paths. Does this make sense to you and would you be interested in this?

> json_tuple throws NullPointException when column is null as string type.
> ------------------------------------------------------------------------
>
>                 Key: SPARK-21677
>                 URL: https://issues.apache.org/jira/browse/SPARK-21677
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Hyukjin Kwon
>            Priority: Minor
>              Labels: Starter
>
> I was testing {{json_tuple}} before using this to extract values from JSONs in my testing cluster but I found it could actually throw  {{NullPointException}} as below sometimes:
> {code}
> scala> Seq(("""{"Hyukjin": 224, "John": 1225}""")).toDS.selectExpr("json_tuple(value, trim(' Hyukjin    '))").show()
> +---+
> | c0|
> +---+
> |224|
> +---+
> scala> Seq(("""{"Hyukjin": 224, "John": 1225}""")).toDS.selectExpr("json_tuple(value, trim(' Jackson    '))").show()
> +----+
> |  c0|
> +----+
> |null|
> +----+
> scala> Seq(("""{"Hyukjin": 224, "John": 1225}""")).toDS.selectExpr("json_tuple(value, trim(null))").show()
> ...
> java.lang.NullPointerException
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple$$anonfun$foldableFieldNames$1.apply(jsonExpressions.scala:367)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple$$anonfun$foldableFieldNames$1.apply(jsonExpressions.scala:366)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> 	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> 	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
> 	at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
> 	at scala.collection.AbstractTraversable.map(Traversable.scala:104)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple.foldableFieldNames$lzycompute(jsonExpressions.scala:366)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple.foldableFieldNames(jsonExpressions.scala:365)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple.constantFields$lzycompute(jsonExpressions.scala:373)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple.constantFields(jsonExpressions.scala:373)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple.org$apache$spark$sql$catalyst$expressions$JsonTuple$$parseRow(jsonExpressions.scala:417)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple$$anonfun$eval$4.apply(jsonExpressions.scala:401)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple$$anonfun$eval$4.apply(jsonExpressions.scala:400)
> 	at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2559)
> 	at org.apache.spark.sql.catalyst.expressions.JsonTuple.eval(jsonExpressions.scala:400)
> {code}
> It sounds we should show explicit error messages or return {{NULL}}.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org