You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cédric Chantepie (Jira)" <ji...@apache.org> on 2022/10/06 12:49:00 UTC

[jira] [Updated] (SPARK-40678) ArrayType is not properly supported

     [ https://issues.apache.org/jira/browse/SPARK-40678?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Cédric Chantepie updated SPARK-40678:
-------------------------------------
    Description: 
Values with `ArrayType` are no longer properly supported; e.g.

{noformat}
import org.apache.spark.sql.SparkSession

case class KeyValue(key: String, value: Array[Byte])

val spark = SparkSession.builder().master("local[1]").appName("test").getOrCreate()

import spark.implicits._

val df = Seq(Array(KeyValue("foo", "bar".getBytes))).toDF()

df.foreach(r => println(r.json))
{noformat}

Expected:

{noformat}
[{foo, bar}]
{noformat}

Encountered:

{noformat}
java.lang.IllegalArgumentException: Failed to convert value ArraySeq([foo,[B@dcdb68f]) (class of class scala.collection.mutable.ArraySeq$ofRef}) with the type of ArrayType(Seq(StructField(key,StringType,false), StructField(value,BinaryType,false)),true) to JSON.
	at org.apache.spark.sql.Row.toJson$1(Row.scala:604)
	at org.apache.spark.sql.Row.jsonValue(Row.scala:613)
	at org.apache.spark.sql.Row.jsonValue$(Row.scala:552)
	at org.apache.spark.sql.catalyst.expressions.GenericRow.jsonValue(rows.scala:166)
	at org.apache.spark.sql.Row.json(Row.scala:535)
	at org.apache.spark.sql.Row.json$(Row.scala:535)
	at org.apache.spark.sql.catalyst.expressions.GenericRow.json(rows.scala:166)
{noformat}

  was:
Values with `ArrayType` are no longer properly supported; e.g.

```
import org.apache.spark.sql.SparkSession

case class KeyValue(key: String, value: Array[Byte])

val spark = SparkSession.builder().master("local[1]").appName("test").getOrCreate()

import spark.implicits._

val df = Seq(Array(KeyValue("foo", "bar".getBytes))).toDF()

df.foreach(r => println(r.json))
```

Expected:

```
[{foo, bar}]
```

Encountered:

```
java.lang.IllegalArgumentException: Failed to convert value ArraySeq([foo,[B@dcdb68f]) (class of class scala.collection.mutable.ArraySeq$ofRef}) with the type of ArrayType(Seq(StructField(key,StringType,false), StructField(value,BinaryType,false)),true) to JSON.
	at org.apache.spark.sql.Row.toJson$1(Row.scala:604)
	at org.apache.spark.sql.Row.jsonValue(Row.scala:613)
	at org.apache.spark.sql.Row.jsonValue$(Row.scala:552)
	at org.apache.spark.sql.catalyst.expressions.GenericRow.jsonValue(rows.scala:166)
	at org.apache.spark.sql.Row.json(Row.scala:535)
	at org.apache.spark.sql.Row.json$(Row.scala:535)
	at org.apache.spark.sql.catalyst.expressions.GenericRow.json(rows.scala:166)
```


> ArrayType is not properly supported
> -----------------------------------
>
>                 Key: SPARK-40678
>                 URL: https://issues.apache.org/jira/browse/SPARK-40678
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 3.2.0
>            Reporter: Cédric Chantepie
>            Priority: Major
>
> Values with `ArrayType` are no longer properly supported; e.g.
> {noformat}
> import org.apache.spark.sql.SparkSession
> case class KeyValue(key: String, value: Array[Byte])
> val spark = SparkSession.builder().master("local[1]").appName("test").getOrCreate()
> import spark.implicits._
> val df = Seq(Array(KeyValue("foo", "bar".getBytes))).toDF()
> df.foreach(r => println(r.json))
> {noformat}
> Expected:
> {noformat}
> [{foo, bar}]
> {noformat}
> Encountered:
> {noformat}
> java.lang.IllegalArgumentException: Failed to convert value ArraySeq([foo,[B@dcdb68f]) (class of class scala.collection.mutable.ArraySeq$ofRef}) with the type of ArrayType(Seq(StructField(key,StringType,false), StructField(value,BinaryType,false)),true) to JSON.
> 	at org.apache.spark.sql.Row.toJson$1(Row.scala:604)
> 	at org.apache.spark.sql.Row.jsonValue(Row.scala:613)
> 	at org.apache.spark.sql.Row.jsonValue$(Row.scala:552)
> 	at org.apache.spark.sql.catalyst.expressions.GenericRow.jsonValue(rows.scala:166)
> 	at org.apache.spark.sql.Row.json(Row.scala:535)
> 	at org.apache.spark.sql.Row.json$(Row.scala:535)
> 	at org.apache.spark.sql.catalyst.expressions.GenericRow.json(rows.scala:166)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org