You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2023/01/12 05:25:00 UTC

[jira] [Created] (SPARK-42015) Support struct as a key in map

Hyukjin Kwon created SPARK-42015:
------------------------------------

             Summary: Support struct as a key in map
                 Key: SPARK-42015
                 URL: https://issues.apache.org/jira/browse/SPARK-42015
             Project: Spark
          Issue Type: Sub-task
          Components: Connect
    Affects Versions: 3.4.0
            Reporter: Hyukjin Kwon


{code}
pyspark/sql/tests/test_serde.py:54 (SerdeParityTests.test_struct_in_map)
self = <pyspark.sql.tests.connect.test_parity_serde.SerdeParityTests testMethod=test_struct_in_map>

    def test_struct_in_map(self):
        d = [Row(m={Row(i=1): Row(s="")})]
>       df = self.spark.createDataFrame(d).toDF()

../test_serde.py:57: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../connect/session.py:278: in createDataFrame
    _table = pa.Table.from_pylist([row.asDict(recursive=True) for row in _data])
pyarrow/table.pxi:3700: in pyarrow.lib.Table.from_pylist
    ???
pyarrow/table.pxi:5221: in pyarrow.lib._from_pylist
    ???
pyarrow/table.pxi:3575: in pyarrow.lib.Table.from_arrays
    ???
pyarrow/table.pxi:1383: in pyarrow.lib._sanitize_arrays
    ???
pyarrow/table.pxi:1364: in pyarrow.lib._schema_from_arrays
    ???
pyarrow/array.pxi:320: in pyarrow.lib.array
    ???
pyarrow/array.pxi:39: in pyarrow.lib._sequence_to_array
    ???
pyarrow/error.pxi:144: in pyarrow.lib.pyarrow_internal_check_status
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???
E   pyarrow.lib.ArrowTypeError: Expected dict key of type str or bytes, got 'Row'

pyarrow/error.pxi:123: ArrowTypeError
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org