You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/11/01 22:16:00 UTC
[jira] [Commented] (SPARK-40990) DataFrame creation from 2d NumPy array with arbitrary columns
[ https://issues.apache.org/jira/browse/SPARK-40990?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17627356#comment-17627356 ]
Apache Spark commented on SPARK-40990:
--------------------------------------
User 'xinrong-meng' has created a pull request for this issue:
https://github.com/apache/spark/pull/38473
> DataFrame creation from 2d NumPy array with arbitrary columns
> -------------------------------------------------------------
>
> Key: SPARK-40990
> URL: https://issues.apache.org/jira/browse/SPARK-40990
> Project: Spark
> Issue Type: Sub-task
> Components: PySpark
> Affects Versions: 3.4.0
> Reporter: Xinrong Meng
> Priority: Major
>
> Currently, DataFrame creation from 2d ndarray works only with 2 columns. We should provide complete support for DataFrame creation with 2d ndarray.
> For example, the test case below should work as shown below.
>
> {code:java}
> >>> spark.createDataFrame(np.arange(100).reshape([10,10])).show()
> +---+---+---+---+---+---+---+---+---+---+
> | _1| _2| _3| _4| _5| _6| _7| _8| _9|_10|
> +---+---+---+---+---+---+---+---+---+---+
> | 0| 1| 2| 3| 4| 5| 6| 7| 8| 9|
> | 10| 11| 12| 13| 14| 15| 16| 17| 18| 19|
> | 20| 21| 22| 23| 24| 25| 26| 27| 28| 29|
> | 30| 31| 32| 33| 34| 35| 36| 37| 38| 39|
> | 40| 41| 42| 43| 44| 45| 46| 47| 48| 49|
> | 50| 51| 52| 53| 54| 55| 56| 57| 58| 59|
> | 60| 61| 62| 63| 64| 65| 66| 67| 68| 69|
> | 70| 71| 72| 73| 74| 75| 76| 77| 78| 79|
> | 80| 81| 82| 83| 84| 85| 86| 87| 88| 89|
> | 90| 91| 92| 93| 94| 95| 96| 97| 98| 99|
> +---+---+---+---+---+---+---+---+---+---+
> {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org