You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Douglas Moore (Jira)" <ji...@apache.org> on 2022/11/01 02:59:00 UTC

[jira] [Commented] (SPARK-37697) Make it easier to convert numpy arrays to Spark Dataframes

    [ https://issues.apache.org/jira/browse/SPARK-37697?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17626928#comment-17626928 ] 

Douglas Moore commented on SPARK-37697:
---------------------------------------

[~XinrongM] can confirm the one dimensional case works on spark 3.4.0 and not 3.3.x !
{code:java}
df = spark.createDataFrame(numpy.arange(10))
df = spark.createDataFrame(numpy.arange(10.)){code}
 

The two dimensional case needs  a little more work:
{code:java}
df = spark.createDataFrame(np.reshape(np.arange(100.),[10,10]))

ValueError: Shape of passed values is (10, 10), indices imply (10, 2){code}
 

 

Whereas with Pandas, we can get that 2d array into a Spark DataFrame.

 
{code:java}
spark.createDataFrame(pd.DataFrame(np.reshape(np.arange(100.),[10,10]))).display(){code}
 

!image-2022-10-31-22-49-37-356.png|width=776,height=233!

> Make it easier to convert numpy arrays to Spark Dataframes
> ----------------------------------------------------------
>
>                 Key: SPARK-37697
>                 URL: https://issues.apache.org/jira/browse/SPARK-37697
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 3.1.2
>            Reporter: Douglas Moore
>            Priority: Major
>         Attachments: image-2022-10-31-22-49-37-356.png
>
>
> Make it easier to convert numpy arrays to dataframes.
> Often we receive errors:
>  
> {code:java}
> df = spark.createDataFrame(numpy.arange(10))
> Can not infer schema for type: <class 'numpy.int64'>
> {code}
>  
> OR
> {code:java}
> df = spark.createDataFrame(numpy.arange(10.))
> Can not infer schema for type: <class 'numpy.float64'>
> {code}
>  
> Today (Spark 3.x) we have to:
> {code:java}
> spark.createDataFrame(pd.DataFrame(numpy.arange(10.))) {code}
> Make this easier with a direct conversion from Numpy arrays to Spark Dataframes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org