You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/01/10 00:04:00 UTC

[jira] [Assigned] (SPARK-23018) PySpark creatDataFrame causes Pandas warning of assignment to a copy of a reference

     [ https://issues.apache.org/jira/browse/SPARK-23018?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-23018:
------------------------------------

    Assignee: Apache Spark

> PySpark creatDataFrame causes Pandas warning of assignment to a copy of a reference
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-23018
>                 URL: https://issues.apache.org/jira/browse/SPARK-23018
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.3.0
>            Reporter: Bryan Cutler
>            Assignee: Apache Spark
>
> When calling {{SparkSession.createDataFrame}} with a Pandas DataFrame as input (with Arrow disabled) a Pandas warning is raised when the DataFrame is a slice:
> {noformat}
> In [1]: import numpy as np
>    ...: import pandas as pd
>    ...: pdf = pd.DataFrame(np.random.rand(100, 2))
>    ...: 
> In [2]: df = spark.createDataFrame(pdf[:10])
> /home/bryan/git/spark/python/pyspark/sql/session.py:476: SettingWithCopyWarning: 
> A value is trying to be set on a copy of a slice from a DataFrame.
> Try using .loc[row_indexer,col_indexer] = value instead
> See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy
>   pdf[column] = s
> {noformat}
> This doesn't seem to cause a bug in this case, but might for others.  It could be avoided by only assigning the series if it was a modified timestamp field.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org