You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yihong He (Jira)" <ji...@apache.org> on 2022/03/09 18:14:00 UTC

[jira] [Updated] (SPARK-38484) Move usage logging instrumentation util functions from pandas module to pyspark.util module

     [ https://issues.apache.org/jira/browse/SPARK-38484?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yihong He updated SPARK-38484:
------------------------------
    Description: It will be helpful to attach the usage logger to other modules (e.g. sql) besides Pandas but other modules should not have a dependency on Pandas modules in order to use the instrumentation utils (e.g. _wrap_function, _wrap_property ...) in pandas module. So we need to move usage logging instrumentation util functions from Pandas module to pyspark.util module.  (was: It will be helpful to attach the usage logger to other modules (e.g. sql) besides Pandas. In order to instrument other modules, we may need to move usage logging instrumentation util functions from Pandas module to pyspark.util module.)

> Move usage logging instrumentation util functions from pandas module to pyspark.util module
> -------------------------------------------------------------------------------------------
>
>                 Key: SPARK-38484
>                 URL: https://issues.apache.org/jira/browse/SPARK-38484
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 3.2.1
>            Reporter: Yihong He
>            Priority: Minor
>
> It will be helpful to attach the usage logger to other modules (e.g. sql) besides Pandas but other modules should not have a dependency on Pandas modules in order to use the instrumentation utils (e.g. _wrap_function, _wrap_property ...) in pandas module. So we need to move usage logging instrumentation util functions from Pandas module to pyspark.util module.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org