You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2018/12/13 17:12:00 UTC

[jira] [Commented] (SPARK-26364) Clean up import statements in pandas udf tests

    [ https://issues.apache.org/jira/browse/SPARK-26364?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16720407#comment-16720407 ] 

ASF GitHub Bot commented on SPARK-26364:
----------------------------------------

icexelloss opened a new pull request #23314: [SPARK-26364][PYTHON][TESTING] Clean up imports in test_pandas_udf*
URL: https://github.com/apache/spark/pull/23314
 
 
   ## What changes were proposed in this pull request?
   
   Clean up unconditional import statements and move them to the top.
   
   Conditional imports (pandas, numpy, pyarrow) are left as-is.
   
   ## How was this patch tested?
   
   Exising tests.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> Clean up import statements in pandas udf tests
> ----------------------------------------------
>
>                 Key: SPARK-26364
>                 URL: https://issues.apache.org/jira/browse/SPARK-26364
>             Project: Spark
>          Issue Type: Improvement
>          Components: Tests
>    Affects Versions: 2.4.0
>            Reporter: Li Jin
>            Priority: Minor
>
> Per discussion [https://github.com/apache/spark/pull/22305/files#r241215618] we should clean up the import statements in test_pandas_udf* and move them to the top. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org