You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maciej Szymkiewicz (JIRA)" <ji...@apache.org> on 2019/07/18 11:43:00 UTC

[jira] [Updated] (SPARK-28439) pyspark.sql.functions.array_repeat should support Column as count argument

     [ https://issues.apache.org/jira/browse/SPARK-28439?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Maciej Szymkiewicz updated SPARK-28439:
---------------------------------------
    Description: 
In Scala Spark supports

 
{code:java}
(Column, Column) => Column
{code}
variant of array_repeat, however PySpark doesn't

 

 

 
{code:java}
>>> import pyspark                                                                                                                                                                       
>>> from pyspark.sql import functions as f                                                                                                                                              
>>> pyspark.__version__                                                                                                                                                                  
'3.0.0.dev0'
 
>>> f.array_repeat(f.col("foo"), f.col("bar"))                                                                                                                                           
Traceback (most recent call last):
...
TypeError: Column is not iterable


{code}
 

 

  was:
In Scala Spark supports

 
{code:java}
(Column, Column) => Column
{code}
variant of array_repeat, however PySpark doesn't

 

 

 
{code:java}
>>> import pyspark                                                                                                                                                                       
>>> from pyspark.sql import functions as f                                                                                                                                              >>> pyspark.__version__                                                                                                                                                                  
'3.0.0.dev0'
 
>>> f.array_repeat(f.col("foo"), f.col("bar"))                                                                                                                                           
Traceback (most recent call last):
...
TypeError: Column is not iterable


{code}
 

 


> pyspark.sql.functions.array_repeat should support Column as count argument
> --------------------------------------------------------------------------
>
>                 Key: SPARK-28439
>                 URL: https://issues.apache.org/jira/browse/SPARK-28439
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, SQL
>    Affects Versions: 2.4.0, 3.0.0
>            Reporter: Maciej Szymkiewicz
>            Priority: Minor
>
> In Scala Spark supports
>  
> {code:java}
> (Column, Column) => Column
> {code}
> variant of array_repeat, however PySpark doesn't
>  
>  
>  
> {code:java}
> >>> import pyspark                                                                                                                                                                       
> >>> from pyspark.sql import functions as f                                                                                                                                              
> >>> pyspark.__version__                                                                                                                                                                  
> '3.0.0.dev0'
>  
> >>> f.array_repeat(f.col("foo"), f.col("bar"))                                                                                                                                           
> Traceback (most recent call last):
> ...
> TypeError: Column is not iterable
> {code}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org