You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by issues solution <is...@gmail.com> on 2017/05/04 07:55:22 UTC

Create multiple columns in pyspak with one shot

Hi ,

How we can create multiple columns iteratively i mean how you can create
empty columns inside loop because :
  with
 for i in listl :
    df = df.withcolumn(i,F.lit(0))

we get stackoverflow
how we can do that inside  list of  columns like that

df.select([F.col(i).lit(0)  for i in df.columns ])

i know the last line it not correct but i try to show you what kind of
result i expect .

Thx for advance
.

Re: Create multiple columns in pyspak with one shot

Posted by Rick Moritz <ra...@gmail.com>.
In Scala you can first define your columns, and then use the
list-to-vararg-expander :_*  in a select call, something like this:

val cols = colnames.map(col).map(column => {
  *lit(0)*
})
dF.select(cols: _*)

I assume something similar should be possible in Java as well, from
your snippet it's unclear which programming language you're actually
using.


On Thu, May 4, 2017 at 9:55 AM, issues solution <is...@gmail.com>
wrote:

> Hi ,
>
> How we can create multiple columns iteratively i mean how you can create
> empty columns inside loop because :
>   with
>  for i in listl :
>     df = df.withcolumn(i,F.lit(0))
>
> we get stackoverflow
> how we can do that inside  list of  columns like that
>
> df.select([F.col(i).lit(0)  for i in df.columns ])
>
> i know the last line it not correct but i try to show you what kind of
> result i expect .
>
> Thx for advance
> .
>