You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Devi P.V" <de...@gmail.com> on 2020/09/04 07:11:22 UTC

Iterating all columns in a pyspark dataframe

Hi all,
What is the best approach for iterating all columns in a pyspark
dataframe?I want to apply some conditions on all columns in the dataframe.
Currently I am using for loop for iteration. Is it a good practice while
using Spark and I am using Spark 3.0
Please advice

Thanks,
Devi

Re: Iterating all columns in a pyspark dataframe

Posted by Sean Owen <sr...@gmail.com>.
Do you need to iterate anything? you can always write a function of
all columns, df.columns. You can operate on a whole Row at a time too.

On Fri, Sep 4, 2020 at 2:11 AM Devi P.V <de...@gmail.com> wrote:
>
> Hi all,
> What is the best approach for iterating all columns in a pyspark dataframe?I want to apply some conditions on all columns in the dataframe. Currently I am using for loop for iteration. Is it a good practice while using Spark and I am using Spark 3.0
> Please advice
>
> Thanks,
> Devi
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org