You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joyce Arruda Recacho (Jira)" <ji...@apache.org> on 2022/02/14 15:24:00 UTC

[jira] [Created] (SPARK-38208) 'Column' object is not callable

Joyce Arruda Recacho created SPARK-38208:
--------------------------------------------

             Summary: 'Column' object is not callable
                 Key: SPARK-38208
                 URL: https://issues.apache.org/jira/browse/SPARK-38208
             Project: Spark
          Issue Type: Bug
          Components: Deploy
    Affects Versions: 3.2.1
            Reporter: Joyce Arruda Recacho


Hi guys, I have such simple dataframe and am trying to create one new column.

That its schema:

 

>>>> df_operation_event_sellers.schema 

Out[69]: StructType(List(StructField(id,StringType,true),StructField(account_id,StringType,true),StructField(p_tenant_id,StringType,true),StructField(vendor_id,StringType,true),StructField(amount,DecimalType(38,18),true),StructField(operation_type,StringType,true),StructField(reference_id,StringType,true),StructField(date,TimestampType,true),StructField(carrier_id,StringType,true),StructField(account_number,StringType,true),StructField(data_source,StringType,true),StructField(entity,StringType,true),StructField(ingestion_date,DateType,true),StructField(event_type,StringType,false),StructField(amount_new,DecimalType(38,18),true),StructField(date_new,IntegerType,true),StructField(row_num,IntegerType,true)))

 

>>> command to create the new column

df_operation_event_sellers= df_operation_event_sellers.withColumn('flag_first_selling',when(col('row_num') == 1,'YES').instead('NO'))



ISSUE >>>>>>>>>>>> TypeError: 'Column' object is not callable

 

What is happing?

ps. I created other columns the same way successfully

 

 

 

 



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org