You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Narendra (JIRA)" <ji...@apache.org> on 2018/02/01 01:18:00 UTC

[jira] [Created] (SPARK-23291) SparkR : substr : In SparkR dataframe , starting and ending position arguments in "substr" is giving wrong result when the position is greater than 1

Narendra created SPARK-23291:
--------------------------------

             Summary: SparkR : substr : In SparkR dataframe , starting and ending position arguments in "substr" is giving wrong result  when the position is greater than 1
                 Key: SPARK-23291
                 URL: https://issues.apache.org/jira/browse/SPARK-23291
             Project: Spark
          Issue Type: Bug
          Components: SparkR
    Affects Versions: 2.2.1
            Reporter: Narendra


Defect Description :

-----------------------------

For example ,an input string "2017-12-01" is read into a SparkR dataframe "df" with column name "col1".
 The target is to create a a new column named "col2" with the value "12" which is inside the string ."12" can be extracted with "starting position" as "6" and "Ending position" as "7"
 (the starting position of the first character is considered as "1" )

But,the current code that needs to be written is :
 
 df <- withColumn(df,"col2",substr(df$col1,7,8)))

Observe that the first argument in the "substr" API , which indicates the 'starting position', is mentioned as "7" 
 Also, observe that the second argument in the "substr" API , which indicates the 'ending position', is mentioned as "8"

i.e the number that should be mentioned to indicate the position should be the "actual position + 1"

Expected behavior :

----------------------------

The code that needs to be written is :
 
 df <- withColumn(df,"col2",substr(df$col1,6,7)))


Note :

-----------
 This defect is observed with only when the starting position is greater than 1.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org