You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by viirya <gi...@git.apache.org> on 2018/05/07 08:39:47 UTC

[GitHub] spark pull request #21249: [SPARK-23291][R][FOLLOWUP] Update SparkR migratio...

Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21249#discussion_r186361818
  
    --- Diff: docs/sparkr.md ---
    @@ -664,6 +664,6 @@ You can inspect the search path in R with [`search()`](https://stat.ethz.ch/R-ma
      - For `summary`, option for statistics to compute has been added. Its output is changed from that from `describe`.
      - A warning can be raised if versions of SparkR package and the Spark JVM do not match.
     
    -## Upgrading to Spark 2.4.0
    +## Upgrading to SparkR 2.3.1 and above
     
    - - The `start` parameter of `substr` method was wrongly subtracted by one, previously. In other words, the index specified by `start` parameter was considered as 0-base. This can lead to inconsistent substring results and also does not match with the behaviour with `substr` in R. It has been fixed so the `start` parameter of `substr` method is now 1-base, e.g., therefore to get the same result as `substr(df$a, 2, 5)`, it should be changed to `substr(df$a, 1, 4)`.
    + - In SparkR 2.3.0 and earlier, the `start` parameter of `substr` method was wrongly subtracted by one, previously. In other words, the index specified by `start` parameter was considered as 0-base. This can lead to inconsistent substring results and also does not match with the behaviour with `substr` in R. In version 2.3.1 and later, it has been fixed so the `start` parameter of `substr` method is now 1-base. As an example, `substr(lit('abcdef'), 2, 4))` would result to `abc` in SparkR 2.3.0, and the result would be `bcd` in SparkR 2.3.1.
    --- End diff --
    
    nit: ```the result would be `bcd` in SparkR 2.3.1 and above.```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org