You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@sedona.apache.org by ji...@apache.org on 2022/01/12 03:12:18 UTC

[incubator-sedona] branch master updated: [DOCS] Fix typo in Raster & Databricks doc (#571)

This is an automated email from the ASF dual-hosted git repository.

jiayu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-sedona.git


The following commit(s) were added to refs/heads/master by this push:
     new 7225629  [DOCS] Fix typo in Raster & Databricks doc (#571)
7225629 is described below

commit 7225629b3721615c3c8818753ae1eb148a431695
Author: Karthick Narendran <ka...@gmail.com>
AuthorDate: Wed Jan 12 01:52:03 2022 +0000

    [DOCS] Fix typo in Raster & Databricks doc (#571)
---
 docs/api/sql/Raster-operators.md | 2 +-
 docs/setup/databricks.md         | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/api/sql/Raster-operators.md b/docs/api/sql/Raster-operators.md
index ccb5677..6c747b3 100644
--- a/docs/api/sql/Raster-operators.md
+++ b/docs/api/sql/Raster-operators.md
@@ -24,7 +24,7 @@ Since: `v1.1.0`
 Spark SQL example:
 ```Scala
 
-val subtractDF = spark.sql("select RS_Subtract(band1, band2) as differenceOfOfBands from dataframe")
+val subtractDF = spark.sql("select RS_SubtractBands(band1, band2) as differenceOfOfBands from dataframe")
 
 ```
 
diff --git a/docs/setup/databricks.md b/docs/setup/databricks.md
index a6960f1..4adfb18 100644
--- a/docs/setup/databricks.md
+++ b/docs/setup/databricks.md
@@ -105,7 +105,7 @@ spark.kryo.registrator org.apache.sedona.core.serde.SedonaKryoRegistrator
 
 From your cluster configuration (`Cluster` -> `Edit` -> `Configuration` -> `Advanced options` -> `Init Scripts`) add the newly created init script 
 ```
-/dbfs/FileStore/sedona/sedona-init.sh
+dbfs:/FileStore/sedona/sedona-init.sh
 ```
 
 *Note: You need to install the sedona libraries via init script because the libraries installed via UI are installed after the cluster has already started, and therefore the classes specified by the config `spark.sql.extensions`, `spark.serializer`, and `spark.kryo.registrator` are not available at startup time.*