You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@phoenix.apache.org by an...@apache.org on 2018/09/20 17:52:32 UTC
svn commit: r1841484 - in /phoenix/site: publish/phoenix_spark.html
source/src/site/markdown/phoenix_spark.md
Author: ankit
Date: Thu Sep 20 17:52:31 2018
New Revision: 1841484
URL: http://svn.apache.org/viewvc?rev=1841484&view=rev
Log:
update save api when using spark dataframe(Sandeep Nemuri)
Modified:
phoenix/site/publish/phoenix_spark.html
phoenix/site/source/src/site/markdown/phoenix_spark.md
Modified: phoenix/site/publish/phoenix_spark.html
URL: http://svn.apache.org/viewvc/phoenix/site/publish/phoenix_spark.html?rev=1841484&r1=1841483&r2=1841484&view=diff
==============================================================================
--- phoenix/site/publish/phoenix_spark.html (original)
+++ phoenix/site/publish/phoenix_spark.html Thu Sep 20 17:52:31 2018
@@ -1,7 +1,7 @@
<!DOCTYPE html>
<!--
- Generated by Apache Maven Doxia at 2018-06-10
+ Generated by Apache Maven Doxia at 2018-09-20
Rendered using Reflow Maven Skin 1.1.0 (http://andriusvelykis.github.io/reflow-maven-skin)
-->
<html xml:lang="en" lang="en">
@@ -324,8 +324,16 @@ val df = sqlContext.load("org.apach
"zkUrl" -> hbaseConnectionString))
// Save to OUTPUT_TABLE
-df.save("org.apache.phoenix.spark", SaveMode.Overwrite, Map("table" -> "OUTPUT_TABLE",
- "zkUrl" -> hbaseConnectionString))
+df.saveToPhoenix(Map("table" -> "OUTPUT_TABLE", "zkUrl" -> hbaseConnectionString))
+
+or
+
+df.write \
+ .format("org.apache.phoenix.spark") \
+ .mode("overwrite") \
+ .option("table", "OUTPUT_TABLE") \
+ .option("zkUrl", "localhost:2181") \
+ .save()
</pre>
</div>
</div>
Modified: phoenix/site/source/src/site/markdown/phoenix_spark.md
URL: http://svn.apache.org/viewvc/phoenix/site/source/src/site/markdown/phoenix_spark.md?rev=1841484&r1=1841483&r2=1841484&view=diff
==============================================================================
--- phoenix/site/source/src/site/markdown/phoenix_spark.md (original)
+++ phoenix/site/source/src/site/markdown/phoenix_spark.md Thu Sep 20 17:52:31 2018
@@ -169,8 +169,16 @@ val df = sqlContext.load("org.apache.pho
"zkUrl" -> hbaseConnectionString))
// Save to OUTPUT_TABLE
-df.save("org.apache.phoenix.spark", SaveMode.Overwrite, Map("table" -> "OUTPUT_TABLE",
- "zkUrl" -> hbaseConnectionString))
+df.saveToPhoenix(Map("table" -> "OUTPUT_TABLE", "zkUrl" -> hbaseConnectionString))
+
+or
+
+df.write \
+ .format("org.apache.phoenix.spark") \
+ .mode("overwrite") \
+ .option("table", "OUTPUT_TABLE") \
+ .option("zkUrl", "localhost:2181") \
+ .save()
```
### PySpark