You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/03/15 12:04:58 UTC

[spark] branch branch-3.4 updated: [SPARK-42496][CONNECT][DOCS][FOLLOW-UP] Addressing feedback to remove last ">>>" and adding type(spark) example

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.4 by this push:
     new 0f71caa186a [SPARK-42496][CONNECT][DOCS][FOLLOW-UP] Addressing feedback to remove last ">>>" and adding type(spark) example
0f71caa186a is described below

commit 0f71caa186abb86e1e7ae7c2c9de3e6bed841966
Author: Allan Folting <al...@databricks.com>
AuthorDate: Wed Mar 15 21:04:29 2023 +0900

    [SPARK-42496][CONNECT][DOCS][FOLLOW-UP] Addressing feedback to remove last ">>>" and adding type(spark) example
    
    ### What changes were proposed in this pull request?
    Removing the last ">>>" in a Python code example based on feedback and adding type(spark) as an example of checking whether a session is Spark Connect.
    
    ### Why are the changes needed?
    To help readers determine whether a session is Spark Connect + removing unnecessary extra line for cleaner reading.
    
    ### Does this PR introduce _any_ user-facing change?
    Yes, updating user-facing documentation
    
    ### How was this patch tested?
    Built the doc website locally and checked the pages.
    PRODUCTION=1 SKIP_RDOC=1 bundle exec jekyll build
    
    Closes #40435 from allanf-db/connect_docs.
    
    Authored-by: Allan Folting <al...@databricks.com>
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
    (cherry picked from commit 8860f69455e5a722626194c4797b4b42cccd4510)
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
 docs/spark-connect-overview.md | 26 ++++++++++++++++++++------
 1 file changed, 20 insertions(+), 6 deletions(-)

diff --git a/docs/spark-connect-overview.md b/docs/spark-connect-overview.md
index f942a884873..55cc825a148 100644
--- a/docs/spark-connect-overview.md
+++ b/docs/spark-connect-overview.md
@@ -145,8 +145,11 @@ And start the Spark shell as usual:
 ./bin/pyspark
 {% endhighlight %}
 
-The PySpark shell is now connected to Spark using Spark Connect as indicated in the welcome
-message.
+The PySpark shell is now connected to Spark using Spark Connect as indicated in the welcome message:
+
+{% highlight python %}
+Client connected to the Spark Connect server at localhost
+{% endhighlight %}
 </div>
 
 </div>
@@ -180,14 +183,27 @@ illustrated here.
 <div data-lang="python"  markdown="1">
 To launch the PySpark shell with Spark Connect, simply include the `remote`
 parameter and specify the location of your Spark server. We are using `localhost`
-in this example to connect to the local Spark server we started previously.
+in this example to connect to the local Spark server we started previously:
 
 {% highlight bash %}
 ./bin/pyspark --remote "sc://localhost"
 {% endhighlight %}
 
 And you will notice that the PySpark shell welcome message tells you that
-you have connected to Spark using Spark Connect.
+you have connected to Spark using Spark Connect:
+
+{% highlight python %}
+Client connected to the Spark Connect server at localhost
+{% endhighlight %}
+
+You can also check the Spark session type. If it includes `.connect.` you
+are using Spark Connect as shown in this example:
+
+{% highlight python %}
+SparkSession available as 'spark'.
+>>> type(spark)
+<class 'pyspark.sql.connect.session.SparkSession'>
+{% endhighlight %}
 
 Now you can run PySpark code in the shell to see Spark Connect in action:
 
@@ -202,8 +218,6 @@ Now you can run PySpark code in the shell to see Spark Connect in action:
 |  1|Sarah|
 |  2|Maria|
 +---+-----+
-
->>>
 {% endhighlight %}
 </div>
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org