You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by pw...@apache.org on 2014/09/08 05:56:16 UTC

git commit: [SPARK-938][doc] Add OpenStack Swift support

Repository: spark
Updated Branches:
  refs/heads/master f25bbbdb3 -> eddfeddac


[SPARK-938][doc] Add OpenStack Swift support

See compiled doc at
http://people.apache.org/~rxin/tmp/openstack-swift/_site/storage-openstack-swift.html

This is based on #1010. Closes #1010.

Author: Reynold Xin <rx...@apache.org>
Author: Gil Vernik <gi...@il.ibm.com>

Closes #2298 from rxin/openstack-swift and squashes the following commits:

ff4e394 [Reynold Xin] Two minor comments from Patrick.
279f6de [Reynold Xin] core-sites -> core-site
dfb8fea [Reynold Xin] Updated based on Gil's suggestion.
846f5cb [Reynold Xin] Added a link from overview page.
0447c9f [Reynold Xin] Removed sample code.
e9c3761 [Reynold Xin] Merge pull request #1010 from gilv/master
9233fef [Gil Vernik] Fixed typos
6994827 [Gil Vernik] Merge pull request #1 from rxin/openstack
ac0679e [Reynold Xin] Fixed an unclosed tr.
47ce99d [Reynold Xin] Merge branch 'master' into openstack
cca7192 [Gil Vernik] Removed white spases from pom.xml
99f095d [Reynold Xin] Pending openstack changes.
eb22295 [Reynold Xin] Merge pull request #1010 from gilv/master
39a9737 [Gil Vernik] Spark integration with Openstack Swift
c977658 [Gil Vernik] Merge branch 'master' of https://github.com/gilv/spark
2aba763 [Gil Vernik] Fix to docs/openstack-integration.md
9b625b5 [Gil Vernik] Merge branch 'master' of https://github.com/gilv/spark
eff538d [Gil Vernik] SPARK-938 - Openstack Swift object storage support
ce483d7 [Gil Vernik] SPARK-938 - Openstack Swift object storage support
b6c37ef [Gil Vernik] Openstack Swift support


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/eddfedda
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/eddfedda
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/eddfedda

Branch: refs/heads/master
Commit: eddfeddac19870fc265ef406d87e1c3db9b54249
Parents: f25bbbd
Author: Reynold Xin <rx...@apache.org>
Authored: Sun Sep 7 20:56:04 2014 -0700
Committer: Patrick Wendell <pw...@gmail.com>
Committed: Sun Sep 7 20:56:04 2014 -0700

----------------------------------------------------------------------
 docs/index.md                   |   2 +
 docs/storage-openstack-swift.md | 152 +++++++++++++++++++++++++++++++++++
 2 files changed, 154 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/eddfedda/docs/index.md
----------------------------------------------------------------------
diff --git a/docs/index.md b/docs/index.md
index 4ac0982..7fe6b43 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -103,6 +103,8 @@ options for deployment:
 * [Security](security.html): Spark security support
 * [Hardware Provisioning](hardware-provisioning.html): recommendations for cluster hardware
 * [3<sup>rd</sup> Party Hadoop Distributions](hadoop-third-party-distributions.html): using common Hadoop distributions
+* Integration with other storage systems:
+  * [OpenStack Swift](storage-openstack-swift.html)
 * [Building Spark with Maven](building-with-maven.html): build Spark using the Maven system
 * [Contributing to Spark](https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark)
 

http://git-wip-us.apache.org/repos/asf/spark/blob/eddfedda/docs/storage-openstack-swift.md
----------------------------------------------------------------------
diff --git a/docs/storage-openstack-swift.md b/docs/storage-openstack-swift.md
new file mode 100644
index 0000000..c39ef1c
--- /dev/null
+++ b/docs/storage-openstack-swift.md
@@ -0,0 +1,152 @@
+---
+layout: global
+title: Accessing OpenStack Swift from Spark
+---
+
+Spark's support for Hadoop InputFormat allows it to process data in OpenStack Swift using the
+same URI formats as in Hadoop. You can specify a path in Swift as input through a 
+URI of the form <code>swift://container.PROVIDER/path</code>. You will also need to set your 
+Swift security credentials, through <code>core-site.xml</code> or via
+<code>SparkContext.hadoopConfiguration</code>.
+Current Swift driver requires Swift to use Keystone authentication method.
+
+# Configuring Swift for Better Data Locality
+
+Although not mandatory, it is recommended to configure the proxy server of Swift with
+<code>list_endpoints</code> to have better data locality. More information is
+[available here](https://github.com/openstack/swift/blob/master/swift/common/middleware/list_endpoints.py).
+
+
+# Dependencies
+
+The Spark application should include <code>hadoop-openstack</code> dependency.
+For example, for Maven support, add the following to the <code>pom.xml</code> file:
+
+{% highlight xml %}
+<dependencyManagement>
+  ...
+  <dependency>
+    <groupId>org.apache.hadoop</groupId>
+    <artifactId>hadoop-openstack</artifactId>
+    <version>2.3.0</version>
+  </dependency>
+  ...
+</dependencyManagement>
+{% endhighlight %}
+
+
+# Configuration Parameters
+
+Create <code>core-site.xml</code> and place it inside Spark's <code>conf</code> directory.
+There are two main categories of parameters that should to be configured: declaration of the
+Swift driver and the parameters that are required by Keystone. 
+
+Configuration of Hadoop to use Swift File system achieved via 
+
+<table class="table">
+<tr><th>Property Name</th><th>Value</th></tr>
+<tr>
+  <td>fs.swift.impl</td>
+  <td>org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem</td>
+</tr>
+</table>
+
+Additional parameters required by Keystone (v2.0) and should be provided to the Swift driver. Those 
+parameters will be used to perform authentication in Keystone to access Swift. The following table 
+contains a list of Keystone mandatory parameters. <code>PROVIDER</code> can be any name.
+
+<table class="table">
+<tr><th>Property Name</th><th>Meaning</th><th>Required</th></tr>
+<tr>
+  <td><code>fs.swift.service.PROVIDER.auth.url</code></td>
+  <td>Keystone Authentication URL</td>
+  <td>Mandatory</td>
+</tr>
+<tr>
+  <td><code>fs.swift.service.PROVIDER.auth.endpoint.prefix</code></td>
+  <td>Keystone endpoints prefix</td>
+  <td>Optional</td>
+</tr>
+<tr>
+  <td><code>fs.swift.service.PROVIDER.tenant</code></td>
+  <td>Tenant</td>
+  <td>Mandatory</td>
+</tr>
+<tr>
+  <td><code>fs.swift.service.PROVIDER.username</code></td>
+  <td>Username</td>
+  <td>Mandatory</td>
+</tr>
+<tr>
+  <td><code>fs.swift.service.PROVIDER.password</code></td>
+  <td>Password</td>
+  <td>Mandatory</td>
+</tr>
+<tr>
+  <td><code>fs.swift.service.PROVIDER.http.port</code></td>
+  <td>HTTP port</td>
+  <td>Mandatory</td>
+</tr>
+<tr>
+  <td><code>fs.swift.service.PROVIDER.region</code></td>
+  <td>Keystone region</td>
+  <td>Mandatory</td>
+</tr>
+<tr>
+  <td><code>fs.swift.service.PROVIDER.public</code></td>
+  <td>Indicates if all URLs are public</td>
+  <td>Mandatory</td>
+</tr>
+</table>
+
+For example, assume <code>PROVIDER=SparkTest</code> and Keystone contains user <code>tester</code> with password <code>testing</code>
+defined for tenant <code>test</code>. Then <code>core-site.xml</code> should include:
+
+{% highlight xml %}
+<configuration>
+  <property>
+    <name>fs.swift.impl</name>
+    <value>org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem</value>
+  </property>
+  <property>
+    <name>fs.swift.service.SparkTest.auth.url</name>
+    <value>http://127.0.0.1:5000/v2.0/tokens</value>
+  </property>
+  <property>
+    <name>fs.swift.service.SparkTest.auth.endpoint.prefix</name>
+    <value>endpoints</value>
+  </property>
+    <name>fs.swift.service.SparkTest.http.port</name>
+    <value>8080</value>
+  </property>
+  <property>
+    <name>fs.swift.service.SparkTest.region</name>
+    <value>RegionOne</value>
+  </property>
+  <property>
+    <name>fs.swift.service.SparkTest.public</name>
+    <value>true</value>
+  </property>
+  <property>
+    <name>fs.swift.service.SparkTest.tenant</name>
+    <value>test</value>
+  </property>
+  <property>
+    <name>fs.swift.service.SparkTest.username</name>
+    <value>tester</value>
+  </property>
+  <property>
+    <name>fs.swift.service.SparkTest.password</name>
+    <value>testing</value>
+  </property>
+</configuration>
+{% endhighlight %}
+
+Notice that
+<code>fs.swift.service.PROVIDER.tenant</code>,
+<code>fs.swift.service.PROVIDER.username</code>, 
+<code>fs.swift.service.PROVIDER.password</code> contains sensitive information and keeping them in
+<code>core-site.xml</code> is not always a good approach.
+We suggest to keep those parameters in <code>core-site.xml</code> for testing purposes when running Spark
+via <code>spark-shell</code>.
+For job submissions they should be provided via <code>sparkContext.hadoopConfiguration</code>.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org