You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nirav Patel <np...@xactlycorp.com> on 2016/01/29 04:36:44 UTC

Spark 1.5.2 - Programmatically launching spark on yarn-client mode

Hi, we were using spark 1.3.1 and launching our spark jobs on yarn-client
mode programmatically via creating a sparkConf and sparkContext object
manually. It was inspired from spark self-contained application example
here:

https://spark.apache.org/docs/1.5.2/quick-start.html#self-contained-applications\

Only additional configuration we would provide would be all related to yarn
like executor instance, cores, memory, extraJavaOptions etc.

However after upgrading to spark 1.5.2 above application breaks on a line
`val sparkContext = new SparkContext(sparkConf)`

16/01/28 17:38:35 ERROR util.Utils: Uncaught exception in thread main

java.lang.NullPointerException

at
org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)

at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228)

at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)

at
org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1749)

at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)

at org.apache.spark.SparkContext.stop(SparkContext.scala:1748)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:593)


*In yarn container logs I see following:*

16/01/28 17:38:29 INFO yarn.ApplicationMaster: Registered signal
handlers for [TERM, HUP, INT]*Unknown/unsupported param
List*(--properties-file,
/tmp/hadoop-xactly/nm-local-dir/usercache/xactly/appcache/application_1453752281504_3427/container_1453752281504_3427_01_000002/__spark_conf__/__spark_conf__.properties)

Usage: org.apache.spark.deploy.yarn.ApplicationMaster [options]
Options:
  --jar JAR_PATH       Path to your application's JAR file
  --class CLASS_NAME   Name of your application's main class
  --primary-py-file    A main Python file
  --py-files PY_FILES  Comma-separated list of .zip, .egg, or .py files to
                       place on the PYTHONPATH for Python apps.
  --args ARGS          Arguments to be passed to your application's main class.
                       Multiple invocations are possible, each will be
passed in order.
  --num-executors NUM    Number of executors to start (Default: 2)
  --executor-cores NUM   Number of cores for the executors (Default: 1)
  --executor-memory MEM  Memory per executor (e.g. 1000M, 2G) (Default: 1G)



So is this approach still supposed to work? Or do I must use SparkLauncher
class with spark 1.5.2?

Thanks

Nirav

-- 


[image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>

<https://www.nyse.com/quote/XNYS:XTLY>  [image: LinkedIn] 
<https://www.linkedin.com/company/xactly-corporation>  [image: Twitter] 
<https://twitter.com/Xactly>  [image: Facebook] 
<https://www.facebook.com/XactlyCorp>  [image: YouTube] 
<http://www.youtube.com/xactlycorporation>

Re: Spark 1.5.2 - Programmatically launching spark on yarn-client mode

Posted by Nirav Patel <np...@xactlycorp.com>.
Thanks Ted. In my application jar there was no spark 1.3.1 artifacts.
Anyhow I got it working via Oozie spark action.

On Thu, Jan 28, 2016 at 7:42 PM, Ted Yu <yu...@gmail.com> wrote:

> Looks like '--properties-file' is no longer supported.
>
> Was it possible that Spark 1.3.1 artifact / dependency leaked into your
> app ?
>
> Cheers
>
> On Thu, Jan 28, 2016 at 7:36 PM, Nirav Patel <np...@xactlycorp.com>
> wrote:
>
>> Hi, we were using spark 1.3.1 and launching our spark jobs on yarn-client
>> mode programmatically via creating a sparkConf and sparkContext object
>> manually. It was inspired from spark self-contained application example
>> here:
>>
>>
>> https://spark.apache.org/docs/1.5.2/quick-start.html#self-contained-applications\
>>
>> Only additional configuration we would provide would be all related to
>> yarn like executor instance, cores, memory, extraJavaOptions etc.
>>
>> However after upgrading to spark 1.5.2 above application breaks on a line
>> `val sparkContext = new SparkContext(sparkConf)`
>>
>> 16/01/28 17:38:35 ERROR util.Utils: Uncaught exception in thread main
>>
>> java.lang.NullPointerException
>>
>> at
>> org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
>>
>> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228)
>>
>> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)
>>
>> at
>> org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1749)
>>
>> at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
>>
>> at org.apache.spark.SparkContext.stop(SparkContext.scala:1748)
>>
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:593)
>>
>>
>> *In yarn container logs I see following:*
>>
>> 16/01/28 17:38:29 INFO yarn.ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]*Unknown/unsupported param List*(--properties-file, /tmp/hadoop-xactly/nm-local-dir/usercache/xactly/appcache/application_1453752281504_3427/container_1453752281504_3427_01_000002/__spark_conf__/__spark_conf__.properties)
>>
>> Usage: org.apache.spark.deploy.yarn.ApplicationMaster [options]
>> Options:
>>   --jar JAR_PATH       Path to your application's JAR file
>>   --class CLASS_NAME   Name of your application's main class
>>   --primary-py-file    A main Python file
>>   --py-files PY_FILES  Comma-separated list of .zip, .egg, or .py files to
>>                        place on the PYTHONPATH for Python apps.
>>   --args ARGS          Arguments to be passed to your application's main class.
>>                        Multiple invocations are possible, each will be passed in order.
>>   --num-executors NUM    Number of executors to start (Default: 2)
>>   --executor-cores NUM   Number of cores for the executors (Default: 1)
>>   --executor-memory MEM  Memory per executor (e.g. 1000M, 2G) (Default: 1G)
>>
>>
>>
>> So is this approach still supposed to work? Or do I must use
>> SparkLauncher class with spark 1.5.2?
>>
>> Thanks
>>
>> Nirav
>>
>>
>>
>>
>> [image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>
>>
>> <https://www.nyse.com/quote/XNYS:XTLY>  [image: LinkedIn]
>> <https://www.linkedin.com/company/xactly-corporation>  [image: Twitter]
>> <https://twitter.com/Xactly>  [image: Facebook]
>> <https://www.facebook.com/XactlyCorp>  [image: YouTube]
>> <http://www.youtube.com/xactlycorporation>
>
>
>

-- 


[image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>

<https://www.nyse.com/quote/XNYS:XTLY>  [image: LinkedIn] 
<https://www.linkedin.com/company/xactly-corporation>  [image: Twitter] 
<https://twitter.com/Xactly>  [image: Facebook] 
<https://www.facebook.com/XactlyCorp>  [image: YouTube] 
<http://www.youtube.com/xactlycorporation>

Re: Spark 1.5.2 - Programmatically launching spark on yarn-client mode

Posted by Ted Yu <yu...@gmail.com>.
Looks like '--properties-file' is no longer supported.

Was it possible that Spark 1.3.1 artifact / dependency leaked into your app
?

Cheers

On Thu, Jan 28, 2016 at 7:36 PM, Nirav Patel <np...@xactlycorp.com> wrote:

> Hi, we were using spark 1.3.1 and launching our spark jobs on yarn-client
> mode programmatically via creating a sparkConf and sparkContext object
> manually. It was inspired from spark self-contained application example
> here:
>
>
> https://spark.apache.org/docs/1.5.2/quick-start.html#self-contained-applications\
>
> Only additional configuration we would provide would be all related to
> yarn like executor instance, cores, memory, extraJavaOptions etc.
>
> However after upgrading to spark 1.5.2 above application breaks on a line
> `val sparkContext = new SparkContext(sparkConf)`
>
> 16/01/28 17:38:35 ERROR util.Utils: Uncaught exception in thread main
>
> java.lang.NullPointerException
>
> at
> org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
>
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228)
>
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)
>
> at
> org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1749)
>
> at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
>
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1748)
>
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:593)
>
>
> *In yarn container logs I see following:*
>
> 16/01/28 17:38:29 INFO yarn.ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]*Unknown/unsupported param List*(--properties-file, /tmp/hadoop-xactly/nm-local-dir/usercache/xactly/appcache/application_1453752281504_3427/container_1453752281504_3427_01_000002/__spark_conf__/__spark_conf__.properties)
>
> Usage: org.apache.spark.deploy.yarn.ApplicationMaster [options]
> Options:
>   --jar JAR_PATH       Path to your application's JAR file
>   --class CLASS_NAME   Name of your application's main class
>   --primary-py-file    A main Python file
>   --py-files PY_FILES  Comma-separated list of .zip, .egg, or .py files to
>                        place on the PYTHONPATH for Python apps.
>   --args ARGS          Arguments to be passed to your application's main class.
>                        Multiple invocations are possible, each will be passed in order.
>   --num-executors NUM    Number of executors to start (Default: 2)
>   --executor-cores NUM   Number of cores for the executors (Default: 1)
>   --executor-memory MEM  Memory per executor (e.g. 1000M, 2G) (Default: 1G)
>
>
>
> So is this approach still supposed to work? Or do I must use SparkLauncher
> class with spark 1.5.2?
>
> Thanks
>
> Nirav
>
>
>
>
> [image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>
>
> <https://www.nyse.com/quote/XNYS:XTLY>  [image: LinkedIn]
> <https://www.linkedin.com/company/xactly-corporation>  [image: Twitter]
> <https://twitter.com/Xactly>  [image: Facebook]
> <https://www.facebook.com/XactlyCorp>  [image: YouTube]
> <http://www.youtube.com/xactlycorporation>