You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Peter Griessl <gr...@ihs.ac.at> on 2016/09/07 12:02:43 UTC

No SparkR on Mesos?

Hello,

does SparkR really not work (yet?) on Mesos (Spark 2.0 on Mesos 1.0)?

$ /opt/spark/bin/sparkR

R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)
Launching java with spark-submit command /opt/spark/bin/spark-submit   "sparkr-shell" /tmp/RtmpPYVJxF/backend_port338581f434
Error: SparkR is not supported for Mesos cluster.
Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
  JVM is not ready after 10 seconds


I couldn't find any information on this subject in the docs - am I missing something?

Thanks for any hints,
Peter

Re: No SparkR on Mesos?

Posted by Timothy Chen <tn...@gmail.com>.
Python should be supported as I tested it, patches should already be merged 1.6.2.

Tim

> On Sep 8, 2016, at 1:20 AM, Michael Gummelt <mg...@mesosphere.io> wrote:
> 
> Quite possibly.  I've never used it.  I know Python was "unsupported" for a while, which turned out to mean there was a silly conditional that would fail the submission, even though all the support was there.  Could be the same for R.  Can you submit a JIRA?
> 
>> On Wed, Sep 7, 2016 at 5:02 AM, Peter Griessl <gr...@ihs.ac.at> wrote:
>> Hello,
>> 
>>  
>> 
>> does SparkR really not work (yet?) on Mesos (Spark 2.0 on Mesos 1.0)?
>> 
>>  
>> 
>> $ /opt/spark/bin/sparkR
>> 
>>  
>> 
>> R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"
>> 
>> Copyright (C) 2016 The R Foundation for Statistical Computing
>> 
>> Platform: x86_64-pc-linux-gnu (64-bit)
>> 
>> Launching java with spark-submit command /opt/spark/bin/spark-submit   "sparkr-shell" /tmp/RtmpPYVJxF/backend_port338581f434
>> 
>> Error: SparkR is not supported for Mesos cluster.
>> 
>> Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
>> 
>>   JVM is not ready after 10 seconds
>> 
>>  
>> 
>>  
>> 
>> I couldn’t find any information on this subject in the docs – am I missing something?
>> 
>>  
>> 
>> Thanks for any hints,
>> 
>> Peter
>> 
> 
> 
> 
> -- 
> Michael Gummelt
> Software Engineer
> Mesosphere

Re: No SparkR on Mesos?

Posted by Felix Cheung <fe...@hotmail.com>.
This is correct - SparkR is not quite working completely on Mesos. JIRAs and contributions welcome!





On Wed, Sep 7, 2016 at 10:21 AM -0700, "Michael Gummelt" <mg...@mesosphere.io>> wrote:

Quite possibly.  I've never used it.  I know Python was "unsupported" for a while, which turned out to mean there was a silly conditional that would fail the submission, even though all the support was there.  Could be the same for R.  Can you submit a JIRA?

On Wed, Sep 7, 2016 at 5:02 AM, Peter Griessl <gr...@ihs.ac.at>> wrote:
Hello,

does SparkR really not work (yet?) on Mesos (Spark 2.0 on Mesos 1.0)?

$ /opt/spark/bin/sparkR

R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)
Launching java with spark-submit command /opt/spark/bin/spark-submit   "sparkr-shell" /tmp/RtmpPYVJxF/backend_port338581f434
Error: SparkR is not supported for Mesos cluster.
Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
  JVM is not ready after 10 seconds


I couldn't find any information on this subject in the docs - am I missing something?

Thanks for any hints,
Peter



--
Michael Gummelt
Software Engineer
Mesosphere

Re: No SparkR on Mesos?

Posted by Michael Gummelt <mg...@mesosphere.io>.
Quite possibly.  I've never used it.  I know Python was "unsupported" for a
while, which turned out to mean there was a silly conditional that would
fail the submission, even though all the support was there.  Could be the
same for R.  Can you submit a JIRA?

On Wed, Sep 7, 2016 at 5:02 AM, Peter Griessl <gr...@ihs.ac.at> wrote:

> Hello,
>
>
>
> does SparkR really not work (yet?) on Mesos (Spark 2.0 on Mesos 1.0)?
>
>
>
> $ /opt/spark/bin/sparkR
>
>
>
> R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"
>
> Copyright (C) 2016 The R Foundation for Statistical Computing
>
> Platform: x86_64-pc-linux-gnu (64-bit)
>
> Launching java with spark-submit command /opt/spark/bin/spark-submit
> "sparkr-shell" /tmp/RtmpPYVJxF/backend_port338581f434
>
> Error: *SparkR is not supported for Mesos cluster*.
>
> Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
>
>   JVM is not ready after 10 seconds
>
>
>
>
>
> I couldn’t find any information on this subject in the docs – am I missing
> something?
>
>
>
> Thanks for any hints,
>
> Peter
>



-- 
Michael Gummelt
Software Engineer
Mesosphere

Failed to open native connection to Cassandra at

Posted by muhammet pakyürek <mp...@hotmail.com>.
how to solve this problem below

py4j.protocol.Py4JJavaError: An error occurred while calling o33.load.
: java.io.IOException: Failed to open native connection to Cassandra at {127.0.1.1}:9042



Re: No SparkR on Mesos?

Posted by ray <su...@163.com>.
Hi, Rodrick,

Interesting. SparkR is expected not to work with Mesos due to lack of support for mesos in some places, and it has not been tested yet.

Have you modified Spark source code by yourself? Have you deployed Spark binary distribution on all salve nodes, and set “spark.mesos.executor.home” to point to it?

It would be cool that you can contribute a patch:)

From:  <us...@spark.apache.org> on behalf of Rodrick Brown
Date:  Thursday, September 8, 2016 at 09:46
To:  Peter Griessl
Cc:  "user@spark.apache.org"
Subject:  Re: No SparkR on Mesos?

We've been using SparkR on Mesos for quite sometime with no issues. 


[fedora@prod-rstudio-1 ~]$ /opt/spark-1.6.1/bin/sparkR

R version 3.3.0 (2016-05-03) -- "Supposedly Educational"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-redhat-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

Launching java with spark-submit command /opt/spark-1.6.1/bin/spark-submit   "sparkr-shell" /tmp/Rtmphk5zxe/backend_port11f8414240b65
16/09/08 01:44:04 INFO SparkContext: Running Spark version 1.6.1
16/09/08 01:44:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/09/08 01:44:05 INFO SecurityManager: Changing view acls to: fedora
16/09/08 01:44:05 INFO SecurityManager: Changing modify acls to: fedora
16/09/08 01:44:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(fedora); users with modify permissions: Set(fedora)
16/09/08 01:44:05 INFO Utils: Successfully started service 'sparkDriver' on port 39193.
16/09/08 01:44:05 INFO Slf4jLogger: Slf4jLogger started
16/09/08 01:44:05 INFO Remoting: Starting remoting
16/09/08 01:44:05 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.1.34.13:44212]
16/09/08 01:44:05 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 44212.
16/09/08 01:44:05 INFO SparkEnv: Registering MapOutputTracker
16/09/08 01:44:05 INFO SparkEnv: Registering BlockManagerMaster
16/09/08 01:44:05 INFO DiskBlockManager: Created local directory at /home/fedora/spark-tmp-73604/blockmgr-2928edf7-635e-45ca-83ed-8dc1de50b141
16/09/08 01:44:05 INFO MemoryStore: MemoryStore started with capacity 3.4 GB
16/09/08 01:44:05 INFO SparkEnv: Registering OutputCommitCoordinator
16/09/08 01:44:05 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/09/08 01:44:05 INFO SparkUI: Started SparkUI at http://172.1.34.13:4040
16/09/08 01:44:06 INFO Executor: Starting executor ID driver on host localhost
16/09/08 01:44:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45678.
16/09/08 01:44:06 INFO NettyBlockTransferService: Server created on 45678
16/09/08 01:44:06 INFO BlockManager: external shuffle service port = 31338
16/09/08 01:44:06 INFO BlockManagerMaster: Trying to register BlockManager
16/09/08 01:44:06 INFO BlockManagerMasterEndpoint: Registering block manager localhost:45678 with 3.4 GB RAM, BlockManagerId(driver, localhost, 45678)
16/09/08 01:44:06 INFO BlockManagerMaster: Registered BlockManager

 Welcome to
    ____              __
   / __/__  ___ _____/ /__
  _\ \/ _ \/ _ `/ __/  '_/
 /___/ .__/\_,_/_/ /_/\_\   version  1.6.1
    /_/


 Spark context is available as sc, SQL context is available as sqlContext
>



On Wed, Sep 7, 2016 at 8:02 AM, Peter Griessl <gr...@ihs.ac.at> wrote:
Hello,

 

does SparkR really not work (yet?) on Mesos (Spark 2.0 on Mesos 1.0)?

 

$ /opt/spark/bin/sparkR

 

R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"

Copyright (C) 2016 The R Foundation for Statistical Computing

Platform: x86_64-pc-linux-gnu (64-bit)

Launching java with spark-submit command /opt/spark/bin/spark-submit   "sparkr-shell" /tmp/RtmpPYVJxF/backend_port338581f434

Error: SparkR is not supported for Mesos cluster.

Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :

  JVM is not ready after 10 seconds

 

 

I couldn’t find any information on this subject in the docs – am I missing something?

 

Thanks for any hints,

Peter



-- 
Rodrick Brown / DevOPs

9174456839 / rodrick@orchardplatform.com

Orchard Platform 
101 5th Avenue, 4th Floor, New York, NY

NOTICE TO RECIPIENTS: This communication is confidential and intended for the use of the addressee only. If you are not an intended recipient of this communication, please delete it immediately and notify the sender by return email. Unauthorized reading, dissemination, distribution or copying of this communication is prohibited. This communication does not constitute an offer to sell or a solicitation of an indication of interest to purchase any loan, security or any other financial product or instrument, nor is it an offer to sell or a solicitation of an indication of interest to purchase any products or services to any persons who are prohibited from receiving such information under applicable law. The contents of this communication may not be accurate or complete and are subject to change without notice. As such, Orchard App, Inc. (including its subsidiaries and affiliates, "Orchard") makes no representation regarding the accuracy or completeness of the information contained herein. The intended recipient is advised to consult its own professional advisors, including those specializing in legal, tax and accounting matters. Orchard does not provide legal, tax or accounting advice.


Re: No SparkR on Mesos?

Posted by Rodrick Brown <ro...@orchardplatform.com>.
We've been using SparkR on Mesos for quite sometime with no issues.


[fedora@prod-rstudio-1 ~]$ /opt/spark-1.6.1/bin/sparkR

R version 3.3.0 (2016-05-03) -- "Supposedly Educational"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-redhat-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

Launching java with spark-submit command /opt/spark-1.6.1/bin/spark-submit
  "sparkr-shell" /tmp/Rtmphk5zxe/backend_port11f8414240b65
16/09/08 01:44:04 INFO SparkContext: Running Spark version 1.6.1
16/09/08 01:44:04 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
16/09/08 01:44:05 INFO SecurityManager: Changing view acls to: fedora
16/09/08 01:44:05 INFO SecurityManager: Changing modify acls to: fedora
16/09/08 01:44:05 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(fedora); users
with modify permissions: Set(fedora)
16/09/08 01:44:05 INFO Utils: Successfully started service 'sparkDriver' on
port 39193.
16/09/08 01:44:05 INFO Slf4jLogger: Slf4jLogger started
16/09/08 01:44:05 INFO Remoting: Starting remoting
16/09/08 01:44:05 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriverActorSystem@172.1.34.13:44212]
16/09/08 01:44:05 INFO Utils: Successfully started service
'sparkDriverActorSystem' on port 44212.
16/09/08 01:44:05 INFO SparkEnv: Registering MapOutputTracker
16/09/08 01:44:05 INFO SparkEnv: Registering BlockManagerMaster
16/09/08 01:44:05 INFO DiskBlockManager: Created local directory at
/home/fedora/spark-tmp-73604/blockmgr-2928edf7-635e-45ca-83ed-8dc1de50b141
16/09/08 01:44:05 INFO MemoryStore: MemoryStore started with capacity 3.4 GB
16/09/08 01:44:05 INFO SparkEnv: Registering OutputCommitCoordinator
16/09/08 01:44:05 INFO Utils: Successfully started service 'SparkUI' on
port 4040.
16/09/08 01:44:05 INFO SparkUI: Started SparkUI at http://172.1.34.13:4040
16/09/08 01:44:06 INFO Executor: Starting executor ID driver on host
localhost
16/09/08 01:44:06 INFO Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 45678.
16/09/08 01:44:06 INFO NettyBlockTransferService: Server created on 45678
16/09/08 01:44:06 INFO BlockManager: external shuffle service port = 31338
16/09/08 01:44:06 INFO BlockManagerMaster: Trying to register BlockManager
16/09/08 01:44:06 INFO BlockManagerMasterEndpoint: Registering block
manager localhost:45678 with 3.4 GB RAM, BlockManagerId(driver, localhost,
45678)
16/09/08 01:44:06 INFO BlockManagerMaster: Registered BlockManager

 Welcome to
    ____              __
   / __/__  ___ _____/ /__
  _\ \/ _ \/ _ `/ __/  '_/
 /___/ .__/\_,_/_/ /_/\_\   version  1.6.1
    /_/


 Spark context is available as sc, SQL context is available as sqlContext
>



On Wed, Sep 7, 2016 at 8:02 AM, Peter Griessl <gr...@ihs.ac.at> wrote:

> Hello,
>
>
>
> does SparkR really not work (yet?) on Mesos (Spark 2.0 on Mesos 1.0)?
>
>
>
> $ /opt/spark/bin/sparkR
>
>
>
> R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"
>
> Copyright (C) 2016 The R Foundation for Statistical Computing
>
> Platform: x86_64-pc-linux-gnu (64-bit)
>
> Launching java with spark-submit command /opt/spark/bin/spark-submit
> "sparkr-shell" /tmp/RtmpPYVJxF/backend_port338581f434
>
> Error: *SparkR is not supported for Mesos cluster*.
>
> Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
>
>   JVM is not ready after 10 seconds
>
>
>
>
>
> I couldn’t find any information on this subject in the docs – am I missing
> something?
>
>
>
> Thanks for any hints,
>
> Peter
>



-- 

[image: Orchard Platform] <http://www.orchardplatform.com/>

*Rodrick Brown */ *DevOPs*

9174456839 / rodrick@orchardplatform.com

Orchard Platform
101 5th Avenue, 4th Floor, New York, NY

-- 
*NOTICE TO RECIPIENTS*: This communication is confidential and intended for 
the use of the addressee only. If you are not an intended recipient of this 
communication, please delete it immediately and notify the sender by return 
email. Unauthorized reading, dissemination, distribution or copying of this 
communication is prohibited. This communication does not constitute an 
offer to sell or a solicitation of an indication of interest to purchase 
any loan, security or any other financial product or instrument, nor is it 
an offer to sell or a solicitation of an indication of interest to purchase 
any products or services to any persons who are prohibited from receiving 
such information under applicable law. The contents of this communication 
may not be accurate or complete and are subject to change without notice. 
As such, Orchard App, Inc. (including its subsidiaries and affiliates, 
"Orchard") makes no representation regarding the accuracy or completeness 
of the information contained herein. The intended recipient is advised to 
consult its own professional advisors, including those specializing in 
legal, tax and accounting matters. Orchard does not provide legal, tax or 
accounting advice.