You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by andrew shved <an...@gmail.com> on 2018/12/23 15:42:12 UTC

livy with sparkR issue

been struggling wiht zeppelin + livy + sparkR integration for days.  I got
livy.pyspark and livy.spark work no issues.  with livy.sparkr I get

18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
support).
Exception in thread "SparkR backend" java.lang.ClassCastException:
scala.Tuple2 cannot be cast to java.lang.Integer
    at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
    at
org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
all the time :disappointed: running out of things to try
simple spark.R works

Any ideas or advice would be appreciated. Thank you!

Re: livy with sparkR issue

Posted by Jiang Jacky <ji...@gmail.com>.
You must try to directly convert tuple2 to integer. You shall get the integer from the tuple2 first then cast that in your workflow.


________________________________
From: Jeff Zhang <zj...@gmail.com>
Sent: Sunday, December 23, 2018 6:59 PM
To: user
Subject: Re: livy with sparkR issue

This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4 changes its SparkR related method signature. I am afraid you have to downgrade to spark 2.3.x


andrew shved <an...@gmail.com>> 于2018年12月24日周一 上午7:48写道:
Spark 2.4.0 Sorry
Zeppelin 0.8.0
Livy 0.5

regular livy.sparkr commands like
1+1 work the issue when spark comes into play

On Sun, Dec 23, 2018 at 6:44 PM andrew shved <an...@gmail.com>> wrote:
0.5 with spark 2.4.9 on AWS EMR

On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang <zj...@gmail.com> wrote:
Which version of livy do you use ?

andrew shved <an...@gmail.com>> 于2018年12月23日周日 下午11:49写道:

been struggling wiht zeppelin + livy + sparkR integration for days.  I got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get

18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive support).
Exception in thread "SparkR backend" java.lang.ClassCastException: scala.Tuple2 cannot be cast to java.lang.Integer
    at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
    at org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
all the time :disappointed: running out of things to try
simple spark.R works

Any ideas or advice would be appreciated. Thank you!


--
Best Regards

Jeff Zhang


--
Best Regards

Jeff Zhang

Re: livy with sparkR issue

Posted by Jeff Zhang <zj...@gmail.com>.
Sorry, my mistake in the last email. Only SparkR before 2.3.0 is supported.

https://github.com/apache/zeppelin/blob/master/spark/interpreter/src/main/java/org/apache/zeppelin/spark/SparkVersion.java#L88


andrew shved <an...@gmail.com> 于2018年12月24日周一 上午9:30写道:

> Actually I get the same error even when I do something dead simple like
> below.  I ran the same commands in sparkR directly and it worked.  Is livy
> just does not work with sparkR this is with 2.3.1? It is a bit concerning
> that nothing really works via livy while works dierctly via sparkR would
> point to a livy issue?
>
> %sparkr
> df <- createDataFrame(sqlContext, faithful)
> head(df)
>
> On Sun, Dec 23, 2018 at 6:59 PM Jeff Zhang <zj...@gmail.com> wrote:
>
>> This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4
>> changes its SparkR related method signature. I am afraid you have to
>> downgrade to spark 2.3.x
>>
>>
>> andrew shved <an...@gmail.com> 于2018年12月24日周一 上午7:48写道:
>>
>>> Spark 2.4.0 Sorry
>>> Zeppelin 0.8.0
>>> Livy 0.5
>>>
>>> regular livy.sparkr commands like
>>> 1+1 work the issue when spark comes into play
>>>
>>> On Sun, Dec 23, 2018 at 6:44 PM andrew shved <an...@gmail.com>
>>> wrote:
>>>
>>>> 0.5 with spark 2.4.9 on AWS EMR
>>>>
>>>> On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang <zjffdu@gmail.com wrote:
>>>>
>>>>> Which version of livy do you use ?
>>>>>
>>>>> andrew shved <an...@gmail.com> 于2018年12月23日周日 下午11:49写道:
>>>>>
>>>>>>
>>>>>> been struggling wiht zeppelin + livy + sparkR integration for days.
>>>>>> I got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>>>>>
>>>>>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>>>>>> support).
>>>>>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>>>>>> scala.Tuple2 cannot be cast to java.lang.Integer
>>>>>>     at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>>>>>>     at
>>>>>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>>>>>> all the time :disappointed: running out of things to try
>>>>>> simple spark.R works
>>>>>>
>>>>>> Any ideas or advice would be appreciated. Thank you!
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Best Regards
>>>>>
>>>>> Jeff Zhang
>>>>>
>>>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
Best Regards

Jeff Zhang

Re: livy with sparkR issue

Posted by andrew shved <an...@gmail.com>.
Actually I get the same error even when I do something dead simple like
below.  I ran the same commands in sparkR directly and it worked.  Is livy
just does not work with sparkR this is with 2.3.1? It is a bit concerning
that nothing really works via livy while works dierctly via sparkR would
point to a livy issue?

%sparkr
df <- createDataFrame(sqlContext, faithful)
head(df)

On Sun, Dec 23, 2018 at 6:59 PM Jeff Zhang <zj...@gmail.com> wrote:

> This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4
> changes its SparkR related method signature. I am afraid you have to
> downgrade to spark 2.3.x
>
>
> andrew shved <an...@gmail.com> 于2018年12月24日周一 上午7:48写道:
>
>> Spark 2.4.0 Sorry
>> Zeppelin 0.8.0
>> Livy 0.5
>>
>> regular livy.sparkr commands like
>> 1+1 work the issue when spark comes into play
>>
>> On Sun, Dec 23, 2018 at 6:44 PM andrew shved <an...@gmail.com>
>> wrote:
>>
>>> 0.5 with spark 2.4.9 on AWS EMR
>>>
>>> On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang <zjffdu@gmail.com wrote:
>>>
>>>> Which version of livy do you use ?
>>>>
>>>> andrew shved <an...@gmail.com> 于2018年12月23日周日 下午11:49写道:
>>>>
>>>>>
>>>>> been struggling wiht zeppelin + livy + sparkR integration for days.  I
>>>>> got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>>>>
>>>>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>>>>> support).
>>>>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>>>>> scala.Tuple2 cannot be cast to java.lang.Integer
>>>>>     at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>>>>>     at
>>>>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>>>>> all the time :disappointed: running out of things to try
>>>>> simple spark.R works
>>>>>
>>>>> Any ideas or advice would be appreciated. Thank you!
>>>>>
>>>>
>>>>
>>>> --
>>>> Best Regards
>>>>
>>>> Jeff Zhang
>>>>
>>>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: livy with sparkR issue

Posted by andrew shved <an...@gmail.com>.
I switched to Spark 2.3.1
and still get the same error message in the logs
when running
%sparkr
sql("select * from a")
see full log.  How would I change that to work with to get result back ...

18/12/24 01:10:55 INFO RSCDriver: Connecting to:
ip-172-31-29-242.ca-central-1.compute.internal:10001
18/12/24 01:10:55 INFO RSCDriver: Starting RPC server...
18/12/24 01:10:55 INFO RpcServer: Connected to the port 10003
18/12/24 01:10:55 WARN RSCConf: Your hostname,
ip-172-31-29-242.ca-central-1.compute.internal, resolves to a loopback
address, but we couldn't find any external IP address!
18/12/24 01:10:55 WARN RSCConf: Set livy.rsc.rpc.server.address if you
need to bind to another address.
18/12/24 01:10:55 INFO RSCDriver: Received job request
f2d02219-7386-4cd7-8cb3-c5de4254d405
18/12/24 01:10:55 INFO RSCDriver: SparkContext not yet up, queueing job request.
18/12/24 01:10:58 INFO SparkEntries: Starting Spark context...
18/12/24 01:10:58 INFO SparkContext: Running Spark version 2.3.1
18/12/24 01:10:58 INFO SparkContext: Submitted application: livy-session-0
18/12/24 01:10:58 INFO SecurityManager: Changing view acls to: livy
18/12/24 01:10:58 INFO SecurityManager: Changing modify acls to: livy
18/12/24 01:10:58 INFO SecurityManager: Changing view acls groups to:
18/12/24 01:10:58 INFO SecurityManager: Changing modify acls groups to:
18/12/24 01:10:58 INFO SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users  with view
permissions: Set(livy); groups with view permissions: Set(); users
with modify permissions: Set(livy); groups with modify permissions:
Set()
18/12/24 01:10:58 INFO Utils: Successfully started service
'sparkDriver' on port 35045.
18/12/24 01:10:58 INFO SparkEnv: Registering MapOutputTracker
18/12/24 01:10:58 INFO SparkEnv: Registering BlockManagerMaster
18/12/24 01:10:58 INFO BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
18/12/24 01:10:58 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/12/24 01:10:58 INFO DiskBlockManager: Created local directory at
/mnt/tmp/blockmgr-ba12ad73-dcd8-4216-8ee3-649db6d5002d
18/12/24 01:10:58 INFO MemoryStore: MemoryStore started with capacity 413.9 MB
18/12/24 01:10:58 INFO SparkEnv: Registering OutputCommitCoordinator
18/12/24 01:10:58 INFO Utils: Successfully started service 'SparkUI'
on port 4040.
18/12/24 01:10:58 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started
at http://ip-172-31-29-242.ca-central-1.compute.internal:4040
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/rsc-jars/netty-all-4.0.37.Final.jar at
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/netty-all-4.0.37.Final.jar
with timestamp 1545613858546
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/rsc-jars/livy-rsc-0.5.0-incubating.jar at
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/livy-rsc-0.5.0-incubating.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/rsc-jars/livy-api-0.5.0-incubating.jar at
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/livy-api-0.5.0-incubating.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/repl_2.11-jars/livy-core_2.11-0.5.0-incubating.jar
at spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/livy-core_2.11-0.5.0-incubating.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/repl_2.11-jars/livy-repl_2.11-0.5.0-incubating.jar
at spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/livy-repl_2.11-0.5.0-incubating.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO SparkContext: Added JAR
file:/usr/lib/livy/repl_2.11-jars/commons-codec-1.9.jar at
spark://ip-172-31-29-242.ca-central-1.compute.internal:35045/jars/commons-codec-1.9.jar
with timestamp 1545613858547
18/12/24 01:10:58 INFO Utils: Using initial executors = 0, max of
spark.dynamicAllocation.initialExecutors,
spark.dynamicAllocation.minExecutors and spark.executor.instances
18/12/24 01:10:59 INFO RMProxy: Connecting to ResourceManager at
ip-172-31-29-242.ca-central-1.compute.internal/172.31.29.242:8032
18/12/24 01:10:59 INFO Client: Requesting a new application from
cluster with 1 NodeManagers
18/12/24 01:10:59 INFO Client: Verifying our application has not
requested more than the maximum memory capability of the cluster
(57344 MB per container)
18/12/24 01:10:59 INFO Client: Will allocate AM container, with 896 MB
memory including 384 MB overhead
18/12/24 01:10:59 INFO Client: Setting up container launch context for our AM
18/12/24 01:10:59 INFO Client: Setting up the launch environment for
our AM container
18/12/24 01:10:59 INFO Client: Preparing resources for our AM container
18/12/24 01:10:59 WARN Client: Neither spark.yarn.jars nor
spark.yarn.archive is set, falling back to uploading libraries under
SPARK_HOME.
18/12/24 01:11:01 INFO Client: Uploading resource
file:/mnt/tmp/spark-9bbe6c97-c7d6-4ef7-8d0c-9bea77319679/__spark_libs__7081303208980234686.zip
-> hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/__spark_libs__7081303208980234686.zip
18/12/24 01:11:01 INFO Client: Uploading resource
file:/usr/lib/livy/rsc-jars/netty-all-4.0.37.Final.jar ->
hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/netty-all-4.0.37.Final.jar
18/12/24 01:11:01 INFO Client: Uploading resource
file:/usr/lib/livy/rsc-jars/livy-rsc-0.5.0-incubating.jar ->
hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/livy-rsc-0.5.0-incubating.jar
18/12/24 01:11:01 INFO Client: Uploading resource
file:/usr/lib/livy/rsc-jars/livy-api-0.5.0-incubating.jar ->
hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/livy-api-0.5.0-incubating.jar
18/12/24 01:11:02 INFO Client: Uploading resource
file:/usr/lib/livy/repl_2.11-jars/livy-core_2.11-0.5.0-incubating.jar
-> hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/livy-core_2.11-0.5.0-incubating.jar
18/12/24 01:11:02 INFO Client: Uploading resource
file:/usr/lib/livy/repl_2.11-jars/livy-repl_2.11-0.5.0-incubating.jar
-> hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/livy-repl_2.11-0.5.0-incubating.jar
18/12/24 01:11:02 INFO Client: Uploading resource
file:/usr/lib/livy/repl_2.11-jars/commons-codec-1.9.jar ->
hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/commons-codec-1.9.jar
18/12/24 01:11:02 INFO Client: Uploading resource
file:/etc/spark/conf/hive-site.xml ->
hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/hive-site.xml
18/12/24 01:11:02 INFO Client: Uploading resource
file:/usr/lib/spark/R/lib/sparkr.zip#sparkr ->
hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/sparkr.zip
18/12/24 01:11:02 INFO Client: Uploading resource
file:/usr/lib/spark/python/lib/pyspark.zip ->
hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/pyspark.zip
18/12/24 01:11:02 INFO Client: Uploading resource
file:/usr/lib/spark/python/lib/py4j-0.10.7-src.zip ->
hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/py4j-0.10.7-src.zip
18/12/24 01:11:02 INFO Client: Uploading resource
file:/mnt/tmp/spark-9bbe6c97-c7d6-4ef7-8d0c-9bea77319679/__spark_conf__7275621142666935300.zip
-> hdfs://ip-172-31-29-242.ca-central-1.compute.internal:8020/user/livy/.sparkStaging/application_1545610352780_0008/__spark_conf__.zip
18/12/24 01:11:02 INFO SecurityManager: Changing view acls to: livy
18/12/24 01:11:02 INFO SecurityManager: Changing modify acls to: livy
18/12/24 01:11:02 INFO SecurityManager: Changing view acls groups to:
18/12/24 01:11:02 INFO SecurityManager: Changing modify acls groups to:
18/12/24 01:11:02 INFO SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users  with view
permissions: Set(livy); groups with view permissions: Set(); users
with modify permissions: Set(livy); groups with modify permissions:
Set()
18/12/24 01:11:02 INFO Client: Submitting application
application_1545610352780_0008 to ResourceManager
18/12/24 01:11:02 INFO YarnClientImpl: Submitted application
application_1545610352780_0008
18/12/24 01:11:02 INFO SchedulerExtensionServices: Starting Yarn
extension services with app application_1545610352780_0008 and
attemptId None
18/12/24 01:11:03 INFO Client: Application report for
application_1545610352780_0008 (state: ACCEPTED)
18/12/24 01:11:03 INFO Client:
	 client token: N/A
	 diagnostics: AM container is launched, waiting for AM container to
Register with RM
	 ApplicationMaster host: N/A
	 ApplicationMaster RPC port: -1
	 queue: default
	 start time: 1545613862223
	 final status: UNDEFINED
	 tracking URL:
http://ip-172-31-29-242.ca-central-1.compute.internal:20888/proxy/application_1545610352780_0008/
	 user: livy
18/12/24 01:11:04 INFO Client: Application report for
application_1545610352780_0008 (state: ACCEPTED)
18/12/24 01:11:05 INFO Client: Application report for
application_1545610352780_0008 (state: ACCEPTED)
18/12/24 01:11:06 INFO YarnClientSchedulerBackend: Add WebUI Filter.
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter,
Map(PROXY_HOSTS -> ip-172-31-29-242.ca-central-1.compute.internal,
PROXY_URI_BASES ->
http://ip-172-31-29-242.ca-central-1.compute.internal:20888/proxy/application_1545610352780_0008),
/proxy/application_1545610352780_0008
18/12/24 01:11:06 INFO JettyUtils: Adding filter:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
18/12/24 01:11:06 INFO Client: Application report for
application_1545610352780_0008 (state: ACCEPTED)
18/12/24 01:11:06 INFO YarnSchedulerBackend$YarnSchedulerEndpoint:
ApplicationMaster registered as
NettyRpcEndpointRef(spark-client://YarnAM)
18/12/24 01:11:07 INFO Client: Application report for
application_1545610352780_0008 (state: RUNNING)
18/12/24 01:11:07 INFO Client:
	 client token: N/A
	 diagnostics: N/A
	 ApplicationMaster host: 172.31.29.242
	 ApplicationMaster RPC port: 0
	 queue: default
	 start time: 1545613862223
	 final status: UNDEFINED
	 tracking URL:
http://ip-172-31-29-242.ca-central-1.compute.internal:20888/proxy/application_1545610352780_0008/
	 user: livy
18/12/24 01:11:07 INFO YarnClientSchedulerBackend: Application
application_1545610352780_0008 has started running.
18/12/24 01:11:07 INFO Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port
45861.
18/12/24 01:11:07 INFO NettyBlockTransferService: Server created on
ip-172-31-29-242.ca-central-1.compute.internal:45861
18/12/24 01:11:07 INFO BlockManager: Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block
replication policy
18/12/24 01:11:07 INFO BlockManagerMaster: Registering BlockManager
BlockManagerId(driver, ip-172-31-29-242.ca-central-1.compute.internal,
45861, None)
18/12/24 01:11:07 INFO BlockManagerMasterEndpoint: Registering block
manager ip-172-31-29-242.ca-central-1.compute.internal:45861 with
413.9 MB RAM, BlockManagerId(driver,
ip-172-31-29-242.ca-central-1.compute.internal, 45861, None)
18/12/24 01:11:07 INFO BlockManagerMaster: Registered BlockManager
BlockManagerId(driver, ip-172-31-29-242.ca-central-1.compute.internal,
45861, None)
18/12/24 01:11:07 INFO BlockManager: external shuffle service port = 7337
18/12/24 01:11:07 INFO BlockManager: Initialized BlockManager:
BlockManagerId(driver, ip-172-31-29-242.ca-central-1.compute.internal,
45861, None)
18/12/24 01:11:07 INFO EventLoggingListener: Logging events to
hdfs:/var/log/spark/apps/application_1545610352780_0008
18/12/24 01:11:07 INFO Utils: Using initial executors = 0, max of
spark.dynamicAllocation.initialExecutors,
spark.dynamicAllocation.minExecutors and spark.executor.instances
18/12/24 01:11:07 INFO YarnClientSchedulerBackend: SchedulerBackend is
ready for scheduling beginning after reached
minRegisteredResourcesRatio: 0.8
18/12/24 01:11:07 INFO SparkEntries: Spark context finished
initialization in 9581ms
18/12/24 01:11:07 INFO SparkEntries: Created Spark session (with Hive support).
Exception in thread "SparkR backend" java.lang.ClassCastException:
scala.Tuple2 cannot be cast to java.lang.Integer
	at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
	at org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)


On Sun, Dec 23, 2018 at 6:59 PM Jeff Zhang <zj...@gmail.com> wrote:

> This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4
> changes its SparkR related method signature. I am afraid you have to
> downgrade to spark 2.3.x
>
>
> andrew shved <an...@gmail.com> 于2018年12月24日周一 上午7:48写道:
>
>> Spark 2.4.0 Sorry
>> Zeppelin 0.8.0
>> Livy 0.5
>>
>> regular livy.sparkr commands like
>> 1+1 work the issue when spark comes into play
>>
>> On Sun, Dec 23, 2018 at 6:44 PM andrew shved <an...@gmail.com>
>> wrote:
>>
>>> 0.5 with spark 2.4.9 on AWS EMR
>>>
>>> On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang <zjffdu@gmail.com wrote:
>>>
>>>> Which version of livy do you use ?
>>>>
>>>> andrew shved <an...@gmail.com> 于2018年12月23日周日 下午11:49写道:
>>>>
>>>>>
>>>>> been struggling wiht zeppelin + livy + sparkR integration for days.  I
>>>>> got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>>>>
>>>>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>>>>> support).
>>>>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>>>>> scala.Tuple2 cannot be cast to java.lang.Integer
>>>>>     at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>>>>>     at
>>>>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>>>>> all the time :disappointed: running out of things to try
>>>>> simple spark.R works
>>>>>
>>>>> Any ideas or advice would be appreciated. Thank you!
>>>>>
>>>>
>>>>
>>>> --
>>>> Best Regards
>>>>
>>>> Jeff Zhang
>>>>
>>>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: livy with sparkR issue

Posted by Jeff Zhang <zj...@gmail.com>.
This is due to livy 0.5 doesn't support spark 2.4. Because spark 2.4
changes its SparkR related method signature. I am afraid you have to
downgrade to spark 2.3.x


andrew shved <an...@gmail.com> 于2018年12月24日周一 上午7:48写道:

> Spark 2.4.0 Sorry
> Zeppelin 0.8.0
> Livy 0.5
>
> regular livy.sparkr commands like
> 1+1 work the issue when spark comes into play
>
> On Sun, Dec 23, 2018 at 6:44 PM andrew shved <an...@gmail.com>
> wrote:
>
>> 0.5 with spark 2.4.9 on AWS EMR
>>
>> On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang <zjffdu@gmail.com wrote:
>>
>>> Which version of livy do you use ?
>>>
>>> andrew shved <an...@gmail.com> 于2018年12月23日周日 下午11:49写道:
>>>
>>>>
>>>> been struggling wiht zeppelin + livy + sparkR integration for days.  I
>>>> got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>>>
>>>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>>>> support).
>>>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>>>> scala.Tuple2 cannot be cast to java.lang.Integer
>>>>     at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>>>>     at
>>>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>>>> all the time :disappointed: running out of things to try
>>>> simple spark.R works
>>>>
>>>> Any ideas or advice would be appreciated. Thank you!
>>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>

-- 
Best Regards

Jeff Zhang

Re: livy with sparkR issue

Posted by andrew shved <an...@gmail.com>.
Spark 2.4.0 Sorry
Zeppelin 0.8.0
Livy 0.5

regular livy.sparkr commands like
1+1 work the issue when spark comes into play

On Sun, Dec 23, 2018 at 6:44 PM andrew shved <an...@gmail.com>
wrote:

> 0.5 with spark 2.4.9 on AWS EMR
>
> On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang <zjffdu@gmail.com wrote:
>
>> Which version of livy do you use ?
>>
>> andrew shved <an...@gmail.com> 于2018年12月23日周日 下午11:49写道:
>>
>>>
>>> been struggling wiht zeppelin + livy + sparkR integration for days.  I
>>> got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>>
>>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>>> support).
>>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>>> scala.Tuple2 cannot be cast to java.lang.Integer
>>>     at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>>>     at
>>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>>> all the time :disappointed: running out of things to try
>>> simple spark.R works
>>>
>>> Any ideas or advice would be appreciated. Thank you!
>>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

Re: livy with sparkR issue

Posted by andrew shved <an...@gmail.com>.
0.5 with spark 2.4.9 on AWS EMR

On Sun., Dec. 23, 2018, 6:41 p.m. Jeff Zhang <zjffdu@gmail.com wrote:

> Which version of livy do you use ?
>
> andrew shved <an...@gmail.com> 于2018年12月23日周日 下午11:49写道:
>
>>
>> been struggling wiht zeppelin + livy + sparkR integration for days.  I
>> got livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>>
>> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
>> support).
>> Exception in thread "SparkR backend" java.lang.ClassCastException:
>> scala.Tuple2 cannot be cast to java.lang.Integer
>>     at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>>     at
>> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
>> all the time :disappointed: running out of things to try
>> simple spark.R works
>>
>> Any ideas or advice would be appreciated. Thank you!
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: livy with sparkR issue

Posted by Jeff Zhang <zj...@gmail.com>.
Which version of livy do you use ?

andrew shved <an...@gmail.com> 于2018年12月23日周日 下午11:49写道:

>
> been struggling wiht zeppelin + livy + sparkR integration for days.  I got
> livy.pyspark and livy.spark work no issues.  with livy.sparkr I get
>
> 18/12/23 15:05:24 INFO SparkEntries: Created Spark session (with Hive
> support).
> Exception in thread "SparkR backend" java.lang.ClassCastException:
> scala.Tuple2 cannot be cast to java.lang.Integer
>     at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:101)
>     at
> org.apache.livy.repl.SparkRInterpreter$$anon$1.run(SparkRInterpreter.scala:83)
> all the time :disappointed: running out of things to try
> simple spark.R works
>
> Any ideas or advice would be appreciated. Thank you!
>


-- 
Best Regards

Jeff Zhang