You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Ben Vogan <be...@shopkick.com> on 2017/05/14 12:16:53 UTC

Illegal Inheritance error

Hi all,

I've been using Zeppelin for a couple of weeks now with a stable
configuration, but all of a sudden I am getting "Illegal inheritance"
errors like so:

 INFO [2017-05-14 03:25:32,678] ({pool-2-thread-56}
Paragraph.java[jobRun]:362) - run paragraph 20170514-032326_663206142 using
livy org.apache.zeppelin.interpreter.LazyOpenInterpreter@505a171c
 WARN [2017-05-14 03:25:33,696] ({pool-2-thread-56}
NotebookServer.java[afterStatusChange]:2058) - Job
20170514-032326_663206142 is finished, status: ERROR, exception: null,
result: %text <console>:4: error: illegal inheritance;

It happens across multiple notebooks and across by my spark and livy
interpreters.  I don't know where to look for more information about what
is wrong.  I don't see any errors in spark/yarn at all.  The driver got
created, but it looks like no jobs were ever submitted to spark.

Help would be greatly appreciated.

Thanks,

-- 
*BENJAMIN VOGAN* | Data Platform Team Lead

<http://www.shopkick.com/>
<https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
<https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
<https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>

Re: Illegal Inheritance error

Posted by Jeff Zhang <zj...@gmail.com>.
You should put hive-site.xml in SPARK_CONF_DIR, the can not find file bug
is due to a spark bug

https://issues.apache.org/jira/browse/SPARK-18160
https://issues.cloudera.org/browse/LIVY-298

I have 1 workaround for you:

You need to install spark in all the nodes and put hive-site.xml in
SPARK_CONF_DIR if your cluster is not large




Ben Vogan <be...@shopkick.com>于2017年5月16日周二 下午3:52写道:

> I know - this is driving me crazy.  It was working fine and without me
> touching any of it (zeppelin/livy/spark/yarn), it broke.  And with no
> errors in spark/yarn or livy.  I see a warning in the livy log about the
> hive-site.xml not being found.  In the interpreter configuration I have
> tried setting livy.repl.enableHiveContext to both true and false and it
> still appears to always create a hive context.  I'm not sure how to get rid
> of that warning - I tried putting the hive-site.xml into the spark conf
> directory on the livy host but that broke things entirely on the yarn side
> - claimed the file didn't exist (maybe it would have to be put on all the
> yarn executor hosts as well?).
>
> *17/05/16 19:47:39 WARN InteractiveSession$: Enable HiveContext but no
> hive-site.xml found under classpath or user request.*
> 17/05/16 19:47:39 INFO InteractiveSession$: Creating LivyClient for
> sessionId: 26
> 17/05/16 19:47:39 WARN RSCConf: Your hostname,
> jarvis-hue002.internal.shopkick.com, resolves to a loopback address, but
> we couldn't find any external IP address!
> 17/05/16 19:47:39 WARN RSCConf: Set livy.rsc.rpc.server.address if you
> need to bind to another address.
> 17/05/16 19:47:39 INFO InteractiveSessionManager: Registering new session
> 26
> 17/05/16 19:47:39 INFO ContextLauncher: WARNING: User-defined SPARK_HOME
> (/opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/spark) overrides
> detected (/opt/cloudera/parcels/CDH/lib/spark).
> 17/05/16 19:47:39 INFO ContextLauncher: WARNING: Running spark-class from
> user-defined location.
> 17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO
> yarn.Client: Requesting a new application from cluster with 12 NodeManagers
> 17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO
> yarn.Client: Verifying our application has not requested more than the
> maximum memory capability of the cluster (38912 MB per container)
> 17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO
> yarn.Client: Will allocate AM container, with 4505 MB memory including 409
> MB overhead
> 17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO
> yarn.Client: Setting up container launch context for our AM
> 17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO
> yarn.Client: Setting up the launch environment for our AM container
> 17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO
> yarn.Client: Preparing resources for our AM container
> 17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO
> yarn.Client: Uploading resource
> file:/services/livy-server/livy-server-current/rsc-jars/livy-api-0.3.0.jar
> ->
> hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/livy-api-0.3.0.jar
> 17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO
> yarn.Client: Uploading resource
> file:/services/livy-server/livy-server-current/rsc-jars/livy-rsc-0.3.0.jar
> ->
> hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/livy-rsc-0.3.0.jar
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Uploading resource
> file:/services/livy-server/livy-server-current/rsc-jars/netty-all-4.0.29.Final.jar
> ->
> hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/netty-all-4.0.29.Final.jar
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Source and destination file systems are the same. Not copying
> hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Source and destination file systems are the same. Not copying
> hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Uploading resource
> file:/services/livy-server/livy-server-current/repl_2.10-jars/commons-codec-1.9.jar
> ->
> hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/commons-codec-1.9.jar
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Uploading resource
> file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-repl_2.10-0.3.0.jar
> ->
> hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/livy-repl_2.10-0.3.0.jar
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Uploading resource
> file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-core_2.10-0.3.0.jar
> ->
> hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/livy-core_2.10-0.3.0.jar
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Uploading resource
> file:/tmp/spark-48e81b56-c827-4227-a969-45022ec71175/__spark_conf__6757250222634589720.zip
> ->
> hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/__spark_conf__6757250222634589720.zip
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> spark.SecurityManager: Changing view acls to: hdfs
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> spark.SecurityManager: Changing modify acls to: hdfs
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> spark.SecurityManager: SecurityManager: authentication disabled; ui acls
> disabled; users with view permissions: Set(hdfs); users with modify
> permissions: Set(hdfs)
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Submitting application 386 to ResourceManager
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> impl.YarnClientImpl: Submitted application application_1494373289850_0386
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
> yarn.Client: Application report for application_1494373289850_0386 (state:
> ACCEPTED)
> 17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
> 17/05/16 19:47:42 INFO ContextLauncher:          client token: N/A
> 17/05/16 19:47:42 INFO ContextLauncher:          diagnostics: N/A
> 17/05/16 19:47:42 INFO ContextLauncher:          ApplicationMaster host:
> N/A
> 17/05/16 19:47:42 INFO ContextLauncher:          ApplicationMaster RPC
> port: -1
> 17/05/16 19:47:42 INFO ContextLauncher:          queue: root.hdfs
> 17/05/16 19:47:42 INFO ContextLauncher:          start time: 1494964062698
> 17/05/16 19:47:42 INFO ContextLauncher:          final status: UNDEFINED
>
>
> On Mon, May 15, 2017 at 6:53 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
>>
>> It is weird that the yarn app log shows the SQLContext is created
>> successfully, but in zeppelin side it shows error of "Fail to create
>> SQLContext"
>>
>> Ben Vogan <be...@shopkick.com>于2017年5月15日周一 下午8:07写道:
>>
>>> I am using 0.7.1 and I checked the yarn app log and don't see any
>>> errors.  It looks like this:
>>>
>>> 17/05/16 00:04:12 INFO yarn.ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
>>> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1494373289850_0336_000001
>>> 17/05/16 00:04:13 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
>>> 17/05/16 00:04:13 INFO spark.SecurityManager: Changing modify acls to: yarn,hdfs
>>> 17/05/16 00:04:13 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); users with modify permissions: Set(yarn, hdfs)
>>> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
>>> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark context initialization
>>> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark context initialization ...
>>> 17/05/16 00:04:14 INFO driver.RSCDriver: Connecting to: jarvis-hue002.internal.shopkick.com:40819
>>> 17/05/16 00:04:14 INFO driver.RSCDriver: Starting RPC server...
>>> 17/05/16 00:04:14 WARN rsc.RSCConf: Your hostname, jarvis-yarn008.internal.shopkick.com, resolves to a loopback address, but we couldn't find any external IP address!
>>> 17/05/16 00:04:14 WARN rsc.RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
>>> 17/05/16 00:04:14 INFO driver.RSCDriver: Received job request cd7d1356-709d-4674-a85c-21edade2c38d
>>> 17/05/16 00:04:14 INFO driver.RSCDriver: SparkContext not yet up, queueing job request.
>>> 17/05/16 00:04:17 INFO spark.SparkContext: Running Spark version 1.6.0
>>> 17/05/16 00:04:17 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
>>> 17/05/16 00:04:17 INFO spark.SecurityManager: Changing modify acls to: yarn,hdfs
>>> 17/05/16 00:04:17 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); users with modify permissions: Set(yarn, hdfs)
>>> 17/05/16 00:04:17 INFO util.Utils: Successfully started service 'sparkDriver' on port 53267.
>>> 17/05/16 00:04:18 INFO slf4j.Slf4jLogger: Slf4jLogger started
>>> 17/05/16 00:04:18 INFO Remoting: Starting remoting
>>> 17/05/16 00:04:18 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
>>> 17/05/16 00:04:18 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
>>> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 38037.
>>> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering MapOutputTracker
>>> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering BlockManagerMaster
>>> 17/05/16 00:04:18 INFO storage.DiskBlockManager: Created local directory at /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/blockmgr-f46429a6-7466-42c1-bd79-9ddf6ec61cb4
>>> 17/05/16 00:04:18 INFO storage.MemoryStore: MemoryStore started with capacity 1966.1 MB
>>> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering OutputCommitCoordinator
>>> 17/05/16 00:04:18 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
>>> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'SparkUI' on port 49024.
>>> 17/05/16 00:04:18 INFO ui.SparkUI: Started SparkUI at http://10.19.194.147:49024
>>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/livy-api-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-api-0.3.0.jar with timestamp 1494893058608
>>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/livy-rsc-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-rsc-0.3.0.jar with timestamp 1494893058609
>>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/netty-all-4.0.29.Final.jar at spark://10.19.194.147:53267/jars/netty-all-4.0.29.Final.jar with timestamp 1494893058609
>>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar at hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar with timestamp 1494893058609
>>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar at hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar with timestamp 1494893058609
>>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/commons-codec-1.9.jar at spark://10.19.194.147:53267/jars/commons-codec-1.9.jar with timestamp 1494893058609
>>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-repl_2.10-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-repl_2.10-0.3.0.jar with timestamp 1494893058609
>>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-core_2.10-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-core_2.10-0.3.0.jar with timestamp 1494893058609
>>> 17/05/16 00:04:18 INFO cluster.YarnClusterScheduler: Created YarnClusterScheduler
>>> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 57551.
>>> 17/05/16 00:04:18 INFO netty.NettyBlockTransferService: Server created on 57551
>>> 17/05/16 00:04:18 INFO storage.BlockManager: external shuffle service port = 7337
>>> 17/05/16 00:04:18 INFO storage.BlockManagerMaster: Trying to register BlockManager
>>> 17/05/16 00:04:18 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.19.194.147:57551 with 1966.1 MB RAM, BlockManagerId(driver, 10.19.194.147, 57551)
>>> 17/05/16 00:04:18 INFO storage.BlockManagerMaster: Registered BlockManager
>>> 17/05/16 00:04:19 INFO scheduler.EventLoggingListener: Logging events to hdfs://jarvis-nameservice001/user/spark/applicationHistory/application_1494373289850_0336_1
>>> 17/05/16 00:04:19 INFO cluster.YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
>>> 17/05/16 00:04:19 INFO cluster.YarnClusterScheduler: YarnClusterScheduler.postStartHook done
>>> 17/05/16 00:04:19 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@10.19.194.147:53267)
>>> 17/05/16 00:04:19 INFO yarn.YarnRMClient: Registering the ApplicationMaster
>>> 17/05/16 00:04:19 INFO yarn.ApplicationMaster: Started progress reporter thread with (heartbeat : 3000, initial allocation : 200) intervals
>>> 17/05/16 00:04:19 INFO hive.HiveContext: Initializing execution hive, version 1.1.0
>>> 17/05/16 00:04:19 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0-cdh5.7.0
>>> 17/05/16 00:04:19 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.7.0
>>> 17/05/16 00:04:20 INFO hive.metastore: Trying to connect to metastore with URI thrift://jarvis-hdfs003.internal.shopkick.com:9083
>>> 17/05/16 00:04:20 INFO hive.metastore: Opened a connection to metastore, current connections: 1
>>> 17/05/16 00:04:20 INFO hive.metastore: Connected to metastore.
>>> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs
>>> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn
>>> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/478f39e9-5295-4e8e-97aa-40b5828f9440_resources
>>> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440
>>> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn/478f39e9-5295-4e8e-97aa-40b5828f9440
>>> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440/_tmp_space.db
>>> 17/05/16 00:04:20 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
>>> 17/05/16 00:04:20 INFO repl.SparkInterpreter: Created sql context (with Hive support).
>>>
>>>
>>> On Mon, May 15, 2017 at 5:43 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>>
>>>>
>>>> Which version of zeppelin do you use ? And can you check the yarn app
>>>> log ?
>>>>
>>>>
>>>> Ben Vogan <be...@shopkick.com>于2017年5月15日周一 下午5:56写道:
>>>>
>>>>> Hi all,
>>>>>
>>>>> For some reason today I'm getting a stack:
>>>>>
>>>>> org.apache.zeppelin.livy.LivyException: Fail to create
>>>>> SQLContext,<console>:4: error: illegal inheritance;
>>>>> at
>>>>> org.apache.zeppelin.livy.LivySparkSQLInterpreter.open(LivySparkSQLInterpreter.java:76)
>>>>> at
>>>>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>>>>> at
>>>>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>>>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>>>>> at
>>>>> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>>>>> at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>>> at
>>>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>>>>> at
>>>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>
>>>>> On the Livy server I see no errors and there is an open session on
>>>>> yarn.
>>>>>
>>>>> Some help on this would be greatly appreciated!
>>>>>
>>>>> --Ben
>>>>>
>>>>> On Sun, May 14, 2017 at 6:16 AM, Ben Vogan <be...@shopkick.com> wrote:
>>>>>
>>>>>> Hi all,
>>>>>>
>>>>>> I've been using Zeppelin for a couple of weeks now with a stable
>>>>>> configuration, but all of a sudden I am getting "Illegal inheritance"
>>>>>> errors like so:
>>>>>>
>>>>>>  INFO [2017-05-14 03:25:32,678] ({pool-2-thread-56}
>>>>>> Paragraph.java[jobRun]:362) - run paragraph 20170514-032326_663206142 using
>>>>>> livy org.apache.zeppelin.interpreter.LazyOpenInterpreter@505a171c
>>>>>>  WARN [2017-05-14 03:25:33,696] ({pool-2-thread-56}
>>>>>> NotebookServer.java[afterStatusChange]:2058) - Job
>>>>>> 20170514-032326_663206142 is finished, status: ERROR, exception: null,
>>>>>> result: %text <console>:4: error: illegal inheritance;
>>>>>>
>>>>>> It happens across multiple notebooks and across by my spark and livy
>>>>>> interpreters.  I don't know where to look for more information about what
>>>>>> is wrong.  I don't see any errors in spark/yarn at all.  The driver got
>>>>>> created, but it looks like no jobs were ever submitted to spark.
>>>>>>
>>>>>> Help would be greatly appreciated.
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> --
>>>>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>>>>
>>>>>> <http://www.shopkick.com/>
>>>>>> <https://www.facebook.com/shopkick>
>>>>>> <https://www.instagram.com/shopkick/>
>>>>>> <https://www.pinterest.com/shopkick/>
>>>>>> <https://twitter.com/shopkickbiz>
>>>>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>>>
>>>>> <http://www.shopkick.com/>
>>>>> <https://www.facebook.com/shopkick>
>>>>> <https://www.instagram.com/shopkick/>
>>>>> <https://www.pinterest.com/shopkick/>
>>>>> <https://twitter.com/shopkickbiz>
>>>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>>>
>>>>
>>>
>>>
>>> --
>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>
>>> <http://www.shopkick.com/>
>>> <https://www.facebook.com/shopkick>
>>> <https://www.instagram.com/shopkick/>
>>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>
>>
>
>
> --
> *BENJAMIN VOGAN* | Data Platform Team Lead
>
> <http://www.shopkick.com/>
> <https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>

Re: Illegal Inheritance error

Posted by Ben Vogan <be...@shopkick.com>.
I know - this is driving me crazy.  It was working fine and without me
touching any of it (zeppelin/livy/spark/yarn), it broke.  And with no
errors in spark/yarn or livy.  I see a warning in the livy log about the
hive-site.xml not being found.  In the interpreter configuration I have
tried setting livy.repl.enableHiveContext to both true and false and it
still appears to always create a hive context.  I'm not sure how to get rid
of that warning - I tried putting the hive-site.xml into the spark conf
directory on the livy host but that broke things entirely on the yarn side
- claimed the file didn't exist (maybe it would have to be put on all the
yarn executor hosts as well?).

*17/05/16 19:47:39 WARN InteractiveSession$: Enable HiveContext but no
hive-site.xml found under classpath or user request.*
17/05/16 19:47:39 INFO InteractiveSession$: Creating LivyClient for
sessionId: 26
17/05/16 19:47:39 WARN RSCConf: Your hostname,
jarvis-hue002.internal.shopkick.com, resolves to a loopback address, but we
couldn't find any external IP address!
17/05/16 19:47:39 WARN RSCConf: Set livy.rsc.rpc.server.address if you need
to bind to another address.
17/05/16 19:47:39 INFO InteractiveSessionManager: Registering new session 26
17/05/16 19:47:39 INFO ContextLauncher: WARNING: User-defined SPARK_HOME
(/opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/spark) overrides
detected (/opt/cloudera/parcels/CDH/lib/spark).
17/05/16 19:47:39 INFO ContextLauncher: WARNING: Running spark-class from
user-defined location.
17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO yarn.Client:
Requesting a new application from cluster with 12 NodeManagers
17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO yarn.Client:
Verifying our application has not requested more than the maximum memory
capability of the cluster (38912 MB per container)
17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO yarn.Client:
Will allocate AM container, with 4505 MB memory including 409 MB overhead
17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO yarn.Client:
Setting up container launch context for our AM
17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO yarn.Client:
Setting up the launch environment for our AM container
17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO yarn.Client:
Preparing resources for our AM container
17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO yarn.Client:
Uploading resource
file:/services/livy-server/livy-server-current/rsc-jars/livy-api-0.3.0.jar
->
hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/livy-api-0.3.0.jar
17/05/16 19:47:41 INFO ContextLauncher: 17/05/16 19:47:41 INFO yarn.Client:
Uploading resource
file:/services/livy-server/livy-server-current/rsc-jars/livy-rsc-0.3.0.jar
->
hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/livy-rsc-0.3.0.jar
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Uploading resource
file:/services/livy-server/livy-server-current/rsc-jars/netty-all-4.0.29.Final.jar
->
hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/netty-all-4.0.29.Final.jar
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Source and destination file systems are the same. Not copying
hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Source and destination file systems are the same. Not copying
hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Uploading resource
file:/services/livy-server/livy-server-current/repl_2.10-jars/commons-codec-1.9.jar
->
hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/commons-codec-1.9.jar
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Uploading resource
file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-repl_2.10-0.3.0.jar
->
hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/livy-repl_2.10-0.3.0.jar
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Uploading resource
file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-core_2.10-0.3.0.jar
->
hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/livy-core_2.10-0.3.0.jar
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Uploading resource
file:/tmp/spark-48e81b56-c827-4227-a969-45022ec71175/__spark_conf__6757250222634589720.zip
->
hdfs://jarvis-nameservice001/user/hdfs/.sparkStaging/application_1494373289850_0386/__spark_conf__6757250222634589720.zip
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
spark.SecurityManager: Changing view acls to: hdfs
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
spark.SecurityManager: Changing modify acls to: hdfs
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
spark.SecurityManager: SecurityManager: authentication disabled; ui acls
disabled; users with view permissions: Set(hdfs); users with modify
permissions: Set(hdfs)
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Submitting application 386 to ResourceManager
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO
impl.YarnClientImpl: Submitted application application_1494373289850_0386
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
Application report for application_1494373289850_0386 (state: ACCEPTED)
17/05/16 19:47:42 INFO ContextLauncher: 17/05/16 19:47:42 INFO yarn.Client:
17/05/16 19:47:42 INFO ContextLauncher:          client token: N/A
17/05/16 19:47:42 INFO ContextLauncher:          diagnostics: N/A
17/05/16 19:47:42 INFO ContextLauncher:          ApplicationMaster host: N/A
17/05/16 19:47:42 INFO ContextLauncher:          ApplicationMaster RPC
port: -1
17/05/16 19:47:42 INFO ContextLauncher:          queue: root.hdfs
17/05/16 19:47:42 INFO ContextLauncher:          start time: 1494964062698
17/05/16 19:47:42 INFO ContextLauncher:          final status: UNDEFINED


On Mon, May 15, 2017 at 6:53 PM, Jeff Zhang <zj...@gmail.com> wrote:

>
> It is weird that the yarn app log shows the SQLContext is created
> successfully, but in zeppelin side it shows error of "Fail to create
> SQLContext"
>
> Ben Vogan <be...@shopkick.com>于2017年5月15日周一 下午8:07写道:
>
>> I am using 0.7.1 and I checked the yarn app log and don't see any
>> errors.  It looks like this:
>>
>> 17/05/16 00:04:12 INFO yarn.ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
>> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1494373289850_0336_000001
>> 17/05/16 00:04:13 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
>> 17/05/16 00:04:13 INFO spark.SecurityManager: Changing modify acls to: yarn,hdfs
>> 17/05/16 00:04:13 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); users with modify permissions: Set(yarn, hdfs)
>> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
>> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark context initialization
>> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark context initialization ...
>> 17/05/16 00:04:14 INFO driver.RSCDriver: Connecting to: jarvis-hue002.internal.shopkick.com:40819
>> 17/05/16 00:04:14 INFO driver.RSCDriver: Starting RPC server...
>> 17/05/16 00:04:14 WARN rsc.RSCConf: Your hostname, jarvis-yarn008.internal.shopkick.com, resolves to a loopback address, but we couldn't find any external IP address!
>> 17/05/16 00:04:14 WARN rsc.RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
>> 17/05/16 00:04:14 INFO driver.RSCDriver: Received job request cd7d1356-709d-4674-a85c-21edade2c38d
>> 17/05/16 00:04:14 INFO driver.RSCDriver: SparkContext not yet up, queueing job request.
>> 17/05/16 00:04:17 INFO spark.SparkContext: Running Spark version 1.6.0
>> 17/05/16 00:04:17 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
>> 17/05/16 00:04:17 INFO spark.SecurityManager: Changing modify acls to: yarn,hdfs
>> 17/05/16 00:04:17 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); users with modify permissions: Set(yarn, hdfs)
>> 17/05/16 00:04:17 INFO util.Utils: Successfully started service 'sparkDriver' on port 53267.
>> 17/05/16 00:04:18 INFO slf4j.Slf4jLogger: Slf4jLogger started
>> 17/05/16 00:04:18 INFO Remoting: Starting remoting
>> 17/05/16 00:04:18 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
>> 17/05/16 00:04:18 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
>> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 38037.
>> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering MapOutputTracker
>> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering BlockManagerMaster
>> 17/05/16 00:04:18 INFO storage.DiskBlockManager: Created local directory at /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/blockmgr-f46429a6-7466-42c1-bd79-9ddf6ec61cb4
>> 17/05/16 00:04:18 INFO storage.MemoryStore: MemoryStore started with capacity 1966.1 MB
>> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering OutputCommitCoordinator
>> 17/05/16 00:04:18 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
>> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'SparkUI' on port 49024.
>> 17/05/16 00:04:18 INFO ui.SparkUI: Started SparkUI at http://10.19.194.147:49024
>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/livy-api-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-api-0.3.0.jar with timestamp 1494893058608
>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/livy-rsc-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-rsc-0.3.0.jar with timestamp 1494893058609
>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/netty-all-4.0.29.Final.jar at spark://10.19.194.147:53267/jars/netty-all-4.0.29.Final.jar with timestamp 1494893058609
>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar at hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar with timestamp 1494893058609
>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar at hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar with timestamp 1494893058609
>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/commons-codec-1.9.jar at spark://10.19.194.147:53267/jars/commons-codec-1.9.jar with timestamp 1494893058609
>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-repl_2.10-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-repl_2.10-0.3.0.jar with timestamp 1494893058609
>> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-core_2.10-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-core_2.10-0.3.0.jar with timestamp 1494893058609
>> 17/05/16 00:04:18 INFO cluster.YarnClusterScheduler: Created YarnClusterScheduler
>> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 57551.
>> 17/05/16 00:04:18 INFO netty.NettyBlockTransferService: Server created on 57551
>> 17/05/16 00:04:18 INFO storage.BlockManager: external shuffle service port = 7337
>> 17/05/16 00:04:18 INFO storage.BlockManagerMaster: Trying to register BlockManager
>> 17/05/16 00:04:18 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.19.194.147:57551 with 1966.1 MB RAM, BlockManagerId(driver, 10.19.194.147, 57551)
>> 17/05/16 00:04:18 INFO storage.BlockManagerMaster: Registered BlockManager
>> 17/05/16 00:04:19 INFO scheduler.EventLoggingListener: Logging events to hdfs://jarvis-nameservice001/user/spark/applicationHistory/application_1494373289850_0336_1
>> 17/05/16 00:04:19 INFO cluster.YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
>> 17/05/16 00:04:19 INFO cluster.YarnClusterScheduler: YarnClusterScheduler.postStartHook done
>> 17/05/16 00:04:19 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@10.19.194.147:53267)
>> 17/05/16 00:04:19 INFO yarn.YarnRMClient: Registering the ApplicationMaster
>> 17/05/16 00:04:19 INFO yarn.ApplicationMaster: Started progress reporter thread with (heartbeat : 3000, initial allocation : 200) intervals
>> 17/05/16 00:04:19 INFO hive.HiveContext: Initializing execution hive, version 1.1.0
>> 17/05/16 00:04:19 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0-cdh5.7.0
>> 17/05/16 00:04:19 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.7.0
>> 17/05/16 00:04:20 INFO hive.metastore: Trying to connect to metastore with URI thrift://jarvis-hdfs003.internal.shopkick.com:9083
>> 17/05/16 00:04:20 INFO hive.metastore: Opened a connection to metastore, current connections: 1
>> 17/05/16 00:04:20 INFO hive.metastore: Connected to metastore.
>> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs
>> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn
>> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/478f39e9-5295-4e8e-97aa-40b5828f9440_resources
>> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440
>> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn/478f39e9-5295-4e8e-97aa-40b5828f9440
>> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440/_tmp_space.db
>> 17/05/16 00:04:20 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
>> 17/05/16 00:04:20 INFO repl.SparkInterpreter: Created sql context (with Hive support).
>>
>>
>> On Mon, May 15, 2017 at 5:43 PM, Jeff Zhang <zj...@gmail.com> wrote:
>>
>>>
>>> Which version of zeppelin do you use ? And can you check the yarn app
>>> log ?
>>>
>>>
>>> Ben Vogan <be...@shopkick.com>于2017年5月15日周一 下午5:56写道:
>>>
>>>> Hi all,
>>>>
>>>> For some reason today I'm getting a stack:
>>>>
>>>> org.apache.zeppelin.livy.LivyException: Fail to create
>>>> SQLContext,<console>:4: error: illegal inheritance;
>>>> at org.apache.zeppelin.livy.LivySparkSQLInterpreter.open(
>>>> LivySparkSQLInterpreter.java:76)
>>>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(
>>>> LazyOpenInterpreter.java:70)
>>>> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$
>>>> InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>>>> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(
>>>> FIFOScheduler.java:139)
>>>> at java.util.concurrent.Executors$RunnableAdapter.
>>>> call(Executors.java:511)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>> at java.util.concurrent.ScheduledThreadPoolExecutor$
>>>> ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>>>> at java.util.concurrent.ScheduledThreadPoolExecutor$
>>>> ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(
>>>> ThreadPoolExecutor.java:1142)
>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>>>> ThreadPoolExecutor.java:617)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> On the Livy server I see no errors and there is an open session on yarn.
>>>>
>>>> Some help on this would be greatly appreciated!
>>>>
>>>> --Ben
>>>>
>>>> On Sun, May 14, 2017 at 6:16 AM, Ben Vogan <be...@shopkick.com> wrote:
>>>>
>>>>> Hi all,
>>>>>
>>>>> I've been using Zeppelin for a couple of weeks now with a stable
>>>>> configuration, but all of a sudden I am getting "Illegal inheritance"
>>>>> errors like so:
>>>>>
>>>>>  INFO [2017-05-14 03:25:32,678] ({pool-2-thread-56}
>>>>> Paragraph.java[jobRun]:362) - run paragraph 20170514-032326_663206142 using
>>>>> livy org.apache.zeppelin.interpreter.LazyOpenInterpreter@505a171c
>>>>>  WARN [2017-05-14 03:25:33,696] ({pool-2-thread-56}
>>>>> NotebookServer.java[afterStatusChange]:2058) - Job
>>>>> 20170514-032326_663206142 is finished, status: ERROR, exception: null,
>>>>> result: %text <console>:4: error: illegal inheritance;
>>>>>
>>>>> It happens across multiple notebooks and across by my spark and livy
>>>>> interpreters.  I don't know where to look for more information about what
>>>>> is wrong.  I don't see any errors in spark/yarn at all.  The driver got
>>>>> created, but it looks like no jobs were ever submitted to spark.
>>>>>
>>>>> Help would be greatly appreciated.
>>>>>
>>>>> Thanks,
>>>>>
>>>>> --
>>>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>>>
>>>>> <http://www.shopkick.com/>
>>>>> <https://www.facebook.com/shopkick>
>>>>> <https://www.instagram.com/shopkick/>
>>>>> <https://www.pinterest.com/shopkick/>
>>>>> <https://twitter.com/shopkickbiz>
>>>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>>
>>>> <http://www.shopkick.com/>
>>>> <https://www.facebook.com/shopkick>
>>>> <https://www.instagram.com/shopkick/>
>>>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>>
>>>
>>
>>
>> --
>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>
>> <http://www.shopkick.com/>
>> <https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>
>


-- 
*BENJAMIN VOGAN* | Data Platform Team Lead

<http://www.shopkick.com/>
<https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
<https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
<https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>

Re: Illegal Inheritance error

Posted by Jeff Zhang <zj...@gmail.com>.
It is weird that the yarn app log shows the SQLContext is created
successfully, but in zeppelin side it shows error of "Fail to create
SQLContext"

Ben Vogan <be...@shopkick.com>于2017年5月15日周一 下午8:07写道:

> I am using 0.7.1 and I checked the yarn app log and don't see any errors.
> It looks like this:
>
> 17/05/16 00:04:12 INFO yarn.ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1494373289850_0336_000001
> 17/05/16 00:04:13 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
> 17/05/16 00:04:13 INFO spark.SecurityManager: Changing modify acls to: yarn,hdfs
> 17/05/16 00:04:13 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); users with modify permissions: Set(yarn, hdfs)
> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark context initialization
> 17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark context initialization ...
> 17/05/16 00:04:14 INFO driver.RSCDriver: Connecting to: jarvis-hue002.internal.shopkick.com:40819
> 17/05/16 00:04:14 INFO driver.RSCDriver: Starting RPC server...
> 17/05/16 00:04:14 WARN rsc.RSCConf: Your hostname, jarvis-yarn008.internal.shopkick.com, resolves to a loopback address, but we couldn't find any external IP address!
> 17/05/16 00:04:14 WARN rsc.RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address.
> 17/05/16 00:04:14 INFO driver.RSCDriver: Received job request cd7d1356-709d-4674-a85c-21edade2c38d
> 17/05/16 00:04:14 INFO driver.RSCDriver: SparkContext not yet up, queueing job request.
> 17/05/16 00:04:17 INFO spark.SparkContext: Running Spark version 1.6.0
> 17/05/16 00:04:17 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
> 17/05/16 00:04:17 INFO spark.SecurityManager: Changing modify acls to: yarn,hdfs
> 17/05/16 00:04:17 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); users with modify permissions: Set(yarn, hdfs)
> 17/05/16 00:04:17 INFO util.Utils: Successfully started service 'sparkDriver' on port 53267.
> 17/05/16 00:04:18 INFO slf4j.Slf4jLogger: Slf4jLogger started
> 17/05/16 00:04:18 INFO Remoting: Starting remoting
> 17/05/16 00:04:18 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
> 17/05/16 00:04:18 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 38037.
> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering MapOutputTracker
> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering BlockManagerMaster
> 17/05/16 00:04:18 INFO storage.DiskBlockManager: Created local directory at /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/blockmgr-f46429a6-7466-42c1-bd79-9ddf6ec61cb4
> 17/05/16 00:04:18 INFO storage.MemoryStore: MemoryStore started with capacity 1966.1 MB
> 17/05/16 00:04:18 INFO spark.SparkEnv: Registering OutputCommitCoordinator
> 17/05/16 00:04:18 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'SparkUI' on port 49024.
> 17/05/16 00:04:18 INFO ui.SparkUI: Started SparkUI at http://10.19.194.147:49024
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/livy-api-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-api-0.3.0.jar with timestamp 1494893058608
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/livy-rsc-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-rsc-0.3.0.jar with timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/rsc-jars/netty-all-4.0.29.Final.jar at spark://10.19.194.147:53267/jars/netty-all-4.0.29.Final.jar with timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar at hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar with timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar at hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar with timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/commons-codec-1.9.jar at spark://10.19.194.147:53267/jars/commons-codec-1.9.jar with timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-repl_2.10-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-repl_2.10-0.3.0.jar with timestamp 1494893058609
> 17/05/16 00:04:18 INFO spark.SparkContext: Added JAR file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-core_2.10-0.3.0.jar at spark://10.19.194.147:53267/jars/livy-core_2.10-0.3.0.jar with timestamp 1494893058609
> 17/05/16 00:04:18 INFO cluster.YarnClusterScheduler: Created YarnClusterScheduler
> 17/05/16 00:04:18 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 57551.
> 17/05/16 00:04:18 INFO netty.NettyBlockTransferService: Server created on 57551
> 17/05/16 00:04:18 INFO storage.BlockManager: external shuffle service port = 7337
> 17/05/16 00:04:18 INFO storage.BlockManagerMaster: Trying to register BlockManager
> 17/05/16 00:04:18 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.19.194.147:57551 with 1966.1 MB RAM, BlockManagerId(driver, 10.19.194.147, 57551)
> 17/05/16 00:04:18 INFO storage.BlockManagerMaster: Registered BlockManager
> 17/05/16 00:04:19 INFO scheduler.EventLoggingListener: Logging events to hdfs://jarvis-nameservice001/user/spark/applicationHistory/application_1494373289850_0336_1
> 17/05/16 00:04:19 INFO cluster.YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
> 17/05/16 00:04:19 INFO cluster.YarnClusterScheduler: YarnClusterScheduler.postStartHook done
> 17/05/16 00:04:19 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@10.19.194.147:53267)
> 17/05/16 00:04:19 INFO yarn.YarnRMClient: Registering the ApplicationMaster
> 17/05/16 00:04:19 INFO yarn.ApplicationMaster: Started progress reporter thread with (heartbeat : 3000, initial allocation : 200) intervals
> 17/05/16 00:04:19 INFO hive.HiveContext: Initializing execution hive, version 1.1.0
> 17/05/16 00:04:19 INFO client.ClientWrapper: Inspected Hadoop version: 2.6.0-cdh5.7.0
> 17/05/16 00:04:19 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-cdh5.7.0
> 17/05/16 00:04:20 INFO hive.metastore: Trying to connect to metastore with URI thrift://jarvis-hdfs003.internal.shopkick.com:9083
> 17/05/16 00:04:20 INFO hive.metastore: Opened a connection to metastore, current connections: 1
> 17/05/16 00:04:20 INFO hive.metastore: Connected to metastore.
> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs
> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn
> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/478f39e9-5295-4e8e-97aa-40b5828f9440_resources
> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440
> 17/05/16 00:04:20 INFO session.SessionState: Created local directory: /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn/478f39e9-5295-4e8e-97aa-40b5828f9440
> 17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory: file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440/_tmp_space.db
> 17/05/16 00:04:20 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
> 17/05/16 00:04:20 INFO repl.SparkInterpreter: Created sql context (with Hive support).
>
>
> On Mon, May 15, 2017 at 5:43 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
>>
>> Which version of zeppelin do you use ? And can you check the yarn app log
>> ?
>>
>>
>> Ben Vogan <be...@shopkick.com>于2017年5月15日周一 下午5:56写道:
>>
>>> Hi all,
>>>
>>> For some reason today I'm getting a stack:
>>>
>>> org.apache.zeppelin.livy.LivyException: Fail to create
>>> SQLContext,<console>:4: error: illegal inheritance;
>>> at
>>> org.apache.zeppelin.livy.LivySparkSQLInterpreter.open(LivySparkSQLInterpreter.java:76)
>>> at
>>> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>>> at
>>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>>> at
>>> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>>> at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>>> at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> On the Livy server I see no errors and there is an open session on yarn.
>>>
>>> Some help on this would be greatly appreciated!
>>>
>>> --Ben
>>>
>>> On Sun, May 14, 2017 at 6:16 AM, Ben Vogan <be...@shopkick.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I've been using Zeppelin for a couple of weeks now with a stable
>>>> configuration, but all of a sudden I am getting "Illegal inheritance"
>>>> errors like so:
>>>>
>>>>  INFO [2017-05-14 03:25:32,678] ({pool-2-thread-56}
>>>> Paragraph.java[jobRun]:362) - run paragraph 20170514-032326_663206142 using
>>>> livy org.apache.zeppelin.interpreter.LazyOpenInterpreter@505a171c
>>>>  WARN [2017-05-14 03:25:33,696] ({pool-2-thread-56}
>>>> NotebookServer.java[afterStatusChange]:2058) - Job
>>>> 20170514-032326_663206142 is finished, status: ERROR, exception: null,
>>>> result: %text <console>:4: error: illegal inheritance;
>>>>
>>>> It happens across multiple notebooks and across by my spark and livy
>>>> interpreters.  I don't know where to look for more information about what
>>>> is wrong.  I don't see any errors in spark/yarn at all.  The driver got
>>>> created, but it looks like no jobs were ever submitted to spark.
>>>>
>>>> Help would be greatly appreciated.
>>>>
>>>> Thanks,
>>>>
>>>> --
>>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>>
>>>> <http://www.shopkick.com/>
>>>> <https://www.facebook.com/shopkick>
>>>> <https://www.instagram.com/shopkick/>
>>>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>>
>>>
>>>
>>>
>>> --
>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>
>>> <http://www.shopkick.com/>
>>> <https://www.facebook.com/shopkick>
>>> <https://www.instagram.com/shopkick/>
>>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>
>>
>
>
> --
> *BENJAMIN VOGAN* | Data Platform Team Lead
>
> <http://www.shopkick.com/>
> <https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>

Re: Illegal Inheritance error

Posted by Ben Vogan <be...@shopkick.com>.
I am using 0.7.1 and I checked the yarn app log and don't see any errors.
It looks like this:

17/05/16 00:04:12 INFO yarn.ApplicationMaster: Registered signal
handlers for [TERM, HUP, INT]
17/05/16 00:04:13 INFO yarn.ApplicationMaster: ApplicationAttemptId:
appattempt_1494373289850_0336_000001
17/05/16 00:04:13 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
17/05/16 00:04:13 INFO spark.SecurityManager: Changing modify acls to: yarn,hdfs
17/05/16 00:04:13 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view
permissions: Set(yarn, hdfs); users with modify permissions: Set(yarn,
hdfs)
17/05/16 00:04:13 INFO yarn.ApplicationMaster: Starting the user
application in a separate Thread
17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark
context initialization
17/05/16 00:04:13 INFO yarn.ApplicationMaster: Waiting for spark
context initialization ...
17/05/16 00:04:14 INFO driver.RSCDriver: Connecting to:
jarvis-hue002.internal.shopkick.com:40819
17/05/16 00:04:14 INFO driver.RSCDriver: Starting RPC server...
17/05/16 00:04:14 WARN rsc.RSCConf: Your hostname,
jarvis-yarn008.internal.shopkick.com, resolves to a loopback address,
but we couldn't find any external IP address!
17/05/16 00:04:14 WARN rsc.RSCConf: Set livy.rsc.rpc.server.address if
you need to bind to another address.
17/05/16 00:04:14 INFO driver.RSCDriver: Received job request
cd7d1356-709d-4674-a85c-21edade2c38d
17/05/16 00:04:14 INFO driver.RSCDriver: SparkContext not yet up,
queueing job request.
17/05/16 00:04:17 INFO spark.SparkContext: Running Spark version 1.6.0
17/05/16 00:04:17 INFO spark.SecurityManager: Changing view acls to: yarn,hdfs
17/05/16 00:04:17 INFO spark.SecurityManager: Changing modify acls to: yarn,hdfs
17/05/16 00:04:17 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view
permissions: Set(yarn, hdfs); users with modify permissions: Set(yarn,
hdfs)
17/05/16 00:04:17 INFO util.Utils: Successfully started service
'sparkDriver' on port 53267.
17/05/16 00:04:18 INFO slf4j.Slf4jLogger: Slf4jLogger started
17/05/16 00:04:18 INFO Remoting: Starting remoting
17/05/16 00:04:18 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
17/05/16 00:04:18 INFO Remoting: Remoting now listens on addresses:
[akka.tcp://sparkDriverActorSystem@10.19.194.147:38037]
17/05/16 00:04:18 INFO util.Utils: Successfully started service
'sparkDriverActorSystem' on port 38037.
17/05/16 00:04:18 INFO spark.SparkEnv: Registering MapOutputTracker
17/05/16 00:04:18 INFO spark.SparkEnv: Registering BlockManagerMaster
17/05/16 00:04:18 INFO storage.DiskBlockManager: Created local
directory at /yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/blockmgr-f46429a6-7466-42c1-bd79-9ddf6ec61cb4
17/05/16 00:04:18 INFO storage.MemoryStore: MemoryStore started with
capacity 1966.1 MB
17/05/16 00:04:18 INFO spark.SparkEnv: Registering OutputCommitCoordinator
17/05/16 00:04:18 INFO ui.JettyUtils: Adding filter:
org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
17/05/16 00:04:18 INFO util.Utils: Successfully started service
'SparkUI' on port 49024.
17/05/16 00:04:18 INFO ui.SparkUI: Started SparkUI at http://10.19.194.147:49024
17/05/16 00:04:18 INFO spark.SparkContext: Added JAR
file:/services/livy-server/livy-server-current/rsc-jars/livy-api-0.3.0.jar
at spark://10.19.194.147:53267/jars/livy-api-0.3.0.jar with timestamp
1494893058608
17/05/16 00:04:18 INFO spark.SparkContext: Added JAR
file:/services/livy-server/livy-server-current/rsc-jars/livy-rsc-0.3.0.jar
at spark://10.19.194.147:53267/jars/livy-rsc-0.3.0.jar with timestamp
1494893058609
17/05/16 00:04:18 INFO spark.SparkContext: Added JAR
file:/services/livy-server/livy-server-current/rsc-jars/netty-all-4.0.29.Final.jar
at spark://10.19.194.147:53267/jars/netty-all-4.0.29.Final.jar with
timestamp 1494893058609
17/05/16 00:04:18 INFO spark.SparkContext: Added JAR
hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar
at hdfs://jarvis-nameservice001/jarvis_pipelines/vertica-jdbc-7.1.2-0.jar
with timestamp 1494893058609
17/05/16 00:04:18 INFO spark.SparkContext: Added JAR
hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar
at hdfs://jarvis-nameservice001/jarvis_pipelines/shopkick-data-pipeline.jar
with timestamp 1494893058609
17/05/16 00:04:18 INFO spark.SparkContext: Added JAR
file:/services/livy-server/livy-server-current/repl_2.10-jars/commons-codec-1.9.jar
at spark://10.19.194.147:53267/jars/commons-codec-1.9.jar with
timestamp 1494893058609
17/05/16 00:04:18 INFO spark.SparkContext: Added JAR
file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-repl_2.10-0.3.0.jar
at spark://10.19.194.147:53267/jars/livy-repl_2.10-0.3.0.jar with
timestamp 1494893058609
17/05/16 00:04:18 INFO spark.SparkContext: Added JAR
file:/services/livy-server/livy-server-current/repl_2.10-jars/livy-core_2.10-0.3.0.jar
at spark://10.19.194.147:53267/jars/livy-core_2.10-0.3.0.jar with
timestamp 1494893058609
17/05/16 00:04:18 INFO cluster.YarnClusterScheduler: Created
YarnClusterScheduler
17/05/16 00:04:18 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port
57551.
17/05/16 00:04:18 INFO netty.NettyBlockTransferService: Server created on 57551
17/05/16 00:04:18 INFO storage.BlockManager: external shuffle service
port = 7337
17/05/16 00:04:18 INFO storage.BlockManagerMaster: Trying to register
BlockManager
17/05/16 00:04:18 INFO storage.BlockManagerMasterEndpoint: Registering
block manager 10.19.194.147:57551 with 1966.1 MB RAM,
BlockManagerId(driver, 10.19.194.147, 57551)
17/05/16 00:04:18 INFO storage.BlockManagerMaster: Registered BlockManager
17/05/16 00:04:19 INFO scheduler.EventLoggingListener: Logging events
to hdfs://jarvis-nameservice001/user/spark/applicationHistory/application_1494373289850_0336_1
17/05/16 00:04:19 INFO cluster.YarnClusterSchedulerBackend:
SchedulerBackend is ready for scheduling beginning after reached
minRegisteredResourcesRatio: 0.8
17/05/16 00:04:19 INFO cluster.YarnClusterScheduler:
YarnClusterScheduler.postStartHook done
17/05/16 00:04:19 INFO
cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster
registered as NettyRpcEndpointRef(spark://YarnAM@10.19.194.147:53267)
17/05/16 00:04:19 INFO yarn.YarnRMClient: Registering the ApplicationMaster
17/05/16 00:04:19 INFO yarn.ApplicationMaster: Started progress
reporter thread with (heartbeat : 3000, initial allocation : 200)
intervals
17/05/16 00:04:19 INFO hive.HiveContext: Initializing execution hive,
version 1.1.0
17/05/16 00:04:19 INFO client.ClientWrapper: Inspected Hadoop version:
2.6.0-cdh5.7.0
17/05/16 00:04:19 INFO client.ClientWrapper: Loaded
org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version
2.6.0-cdh5.7.0
17/05/16 00:04:20 INFO hive.metastore: Trying to connect to metastore
with URI thrift://jarvis-hdfs003.internal.shopkick.com:9083
17/05/16 00:04:20 INFO hive.metastore: Opened a connection to
metastore, current connections: 1
17/05/16 00:04:20 INFO hive.metastore: Connected to metastore.
17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory:
file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs
17/05/16 00:04:20 INFO session.SessionState: Created local directory:
/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn
17/05/16 00:04:20 INFO session.SessionState: Created local directory:
/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/478f39e9-5295-4e8e-97aa-40b5828f9440_resources
17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory:
file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440
17/05/16 00:04:20 INFO session.SessionState: Created local directory:
/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/yarn/478f39e9-5295-4e8e-97aa-40b5828f9440
17/05/16 00:04:20 INFO session.SessionState: Created HDFS directory:
file:/yarn/nm/usercache/hdfs/appcache/application_1494373289850_0336/container_e14_1494373289850_0336_01_000001/tmp/spark-2217d267-a3c0-4cf4-9565-45f80517d41c/scratch/hdfs/478f39e9-5295-4e8e-97aa-40b5828f9440/_tmp_space.db
17/05/16 00:04:20 INFO session.SessionState: No Tez session required
at this point. hive.execution.engine=mr.
17/05/16 00:04:20 INFO repl.SparkInterpreter: Created sql context
(with Hive support).


On Mon, May 15, 2017 at 5:43 PM, Jeff Zhang <zj...@gmail.com> wrote:

>
> Which version of zeppelin do you use ? And can you check the yarn app log ?
>
>
> Ben Vogan <be...@shopkick.com>于2017年5月15日周一 下午5:56写道:
>
>> Hi all,
>>
>> For some reason today I'm getting a stack:
>>
>> org.apache.zeppelin.livy.LivyException: Fail to create
>> SQLContext,<console>:4: error: illegal inheritance;
>> at org.apache.zeppelin.livy.LivySparkSQLInterpreter.open(
>> LivySparkSQLInterpreter.java:76)
>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(
>> LazyOpenInterpreter.java:70)
>> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$
>> InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(
>> FIFOScheduler.java:139)
>> at java.util.concurrent.Executors$RunnableAdapter.
>> call(Executors.java:511)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> at java.util.concurrent.ScheduledThreadPoolExecutor$
>> ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>> at java.util.concurrent.ScheduledThreadPoolExecutor$
>> ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(
>> ThreadPoolExecutor.java:1142)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> On the Livy server I see no errors and there is an open session on yarn.
>>
>> Some help on this would be greatly appreciated!
>>
>> --Ben
>>
>> On Sun, May 14, 2017 at 6:16 AM, Ben Vogan <be...@shopkick.com> wrote:
>>
>>> Hi all,
>>>
>>> I've been using Zeppelin for a couple of weeks now with a stable
>>> configuration, but all of a sudden I am getting "Illegal inheritance"
>>> errors like so:
>>>
>>>  INFO [2017-05-14 03:25:32,678] ({pool-2-thread-56}
>>> Paragraph.java[jobRun]:362) - run paragraph 20170514-032326_663206142 using
>>> livy org.apache.zeppelin.interpreter.LazyOpenInterpreter@505a171c
>>>  WARN [2017-05-14 03:25:33,696] ({pool-2-thread-56} NotebookServer.java[afterStatusChange]:2058)
>>> - Job 20170514-032326_663206142 is finished, status: ERROR, exception:
>>> null, result: %text <console>:4: error: illegal inheritance;
>>>
>>> It happens across multiple notebooks and across by my spark and livy
>>> interpreters.  I don't know where to look for more information about what
>>> is wrong.  I don't see any errors in spark/yarn at all.  The driver got
>>> created, but it looks like no jobs were ever submitted to spark.
>>>
>>> Help would be greatly appreciated.
>>>
>>> Thanks,
>>>
>>> --
>>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>>
>>> <http://www.shopkick.com/>
>>> <https://www.facebook.com/shopkick>
>>> <https://www.instagram.com/shopkick/>
>>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>>
>>
>>
>>
>> --
>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>
>> <http://www.shopkick.com/>
>> <https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>
>


-- 
*BENJAMIN VOGAN* | Data Platform Team Lead

<http://www.shopkick.com/>
<https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
<https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
<https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>

Re: Illegal Inheritance error

Posted by Jeff Zhang <zj...@gmail.com>.
Which version of zeppelin do you use ? And can you check the yarn app log ?


Ben Vogan <be...@shopkick.com>于2017年5月15日周一 下午5:56写道:

> Hi all,
>
> For some reason today I'm getting a stack:
>
> org.apache.zeppelin.livy.LivyException: Fail to create
> SQLContext,<console>:4: error: illegal inheritance;
> at
> org.apache.zeppelin.livy.LivySparkSQLInterpreter.open(LivySparkSQLInterpreter.java:76)
> at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
> at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
> at
> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> On the Livy server I see no errors and there is an open session on yarn.
>
> Some help on this would be greatly appreciated!
>
> --Ben
>
> On Sun, May 14, 2017 at 6:16 AM, Ben Vogan <be...@shopkick.com> wrote:
>
>> Hi all,
>>
>> I've been using Zeppelin for a couple of weeks now with a stable
>> configuration, but all of a sudden I am getting "Illegal inheritance"
>> errors like so:
>>
>>  INFO [2017-05-14 03:25:32,678] ({pool-2-thread-56}
>> Paragraph.java[jobRun]:362) - run paragraph 20170514-032326_663206142 using
>> livy org.apache.zeppelin.interpreter.LazyOpenInterpreter@505a171c
>>  WARN [2017-05-14 03:25:33,696] ({pool-2-thread-56}
>> NotebookServer.java[afterStatusChange]:2058) - Job
>> 20170514-032326_663206142 is finished, status: ERROR, exception: null,
>> result: %text <console>:4: error: illegal inheritance;
>>
>> It happens across multiple notebooks and across by my spark and livy
>> interpreters.  I don't know where to look for more information about what
>> is wrong.  I don't see any errors in spark/yarn at all.  The driver got
>> created, but it looks like no jobs were ever submitted to spark.
>>
>> Help would be greatly appreciated.
>>
>> Thanks,
>>
>> --
>> *BENJAMIN VOGAN* | Data Platform Team Lead
>>
>> <http://www.shopkick.com/>
>> <https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
>> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
>> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>>
>
>
>
> --
> *BENJAMIN VOGAN* | Data Platform Team Lead
>
> <http://www.shopkick.com/>
> <https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>

Re: Illegal Inheritance error

Posted by Ben Vogan <be...@shopkick.com>.
Hi all,

For some reason today I'm getting a stack:

org.apache.zeppelin.livy.LivyException: Fail to create
SQLContext,<console>:4: error: illegal inheritance;
at
org.apache.zeppelin.livy.LivySparkSQLInterpreter.open(LivySparkSQLInterpreter.java:76)
at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

On the Livy server I see no errors and there is an open session on yarn.

Some help on this would be greatly appreciated!

--Ben

On Sun, May 14, 2017 at 6:16 AM, Ben Vogan <be...@shopkick.com> wrote:

> Hi all,
>
> I've been using Zeppelin for a couple of weeks now with a stable
> configuration, but all of a sudden I am getting "Illegal inheritance"
> errors like so:
>
>  INFO [2017-05-14 03:25:32,678] ({pool-2-thread-56}
> Paragraph.java[jobRun]:362) - run paragraph 20170514-032326_663206142 using
> livy org.apache.zeppelin.interpreter.LazyOpenInterpreter@505a171c
>  WARN [2017-05-14 03:25:33,696] ({pool-2-thread-56} NotebookServer.java[afterStatusChange]:2058)
> - Job 20170514-032326_663206142 is finished, status: ERROR, exception:
> null, result: %text <console>:4: error: illegal inheritance;
>
> It happens across multiple notebooks and across by my spark and livy
> interpreters.  I don't know where to look for more information about what
> is wrong.  I don't see any errors in spark/yarn at all.  The driver got
> created, but it looks like no jobs were ever submitted to spark.
>
> Help would be greatly appreciated.
>
> Thanks,
>
> --
> *BENJAMIN VOGAN* | Data Platform Team Lead
>
> <http://www.shopkick.com/>
> <https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
> <https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
> <https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>
>



-- 
*BENJAMIN VOGAN* | Data Platform Team Lead

<http://www.shopkick.com/>
<https://www.facebook.com/shopkick> <https://www.instagram.com/shopkick/>
<https://www.pinterest.com/shopkick/> <https://twitter.com/shopkickbiz>
<https://www.linkedin.com/company-beta/831240/?pathWildcard=831240>