You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Eric Yang (JIRA)" <ji...@apache.org> on 2017/08/15 15:26:00 UTC

[jira] [Resolved] (AMBARI-21608) Spark shell is not working after upgrade

     [ https://issues.apache.org/jira/browse/AMBARI-21608?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Eric Yang resolved AMBARI-21608.
--------------------------------
    Resolution: Implemented

The root cause is that Hive code can not locate jar files on HIVE_AUX_JARS_PATH.  Hive code throws a write error which gets interpreted as permission denied.  This issue is resolved by fixing AMBARI-21607.

> Spark shell is not working after upgrade
> ----------------------------------------
>
>                 Key: AMBARI-21608
>                 URL: https://issues.apache.org/jira/browse/AMBARI-21608
>             Project: Ambari
>          Issue Type: Bug
>    Affects Versions: 2.5.2
>            Reporter: Eric Yang
>             Fix For: 2.5.2
>
>
> Spark shell does not work after IOP to HDP upgrade.  This error message shows up when running spark-shell:
> {code}
> 17/07/28 20:50:46 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
> 17/07/28 20:50:46 INFO SessionState: Created local directory: /tmp/cedf0cab-747a-45e8-8c36-53a304027587_resources
> 17/07/28 20:50:46 INFO SessionState: Created HDFS directory: /tmp/hive/spark/cedf0cab-747a-45e8-8c36-53a304027587
> 17/07/28 20:50:46 INFO SessionState: Created local directory: /tmp/spark/cedf0cab-747a-45e8-8c36-53a304027587
> 17/07/28 20:50:46 INFO SessionState: Created HDFS directory: /tmp/hive/spark/cedf0cab-747a-45e8-8c36-53a304027587/_tmp_space.db
> java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: Permission denied
> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
> 	at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:209)
> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
> 	at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:225)
> 	at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:215)
> 	at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:480)
> 	at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:479)
> 	at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40)
> 	at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
> 	at $iwC$$iwC.<init>(<console>:15)
> 	at $iwC.<init>(<console>:24)
> 	at <init>(<console>:26)
> 	at .<init>(<console>:30)
> 	at .<clinit>(<console>)
> 	at .<init>(<console>:7)
> 	at .<clinit>(<console>)
> 	at $print(<console>)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> 	at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
> 	at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> 	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> 	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> 	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
> 	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> 	at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> 	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
> 	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
> 	at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
> 	at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
> 	at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
> 	at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
> 	at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
> 	at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
> 	at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> 	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> 	at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> 	at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
> 	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> 	at org.apache.spark.repl.Main$.main(Main.scala:31)
> 	at org.apache.spark.repl.Main.main(Main.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.RuntimeException: java.io.IOException: Permission denied
> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:515)
> 	... 62 more
> Caused by: java.io.IOException: Permission denied
> 	at java.io.UnixFileSystem.createFileExclusively(Native Method)
> 	at java.io.File.createTempFile(File.java:2024)
> 	at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState.java:818)
> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)
> 	... 62 more
> <console>:16: error: not found: value sqlContext
>          import sqlContext.implicits._
>                 ^
> <console>:16: error: not found: value sqlContext
>          import sqlContext.sql
>                 ^
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)