You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ABHISHEK KUMAR GUPTA (JIRA)" <ji...@apache.org> on 2018/11/01 18:20:00 UTC
[jira] [Created] (SPARK-25912) Starting history server throwing
exception, when enable the local store directory
ABHISHEK KUMAR GUPTA created SPARK-25912:
--------------------------------------------
Summary: Starting history server throwing exception, when enable the local store directory
Key: SPARK-25912
URL: https://issues.apache.org/jira/browse/SPARK-25912
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 2.3.2
Reporter: ABHISHEK KUMAR GUPTA
To reproduce the issue,
1) create a local store folder, /opt/localDir, (No write or executable permission)
2) In the spark-defaults conf, add the configuration,
spark.history.store.path= /opt/localDir
3) Start the history server
Output:
{code:java}
018-11-01 23:42:06 INFO FsHistoryProvider:54 - History server ui acls disabled; users with admin permissions: ; groups with admin permissions
2018-11-01 23:42:07 WARN FsHistoryProvider:87 - Failed to load disk store /opt/localDir/listing.ldb :
org.fusesource.leveldbjni.internal.NativeDB$DBException: IO error: /opt/localDir/listing.ldb/LOCK: Permission denied
at org.fusesource.leveldbjni.internal.NativeDB.checkStatus(NativeDB.java:200)
at org.fusesource.leveldbjni.internal.NativeDB.open(NativeDB.java:218)
at org.fusesource.leveldbjni.JniDBFactory.open(JniDBFactory.java:168)
at org.apache.spark.util.kvstore.LevelDB.<init>(LevelDB.java:80)
at org.apache.spark.status.KVUtils$.open(KVUtils.scala:60)
at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$3.apply(FsHistoryProvider.scala:137)
at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$3.apply(FsHistoryProvider.scala:130)
at scala.Option.map(Option.scala:146)
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:130)
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:84)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:281)
at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:281)
at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: org.fusesource.leveldbjni.internal.NativeDB$DBException: IO error: /opt/localDir/listing.ldb/LOCK: Permission denied
at org.fusesource.leveldbjni.internal.NativeDB.checkStatus(NativeDB.java:200)
at org.fusesource.leveldbjni.internal.NativeDB.open(NativeDB.java:218)
at org.fusesource.leveldbjni.JniDBFactory.open(JniDBFactory.java:168)
at org.apache.spark.util.kvstore.LevelDB.<init>(LevelDB.java:80)
at org.apache.spark.status.KVUtils$.open(KVUtils.scala:60)
at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$3.apply(FsHistoryProvider.scala:150)
at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$3.apply(FsHistoryProvider.scala:130)
at scala.Option.map(Option.scala:146)
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:130)
at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:84)
... 6 more
{code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org